This application claims benefit of priority to Japanese Patent Application No. 2017-115510 filed on Jun. 12, 2017, which is hereby incorporated by reference in its entirety.
The present disclosure relates to a user interface device that controls a display on a screen of a display device according to an input operation, and a display control method and a program for the user interface device.
Recently, apparatuses having an input interface, for example a touch pad or a touch panel, for detecting a contact position of an object such as a finger or a pen, have come to be widely used. Japanese Unexamined Patent Application Publication No. 2010-134938 discloses a mobile information apparatus that identifies a type of operation, on the basis of a movement history of the finger contacting the touch panel and, for example, enlarges or reduces a map image according to the type of operation identified.
The apparatuses that include a touch panel as the input interface, like the mobile information apparatus according to the cited document, are advantageous in allowing intuitive operation, compared with apparatuses accompanied with a mouse or the like. However, when a complicated operation has to be performed, a set of mouse and keyboard is often easier to operate than the touch pad or touch panel, and therefore the input interface such as the touch pad or touch panel is desired to be more user-friendly.
A first aspect of the present disclosure relates to a user interface device that controls a display on a screen of a display device, according to a contact on an input surface. The user interface device includes a detector configured to detect a contact position on the input surface and a pressing force applied to the input surface owing to the contact, and a controller configured to control a display on the screen according a detection result provided by the detector. The controller is configured to identify, as a designated object, at least an object on the screen designated by the contact made on the input surface, on a basis of the detection result of the contact position provided by the detector, select at least one designated object as an object to be moved, according to the pressing force detected by the detector, and move, when the contact position moves, with the object to be moved kept selected, the object to be moved on the screen according to the movement of the contact position.
In the mentioned user interface device, at least an object on the screen, designated by the contact made on the input surface, is identified as the designated object, on the basis of the detection result of the contact position provided by the detector. Then, at least one designated object is selected as the object to be moved, according to the pressing force detected by the detector. Thus, the object to be moved is selected out of the objects on the screen, on the basis of the detection result of the contact position and the pressing force on the input surface. Such an arrangement facilitates the selection of the object to be moved.
A second aspect of the present disclosure relates to a user interface device that controls a display on a screen of a display device, according to a contact on an input surface. The user interface device includes a detector configured to detect a contact position on the input surface and a pressing force applied to the input surface owing to the contact, a controller configured to control a display on the screen according a detection result provided by the detector. The controller is configured to move, when the contact position detected by the detection unit moves, at least part of the objects displayed on the screen according to the movement of the contact position, and change, when moving the at least part of the objects, a relation between an operation stroke corresponding to a movement distance of the contact position on the screen and an object travel corresponding to a movement distance of the at least part of the objects, according to the pressing force detected by the detector.
In the mentioned user interface device, when at least part of the objects displayed on the screen is to be moved according to the movement of the contact position, the relation between the operation stroke and the object travel is changed according to the pressing force. When the moving speed of the contact position is constant, the longer the object travel is with respect to the operation stroke, the faster the object moves, and the shorter the object travel is with respect to the operation stroke, the slower the object moves. Therefore, the user can control the moving speed of the object by adjusting the pressing force.
A third aspect of the present disclosure relates to a user interface device that controls a display on a screen of a display device, according to a contact on an input surface. The user interface device includes a detector configured to detect a contact position on the input surface and a pressing force applied to the input surface owing to the contact, and a controller configured to control a display on the screen according a detection result provided by the detector. The controller is configured to identify, as a designated object, at least an object on the screen designated by the contact made on the input surface, on a basis of a detection result of the contact position provided by the detector, select at least one designated object as an object to be moved, according to the pressing force detected by the detector, and change at least one of a display size of the at least one designated object and displayed details of information accompanying the at least one designated object, according to the pressing force detected by the detector.
In the mentioned user interface device, at least an object on the screen is identified as the designated object, on the basis of the detection result of the contact position provided by the detector. In addition, at least one of the display size of the designated object and the displayed details of the information accompanying the designated object is changed, according to the pressing force. Thus, the display size of the designated object, and/or the displayed details of the accompanying information are changed, on the basis of the contact position and the pressing force on the input surface. Such an arrangement facilitates the changing of the display size of the designated object, and/or the displayed details of the accompanying information.
A fourth aspect of the present disclosure relates to a display control method for controlling a display on a screen of a display device, according to a contact on an input surface. The display control method includes acquiring a detection result from a detector configured to detect a contact position on the input surface and a pressing force applied to the input surface owing to the contact, identifying, as a designated object, at least an object on the screen designated by the contact made on the input surface, on a basis of the detection result of the contact position provided by the detector, selecting at least one designated object as an object to be moved, according to the pressing force detected by the detector, and moving, when the contact position moves, with the object to be moved kept selected, the object to be moved on the screen according to the movement of the contact position.
Hereafter, a user interface device according to a first embodiment will be described with reference to the drawings.
The detector 20 detects, when for example a finger of a user contacts the input surface 21, the contact position of the finger on the input surface 21 and the pressing force applied to the input surface 21 owing to the contact.
The pressure sensor 23 serves to detect the pressing force imposed from the input surface 21 through the support member and, for example, includes a piezoelectric element. The pressure sensor 23 is located, for example as shown in
The electrostatic sensor 22 includes, as shown in
The contact position calculation unit 24 detects the change in electrostatic capacitance generated in each of the capacitive sensor elements S of the electrostatic sensor 22, owing to the contact of, for example, a finger on the input surface 21, and calculates the contact position of the finger on the input surface 21, on the basis of the detection result.
For example, the contact position calculation unit 24 sequentially applies a drive voltage to each of the electrodes Ex, and detects a charge supplied to the capacitive sensor element S from the electrode Ey because of the application of the drive voltage, to thereby detect the electrostatic capacitance of the capacitive sensor element S proportional to the charge. The contact position calculation unit 24 decides whether the finger has contacted the input surface 21 with respect to each of a plurality of positions, on the basis of data of a plurality of electrostatic capacitance values detected with respect to the plurality of capacitive sensor elements S. The contact position calculation unit 24 identifies the contact range of the finger on the input surface 21 from the decision result whether a contact has been made, and calculates the contact position of the finger on the basis of the contact range identified as above. The contact position calculation unit 24 includes, for example, a drive circuit that supplies the drive voltage to the electrostatic sensor 22, a charge amplifier that detects the charge of each capacitive sensor element S, an AD converter that converts an output signal of the charge amplifier to a digital value, and a signal processing circuit (e.g., computer and exclusive logic circuit) that calculates the contact position on the basis of the electrostatic capacitance value obtained from the AD converter.
Although the mentioned electrostatic sensor 22 is configured to detect an approaching object on the basis of a change in electrostatic capacitance taking place between the electrodes (Ex, Ey) (mutual capacitance), the approaching of an object may be detected by different methods. For example, the electrostatic sensor 22 may be based on a self-capacitance method, to detect the electrostatic capacitance generated between the electrode and the ground, when an object comes close.
The detection signal generation unit 25 generates a detection signal indicating the value of the pressing force, on the basis of a physical amount detected by the pressure sensor 23. The detection signal generation unit 25 includes, for example, a charge amplifier that detects a charge generated by the piezoelectric element of the pressure sensor 23, an AD converter that converts an output signal of the charge amplifier to a digital value, and a signal processing circuit (e.g., computer and exclusive logic circuit) that corrects the digital value and generates the detection signal of the pressing force.
The tactile presentation unit 30 presents tactile feeling to the user's finger brought into contact with the input surface 21. The tactile presentation unit 30 includes an actuator, such as a piezoelectric oscillator or a solenoid. In the example shown in
Here, the tactile feeling to be presented by the tactile presentation unit 30 is not limited to the oscillation but, for example, an electrostatic force or heat (warm or cool effect) may be presented as the tactile feeling.
The controller or control unit 40 serves to control the overall operation of the user interface device 1, and includes, for example, a computer that executes processings according to a program 51 (e.g., operating system, application software, and device driver) stored in the storage unit 50. The control unit 40 may also include an exclusive logic circuit configured to execute predetermined processings.
The control unit 40 may utilize the computer to execute all of the processings related to the display control of the screen 11, to be subsequently described, or utilize the exclusive logic circuit to execute at least a part of the processings.
The control unit 40 controls the display on the screen 11, according to the detection result (contact position and pressing force) provided by the detection unit 20. To be more detailed, the control unit 40 identifies, as a designated object, at least one object on the screen 11 designated by a contact made on the input surface 21, on the basis of the detection result of the contact position of the finger, provided by the detection unit 20. For example, the control unit 40 updates, when the detection unit 20 detects a contact of the finger on the input surface 21, the position of a cursor (pointer) displayed on the screen 11 of the display device 10, according to the detection result of the contact position. In this case, the control unit 40 identifies, as the designated object, an object such as an icon located at the position corresponding to the cursor, made to move on the screen 11 by the contact made on the input surface 21. The control unit 40 may identify the designated object each time the position of the cursor is updated, or when the cursor remains at a given position for a predetermined time or longer. Alternatively, the control unit 40 may identify the designated object when the pressing force detected by the detection unit 20 is larger than a predetermined threshold.
The control unit 40 also selects the designated object as an object to be moved, according to the pressing force detected by the detection unit 20. For example, when the pressing force detected by the detection unit 20 is larger than the predetermined threshold, the control unit 40 selects the designated object identified on the basis of the contact position, as the object to be moved. When the contact position detected by the detection unit 20 moves, with at least one designated object kept selected as the object to be moved, the control unit 40 moves such object to be moved on the screen 11, according to the movement of the contact position.
The control unit 40 may identify a plurality of objects on the screen 11 as the designated object, on the basis of the detection result of the contact position of the finger, detected by the detection unit 20. The control unit 40 selects at least one designated object as the object to be moved out of the plurality of designated objects, according to the pressing force detected by the detection unit 20. For example, the control unit 40 increases the number of the designated objects to be selected as the object to be moved out of the plurality of designated objects, with an increase in the pressing force detected by the detection unit 20. More specifically, when the plurality of designated objects identified as above overlap on the screen 11, the control unit 40 expands the range of the designated objects to be selected as the object to be moved, from the designated object on the front side toward another one on the rear side, with the increase in the pressing force detected by the detection unit 20.
When at least one designated object is selected as the object to be moved, the control unit 40 controls the tactile presentation unit 30 so as to present continuous tactile feeling, to notify that the object to be moved has been selected. For example, the control unit 40 controls the tactile presentation unit 30 so as to present the heavier tactile feeling, the larger number of designated objects are selected as the object to be moved. More specifically, the control unit 40 reduces the frequency, and increases the amplitude, of the oscillation transmitted as the tactile feeling, with the increase in the number of designated objects selected as the object to be moved. The control unit 40 may control the frequency and amplitude of the oscillation, for example by selectively driving one or more oscillators, out of a plurality of oscillators provided in the tactile presentation unit 30.
When at least one designated object is selected as the object to be moved, the control unit 40 may deselect the designated object as the object to be moved, depending on the pressing force detected by the detection unit 20. For example, when at least one designated object is selected as the object to be moved, the control unit 40 deselects the designated object as the object to be moved, in the case where the pressing force detected by the detection unit 20 is below a predetermined threshold.
In addition, when at least one designated object is selected as the object to be moved, the control unit 40 deselects the designated object as the object to be moved, in the case where the detection unit 20 stops detecting the contact position.
The storage unit 50 stores therein the program 51 configured to cause the computer of the control unit 40 to execute the processings, and data to be used for the processings executed by the control unit 40. The storage unit 30 includes, for example, volatile memories such as a DRAM and a SRAM, non-volatile memories such as a flash memory, and a hard disk.
The program 51 may be downloaded from an external apparatus (e.g., server apparatus) through a non-illustrated communication interface, or inputted from a physical non-transitory medium (e.g., optical disk and USB memory), through a non-illustrated input device.
Hereunder, an operation of the user interface device 1 configured as above according to the first embodiment will be described.
First, the control unit 40 acquires the detection result of the contact position and the pressing force on the input surface 21, from the detection unit 20 (ST100). Upon acquiring the detection result from the detection unit 20, the control unit 40 selects the object to be moved out of the objects displayed on the screen 11, and also deselects the object as the object to be moved, on the basis of the detection result (ST105). Further details of step ST105 will be subsequently described, with reference to
After selecting or deselecting the object to be moved, the control unit 40 updates, in the case where any object to be moved remains selected through the previous and the current process (Yes at ST110), the position of such object to be moved (ST120). For example, the control unit 40 calculates a direction and a distance, in and by which the contact position has moved on the input surface 21, on the basis of the previously detected contact position on the input surface 21 and the currently detected contact position on the input surface 21. The control unit 40 calculates a coordinate on the screen 11 to which the object to be moved is supposed to move, on the basis of the direction and the distance in and by which the contact position has moved, and moves the object to be moved to the coordinate.
The control unit 40 decides whether a contact has been made on the input surface 21, on the basis of the detection result of the contact position provided by the detection unit 20 (ST200). In the case where a contact has been made on the input surface 21 (Yes at ST200), the control unit 40 checks whether any designated object has been selected as the object to be moved (ST205). In the case where a designated object has been selected as the object to be moved (Yes at ST205), the control unit 40 proceeds to step ST235 and ST250.
In the case where no designated object has been selected as the object to be moved (No at ST205), the control unit 40 identifies the object on the screen 11 designated by the contact made on the input surface 21, as the designated object (ST210). For example, the control unit 40 identifies the object located at the position overlapping the cursor (pointer) indicating the pointed object, as the designated object. When a plurality of objects are located at the position overlapping the cursor, the control unit 40 may identify each of the plurality of objects as the designated object.
In the case where a touch panel, in which the screen 11 of the display device 10 and the input surface 21 of the detection unit 20 are integrated, is employed, the control unit 40 may identify, for example, an object displayed at the contact position as the designated object.
The control unit 40 then decides whether any designated object (object designated by the contact made on the input surface 21) has been identified at step ST210 (ST215). In the case where a designated object has been identified at step ST210 (Yes at ST215), the control unit 40 proceeds to step ST235 and ST250. In contrast, in the case where no designated object has been identified at step ST210 (No at ST215), the control unit 40 finishes the operation instead of proceeding to step ST235 and ST250, because the control unit 40 is unable to select or deselect the object to be moved.
At step ST235, the control unit 40 selects and deselects the object to be moved, according to the detection result of the pressing force provided by the detection unit 20. Further details of step ST235 will be subsequently described, with reference to
After step ST235, the control unit 40 controls the tactile presentation unit 30 so as to present the tactile feeling that matches the number of the object to be moved that have been selected (ST250). Further details of step ST250 will be subsequently described, with reference to
Upon deciding at step ST200 that no contact has been made on the input surface 21 (No at ST200), the control unit 40 checks whether any designated object has been selected as the object to be moved (ST255). In the case where no designated object has been selected as the object to be moved (No at ST255), the control unit 40 finishes the operation. In contrast, in the case where a designated object has been selected as the object to be moved (Yes at ST255), the control unit 40 deselects such designated object as the object to be moved (ST260), and causes the tactile presentation unit 30 to stop presenting the tactile feeling (ST265). Therefore, the object on the screen 11 can be deselected as the object to be moved, simply by stopping touching the input surface 21.
Although the flowchart of
The control unit 40 compares the pressing force detected by the detection unit 20 with a threshold A1 (ST300). A code “F” in the flowchart of
When the pressing force F is equal to or larger than the threshold A1 (No at ST300), the control unit 40 compares the pressing force F with a threshold A2 (A2>A1) (ST320). When the pressing force F is smaller than the threshold A2 (Yes at ST320), the control unit 40 proceeds to a “first mode” (ST330). In the first mode, the control unit 40 selects the frontmost designated object as the object to be moved, out of the designated objects (objects designated by the contact made on the input surface 21) identified at step ST210 (ST335). In the case where, for example, one designated object has been identified at step ST210, the control unit 40 selects the one designated object as the object to be moved. In the case where two or more designated objects have been identified at step ST210, the control unit 40 selects the frontmost designated object as the object to be moved, but not the remaining designated objects.
When the pressing force F is equal to or larger than the threshold A2 (No at ST320), the control unit 40 compares the pressing force F with a threshold A3 (A3>A2) (ST340). When the pressing force F is smaller than the threshold A3 (Yes at ST340), the control unit 40 proceeds to a “second mode” (ST350). In the second mode, the control unit 40 selects the frontmost and second frontmost designated objects as the object to be moved, out of the designated objects identified at step ST210 (ST355). In the case where, for example, one designated object has been identified at step ST210, the control unit 40 selects the one designated object as the object to be moved. In the case where two designated objects have been identified at step ST210, the control unit 40 selects the two designated objects as the object to be moved. In the case where three or more designated objects have been identified at step ST210, the control unit 40 selects the frontmost and second frontmost designated objects as the object to be moved, but not the remaining designated objects.
When the pressing force F is equal to or larger than a threshold A3 (No at ST340), the control unit 40 proceeds to a “third mode” (ST370). In the third mode, the control unit 40 selects all the designated objects identified at step ST210, as the object to be moved (ST375).
Three objects 201 to 203 are located at the position overlapping a cursor 101. The three objects 201 to 203 are each identified as the designated object. The objects 202 and 203 are windows, and the object 201 is a pattern located inside the object 202 (window). The object 201 (pattern) is at the frontmost position, the object 202 (window) is at the second frontmost position, and the object 203 (window) is at the rearmost position.
The control unit 40 decides the number of designated objects selected as the object to be moved (ST400, ST410, and ST420). In the case where no designated object has been selected as the object to be moved (Yes at ST400), the control unit 40 causes the tactile presentation unit 30 to stop presenting the tactile feeling (ST405). In the case where one designated object has been selected as the object to be moved (Yes at ST410), the control unit 40 causes the tactile presentation unit 30 to present relatively light tactile feeling (ST415). In the case where two designated objects have been selected as the object to be moved (Yes at ST420), the control unit 40 causes the tactile presentation unit 30 to present medium tactile feeling (ST425). The medium tactile feeling (ST425) is lower in frequency and larger in amplitude of the oscillation, than the light tactile feeling (ST415). In the case where three or more designated objects have been selected as the object to be moved (No at ST400, ST410, and ST420), the control unit 40 causes the tactile presentation unit 30 to present heavy tactile feeling (ST430). The heavy tactile feeling (ST430) is lower in frequency and larger in amplitude of the oscillation, than the medium tactile feeling (ST425).
As described thus far, in the user interface device 1 according to the first embodiment, the object on the screen 11 designated by the contact made on the input surface 21 is identified as the designated object, on the basis of the detection result of the contact position of the finger or the like, provided by the detection unit 20. In addition, the identified designated object is selected as the object to be moved, according to the pressing force detected by the detection unit 20. Thus, the object to be moved is selected out of the objects on the screen 11, on the basis of the detection result of the contact position and the pressing force on the input surface 21. The mentioned arrangement enables the object to be moved to be selected through an operation as simple as touching and pressing the input surface 21, thereby significantly facilitating the selection of the object to be moved, and improving the user-friendliness.
In the user interface device 1 according to the first embodiment, at least one designated object is selected as the object to be moved, out of the plurality of designated objects, according to the pressing force detected by the detection unit 20. Such an arrangement enables the object to be moved to be selected out of the plurality of designated objects, through an operation as simple as adjusting the pressing force, thereby improving the user-friendliness.
In the user interface device 1 according to the first embodiment, the number of the designated objects to be selected as the object to be moved, out of the plurality of designated objects, is increased with the increase in the pressing force detected by the detection unit. Accordingly, the number of objects to be moved is increased, with the increase in the pressing force applied to the input surface 21. Such an arrangement simplifies the operation to select the object to be moved out of the plurality of designated objects, thereby improving the user-friendliness.
In the user interface device 1 according to the first embodiment, the designated object at the frontmost position, among the plurality of designated objects overlapping each other on the screen 11, is selected as the object to be moved, when the pressing force is relatively small. As the pressing force increases, the selection range is expanded from the designated object at the frontmost position toward the designated objects at the rear position. Accordingly, the number of objects to be moved overlapping each other is increased, with the increase in the pressing force applied to the input surface 21. Such an arrangement simplifies the operation to select the object to be moved out of the plurality of designated objects overlapping on the screen 11, thereby improving the user-friendliness.
With the user interface device 1 according to the first embodiment, at least one designated object is deselected as the object to be moved, according to the pressing force detected by the detection unit 20, and therefore the deselection as the object to be moved can be easily performed.
With the user interface device 1 according to the first embodiment, at least one designated object is deselected as the object to be moved, by making the pressing force detected by the detection unit 20 smaller than the threshold A1, and therefore the deselection as the object to be moved can be easily performed.
With the user interface device 1 according to the first embodiment, at least one designated object is deselected as the object to be moved by stopping touching the input surface 21, and therefore the deselection as the object to be moved can be easily performed.
With the user interface device 1 according to the first embodiment, the user can perceive whether at least one object on the screen 11 has been selected (not in the non-selection mode), depending on whether the tactile presentation unit 30 is presenting the continuous tactile feeling. Such an arrangement enables the user to perceive the situation through the tactile feeling, without the need to constantly watch the objects on the screen 11, thereby making the operation to select the object to be moved more comfortable.
Hereunder, some variations of the user interface device 1 according to the first embodiment will be described.
The flowchart of
In the first mode (ST330), the control unit 40 selects the designated object smallest in area, as the object to be moved (ST336), out of the designated objects identified at step ST210. In the case where, for example, two or more designated objects have been identified at step ST210, the control unit 40 selects the designated object smallest in area as the object to be moved, but not the remaining designated objects.
In the second mode (ST350), the control unit 40 selects the designated objects smallest and second smallest in area, as the object to be moved (ST356), out of the designated objects identified at step ST210. In the case where, for example, three or more designated objects have been identified at step ST210, the control unit 40 selects the designated objects smallest and second smallest in area as the object to be moved, but not the remaining designated objects.
In
With the mentioned variation, the designated object smallest in area is selected as the object to be moved, when the pressing force is relatively small. As the pressing force increases, the selection range is expanded from the designated object smallest in area toward the designated objects larger in area. Accordingly, the area of the object to be selected as the object to be moved is increased, with the increase in the pressing force applied to the input surface 21. Such an arrangement simplifies the operation to select the object to be moved out of the plurality of designated objects that are different in area, thereby improving the user-friendliness.
First, a difference between the flowchart of
The flowchart of
In addition, the flowchart of
According to the flowchart of
By the method according to the flowchart of
In this variation, the control unit 40 repeatedly decides which of the plurality of conditions (first condition, second condition, and third condition), corresponding to the respective selection criteria (first mode, second mode, and third mode) is satisfied. The control unit 40 also counts, when no object to be moved has been selected (in the non-selection mode), the number of times that the condition has been decided to be satisfied, as “the number of decision-making times”, with respect to each of the plurality of conditions regarding the pressing force F. When the number of decision-making times counted with respect to a given condition regarding the pressing force F exceeds a predetermined number of times (first number of decisions), the control unit 40 selects at least one designated object as the object to be moved, according to the selection criterion corresponding to that condition. When none of the three conditions regarding the pressing force F are satisfied, in other words when the pressing force is smaller than the threshold A1, the control unit 40 proceeds to the non-selection mode in which no object to be moved is selected, and resets the number of decision-making times counted with respect to each of the conditions, to an initial value.
In this variation, further, when the number of decision-making times counted with respect to a given condition regarding the pressing force F exceeds a predetermined number of times, equal to or fewer than the first number of decisions (second number of decisions), the control unit 40 resets the number of decision-making times counted with respect to the remaining conditions, to the initial value.
Referring to
The control unit 40 compares the pressing force detected by the detection unit 20 with the threshold A1 (ST300). When the pressing force F is smaller than the threshold A1 (Yes at ST300), the control unit 40 proceeds to the “non-selection mode” (ST310). In the non-selection mode, the control unit 40 does not select the object to be moved (ST315). In this case, in addition, the control unit 40 resets the number of decision-making times CT1 counted with respect to the first condition, the number of decision-making times CT2 counted with respect to the second condition, and the number of decision-making times CT3 counted with respect to the third condition, to the initial value (e.g., zero) (ST301).
When the pressing force F satisfies the first condition “A1·F<A2” (No at ST300, Yes at ST320), the control unit 40 decides whether the non-selection mode is set (ST321), and performs the operation of steps ST322 to ST325, ST330, and ST335, in the case where the non-selection mode is set (Yes at ST321). In the case where the non-selection mode is not set (No at ST321), the control unit 40 skips the operation of steps ST322 to ST325, ST330, and ST335, and maintains the current mode.
At step ST322, the control unit 40 increments the number of decision-making times CT1 for the first condition (e.g., increases the value by 1). Upon incrementing the number of decision-making times CT1, the control unit 40 compares between the number of decision-making times CT1 and a second number of decisions M1 (ST323). In the case where the number of decision-making times CT1 is larger than the second number of decisions M1 (Yes at ST323), the control unit 40 resets the number of decision-making times CT2 counted with respect to the second condition, and the number of decision-making times CT3 counted with respect to the third condition, to the initial value (ST324).
After step ST323 and ST324, the control unit 40 compares between the number of decision-making times CT1 and a first number of decisions N1 (ST325). The first number of decisions N1 has a value equal to or larger than the second number of decisions M1. In the case where the number of decision-making times CT1 is larger than the first number of decisions N1 (Yes at ST325), the control unit 40 proceeds to the first mode (ST330). In the first mode, the control unit 40 selects the frontmost designated object as the object to be moved, out of the designated objects identified at step ST210 (
When the pressing force F satisfies the second condition “A2≤F<A3” (No at ST320, Yes at ST340), the control unit 40 decides whether the non-selection mode is set (ST341), and performs the operation of steps ST342 to ST345, ST350, and ST355, in the case where the non-selection mode is set (Yes at ST341). In the case where the non-selection mode is not set (No at ST341), the control unit 40 skips the operation of steps ST342 to ST345, ST350, and ST355, and maintains the current mode.
At step ST342, the control unit 40 increments the number of decision-making times CT2 for the second condition. Upon incrementing the number of decision-making times CT2, the control unit 40 compares between the number of decision-making times CT2 and a second number of decisions M2 (ST343). In the case where the number of decision-making times CT2 is larger than the second number of decisions M2 (Yes at ST343), the control unit 40 resets the number of decision-making times CT1 counted with respect to the first condition, and the number of decision-making times CT3 counted with respect to the third condition, to the initial value (ST344).
After step ST343 and ST344, the control unit 40 compares between the number of decision-making times CT2 and a first number of decisions N2 (ST345). The first number of decisions N2 has a value equal to or larger than the second number of decisions M2. In the case where the number of decision-making times CT2 is larger than the first number of decisions N2 (Yes at ST345), the control unit 40 proceeds to the second mode (ST350). In the second mode, the control unit 40 selects the frontmost and the second frontmost designated objects as the object to be moved, out of the designated objects identified at step ST210 (
When the pressing force F satisfies the third condition “A3≤F” (No at ST340), the control unit 40 decides whether the non-selection mode is set (ST361), and performs the operation of steps ST362 to ST365, ST370, and ST375, in the case where the non-selection mode is set (Yes at ST361). In the case where the non-selection mode is not set (No at ST361), the control unit 40 skips the operation of steps ST362 to ST365, ST370, and ST375, and maintains the current mode.
At step ST362, the control unit 40 increments the number of decision-making times CT3 for the third condition. Upon incrementing the number of decision-making times CT3, the control unit 40 compares between the number of decision-making times CT3 and a second number of decisions M3 (ST363). In the case where the number of decision-making times CT3 is larger than the second number of decisions M3 (Yes at ST363), the control unit 40 resets the number of decision-making times CT1 counted with respect to the first condition, and the number of decision-making times CT2 counted with respect to the second condition, to the initial value (ST364).
After step ST363 and ST364, the control unit 40 compares between the number of decision-making times CT3 and a first number of decisions N3 (ST365). The first number of decisions N3 has a value equal to or larger than the second number of decisions M3. In the case where the number of decision-making times CT3 is larger than the first number of decisions N3 (Yes at ST365), the control unit 40 proceeds to the third mode (ST370). In the third mode, the control unit 40 selects all of the designated objects identified at step ST210 (
With the mentioned variation, in order for the object to be moved to be selected according to one of the selection criteria, the number of decision-making times (CT1, CT2, CT3), at which it has been decided that one of the conditions corresponding to the one of the selection criteria is satisfied, has to exceed the first number of decisions (N1, N2, N3). Therefore, the selection criteria are prevented from switching at short time intervals, even when the decision result on the conditions related to the pressing force F varies at short time intervals.
With the mentioned variation, in addition, when the number of decision-making times (CT1, CT2, or CT3) counted with respect to a given condition exceeds the second number of decisions (M1, M2, M3) equal to or fewer than the first number of decisions (N1, N2, N3), the number of decision-making times counted with respect to the remaining conditions is reset to the initial value. Accordingly, in the case where the numbers of decision-making times with respect to the respective conditions each increase owing to the variation of the pressing force F, the numbers of decision-making times, with respect to the conditions other than the condition about which the number of decision-making times has first exceeded the second number of decisions, are suppressed from exceeding the first number of decisions. For example, when the number of decision-making times CT1 for the first condition and the number of decision-making times CT2 for the second condition are each increasing, the number of decision-making times CT1 for the first condition and the number of decision-making times CT3 for the third condition are reset to the initial value (e.g., zero), in the case where the number of decision-making times CT2 for the second condition first exceeds the second number of decisions M2. Therefore, the number of decision-making times CT1 for the first condition is restricted from exceeding the first number of decisions N1, and the number of decision-making times CT3 for the third condition is restricted from exceeding the first number of decisions N3. Thus, even when the decision result on the condition of the pressing force F varies owing to the fluctuation of the pressing force F, the number of decision-making times for a given condition is facilitated to exceed the first number of decisions earlier than the number of decision-making times for the remaining conditions, and consequently the selection criterion with respect to the object to be moved can be stably established.
In another variation of this embodiment, when the number of decision-making times (CT1, CT2, CT3) counted with respect to a given condition exceeds the second number of decisions (M1, M2, M3) equal to or fewer than the first number of decisions (N1, N2, N3), the control unit 40 may decrease the number of decision-making times counted with respect to the remaining conditions. Such an arrangement also facilitates the number of decision-making times for a given condition to exceed the first number of decisions, earlier than the number of decision-making times for the remaining conditions.
Further, the control unit 40 may employ an output of a timer circuit, as the count value of the number of decision-making times (CT1, CT2, CT3). In other words, the control unit 40 may use the count value incremented by the timer circuit at predetermined time intervals, from a time point where it is decided that one of the first to the third conditions is satisfied, as the number of decision-making times (CT1, CT2, CT3). The number of decision-making times (CT1, CT2, CT3) thus counted may be approximately regarded as the number of decision-making times counted when the decision on which of the first to the third conditions is satisfied is made, at the predetermined time intervals.
At step ST236, the control unit 40 selects the object to be moved according to the pressing force, but does not deselect the object to be moved according to the pressing force. The control unit 40 deselects the object to be moved at step ST260, reached when the contact on the input surface 21 is suspended. In other words, upon selecting the object to be moved according to the pressing force, the control unit 40 maintains the selection of the object to be moved, until the contact on the input surface 21 is suspended.
When the pressing force F is smaller than the threshold A1 at step ST300 (Yes at ST300), the control unit 40 enters the non-selection mode in the case where none of the first to the third modes is set (No at ST306), but proceeds to step ST320 in the case where one of the first to the third modes is set (Yes at ST306). Then, when the pressing force F is smaller than the threshold A2 at step ST320 (Yes at ST320), the control unit 40 enters the first mode in the case where neither of the second and the third modes is set (No at ST326), but proceeds to step ST340 in the case where one of the second and the third modes is set (Yes at ST326). Further, when the pressing force F is smaller than the threshold A3 at step ST340 (Yes at ST340), the control unit 40 enters the second mode in the case where the third modes is not set (No at ST346), but again enters the third mode in the case where the third modes is set (Yes at ST346). Thus, when the mode to select a larger number of objects to be moved is once entered by applying a larger pressing force, such mode is maintained even though the pressing force is reduced thereafter. Therefore, the object to be moved can be prevented from being deselected.
With the mentioned variation, when a larger number of objects to be moved are selected by increasing the pressing force, the selection of the objects to be moved is maintained despite the pressing force being reduced thereafter. Therefore, a plurality of objects can be collectively moved easily, with a small pressing force.
Upon selecting at least one designated object as the object to be moved, the control unit 40 controls the tactile presentation unit 30 so as to present a temporary tactile feeling for notifying that the selection has been made. To be more detailed, upon selecting a new object to be moved at step ST235 (Yes at ST240), where the object to be moved is selected and deselected according to the pressing force, the control unit 40 controls the tactile presentation unit 30 so as to present a temporary tactile feeling (e.g., temporary oscillation) for notifying that the new object to be moved has been selected (ST245).
The arrangement according to the mentioned variation enables the user to perceive that the new object to be moved has been selected, with the temporary tactile feeling. Therefore, the user can perceive the situation through the tactile feeling, without the need to constantly watch the objects on the screen 11. Thus, the operation to select the object to be moved can be more comfortably performed.
Hereafter, the user interface device 1 according to a second embodiment will be described. In the user interface device 1 according to the second embodiment, the moving speed of the object is changed according to the pressing force. The configuration of the user interface device 1 according to the second embodiment is generally the same as that of the user interface device 1 according to the first embodiment shown in
When the contact position detected by the detection unit 20 moves, the control unit 40 moves at least a part of the objects displayed on the screen 11, according to the movement of the contact position. When moving the object on the screen 11 according to the movement of the contact position, the control unit 40 changes the relation between an operation stroke L and an object travel M, according to the pressing force detected by the detection unit 20. The operation stroke L corresponds to a movement distance of the contact position on the input surface 21, and the object travel M corresponds to a movement distance of the object on the screen 11.
For example, the control unit 40 determines the object travel M with respect to the operation stroke L, so that a ratio M/L of the object travel M to the operation stroke L becomes a predetermined value. When the pressing force detected by the detection unit 20 varies, the control unit 40 changes the ratio M/L according to the change of the pressing force.
The control unit 40 reduces the object travel M with respect to a certain fixed operation stroke L, with an increase in the pressing force detected by the detection unit 20. In other words, the control unit 40 decreases the ratio M/L, with the increase in the pressing force.
First, the control unit 40 acquires a detection result of the contact position and the pressing force on the input surface 21, from the detection unit 20 (ST500). Upon acquiring the detection result from the detection unit 20, the control unit 40 selects the object to be moved out of the objects displayed on the screen 11, and deselects the object as the object to be moved, on the basis of the detection result (ST505).
At step ST505, the control unit 40 selects and deselects the object to be moved, for example in the same manner as step ST105 (
Alternatively, the control unit 40 may select and deselect the object to be moved by a different method, instead of utilizing the detection result of the pressing force. For example, the control unit 40 may identify an object on the screen 11 as the designated object in the same manner as step ST210 (
After selecting or deselecting the object to be moved, the control unit 40 updates, in the case where any object to be moved remains selected through the previous and the current process (Yes at ST510), the position of such object to be moved (ST525). For example, the control unit 40 calculates a direction and a distance, in and by which the contact position has moved on the input surface 21, on the basis of the previously detected contact position on the input surface 21 and the currently detected contact position on the input surface 21. The control unit 40 calculates a coordinate on the screen 11 to which the object to be moved is supposed to move, on the basis of the direction and the distance in and by which the contact position has moved, and moves the object to be moved to the coordinate.
To update the position of the object to be moved at step ST525, the control unit 40 determines the relation between the operation stroke L and the object travel M, according to the pressing force F (ST515). The control unit 40 calculates the coordinate on the screen 11 to which the object to be moved is supposed to move, according to the relation between the operation stroke L and the object travel M determined at step ST515 (ST525).
The control unit 40 compares the pressing force detected by the detection unit 20, with a threshold B1 (ST600). When the pressing force F is smaller than the threshold B1 (Yes at ST600), the control unit 40 sets a “normal speed”, by adjusting the value of the ratio M/L to “K0” (ST605). The value “K0” is larger than “K1” to “K4” to be subsequently referred to. In the normal speed, the speed of the object with respect to a fixed speed of the contact position on the input surface 21 (hereinafter simply “object speed” as the case may be) is fastest.
When the pressing force F is equal to or larger than the threshold B1 (No at ST600), the control unit 40 compares the pressing force F with a threshold B2 (B2>B1) (ST610). When the pressing force F is smaller than the threshold B2 (Yes at ST610), the control unit 40 sets a “first speed”, by adjusting the value of the ratio M/L to “K1” (K1<K0) (ST615). In the first speed, the object speed is second fastest.
When the pressing force F is equal to or larger than the threshold B2 (No at ST610), the control unit 40 compares the pressing force F with a threshold B3 (B3>B2) (ST620). When the pressing force F is smaller than the threshold B3 (Yes at ST620), the control unit 40 sets a “second speed”, by adjusting the value of the ratio M/L to “K2” (K2<K1) (ST625). In the second speed, the object speed is third fastest.
When the pressing force F is equal to or larger than the threshold B3 (No at ST620), the control unit 40 compares the pressing force F with a threshold B4 (B4>B3) (ST630). When the pressing force F is smaller than the threshold B4 (Yes at ST630), the control unit 40 sets a “third speed”, by adjusting the value of the ratio M/L to “K3” (K3<K2) (ST635). In the third speed, the object speed is second slowest.
When the pressing force F is equal to or larger than the threshold B4 (No at ST630), the control unit 40 sets a “fourth speed”, by adjusting the value of the ratio M/L to “K4” (K4<K3) (ST645). In the fourth speed, the object speed is slowest.
As described above, in the user interface device 1 according to the second embodiment, the relation between the operation stroke L and the object travel M is changed according to the pressing force detected by the detection unit 20, when at least a part of the objects displayed on the screen 11 moves so as to follow up the movement of the contact position on the input surface 21. On the assumption that the moving speed of the contact position on the input surface 21 is constant, the longer the object travel M is with respect to the operation stroke L, the faster the object moves, and the shorter the object travel M is with respect to the operation stroke L, the slower the object moves. Therefore, the moving speed of the object can be controlled by adjusting the pressing force. Thus, since the moving speed of the object can be easily adjusted, without the need to go through a troublesome environment setting, the user-friendliness in terms of movement of the object can be improved.
With the user interface device 1 according to the second embodiment, the larger the pressing force is, the longer the object travel M becomes with respect to a fixed operation stroke L. On the assumption that the moving speed of the contact position on the input surface 21 is constant, the larger the pressing force is, the slower the object moves. Therefore, the object can be easily made to move a minute distance.
Hereunder, a variation of the user interface device 1 according to the second embodiment will be described.
Upon determining the relation between the operation stroke L and the object travel M at step ST515, the control unit 40 controls the tactile presentation unit 30 so as to change the tactile feeling according to the relation between the operation stroke L and the object travel M. More specifically, the control unit 40 controls the tactile presentation unit 30 so as to change the frequency of the click feeling repeatedly transmitted as the tactile feeling, according to the relation between the operation stroke L and the object travel M. For example, the control unit 40 causes the tactile presentation unit 30 to generate periodical click feeling, while the user is moving the object. When the relation between the operation stroke L and the object travel M is changed according to the pressing force F, the control unit 40 also changes the frequency of the click feeling, according to the change of the relation.
The control unit 40 checks the state of the ratio M/L determined at step ST515 (ST700, ST710, ST720, ST730). In the case of the normal speed (Yes at ST700), the control unit 40 causes the tactile presentation unit 30 to stop presenting the tactile feeling (ST705). In the case of the first speed (Yes at ST710), the control unit 40 sets the frequency of the click feeling generated by the tactile presentation unit 30 to “T1”. The frequency “T1” is shorter than “T2” to “T4” to be subsequently referred to, and therefore the tempo of the click feeling is fastest, in the first speed. In the case of the second speed (Yes at ST720) the control unit 40 sets the frequency of the click feeling to T2 (T2>T1), sets the frequency of the click feeling to T3 (T3>T2) in the case of the third speed (Yes at ST730), and sets the frequency of the click feeling to T4 (T4>T3) in other cases (No at all of ST700, ST710, ST720, and ST730). The control unit 40 reduces the frequency of the click feeling (slows down the tempo of the click feeling) generated by the tactile presentation unit 30, with a decrease in the value of the ratio M/L.
With the mentioned variation, the user can perceive the relation between the operation stroke L and the object travel M determined according to the pressing force, from the frequency of the click feeling transmitted as the tactile feeling. Such an arrangement enables the user to perceive the situation through the tactile feeling, without the need to constantly watch the objects on the screen 11, thereby making the operation to select the object to be moved more comfortable.
Although the frequency of the click feeling transmitted as the tactile feeling is changed in the foregoing variation, the tactile feeling may be changed in different manners. For example, the control unit 40 may control the tactile presentation unit 30 so as to change the frequency or amplitude of the oscillation transmitted as the tactile feeling, according to the relation between the operation stroke L and the object travel M. More specifically, the control unit 40 may reduce the frequency, or increase the amplitude, of the oscillation generated by the tactile presentation unit 30, with a decrease in the value of the ratio M/L (decrease in the object speed). In this case also, the user can perceive the relation between the operation stroke L and the object travel M from the tactile feeling, and therefore the operation becomes more comfortable, compared with the situation where the user has to constantly watch the screen.
Hereafter, the user interface device 1 according to a third embodiment will be described. In the user interface device 1 according to the third embodiment, the display size of the object is changed according to the pressing force. The configuration of the user interface device 1 according to the third embodiment is generally the same as that of the user interface device 1 according to the first embodiment shown in
When a contact position of a finger or the like is detected by the detection unit 20, the control unit 40 identifies at least one object on the screen 11 designated by the contact made on the input surface 21, as the designated object, on the basis of the detected contact position. Upon identifying the designated object, the control unit 40 changes the display size of the designated object, according to the pressing force detected by the detection unit 20.
In an example, the designated object the display size of which is to be changed may be an icon, for example representing a file. Upon identifying the icon on the screen 11 on the basis of the contact position detected by the detection unit 20, the control unit 40 changes the display size of the icon, according to the pressing force detected by the detection unit 20.
In another example, the designated object the display size of which is to be changed may be at least one of icons included in the same folder. Upon identifying a window of the folder on the screen 11 on the basis of the contact position detected by the detection unit 20, the control unit 40 changes the display size of the at least one icon included in the window of the folder, according to the pressing force detected by the detection unit 20.
In still another example, the designated object the display size of which is to be changed may be contents (e.g., image) of a file displayed in a preview window. Upon identifying the file the contents of which are displayed in the preview window as the designated object, or identifying the preview window as the designated object, the control unit 40 changes the display size of the contents of the file in the preview window, according to the pressing force detected by the detection unit 20.
For example, upon identifying the designated object on the basis of the contact position detected by the detection unit 20, the control unit 40 increases display size of the designated object, with an increase in the pressing force detected by the detection unit 20.
In addition, when changing the display size of the designated object according to the pressing force detected by the detection unit 20, the control unit 40 may control the tactile presentation unit 30 so as to change at least one of the frequency and the amplitude of the oscillation transmitted as the tactile feeling, according to the display size of the designated object. For example, the control unit 40 reduces the frequency of the oscillation transmitted as the tactile feeling, with an increase in the display size of the designated object.
First, the control unit 40 acquires the detection result of the contact position and the pressing force on the input surface 21, from the detection unit 20 (ST800). Upon acquiring the detection result from the detection unit 20, the control unit 40 decides whether a contact has been made on the input surface 21, on the basis of the detection result of the contact position provided by the detection unit 20 (ST805). In the case where a contact has been made on the input surface 21 (Yes at ST805), the control unit 40 identifies the object on the screen 11 designated by the contact, as the designated object (ST810). For example, the control unit 40 identifies the object located at the position overlapping the cursor (pointer) indicating the pointed object, as the designated object. When a plurality of objects are located at the position overlapping the cursor, the control unit 40 may identify each of the plurality of objects, or only the frontmost object, as the designated object.
After step ST810, the control unit 40 decides whether any designated object (object designated by the contact made on the input surface 21) has been identified at step ST810 (ST815). In the case where a designated object has been identified at step ST810 (Yes at ST815), the control unit 40 proceeds to step ST820 and ST835. In contrast, in the case where no designated object has been identified at step ST810 (No at ST815), the control unit 40 finishes the operation instead of proceeding to step ST820 and ST835, because the control unit 40 is unable to select or deselect the object to be moved.
At step ST820, the control unit 40 determines the display size of the designated object, according to the detection result of the pressing force provided by the detection unit 20. Further details of step ST820 will be subsequently described, with reference to
After step ST820, the control unit 40 controls the tactile presentation unit 30 so as to present the tactile feeling according to the display size of the designated object determined at step ST820 (ST835). Further details of step ST835 will be subsequently described, with reference to
Upon deciding at step ST805 that no contact has been made on the input surface 21 (No at ST805), the control unit 40 decides whether the display size of the designated object is a normal size (ST850). In the case where the display size of the designated object is the normal size (Yes at ST850), the control unit 40 finishes the operation. In contrast, in the case where the display size of the designated object is not the normal size (No at ST850), the control unit 40 returns the display size of the designated object to the normal size (ST855), and causes the tactile presentation unit 30 to stop presenting the tactile feeling (ST860). Therefore, the display size of the designated object can be returned to the normal size, simply by stopping touching the input surface 21.
The control unit 40 compares the pressing force detected by the detection unit 20 with a threshold C1 (ST900). When the pressing force F is smaller than the threshold C1 (Yes at ST900), the control unit 40 sets the display size of the designated object to the normal size (ST905). In this embodiment, the normal size is smaller than a medium size, a large size, and an extra-large size to be subsequently referred to.
When the pressing force F is equal to or larger than the threshold C1 (No at ST900), the control unit 40 compares the pressing force F with a threshold C2 (C2>C1) (ST910). When the pressing force F is smaller than the threshold C2 (Yes at ST910), the control unit 40 sets the display size of the designated object to the medium size (ST915). When the pressing force F is equal to or larger than the threshold C2 (No at ST910), the control unit 40 compares the pressing force F with a threshold C3 (C3>C2) (ST920). When the pressing force F is smaller than the threshold C3 (Yes at ST920), the control unit 40 sets the display size of the designated object to the large size (ST925). When the pressing force F is equal to or larger than the threshold C3 (No at ST920), the control unit 40 sets the display size of the designated object to the extra-large size (ST930).
Here, when changing the display size of the icon according to the pressing force, the control unit 40 may also change the display size of the information expressed in characters (e.g., file name, application name) accompanying the icon, in proportion to the icon size.
Here, although the designated object designated by the contact made on the input surface 21 (object pointed by the cursor) is the icon 261 in the examples of
The control unit 40 checks the display size of the object determined at step ST820 (ST1000, ST1010, and ST1020). When the object is set to the normal size (Yes at ST1000), the control unit 40 causes the tactile presentation unit 30 to stop presenting the tactile feeling (ST1005). When the object is set to the medium size (Yes at ST1010), the control unit 40 causes the tactile presentation unit 30 to present a relatively light tactile feeling (ST1015). When the object is set to the large size (Yes at ST1020), the control unit 40 causes the tactile presentation unit 30 to present a medium tactile feeling (ST1025). The medium tactile feeling (ST1025) is lower in frequency of the oscillation, than the light tactile feeling (ST1015). When the object is set to the extra-large size (No at all of ST1000, ST1010, and ST1020), the control unit 40 causes the tactile presentation unit 30 to present a heavy tactile feeling (ST1030). The heavy tactile feeling (ST1030) is lower in frequency of the oscillation, than the medium tactile feeling (ST1025).
As described above, in the user interface device 1 according to the third embodiment, when at least one object on the screen 11 is identified as the designated object on the basis of the contact position detected by the detection unit 20, the display size of the designated object is changed, according to the pressing force detected by the detection unit 20. In other words, the display size of the object on the screen 11 is changed, on the basis of the contact position and the pressing force on the input surface. The mentioned arrangement enables the display size of the object to be changed through an operation as simple as touching and pressing the input surface 21, thereby significantly facilitating the selection of the object to be moved, and improving the user-friendliness.
Here, the designated object is not limited to the icon, but may be an image or a map displayed in a predetermined area on the screen 11. In such cases, the image or map displayed in the predetermined area can be easily enlarged or reduced, or made to appear farther or closer, according to the pressing force applied to the input surface 21.
With the user interface device 1 according to the third embodiment, the display size of the designated object is increased, with the increase in the pressing force. In other words, the display size is increased with the increase in the pressing force applied to the input surface 21. Such an arrangement simplifies the operation to increase the display size of the object, thereby improving the user-friendliness.
With the user interface device 1 according to the third embodiment, the user can decide whether the display size of the designated object has been changed according to the pressing force, from the oscillation transmitted as the tactile feeling. Such an arrangement enables the user to perceive the situation through the tactile feeling, without the need to constantly watch the objects on the screen 11, thereby making the operation to change the display size of the objects more comfortable.
Hereunder, some variations of the user interface device 1 according to the third embodiment will be described.
In the case where the display size of the designated object is set according to the pressing force at step ST820, the control unit 40 decides whether the display size of the designated object has been changed by the mentioned setting (ST825). In the case where the display size of the designated object has been changed at step ST820 (Yes at ST825), the control unit 40 changes the frequency of the oscillation generated by the tactile presentation unit 30 as the tactile feeling (ST830). More specifically, the control unit 40 controls the tactile presentation unit 30 so as to reduce the frequency of the oscillation, when increasing the display size of the designated object according to the pressing force detected by the detection unit 20. Conversely, when reducing the display size of the designated object according to the pressing force detected by the detection unit 20, the control unit 40 controls the tactile presentation unit 30 so as to increase the frequency of the oscillation.
The mentioned variation enables the user to perceive that the display size of the designated object has been increased, from the reduction in the frequency of the oscillation transmitted as the tactile feeling. Conversely, the increase in the frequency of the oscillation transmitted as the tactile feeling leads the user to perceive that the display size of the designated object has been reduced. Such an arrangement enables the user to perceive the situation through the tactile feeling, without the need to constantly watch the objects on the screen 11, thereby making the operation related to changing the display size of the objects more comfortable.
When the contact position of a finger or the like is detected by the detection unit 20 (Yes at ST800), the control unit 40 identifies the object on the screen 11 designated by the contact on the input surface 21 as the designated object, on the basis of the contact position where the finger has been detected (ST810). Upon identifying the designated object (Yes at ST815), the control unit 40 changes the displayed details of the information accompanying the identified designated object, according to the pressing force detected by the detection unit 20 (ST870).
Examples of the accompanying information of the designated object include information related to the properties of the file (e.g., file name, file making date and time, file updating date and time, and file size), and information related to the contents (e.g., image size in an image file, and duration in a music file).
In an example, the accompanying information the displayed details of which are to be changed is the information displayed in the accompanying information window. Upon identifying a file in which the accompanying information is displayed in the accompanying information window as the designated object, or identifying the accompanying information window as the designated object, the control unit 40 changes the displayed details of the accompanying information in the accompanying information window, according to the pressing force detected by the detection unit 20.
For example, upon identifying the designated object on the basis of the detection result of the contact position provided by the detection unit 20, the control unit 40 increases the displayed details of the accompanying information of the designated object, with an increase in the pressing force detected by the detection unit 20.
Upon deciding at step ST805 that the input surface 21 has not been contacted (No at ST805), the control unit 40 decides whether the amount of the accompanying information of the designated object is “few” to be subsequently described (step ST1105 in
The control unit 40 compares the pressing force detected by the detection unit 20 with a threshold D1 (ST1100). When the pressing force F is smaller than the threshold D1 (Yes at ST1100), the control unit 40 sets the amount of displayed details of the accompanying information of the object to “few” (ST1105). When the pressing force F is equal to or larger than the threshold D1 (No at ST1100), the control unit 40 compares the pressing force F with a threshold D2 (D2>D1) (ST1110). When the pressing force F is smaller than the threshold D2 (Yes at ST1110), the control unit 40 sets the amount of the displayed details of the accompanying information of the object to “medium” (ST1115). A larger number of items are displayed in the “medium” state, than in the “few” state. When the pressing force F is equal to or larger than the threshold D2 (No at ST1110), the control unit 40 sets the amount of the displayed details of the accompanying information of the object to “many” (ST1120). A larger number of items are displayed in the “many” state, than in the “medium” state.
Here, although the designated object designated by the contact made on the input surface 21 (object pointed by the cursor 171) is the icon 271 in the examples of
With the mentioned variation, when at least one object on the screen 11 is identified as the designated object on the basis of the contact position detected by the detection unit 20, the displayed details of the accompanying information of the designated object are changed, according to the pressing force detected by the detection unit 20. In other words, the displayed details of the accompanying information of the object are changed, on the basis of the contact position and the pressing force on the input surface. The mentioned arrangement enables the displayed details of the accompanying information of the object to be changed through an operation as simple as touching and pressing the input surface 21. Thus, the displayed details of the accompanying information of the object can be easily changed, and therefore the user-friendliness can be improved.
With the mentioned variation, in addition, a larger number of items of the accompanying information of the designated object are displayed, with the increase in the pressing force. In other words, the displayed details of the accompanying information are increased, with the increase in the pressing force applied to the input surface 21. Such an arrangement simplifies the operation to increase the amount of the displayed details of the accompanying information, thereby improving the user-friendliness.
The present invention is not limited to the foregoing embodiments, but broadly encompasses different variations.
Although the foregoing embodiments represent the case where the detection unit is configured to detect the contact position and the pressing force, the detection unit may also detect the type of the object that has contacted the input surface. For example, the detection unit may detect whether the object that has contacted the input surface is a finger or another object (e.g., palm). The finger may be detected, for example, on the basis of the contact area of the object on the input surface. When a contact on the input surface by an object other than a finger is detected, the control unit may suspend the display control of the screen based on the pressing force, performed according to the foregoing embodiments. Such an arrangement prevents an unintended display control of the screen (e.g., moving the object, change of the display size of the object, and so forth) from being performed, owing to a contact or pressing by an object other than the finger (e.g., palm).
Although the detection of the contact position on the input surface is based on the electrostatic capacitance in the foregoing embodiments, the contact position may be detected by different methods. To detect the contact position, at least one of the methods known to persons skilled in the art may be employed, such as the electrostatic capacitance method, an electromagnetic induction method, a resistive film method, a surface acoustic wave method, and an infrared light method.
Although the piezoelectric elements are employed to detect the pressing force applied to the input surface in the foregoing embodiments, the pressing force may be detected by different methods. To detect the contact position, at least one of the methods known to persons skilled in the art may be employed, such as the piezoelectric method, a distortion gauge method, and an electromagnetic induction method. Alternatively, an electrostatic sensor may be employed so as to detect the pressing force on the basis of information of contact area of a finger on the sensor, or any two or more of the cited detection methods may be combined, to detect the pressing force.
Although the screen of the display device and the input surface of the detection unit are independent from each other in the foregoing embodiments, a known touch panel may be employed, so as to integrate the screen of the display device and the input surface of the detection unit.
Although the user interface device is exemplified by the laptop personal computer in the foregoing embodiments, the user interface device is not limited thereto. The user interface device according to the embodiments is applicable to various apparatuses having a user interface function, examples of which include a desktop PC, a tablet computer, a telephone, a calculator, a game machine, a car navigation system, an automatic vendor, a ticket vending machine, an ATM, and an industrial machine with a control panel.
Number | Date | Country | Kind |
---|---|---|---|
2017-115510 | Jun 2017 | JP | national |