This application claims priority from Japanese Patent Application No. 2012-010316 filed on Jan. 20, 2012 including the specification, drawings and abstract, the disclosure of which is incorporated herein by reference in its entirety.
1. Field of the Invention
Aspects of the present invention relate to an operation input system including a touch pad serving as a pointing device.
2. Description of the Related Art
Devices including an operation input system as standard equipment are commonly utilized in laptop personal computers etc., for example. The operation input system includes a touch pad serving as a pointing device. In these types device a user performs various slide operations using their fingertips, the tip of a stylus pen, or the like on an operation surface provided on a surface of the touch pad to move an operation cursor displayed on a display screen which is communicably connected to the touch pad. In addition, the user may perform a predetermined operation on the operation surface when the operation cursor displayed on the display screen is located over an operation figure (such as an operation icon, for example) to achieve a function associated with the operation figure. This type of operation input system including the touch pad may be utilized to input a predetermined operation to in-vehicle navigation apparatuses.
The in-vehicle navigation apparatuses are often operated by a driver of a vehicle. In such a case, the user (a driver of the vehicle) operates the navigation apparatus when. When driving, it is difficult to perform these operations while closely watching the display screen, and thus, a desired operation may not be performed accurately. In view of this, there have been proposed an operation input system that permits a user to perform operation input utilizing tactile sensation (a tactile feel) without closely watching a display screen. For example, Japanese Patent Application Publication No. 2006-268068 (JP 2006-268068 A) discloses a technology by which the entirety of an operation surface is covered with fiber hair and fiber hair provided at a position on the operation surface corresponding to the position of an operation figure displayed on a display device is caused to stand up. In the system according to JP 2006-268068 A, however, the entirety of the operation surface is covered with fiber hair. Thus, it is difficult to discriminate through tactile sensation between standing fiber hair standing and non-standing fiber hair.
In some navigation apparatuses, a screen displayed on a display device is divided so as to display a plurality of different screens such as a map and route information, a map and a television broadcast screen, or the like. In the case where such a plurality of different screens is provided, it may be difficult to distinguish between the screens on an operation surface, and it may be difficult for the user to know which of the screens he/she is operating. Simply providing a partition line on the operation surface may hinder operations performed through tactile sensation. From the viewpoint of convenience to the user, it is preferable that operation input may be performed more intuitively. The operation input system according to the related art leaves room for improvement in this regard.
In view of the foregoing, it is desired to provide an operation input system that enables as user to perform reliable operation input compared to the related art without closely watching a display screen, and that enables to perform operation input in a highly convenient manner.
In view of the foregoing issue, an aspect of the present invention provides an operation input system including:
According to the aspect, a predetermined operation can be input to another device communicably connected to the operation input system in accordance with the position of the object to be sensed in contact with or in proximity to the operation surface of the touch pad. In the case where the display screen of the display device is divided into a plurality of screen regions, in general, the screen regions are often configured to receive different types of input as well. Thus, in the case where the user performs a predetermined operation in a particular screen region, an operation is preferably performed in a region corresponding to the particular screen region also on the operation surface of the touch pad. According to the aspect, the operation surface is divided into a plurality of operation surface regions in correspondence with the screen regions formed by dividing the display screen. The protrusion members penetrate through the operation plate on the surface of the touch pad to protrude from the operation surface along the boundary between the operation surface regions. As a result, the stereoscopic boundary formed by the protrusion members protruded from the operation surface makes each operation surface region clearly distinguishable by the user. That is, the user can clearly recognize the operation surface region corresponding to the particular screen region for which it is desired to input a predetermined operation.
According to the aspect, further, the area of each operation surface region is set in accordance with the display content of each screen region, irrespective of the ratio in area between the plurality of screen regions. The type of a predetermined operation input by the user, the number of operation locations that should be distinguished during a predetermined operation, and so forth differ depending on the display content of each screen region. Therefore, the area required for the operation surface region also differs depending on the display content of the corresponding screen region. With the area of each operation surface region set in accordance with the display content of each screen region, it is possible to appropriately secure the area of each operation surface region required for a predetermined operation, which enables to favorably detect a predetermined operation performed by the user in each operation surface region. That is, the user can perform reliable operation input compared to the related art without closely watching the display screen, and an operation input system that enables to perform operation input in a highly convenient manner is provided.
In the case where a particular operation figure is displayed in at least one of the screen regions, the protrusion control section of the operation input system according to an aspect of the present invention may set the boundary between the operation surface regions such that the area of each of the operation surface regions corresponds to the number of the operation figures provided in the corresponding screen region. That is, in the case where the display screen is divided into a plurality of screen regions that display independent images and a particular operation figure is displayed in at least one of the screen regions, the protrusion control section may divide the operation surface into a plurality of operation surface regions corresponding to respective ones of the plurality of screen regions, the number of the operation surface regions being the same as the number of the screen regions, set the boundary between the operation surface regions such that the area of each of the operation surface regions corresponds to the number of operation figures provided in the corresponding screen region, and cause the protrusion members to be protruded along the boundary. That is, in one more specific aspect, the area of each operation surface region, which is set in accordance with the display content of each screen region, is set in accordance with the number of operation figures provided in each screen region. In other words, the area of each operation surface region is set in accordance with the number of operation locations that should be distinguished during a predetermined operation. This allows the operation figures in each screen region to be appropriately distributed in the corresponding operation surface region, which enables to favorably detect a predetermined operation performed by the user in each operation surface region. That is, the user can perform reliable operation input compared to the related art without closely watching the display screen, and an operation input system that enables to perform operation input in a highly convenient manner is provided.
In the case where a particular operation figure is displayed in at least one of the screen regions, the protrusion control section of the operation input system according to an aspect of the present invention may set the boundary such that the operation surface region corresponding to the screen region containing the larger number of the operation figures has a larger area, irrespective of a ratio in area between the plurality of screen regions. According to the aspect, an operation surface region corresponding to a screen region with the larger number of operation figures have a larger area, and thus the operation figures in each screen region are appropriately distributed in the corresponding operation surface region. For example, an appropriate gap that is uniform over the entire touch pad may be set between the positions corresponding to the operation figures in the operation surface regions. This enables the user to perform more reliable operation input.
In the case where a particular operation figure is displayed in at least one of the screen regions, the protrusion control section of the operation input system according to an aspect of the present invention may set the area of the operation surface region corresponding to each of the plurality of screen regions such that a ratio in area between the operation surface regions matches a ratio in number of the operation figures contained in each of the screen regions, irrespective of a ratio in area between the plurality of screen regions. With the area of each of the operation surface regions thus set such that the ratio in area between the operation surface regions matches the ratio in number of operation figures contained in the screen regions, the operation figures in each screen region are appropriately distributed in the corresponding operation surface region. That is, the positions corresponding to the operation figures in the operation surface regions are distributed uniformly in a well-balanced manner substantially over the entire operation surface. As a result, the user can perform more reliable operation input.
The type of a predetermined operation input by the user may differ depending on the display content of each screen region. In this case, the operation method acceptable by each of the operation surface regions differs between the operation surface regions, and the area required for the operation surface region also differs depending on the acceptable operation method. With the area of each operation surface region set in accordance with the operation method, it is possible to appropriately secure the area of each operation surface region required for a predetermined operation. In one preferred aspect in which the boundary between the operation surface regions is set such that the area of each of the operation surface regions corresponds to the display content of the corresponding screen region, the protrusion control section of the operation input system according to an aspect of the present invention may set the area of each of the operation surface regions in accordance with an operation method corresponding to the display content of each of the screen regions and acceptable by the operation surface region corresponding to each screen region. As a result, it is possible to favorably detect a predetermined operation performed by the user in each operation surface region, which enables the user to perform more reliable operation input.
In one aspect in which the boundary between the operation surface regions is set such that the area of each of the operation surface regions corresponds to the display content of the corresponding screen region, the protrusion control section of the operation input system according to an aspect of the present invention may set the area of each of the operation surface regions in accordance with whether or not the operation method corresponding to the display content of each of the screen regions and acceptable by the operation surface region corresponding to each screen region includes a touch operation in which the object to be sensed is brought into contact with or into proximity to the operation surface region, or whether or not the operation method includes both a slide operation in which the object to be sensed is slid with the object to be sensed in contact with or in proximity to the operation surface region and the touch operation. While the touch operation is performed at substantially one point on the operation surface, the slide operation is performed at least linearly, or planarly, on the operation surface. An operation performed at a point and a linear or planar operation require different areas of the operation surface region. As a matter of course, a linear or planar operation requires a larger area of the operation surface region than an operation performed at a point. Thus, with the area of each operation surface region set as in the aspect, it is possible to more appropriately secure the area of each operation surface region required for a predetermined operation. As a result, the user can perform more reliable operation input.
In one aspect in which the boundary between the operation surface regions is set such that the area of each of the operation surface regions corresponds to the display content of the corresponding screen region, the display screen of the operation input system according to an aspect of the present invention may have a function of sensing an object to be sensed in contact with or in proximity to the display screen to receive input corresponding to a position of the sensed object, and the protrusion control section may set the area of each of the operation surface regions in accordance with an operation method corresponding to the display content of each of the screen regions on the display screen and acceptable by the screen region. The operation method acceptable by each of the screen regions on the display screen differs between the screen regions. In addition, the area required for the operation surface region also differs depending on the acceptable operation method. With the area of each operation surface region set in accordance with the operation method acceptable by each of the screen regions on the display screen, it is possible to appropriately secure the area of each operation surface region required for a predetermined operation. As a result, it is possible to favorably detect a predetermined operation performed by the user in each operation surface region, which enables the user to perform more reliable operation input.
In one aspect in which the boundary between the operation surface regions is set such that the area of each of the operation surface regions corresponds to the display content of the corresponding screen region, the display screen of the operation input system according to an aspect of the present invention may have a function of sensing an object to be sensed in contact with or in proximity to the display screen to receive input corresponding to a position of the sensed object, and the protrusion control section may set the area of each of the operation surface regions in accordance with whether or not the operation method corresponding to the display content of each of the screen regions on the display screen and acceptable by the screen region includes a touch operation in which the object to be sensed is brought into contact with or into proximity to the screen region, or whether or not the operation method includes both a slide operation in which the object to be sensed is slid with the object to be sensed in contact with or in proximity to the screen region and the touch operation. As discussed above, a slide operation requires a larger area of the operation surface region than a touch operation. With the area of each operation surface region set as in the aspect, it is possible to more appropriately secure the area of each operation surface region required for a predetermined operation, which enables the user to perform more reliable operation input.
As discussed above, the plurality of protrusion members can penetrate through the operation plate on the surface of the touch pad to protrude from the operation surface. The protrusion members can be moved between the protruded state and the retracted state by the protrusion control section. When the protrusion member is in the retracted state, a portion of the operation surface around the protrusion member is flat. When the protrusion member is in the protruded state, in contrast, the distal end portion of the protrusion member is distinctly protruded from the operation surface so as to be directly recognizable by a user through tactile sensation using a fingertip or the like. By bringing the protrusion member positioned at the coordinates on the operation surface corresponding to the coordinates of an operation figure displayed on a display device into the protruded state, for example, the user may easily perform operation input to the operation surface at that position in reliance on the protrusion member in the protruded state. In the case where a particular operation figure is displayed on the display screen, the protrusion control section of the operation input system according to an aspect of the present invention may cause the protrusion members provided in the operation surface region corresponding to each of the plurality of screen regions to be protruded from the operation surface such that an arrangement of the protruded protrusion members corresponds to an arrangement of the operation figures in each of the screen regions.
The operation input system according to an aspect of the present invention may further include a depiction control section that controls depiction of the image to be displayed on the display screen, and in the case where a particular operation figure is displayed on the display screen, the boundary is set within a predetermined distance from an outer periphery of the operation surface or from another boundary, and a plurality of the operation figures are provided in the screen region corresponding to a narrow operation surface region which is the operation surface region set between the boundary and the outer periphery or between the boundary and the other boundary, the protrusion control section may dispose the protrusion members provided in the narrow operation surface region and protruded from the operation surface in correspondence with the plurality of operation figures so as to be in parallel with the boundary, and the depiction control section may dispose the plurality of operation figures in the screen region corresponding to the narrow operation surface region such that an arrangement of the operation figures corresponds to an arrangement of the protrusion members established by the protrusion control section. According to the aspect, the arrangement of the operation figures in the screen region is changed in accordance with the arrangement of the regions corresponding to the operation figures in the operation surface region to make the arrangement of the operation figures in the screen region and the arrangement of the regions corresponding to the operation figures in the operation surface region common. As a result, the user can easily correlate the operation figures in the screen region with the regions corresponding to the operation figures in the operation surface region, which enables the user to perform more reliable operation input.
An operation input system according to an embodiment of the present invention will be described with reference to the drawings. In the embodiment, a system formed using an operation input device 4 configured to perform predetermined (prescribed) operational input to an in-vehicle navigation apparatus 1 (see
1. Schematic Configuration of Navigation Apparatus
A schematic configuration of the navigation apparatus 1 will be described with reference to
The GPS receiver 81 receives GPS signals from Global Positioning System (GPS) satellites. The orientation sensor 82 detects the orientation of travel of the vehicle or variations in the orientation of travel of the vehicle. The distance sensor 83 detects the vehicle speed and the travel distance of the vehicle. As is known in the related art, the navigation computation section 70 can derive an estimated vehicle position on the basis of information obtained from the GPS receiver 81, the orientation sensor 82, and the distance sensor 83, and further on the basis of map matching.
The map database 85 stores map data divided for each predetermined partition. The map data include road network data describing the connection relationship between a plurality of nodes corresponding to intersections and a plurality of links corresponding to roads connecting adjacent nodes. Each node has information on its position on the map expressed by latitude and longitude. Each link has information such as the road type, the length of the link, and the road width as its attribute information. The map database 85 is referenced by the navigation computation section 70 during execution of processes such as displaying a map, searching for a route, and map matching. The map database 85 is stored in a storage medium such as a hard disk drive, a flash memory, or a DVD-ROM.
The display input device 40 is formed by integrating a display device such as a liquid crystal display device and an input device such as a touch panel. The display input device 40 includes a display screen 41 which displays a map of an area around the vehicle, images such as an operation
As shown in
The display input device 40 is disposed at a position at which the display input device 40 may be seen without the need for the user (in particular, the driver of the vehicle) to significantly change his/her viewing direction during drive so as to be easily seeable by the user. In the example shown in
The sound input device 87 receives voice input from the user. The sound input device 87 includes a microphone or the like. The navigation computation section 70 may achieve functions such as searching for a destination through voice recognition and making a handsfree call on the basis of voice commands received through the sound input device 87. The sound input device 87 functions as a third operation input unit. The sound output device 88 includes a speaker or the like. The navigation computation section 70 may achieve functions such as providing voice guidance via the sound output device 88.
In the present embodiment, the specific configuration of the touch pad 10 serving as the second operation input unit, among various devices communicably connected to the navigation apparatus 1, has a novel feature in contrast to its counterpart according to the related art. Thus, the configuration of the operation input device 4 formed to include the touch pad 10 and the configuration of the operation input system 3 formed to include the operation input device 4 is described in detail below.
2. Configuration of Operation Input Device
As shown in
As shown in
The operation plate 11 is provided with a hole portion 12 that penetrates through the operation plate 11. In the embodiment, as shown in
A detailed configuration of the operation input device 4 will be described below. For ease of description, a simplified structure illustrated in
As shown in
As shown in
The piezoelectric element 31 is a passive element that utilizes a piezoelectric effect, and converts a voltage applied to a piezoelectric body into a force, or converts an external force applied to the piezoelectric body into a voltage. The piezoelectric element 31 is provided to vibrate in the advancing/retracting operation direction Z. A coupling member 33 is coupled to the piezoelectric element 31 to vibrate together with the piezoelectric element 31. The coupling member 33 is formed in the shape of an elongated circular column (pin). The distal end portion of the coupling member 33 opposite to the side on which the coupling member 33 is coupled to the piezoelectric element 31 is inserted into a space inside the tubular member 22. The diameter of the coupling member 33 is substantially equal to the inside diameter of the tubular member 22. The outer peripheral surface of the coupling member 33 and the inner peripheral surface of the tubular member 22 contact each other.
A spring member 34 is provided at a position at which the coupling member 33 and the tubular member 22 contact each other so as to surround the tubular member 22 from the outer peripheral side. The spring member 34 provides an inward preliminary pressure having a predetermined magnitude to cause a predetermined friction force between the coupling member 33 and the tubular member 22 forming the protrusion member 20. The preliminary pressure applied by the spring member 34 is set such that the static friction force between the coupling member 33 and the tubular member 22 is at least larger than a component of a gravitational force acting on the protrusion member 20 in the advancing/retracting operation direction Z. In addition, the preliminary pressure is set such that the coupling member 33 and the tubular member 22 can slide with respect to each other with a dynamic friction force caused between the coupling member 33 and the tubular member 22 along with vibration of the piezoelectric element 31. In the embodiment, a slide mechanism 32 is formed by a slide section formed by the tubular member 22 and the coupling member 33 and the spring member 34 serving as a preliminary pressure application unit.
In addition, a magnitude of the difference between the speed of vibration of the piezoelectric element 31 to one direction side along the advancing/retracting operation direction Z and the speed of vibration of the piezoelectric element 31 to the other side can be adjusted by the protrusion control section 52 (see
On the other hand, when the speed of vibration to the retraction direction side is lower than the speed of vibration to the protrusion direction side, the protrusion member 20 is moved to the retraction direction side. That is, the protrusion member 20 may be brought into a state (retracted state) in which the distal end portion of the protrusion member 20 is retracted to the back surface side with respect to the operation surface 11a. The “retracted state” includes a state in which the distal end portion of the pin member 21 of the protrusion member 20 is flush with the level of the operation surface 11a.
In the embodiment, the drive mechanism 30 is formed by the piezoelectric element 31 and the slide mechanism 32. The drive mechanism 30 may include the protrusion control section 52 which provides the piezoelectric element 31 with a pulsed drive signal. The plurality of protrusion members 20 can be independently moved between the protruded state and the retracted state by the drive mechanism 30. The operation input device 4 according to the embodiment thus includes a combination of the touch pad 10 and the plurality of protrusion members 20 provided so as to freely appear and disappear from the operation surface 11a of the touch pad 10.
3. Configuration of Operation Input System
As shown in
The operation input computation section 50 includes a status determination section 51, the protrusion control section 52, a position sensing section 53, a depiction control section 54, and a select operation determination section 55. In the embodiment, in addition, the operation input computation section 50 further includes a protrusion state sensing section 56 and an input reception section 57.
The status determination section 51 determines a protrusion status representing the state of protrusion of each of the protrusion members 20 in accordance with the image content displayed on the display screen 41. In the embodiment, the protrusion status includes the “protruded state” and the “retracted state”. The “retracted state” is a state in which the protrusion member 20 is at the minimally displaced position within its movable range in the advancing/retracting operation direction Z (with the distal end portion of the pin member 21 flush with the level of the operation surface 11a). The “protruded state” is a state in which the protrusion member 20 is at the maximally displaced position within its movable range in the advancing/retracting operation direction Z. In the embodiment, the status determination section 51 determines which one of the protruded state and the retracted state each of the protrusion members 20 is brought into.
As discussed above, the display screen 41 may display an image of the operation
The status determination section 51 correlates the coordinates of the display screen 41 and the coordinates of the operation surface 11a, and determines that the protrusion status of one or more protrusion members 20 positioned at the coordinates on the operation surface 11a corresponding to the coordinates on the display screen 41 of the operation
In the case where the protrusion members 20 are arranged regularly over the entire operation surface 11a as illustrated in
In the case where the image displayed on the display screen 41 is changed, the status determination section 51 determines a difference between the protrusion status corresponding to the image before the change and the protrusion status corresponding to the image after the change for each of the protrusion members 20. The status determination section 51 determines which one of “not changed”, “transitioned to the protruded state”, and “transitioned to the retracted state” is applied to each of the protrusion members 20. In the case where the operation
The status determination section 51 outputs information on the protrusion status, or the difference in protrusion status, determined for each of the protrusion members 20 to the protrusion control section 52.
The protrusion control section 52 controls the position of the protrusion member 20 with respect to the operation surface 11a in the protrusion direction (which coincides with the advancing/retracting operation direction Z). The protrusion control section 52 controls the drive mechanism 30 on the basis of the information received from the status determination section 51. In the embodiment, the protrusion control section 52 vibrates the piezoelectric element 31 by applying a pulsed voltage. The protrusion control section 52 is configured to adjust the difference between the speed of vibration to one direction side along the advancing/retracting operation direction Z and the speed of vibration to the other side. Such a configuration may be achieved by changing the duty ratio in accordance with the direction of vibration of the piezoelectric element 31. The protrusion control section 52 moves the protrusion member 20 to the protrusion direction side by making the speed of vibration to the protrusion direction side lower than the speed of vibration to the retraction direction side. On the other hand, the protrusion control section 52 moves the protrusion member 20 to the retraction direction side by making the speed of vibration to the retraction direction side lower than the speed of vibration to the protrusion direction side.
As discussed above, the results of the determination performed by the status determination section 51 are based on whether or not the operation
In addition, the protrusion control section 52 brings the protrusion members 20 positioned at the coordinates on the operation surface 11a corresponding to the coordinates on the display screen 41 of a region in which the operation
The protrusion control section 52 vibrates the piezoelectric element 31 for a predetermined time longer than the time required to switch the protrusion member 20 between the protruded state and the retracted state, and thereafter stops the vibration. That is, a voltage is applied to the piezoelectric element 31 only for the predetermined time, and thereafter application of the voltage is stopped. Even after application of the voltage is stopped, the protrusion member 20 maintains its position in the advancing/retracting operation direction Z through static friction between the coupling member 33 and the tubular member 22.
In the embodiment, the protrusion height of the protrusion member 20 which is brought into the protruded state (height of the distal end portion of the protrusion member 20 with reference to the operation surface 11a) is set to be relatively small. In the case where the object to be sensed D is a fingertip of the user as shown in
The position sensing section 53 acquires a sensed position of the object to be sensed D on the operation surface 11a of the touch pad 10. The position sensing section 53 specifies the position of an electrode most proximal to the object to be sensed D on the basis of variations in capacitance of the electrodes caused when the object to be sensed D such as a fingertip is brought into contact with or into proximity to the operation surface 11a. Then, the position sensing section 53 acquires the specified position of the electrode as the sensed position on the operation surface 11a. The touch pad 10 may receive input corresponding to the sensed position on the operation surface 11a through such a function of the position sensing section 53. The position sensing section 53 outputs information on the acquired sensed position to the depiction control section 54 and the select operation determination section 55.
The depiction control section 54 controls depiction of an image to be displayed on the display screen 41. The depiction control section 54 generates a plurality of layers containing images of a background, roads, names of places, etc. around the vehicle position. In addition, the depiction control section 54 generates a layer containing an image of a vehicle position mark representing the vehicle position, and a layer containing an image of a route for guidance to a destination in the case where such a destination is set. Further, the depiction control section 54 generates a layer containing images of the predetermined operation figures 44, and a layer containing an image of the predetermined operation cursor 45. Then, the depiction control section 54 superimposes the generated layers to generate a single display image, and causes the display screen 41 to display the generated image.
The depiction control section 54 causes the main operation figures 44 to be displayed in the operation figure display region R set in the display screen 41 (see
In addition, the depiction control section 54 appropriately displays and hides the operation cursor 45 in accordance with a request from the user. In the embodiment, in the case where the position sensing section 53 does not sense contact of the object to be sensed D with or proximity of the object to be sensed D to the operation surface 11a, the depiction control section 54 hides the operation cursor 45. In the case where the position sensing section 53 senses contact of the object to be sensed D with or proximity of the object to be sensed D to the operation surface 11a, on the other hand, the depiction control section 54 displays the operation cursor 45, which has a circular shape in the example, at a position on the display screen 41 corresponding to the sensed position on the operation surface 11a. In the example, the operation cursor 45 is displayed such that the sensed position and the center position of the operation cursor 45 coincide with each other. In the case where the object to be sensed D in contact with or in proximity to the operation surface 11a is slid and the sensed position is also slid, the operation cursor 45 being displayed is also moved on the display screen 41 synchronously.
The select operation determination section 55 determines whether or not a select operation is performed for the operation
In the embodiment, two protrusion members 20 are assigned to one operation
In the embodiment, the coordinates of the display screen 41 and the coordinates of the operation surface 11a are correlated with each other as discussed above, and only the protrusion members 20 corresponding to a particular operation
In the embodiment, in addition, each of the operation figures 44 displayed on the display screen 41 is expressed by a pair of (two) protrusion members 20 in the form of two protrusion portions arranged side by side. Therefore, the user may easily grasp the position of the operation figure assignment region I on the operation surface 11a by recognizing the two points at the same location through tactile sensation. In addition, the configuration of the drive mechanism 30 can be advantageously relatively simplified without increasing the number of protrusion members 20 more than necessary.
In the case where it is determined that a select operation for the operation
The protrusion state sensing section 56 senses the protruded state and the retracted state of the protrusion members 20. The protrusion state sensing section 56 is configured to acquire information from a position sensor (not shown), for example. The protrusion state sensing section 56 senses whether the actual protrusion status of each protrusion member 20 is the protruded state or the retracted state on the basis of the acquired information on the position of the protrusion member 20 in the advancing/retracting operation direction Z. The protrusion state sensing section 56 outputs information on the sensing results to the input reception section 57 of the select operation determination section 55.
In the case where the protrusion state sensing section 56 senses that the protrusion member 20 has been changed from the protruded state to the retracted state, the input reception section 57 receives input to the protrusion member 20. In the embodiment, as described above, the protrusion members 20 corresponding to a particular operation
In the embodiment, in which the input reception section 57 is provided, a select operation for the operation
4. Process Procedures of Operation Input Reception Process
The process procedures of the operation input reception process performed by the operation input system 3 according to the embodiment will be described with reference to
In the operation input reception process, as shown in
In the input determination process, as shown in
In the case where a touch operation is sensed in step #14 (step #14: Yes), it is determined whether or not the position at which the touch operation is sensed falls within the operation figure assignment region I (step #15). In the case where it is determined that the sensed position falls within the operation figure assignment region I (step #15: Yes) or in the case where it is determined in step #13 that a depression operation for the protrusion member 20 has been sensed (step #13: Yes), the type of the operation
When the input determination process is terminated, the process returns to
5. Procedures of Control Process for Operation Surface of Touch Pad of Operation Input Device
The display screen 41 of the display input device 40 is occasionally divided into a plurality of screen regions 48 as shown in
In the case where the display screen 41 is divided into a plurality of screen regions 48 that display independent images, the protrusion control section 52 divides the operation surface 11a of the touch pad 10 into the number of operation surface regions 18, the number of the operation surface regions 18 being the same as the number of the screen regions 48. That is, the protrusion control section 52 sets a boundary 19 between the operation surface regions 18, and causes the protrusion members 20 to be protruded from the operation surface 11a along the boundary 19. As a result, the boundary 19 formed by the protruded protrusion members 20 makes each operation surface region 18 clearly distinguishable by the user. Thus, the user can clearly recognize an operation surface region 18 corresponding to a particular screen region 48 for which it is desired to input a predetermined operation. In this event, the protrusion control section 52 sets the boundary 19 between the operation surface regions 18 such that the area of each of the operation surface regions 18 corresponds to the display content of the corresponding screen region 48, irrespective of the ratio in area between the plurality of screen regions 48. That is, as one viewpoint, the protrusion control section 52 may set the boundary 19 between the operation surface regions 18 such that the area of each of the operation surface regions 18 corresponds to the number of operation figures 44 in the corresponding screen region 48 (see a first example etc. to be discussed later). As another viewpoint, the protrusion control section 52 may set the area of each of the operation surface regions 18 in accordance with an operation method corresponding to the display content of each of the screen regions 48 and acceptable by the operation surface region 18 corresponding to each screen region 48 (see a second example etc. to be discussed later).
More particularly, after acquiring information on partitioning between the screen regions 48 on the display screen 41 of the display input device 40 from the depiction control section 54, the status determination section 51 determines that the protrusion status of the protrusion members 20 assigned to the boundary 19 is the protruded state. The protrusion control section 52 causes the corresponding protrusion members 20 to be protruded on the basis of the determination results. A plurality of functional sections thus cooperate with each other to set the boundary 19. In the following description, unless otherwise noted, it is assumed that the protrusion control section 52 principally performs control (it may be considered that the operation input computation section 50 principally performs control).
With reference to
In a first example shown in
Thus, in one aspect, the protrusion control section 52 sets the boundary 19 such that an operation surface region 18 corresponding to a screen region 48 containing the larger number of operation figures 44 has a larger area, irrespective of the ratio in area between the plurality of screen regions 48. That is, an operation surface region 18 with the larger number of operation figures 44 have a larger area, and thus the operation figures 44 in each screen region 48 are appropriately distributed in the corresponding operation surface region 18. For example, an appropriate gap that is uniform over the entire touch pad 10 may be set between the operation figure assignment regions I corresponding to the operation figures 44 in the operation surface regions 18. This enables the user to perform more reliable operational input.
Preferably, the protrusion control section 52 sets the boundary 19 quantitatively. For example, in one aspect, the protrusion control section 52 may set the area of the operation surface region 18 corresponding to each of the screen regions 48 such that the ratio in area between the operation surface regions 18 matches the ratio in number of operation figures 44 contained in each of the plurality of screen regions 48, irrespective of the ratio in area between the plurality of screen regions 48. For example, the ratio in area between the first operation surface region 18a and the second operation surface region 18b illustrated in
The procedures of setting the operation surface regions 18 and the boundary 19 will be described below with additional reference to the flowcharts of
In the operation surface region boundary setting process through operation figure number acquisition, as shown in
The operation surface region boundary setting process through operation figure number acquisition will be described using a specific example with reference to
On the other hand, in the case where the numbers N1 and N2 of operation figures 44 are not equal to each other as in the example, the process takes the branch indicated by “No” at step #331, and it is determined which of the numbers N1 and N2 of operation figures 44 is larger (step #332). In the example, the number N1 of operation figures 44 in the first screen region 48a is the larger. Thus, it is determined in step #332 as “true” (step #332: Yes), and the process proceeds to step #334. That is, a boundary 19 is set such that the area of the first operation surface region 18a corresponding to the first screen region 48a is set to be preferentially larger than that of the second operation surface region 18b corresponding to the second screen region 48b. In the case where the number N2 of operation figures 44 in the second screen region 48b is larger, in contrast, a boundary 19 is set such that the area of the second operation surface region 18b corresponding to the second screen region 48b is set to be preferentially larger than that of the first operation surface region 18a corresponding to the first screen region 48a (step #335).
When the boundary 19 is set in the operation surface region boundary setting process through operation figure number acquisition (step #30 of
In the first example discussed above, the protrusion control section 52 sets the boundary 19 between the operation surface regions 18 such that the area of each of the operation surface regions 18 corresponds to the number of operation figures 44 in the corresponding screen region 48. However, the manner of setting the boundary 19 is not limited to the manner according to the first example. In a second example described below, a boundary between the operation surface regions 18 is set such that the area of each of the operation surface regions 18 corresponds to the display content of the corresponding screen region 48. The display content indicates the characteristics (screen region characteristics) of a screen depicted in each screen region 48 by the depiction control section 54. The screen region characteristics may be the type of a display image such as a map screen, a television broadcast screen, a screen for displaying a video image from a video disc or the like, an audio setting screen, a schematic view of an expressway, a screen for displaying a schematic view of a route, etc.
The screen region characteristics may also be the type of an operation mode in which the user can operate each screen region 48 or an operation surface region 18 corresponding to each screen region 48. Examples of the type of the operation mode include an operation of bringing the object to be sensed D, which has not been in contact with the operation surface 11a, into contact with the operation surface 11a (touch operation), an operation of temporarily moving the object to be sensed D, which has been in contact with the operation surface 11a, away from the operation surface 11a and thereafter bringing the object to be sensed D into contact with the operation surface 11a again (tap operation), and an operation of performing two tap operations within a predetermined time (double-tap operation). Examples of the type of the operation mode also include an operation of sliding the object to be sensed D with the object to be sensed D in contact with or in proximity to the operation surface 11a (slide operation), and an operation of varying the distance between two objects to be sensed D by causing the objects to be sensed D to move closer to and away from each other with the objects to be sensed in contact with the operation surface 11a (pinch-touch operation). For example, a pinch-touch operation may be performed on a map screen to move the objects to be sensed D away from each other to enlarge a region between the objects to be sensed D, or to move the objects to be sensed D closer to each other to reduce a region between the objects to be sensed D. The type of the display screen and the type of the operation mode may be associated with each other to prescribe the priority of an operation to use the prescribed priority as the screen region characteristics.
As described above, the screen region characteristics may be the type of the operation mode in which the user can operate each screen region 48 or an operation surface region 18 corresponding to each screen region 48. In the case where an image displayed in a screen region 48 is a map image, for example, it may be more convenient if a map can be enlarged/reduced intuitively through a slide operation and a pinch-touch operation, rather than a touch operation performed on an operation
From such a viewpoint, in addition, it is not necessary that an operation
In the second example shown in
The first screen region 48a corresponds to a map screen, and can be enlarged/reduced through a pinch-touch operation discussed above, besides an operation performed utilizing the operation figures 44 (“−” and “+” marks). In the navigation apparatus 1, the map display function is given priority over the audio setting function. Thus, it is determined on the basis of the display content (screen region characteristics) that the first screen region 48a is given priority over the second screen region 48b in assigning an area on the touch pad 10 for exclusive use. In the second example in which a boundary between the operation surface regions 18 is set such that the area of each of the operation surface regions 18 corresponds to the display content of the corresponding screen region 48, the area of the first operation surface region 18a corresponding to the first screen region 48a is set to be preferentially larger.
What display content of the screen regions 48 is to be given priority may be determined in advance by determining the order (order of priority) in accordance with the type of the display image, characteristics information obtained by combining the type of the display image and the type of the operation mode, or the like, and storing such order in a table or the like. The type of the display image and the type of the operation mode may be converted into numerals to calculate priority through computation to decide the order in accordance with the calculated priority. The ratio in area between the operation surface regions may be decided quantitatively in accordance with the order of priority or the priority.
That is, the protrusion control section 52 sets the boundary 19 between the operation surface regions 18 such that the area of each of the operation surface regions 18 corresponds to the display content (screen region characteristics such as the type of the display image and the type of the operation mode) of the corresponding screen region 48, and causes the protrusion members 20 to be protruded along the boundary 19. As a preferred aspect, as in the display input device 40 according to the embodiment, in the case where the display screen 41 is a touch panel having a function of sensing an object to be sensed D in contact with or in proximity to the display screen 41 to receive input corresponding to the position of the sensed object D, the protrusion control section 52 sets the area of each of the operation surface regions 18 in accordance with an operation method (type of the operation mode) acceptable by each of the screen regions 48 on the display screen 41. The protrusion control section 52 may also set the area of each of the operation surface regions 18 in accordance with an operation method corresponding to each of the screen regions 48 and acceptable by the operation surface region 18 corresponding to each screen region 48, irrespective of whether or not the display screen 41 has a function of sensing an object to be sensed in contact with or in proximity to the display screen 41 to receive input corresponding to the position of the sensed object.
In particular, the protrusion control section 52 preferably sets the area of each of the operation surface regions 18 in accordance with whether or not the operation method acceptable by the screen region 48 or the operation surface region 18 includes a touch operation, or whether or not the operation method includes both a slide operation (in particular, a pinch-touch operation) and a touch operation. A slide operation including a pinch-touch operation, for example, involves operation performed along the operation surface 11a, and thus requires a large area compared to a touch operation and a tap operation which involve an operation performed vertically to the operation surface 11a. Thus, setting the area of an operation surface region 18 corresponding to a screen region 48 that receives a slide operation to be larger improves convenience to the user.
In the case where the display screen 41 has a function of sensing an object to be sensed D in contact with or in proximity to the display screen 41 to receive input corresponding to the position of the sensed object D as in the display input device 40, the operation method acceptable by the screen region 48 is acceptable by each of the screen regions 48 on the display screen 41. In addition, the operation method acceptable by the operation surface region 18 is set for each of the operation surface regions 18 corresponding to the screen regions 48 in accordance with each of the screen regions 48, irrespective of whether or not the display screen 41 can accept such operation input, and acceptable by each of the operation surface regions 18 corresponding to the screen regions 48.
The procedures of setting the operation surface regions 18 and the boundary 19 will be described below with additional reference to the flowcharts of
In the operation surface region boundary setting process through screen region characteristics acquisition, as shown in
The operation surface region boundary setting process through screen region characteristics acquisition will be described using a specific example with reference to
Next, it is determined whether or not the priority based on the screen region characteristics of the first screen region 48a and the priority based on the screen region characteristics of the second screen region 48b are equal to each other (step #431). In the case where the two priorities are equal to each other, it is not necessary to set any of the respective areas of the operation surface regions 18 corresponding to the first screen region 48a and the second screen region 48b to be preferentially larger. Thus, a boundary 19 that makes the areas of the operation surface regions 18 match those of the screen regions 48 is set. That is, a boundary 19 is set at a position on the operation surface 11a corresponding to the middle between the two screen regions 48 (step #433).
In the example, the priorities of the screen regions 48a and 48b are not equal to each other with the first screen region 48a given higher priority as discussed above. Thus, the process takes the branch indicated by “No” at step #431, and it is determined which of the screen regions 48a and 48b is given higher priority (step #432). In the case where the priority of the first screen region 48a is higher as in the example, a boundary 19 is set such that the area of the first operation surface region 18a corresponding to the first screen region 48a is set to be preferentially larger than that of the second operation surface region 18b corresponding to the second screen region 48b. (step #434). In the case where the priority of the second screen region 48b is higher, in contrast, a boundary 19 is set such that the area of the second operation surface region 18b corresponding to the second screen region 48b is set to be preferentially larger than that of the first operation surface region 18a corresponding to the first screen region 48a (step #435).
When the boundary 19 is set in the operation surface region boundary setting process through screen region characteristics acquisition (step #40 of
In the first and second examples, the operation surface regions 18 are set irrespective of the ratio in area between the screen regions 48. Therefore, in the case where operation figures 44 are contained in the screen regions 48, operation figure assignment regions I may not be set at sufficient intervals particularly in each operation surface region 18, the ratio in area of which is set to be low compared to the corresponding screen region 48.
If the direction of arrangement of the operation figures 44 displayed in the second screen region 48b and the direction of arrangement of the operation figure assignment regions I set in the second operation surface region 18b are different from each other, it may be difficult for the user to perform accurate operation input to the touch pad 10. Thus, the depiction control section 54 disposes (rearranges) the plurality of operation figures 44 in the screen region 48 corresponding to the narrow operation surface region 18N such that the arrangement of the operation figures 44 corresponds to the arrangement of the protrusion members 20 established by the protrusion control section 52 as indicated by the solid line in
That is, in the case where the boundary 19 is set within a predetermined distance (M) from the outer periphery of the operation surface 11a and a plurality of operation figures 44 are provided in a screen region 48 corresponding to the narrow operation surface region 18N which is an operation surface region 18 set between the boundary 19 and the outer periphery, the protrusion control section 52 disposes the protrusion members 20 protruded from the operation surface 11a in correspondence with the plurality of operation figures 44 in the narrow operation surface region 18N so as to be in parallel with the boundary 19. Thus, the depiction control section 54 disposes (rearranges) the plurality of operation figures 44 in the screen region 48 corresponding to the narrow operation surface region 18N such that the arrangement of the operation figures 44 corresponds to the arrangement of the protrusion members 20 (operation figure assignment regions I) established by the protrusion control section 52.
As shown in
In one aspect, the predetermined distance (M) is preferably set on the basis of the number of operation figures 44 contained in the screen region 48 (or the number of operation figure assignment regions I contained in the operation surface region 18). For example, the predetermined distance (M) may be defined as a function f of N by the following formula:
M=f(N)
Then, it is preferably determined whether or not the operation surface region 18 is a narrow operation surface region 18N on the basis of the calculated predetermined distance (M) and the actual distance L between the outer periphery of the operation surface 11a and the boundary 19 (or the actual distance L between adjacent boundaries 19 illustrated in
If it is determined in step #73 that there is any narrow operation surface region 18N, the protrusion control section 52 disposes the operation figure assignment regions I (operation figures 44) in the narrow operation surface region 18N in parallel with the boundary 19 (step #74). Then, the protrusion control section 52 informs the depiction control section 54 that the operation figure assignment regions I in the narrow operation surface region 18N have been set in an arrangement different from the arrangement of the operation figures 44 in the screen region 48 corresponding to the narrow operation surface region 18N (step #75). The depiction control section 54 rearranges the operation figures 44 in the screen region 48 corresponding to the narrow operation surface region 18N in accordance with the arrangement of the operation figure assignment regions I in the narrow operation surface region 18N (step #76).
The arrangement of the operation figures 44 in the screen region 48 is changed in accordance with the arrangement of the operation figure assignment regions I in the operation surface region 18 to make the arrangement of the operation figures 44 in the screen region 48 and the arrangement of the regions corresponding to the operation figures 44 in the operation surface region 18 common. As a result, the user can easily correlate the operation figures 44 in the screen region 48 with the regions corresponding to the operation figures 44 in the screen region 48, which enables the user to perform more reliable operational input.
In the embodiment, two hole portions 12 are arranged along the Y direction of the operation surface 11a as the hole portions 12 through which the protrusion members 20 provided in an operation figure assignment region I corresponding to one operation
In the first to third examples discussed above, the display screen 41 is configured to have two screen regions 48. However, the display screen 41 may be configured to have three or more screen regions 48 as shown in
In the third example discussed above, the boundary 19 is set within a predetermined distance (M) from the outer periphery of the operation surface 11a, and a plurality of operation figures 44 are provided in a screen region 48 corresponding to the narrow operation surface region 18N which is an operation surface region 18 set between the boundary 19 and the outer periphery. That is, the narrow operation surface region 18N faces the outer periphery of the operation surface 11a. However, the narrow operation surface region 18N does not necessarily face the outer periphery of the operation surface 11a. For example, as shown in
Lastly, operation input systems according to other embodiments of the present invention will be described. A configuration disclosed in each of the following embodiments may be applied in combination with a configuration disclosed in any other embodiment.
(1) In the embodiment described above, the protrusion members 20 are protruded in the operation figure assignment regions I. However, in the case where an image displayed in a screen region 48 is a map image, for example, it may be more convenient if a map can be enlarged/reduced through a slide operation and a pinch-touch operation, not a touch operation performed on an operation
(2) In the embodiment described above, the drive mechanism 30 brings the protrusion member 20 into one of the protruded state and the retracted state. However, embodiments of the present invention are not limited thereto. That is, the drive mechanism 30 may be configured to bring the protrusion member 20 into an intermediate state between the protruded state and the retracted state. In this case, the protrusion control section 52 may be configured to control stepwise the position of the protrusion member 20 with respect to the operation surface 11a in the protrusion direction (advancing/retracting operation direction Z) so that the protrusion member 20 can be protruded stepwise.
(3) In the embodiment described above, the drive mechanism 30 includes the piezoelectric element 31, the slide mechanism 32, and the protrusion control section 52. However, embodiments of the present invention are not limited thereto. That is, the drive mechanism 30 may have any specific configuration as long as the drive mechanism 30 can cause advancing/retracting operation of the protrusion member 20 along the advancing/retracting operation direction Z to move the protrusion member 20 between the protruded state and the retracted state. For example, the drive mechanism 30 may utilize a fluid pressure such as a liquid pressure or a gas pressure, or may utilize an electromagnetic force of an electromagnet or the like.
(4) In the embodiment described above, the protrusion member 20 is driven so as to be advanced and retracted along the advancing/retracting operation direction Z set to a direction orthogonally intersecting the operation surface 11a. However, embodiments of the present invention are not limited thereto. That is, the advancing/retracting operation direction Z may be set to a direction inclined with respect to, rather than orthogonally intersecting, the operation surface 11a. In this case, in the case where the touch pad 10 is disposed generally horizontally at the center console portion as in the embodiment described above, for example, the advancing/retracting operation direction Z is preferably set to be inclined toward a driver's seat.
(5) In the embodiment described above, the touch pad 10 of the capacitance type which can sense the object to be sensed D in contact with or in proximity to the operation surface 11a is used. However, embodiments of the present invention are not limited thereto. That is, the touch pad 10 of the resistance film type may also be utilized in place of the touch pad 10 of the capacitance type. Alternatively, the touch pad 10 of a pressure sensitive type which can sense the object to be sensed D in contact with the operation surface 11a may also be utilized.
(6) In the embodiment described above, the operation
(7) In the embodiment described above, the protrusion state sensing section 56 is configured to sense the actual protrusion status of each protrusion member 20 on the basis of information acquired from a position sensor. However, embodiments of the present invention are not limited thereto. For example, the protrusion state sensing section 56 may be formed using the piezoelectric element 31 provided in the drive mechanism 30 as a sensor element, by utilizing the characteristics of the piezoelectric element 31. As discussed above, when the protrusion control section 52 drives the protrusion member 20 so as to be advanced and retracted, application of a voltage is stopped after a predetermined time elapses. Therefore, providing a configuration that enables to sense an external force (a depressing force provided by the user) applied to the piezoelectric element 31 via the protrusion member 20 and the coupling member 33 as an electric signal after the stop of the voltage application may achieve a configuration that enables to sense an operation (depression operation) for the protrusion member 20 performed by the user. Then, the protrusion state sensing section 56 may sense the actual protrusion status of each protrusion member 20 on the basis of the sensed depression operation and the protrusion status of each protrusion member 20 determined by the status determination section 51. That is, in the case where an electric signal from the piezoelectric element 31 corresponding to the protrusion member 20 in the protruded state is sensed, the protrusion state sensing section 56 determines that the protrusion member 20 has been brought into the retracted state. Meanwhile, in the case where a lapse of the predetermined time is detected by a timer or the like after the piezoelectric element 31 corresponding to the protrusion member 20 in the retracted state is vibrated, the protrusion state sensing section 56 determines that the protrusion member 20 has been brought into the protruded state.
(8) In the embodiment described above, the operation input computation section 50 includes the functional sections 51 to 57. However, embodiments of the present invention are not limited thereto. That is, the assignment of the functional sections described in relation to the embodiment described above is merely illustrative, and a plurality of functional sections may be combined with each other, or a single functional section may be further divided into sub-sections.
(9) In the embodiment described above, the operation input device 4 allows to perform operation input to the in-vehicle navigation apparatus 1. However, embodiments of the present invention are not limited thereto. That is, the operation input device according to the present invention may allow to perform operation input to a navigation system in which the components of the navigation apparatus 1 described in the embodiment described above are distributed to a server device and an in-vehicle terminal device, a laptop personal computer, a gaming device, and other systems and devices such as control devices for various machines, for example.
(10) Also regarding other configurations, the embodiment disclosed herein is illustrative in all respects, and the present invention is not limited thereto. That is, a configuration not described in the claims of the present invention may be altered without departing from the object of the present invention.
The present invention may be suitably applied to an operation input system including a touch pad serving as a pointing device.
Number | Date | Country | Kind |
---|---|---|---|
2012-010316 | Jan 2012 | JP | national |