The present disclosure is based on Japanese Patent Application No. 2012-220661 filed on Oct. 2, 2012, the disclosure of which is incorporated herein by reference.
The present disclosure relates to a manipulating apparatus that manipulates an image portion displayed on a display screen by an input to a manipulation portion.
Patent Literature 1 discloses the art of moving image portions on a display screen, such as a pointer of a navigation display and a radio main screen, in association with manipulations performed to a remote touch pad portion. A user interface apparatus disclosed in Patent Literature 1 includes a remote touch pad portion that detects the manipulation to move a finger of a manipulator and a control portion that associates the manipulation of the finger detected by the remote touch pad portion with movement of a map and a pointer.
In addition, the control portion acquires the distance from the remote touch pad portion to the finger. When the acquired distance to the finger is less than a predefined height, for example, three centimeters (cm), the control portion associates the manipulation by the finger detected by the remote touch pad portion with the manipulation that moves the pointer on the display screen. In contrast, when the distance to the finger acquired by the control portion is a predefined height, for example, in the range from 5 cm to 7 cm, the control portion associates the manipulation by the finger detected by the remote touch pad portion with the switching manipulation that switches from a radio main screen to a manipulation wait screen.
Patent Literature 1: JP 2011-118857 A
Now, the manipulating apparatus of Patent Literature 1 does not provide any configuration that assists the input by the finger of a manipulator for switching the radio main screen to the manipulation wait screen. Therefore, it is difficult for the manipulator to obtain the range in which the switching control is performable. Thus, when the range in which the switching control is performable is unclear, it may be difficult for the manipulator to easily perform the input for the switching manipulation.
It is an object of the present disclosure to provide a manipulating apparatus that makes easy the input by a manipulator in a manipulation space spaced apart from a manipulation portion.
To achieve the above object, a manipulating apparatus according to a first aspect of the present disclosure is provided as follows. The manipulating apparatus manipulates an image displayed on a display screen by an input with a manipulation body performed to a manipulation surface. The manipulating apparatus includes a detection device, an acquisition device, an association device, and an input auxiliary portion. The detection device detects movement of the manipulation body. The acquisition device acquires a manipulation body distance from the manipulation surface to the manipulation body. The association device distinguishes a first movement of the manipulation body detected in a first manipulation space in which the manipulation body distance is less than a predefined threshold distance from a second movement of the manipulation body detected in a second manipulation space in which the manipulation body distance is greater than the predefined threshold distance. The association device associates the first movement of the manipulation body and the second movement of the manipulation body, respectively, with a first image portion displayed on the display screen and a second image portion displayed on the display screen; the second image portion is different from the first image portion. The association device changes a display mode of at least one of the first image portion and the second image portion. The input auxiliary portion is formed to a location spaced apart from the manipulation surface by a distance equal to or greater than the threshold distance.
According to this configuration, when an input is made in a second manipulation space in which movement of a manipulation body is associated with a second image, a manipulator can receive an input assistance by contacting an input auxiliary portion formed apart from a manipulation surface by a threshold distance. Because of such assistance, the manipulator can easily obtain the location of the second manipulation space defined apart from the manipulation surface. Therefore, the manipulator can easily perform the input in the second manipulation space.
A manipulating apparatus according to a second aspect of the present disclosure is provided to include a plane-shaped capacitive touch panel and a guide portion that covers part of the capacitive touch panel and guides input manipulation to the capacitive touch panel.
A manipulating apparatus according to a third aspect of the present disclosure is provided as follows. The manipulating apparatus manipulates an image displayed on a display screen by an input with a manipulation body performed to a manipulation surface. The manipulating apparatus including: means for detecting movement of the manipulation body; means for acquiring a manipulation body distance from the manipulation surface to the manipulation body; means for distinguishing a first movement of the manipulation body detected in a first manipulation space in which the manipulation body distance is less than a predefined threshold distance from a second movement of the manipulation body detected in a second manipulation space in which the manipulation body distance is greater than the predefined threshold distance, associating the first movement of the manipulation body and the second movement of the manipulation body, respectively, with a first image portion displayed on the display screen and a second image portion displayed on the display screen, the second image portion being different from the first image portion, and changing a display mode of at least one of the first image portion and the second image portion; and means for assisting the second movement of the manipulation body in the second manipulation space.
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description made with reference to the accompanying drawings. In the drawings:
Hereafter, multiple embodiments of the present disclosure are described based on the figures. The same reference numerals designate corresponding components in each embodiment, and repetitive explanation may be therefore omitted. When only part of a configuration is explained in each embodiment, configurations of other embodiments already explained can be applied to the other part of the configuration. Not only specified combinations of configurations in explanation of each embodiment, but also sub-combinations of configurations of multiple embodiments, though not specified, are possible when there is no difficulty in the combinations. Non-specified combinations of configurations described in multiple embodiments and modifications are also disclosed in the following explanation.
A remote manipulating apparatus 100 of a first embodiment of the present disclosure is mounted to a vehicle, and configures a display system 10 together with a navigation apparatus 50 as shown in
The display image 60 shown in
As shown in
Next, configurations of the remote manipulating apparatus 100 and navigation apparatus 50 shown in
The remote manipulating apparatus 100 is connected to a Controller Area Network (CAN) bus 90 and an external battery 95. The CAN bus 90 is a transmission path used for data transmission between each vehicle-mounted apparatus in an in-vehicle communication network that connects multiple vehicle-mounted apparatuses to each other. The remote manipulating apparatus 100 is capable of communicating with the navigation apparatus 50 separate therefrom by CAN communication via the CAN bus 90.
The remote manipulating apparatus 100 includes power interfaces 21 and 22, a communication control portion 23, a communication interface 24, a touch sensor 31, and a manipulation control portion 33. The power interfaces 21 and 22 stabilize electric power supplied from the battery 95 to supply the power to the manipulation control portion 33. Power is always supplied to the power interface 21 from the battery 95. When a switch 93 is electrically conductive upon turn-on of an accessory (ACC) power of the vehicle, power is supplied from the battery 95 to the power interface 22.
The communication control portion 23 and communication interface 24 output the information processed by the manipulation control portion 33 to the CAN bus 90 and acquires the information outputted from other vehicle-mounted apparatuses to the CAN bus 90. The communication control portion 23 and communication interface 24 are connected to each other by a signal line TX for transmission and a signal line RX for reception.
As shown in
The manipulation control portion 33 is also called a control circuit 33, and includes a processor for various arithmetic processing, a RAM that functions as an area for the arithmetic processing, and a flash memory that stores programs for the arithmetic processing. In addition, the manipulation control portion 33 is connected to the power interfaces 21 and 22, communication control portion 23, and touch sensor 31.
Charge is stored between the finger F and the electrodes of the touch sensor 31 that are proximate to each other as shown in
The navigation apparatus 50 shown in
The display control portion 53 includes a processor that performs various arithmetic processing, a RAM that functions as a work area for the arithmetic processing, a graphic processor that performs rendering of images, and a graphic RAM that functions as a work area for rendering of images. In addition, the display control portion 53 has a flash memory to store data used for arithmetic processing and rendering, a communication interface connected to the CAN bus 90, and an image output interface to output data of rendered images to the liquid crystal display 51. The display control portion 53 renders the display images 60 to be displayed on the display screen 52 based on the information acquired from the CAN bus 90. The display control portion 53 outputs the image data of the rendered display images 60 to the liquid crystal display 51 one after another via the image output interface.
The liquid crystal display 51 is a dot matrix type display to realize color display by controlling multiple pixels arranged on the display screen 52. The liquid crystal display 51 displays images by forming sequentially image data acquired successively from the display control portion 53 on the display screen 52.
Next, the configuration of the manipulation surface 70 is explained in more detail based on
The manipulation surface 70 is formed to a recessed portion 80 provided to the remote manipulating apparatus 100. The recessed portion 80 is recessed in a rectangular shape from a periphery surface 85 surrounding the periphery of the recessed portion 80. The recessed portion 80 includes: a first bottom surface 81 and a second bottom surface 83 between which a level difference is provided; and multiple side wall surfaces 82, 84, and 86. The first bottom surface 81 and second bottom surface 83 are formed in a plane shape along the x-y plane. The first bottom surface 81 is formed to a deeper position than the second bottom surface 83 relative to the periphery surface 85. The second bottom surface 83 is formed to the intermediate position between the first bottom surface 81 and periphery surface 85 in the z-axis direction. The second bottom surface 83 is formed toward the front in the travel direction of the vehicle in the y-axis direction.
An intermediate side wall surface 82 is formed between the first bottom surface 81 and second bottom surface 83. The intermediate side wall surface 82 slopes toward the outer periphery of the first bottom surface 81 as it is separate from the first bottom surface 81 along the z-axis direction. A front side wall surface 84 is formed between the second bottom surface 83 and periphery surface 85. The front side wall surface 84 slopes toward the outer periphery of the second bottom surface 83 as it is separate from the second bottom surface 83 along the z-axis direction. A back side wall surface 86 is formed between the first bottom surface 81 and periphery surface 85. The back side wall surface 86 slopes toward the outer periphery of the first bottom surface 81 as it is separate from the first bottom surface 81 along the z-axis direction.
The manipulation surface 70 formed to the above recessed portion 80 includes a first manipulation surface 72 and a second manipulation surface 74. The first manipulation surface 72 is formed to the first bottom surface 81 in a rectangular shape. The second manipulation surface 74 is formed to the second bottom surface 83 in a rectangular shape, and separate from the first manipulation surface 72 by a distance equal to greater than first threshold distance Dth1 (see
The touch sensor 31 opposes the overall areas of the first manipulation surface 72 and the second manipulation surface 74 in the z axis direction. Thus, the touch sensor 31 is capable of detecting not only the moving manipulation of moving the finger F in the area opposing the first manipulation surface 72 but also the moving manipulation of moving the finger F in the area opposing the second manipulation surface 74. Therefore, the second manipulation surface 74 covers a partial area of the capacitive touch sensor (31), and functions also as a guide portion that guides input manipulation to the capacitive touch sensor (31).
In the remote manipulating apparatus 100 explained above, the manipulation mode is changed in response to: a relative location of the finger F that inputs the moving manipulation; and the manipulation body distance d. This changes image portions 61 associated with the moving manipulation of the finger F in the display images 60, as shown in
(1) Deep Manipulation Mode
In a deep manipulation mode, as shown in
The moving manipulation to move the finger F along the x-y plane in the above first manipulation space Sp1 is defined as a “deep manipulation,” and is capable of moving the position of the focus 62 displayed on the display screen 52.
(2) Shallow Manipulation Mode
In a shallow manipulation mode, as shown in
When the finger F is located in the above second manipulation space Sp2, the display screen 52 is changed such that the multiple submenu images 164 including the air conditioning menu image are selectable. Under such a state, the moving manipulation to move the finger F along the x-y plane is defined as a “shallow manipulation” to be capable of scrolling the multiple submenu images 164 displayed on the display screen 52.
Additionally, by setting the above first threshold distance Dth1 and second threshold distance Dth2, the second manipulation surface 74 is located closer to the second manipulation space Sp2 than the first manipulation surface 72, and located in the second manipulation space Sp2. Therefore, a manipulator can perform the shallow manipulation with the finger F contacting the second manipulation surface 74. Thus, the second manipulation surface 74 is capable of assisting the input by the finger F in the second manipulation space Sp2.
(3) Non-adjacent Mode
The moving manipulation by the finger F is not associated with any of the image portions 61 of the display screen 52 in the non-adjacent mode. In such a non-adjacent mode, the finger F is not located in the first manipulation space Sp1 (see
As explained above, in the display system 10 (see
In
Under the above state, the manipulator can superimpose the focus 62 on the arbitrary icon 63 by inputting the deep manipulation of dragging the first manipulation surface 72 with the finger F. The manipulator can select this arbitrary icon 63 by inputting a tap to the first manipulation surface 72 with the focus 62 superimposed on the arbitrary icon 63.
As shown in
To realize the above icon selection, the manipulation mode selection performed by the manipulation control portion 33 is explained in detail based on
It is noted that a flowchart or the processing of the flowchart in the present application includes sections (also referred to as steps), each of which is represented, for instance, as S101. Further, each section can be divided into several sub-sections while several sections can be combined into a single section. Furthermore, each of thus configured sections can be also referred to as a device, module, or means. Each or any combination of sections explained in the above can be achieved as (i) a software section in combination with a hardware unit (e.g., computer) or (ii) a hardware section (e.g., integrated circuit, hard-wired logic circuit), including or not including a function of a related apparatus; furthermore, the hardware section may be constructed inside of a microcomputer.
In S101, the presence of a tap on any one of the first manipulation surface 72 and second manipulation surface 74 is determined based on variation in the output acquired from the touch sensor 31. When it is determined that there is no tap in S101, the wait state of the remote manipulating apparatus 100 is maintained by repeating the determination of S101. On the other hand, when it is determined that there is a tap in S101, the flow proceeds to S102.
In S102, the sensitivity value detected in each electrode of the touch sensor 31 is acquired, and the flow proceeds to S103. In S103, by use of the sensitivity value acquired in S102, the calculation of the x-coordinate, y-coordinate, and z-coordinate (hereinafter, “input position coordinate”) that show a three dimensional location of the finger F relative to a manipulation surface 32 is performed, and the flow proceeds to S104.
The calculation executed in S103 is explained in detail based on
As shown in
In S104, it is determined whether the finger F is in the first manipulation space Sp1. That is, it is determined whether the relative position of the finger F is on the first manipulation surface 72 and the sensitivity value is the first sensibility thresholds Hth1 or greater. When a positive determination is made in S104, the flow proceeds to S105. In S105, the manipulation mode is set to the deep manipulation mode, and the flow returns to S102.
In S106 when a negative determination is made in S104, it is determined whether the relative position of the finger F is on the first manipulation surface 72 and the sensitivity value is less than the first sensibility threshold Hth1. When a positive determination is made in S106, the flow proceeds to S107. In S107, the manipulation mode is set to the non-adjacent mode, and the flow proceeds to S108. In S107, when the manipulation mode is changed from any one of the deep manipulation mode and shallow manipulation mode to the non-adjacent mode, the count of an elapsed time t is started after the transition to the non-adjacent mode.
In S108, it is determined whether the elapsed time t that has started to be counted in S107 or before the previous S107 is equal to or greater than the predefined threshold time Tth. When a positive determination is made in S108, the flow returns to S101. Thereby, the remote manipulating apparatus 100 transitions to the wait state. On the other hand, when a negative determination is made in S108, the flow returns to S102.
In S109 when a negative determination is made in S106, it is determined whether the finger F is in the second manipulation space Sp2. That is, it is determined whether the relative position of the finger F is on the second manipulation surface 74 and the sensitivity value is less than the first sensibility threshold Hth1 and equal to or greater than the sensibility threshold Hth2. When a negative determination is made in S109, the finger F is estimated to be in the non-adjacent space, and the flow proceeds to the above S107. On the other hand, when a positive determination is made in S109, the flow proceeds to S110. In S110, the manipulation mode is set to the shallow manipulation mode, and the flow returns to S102.
The manipulation control portion 33 that has changed the manipulation mode in the above S105, S107, and S110 outputs a command signal to notify the CAN bus 90 of the change of the manipulation mode via the communication control portion 23 and communication interface 24. The display control portion 53 that has acquired this command signal activates the rendering layer corresponding to each manipulation mode based on the signal.
Specifically, when the manipulation control portion 33 sets the manipulation mode to the deep manipulation mode, the display control portion 53 selects the focus layer L1 as an active rendering layer. Thus, the manipulation control portion 33 is capable of changing or manipulating the display of the focus 62 while associating the deep manipulation by the finger F with the focus 62 (focus control). In other words, the display mode (including display contents, display position, display color, display shape, display type, display specification, and display size) of the focus 62 is changed.
In contrast, when the manipulation control portion 33 sets the manipulation mode to the shallow manipulation mode, the display control portion 53 selects submenu layers (not shown) that renders the multiple submenu images 164 as active rendering layers. Thus, the manipulation control portion 33 is capable of changing or manipulating the display of the submenu images 164 while associating the shallow manipulation by the finger F with the multiple submenu images 164 (scroll control). In other words, the display mode (including display contents, display position, display color, display shape, display type, display specification, and display size) of the submenu image 164 is changed.
Further, when the manipulation control portion 33 sets the manipulation mode to the non-adjacent mode, the display control portion 53 sets the active rendering layers to “absence.” Thereby, the moving manipulation of the finger F is not associated with any of the image portions 61.
According to the first embodiment described above, when the shallow manipulation is input in the second manipulation space Sp2, the manipulator makes the finger F contact the second manipulation surface 74 to be capable of receiving input assistance. Because of such assistance, the manipulator easily obtains the location of the second manipulation space Sp2 defined remote from the first manipulation surface 72. Therefore, the manipulator can easily perform the input in the second manipulation space Sp2.
Additionally, in the first embodiment, the manipulator can change the image portion 61 targeted for manipulation in the focus 62 and submenu images 164 by moving the finger F between the first manipulation surface 72 and second manipulation surface 74. Thus, the manipulator can more easily grasp each of the manipulation spaces Sp1 and Sp2 by dividing the first manipulation space Sp1 and second manipulation space Sp2 front to back along the z-x plane. Therefore, the manipulator can select a series of icons by blind manipulation without looking at the hand.
Additionally, according to the first embodiment, by forming each manipulation surface 71, 73 on each bottom surface 81, 83 of the recessed portion 80 provided to the remote manipulating apparatus 100, the manipulator can perform the input to each manipulation surface 72, 74 after stabilizing the hand by using the periphery surface 85 and palm rest 39. Therefore, the input by the manipulator in each manipulation space Sp1, Sp2 becomes easier.
Further, the capacitive touch sensor 31 used for the first embodiment is capable of detecting not only the movement of the finger F in the x-y-axis direction along the first manipulation surface 72 but also the movement of the finger F in the z-axis direction that substantially intersects perpendicularly with the first manipulation surface 72. Therefore, the above touch sensor 31 is preferred as the configuration that detects both (i) the relative position of the finger F in the x-y plane and (ii) the manipulation body distance d.
Further, in the first embodiment, by selecting from the submenu images 164, the icons 63 in the selected submenu image 164 are selectable. Thus, the display images 60 have the hierarchical configuration that permits one selection to enter the state where more detail functions are selectable. In such a state, the physical depth in the recessed portion 80 of each manipulation surfaces 72, 74 manipulated by the finger F is set to correspond to the hierarchical depth of each image portion 61 associated with the manipulation by the finger F. Thereby, it becomes easy for the manipulator to grasp the change of the display images 60 in association with the input of the finger F sensuously.
In the first embodiment, the touch sensor 31 and manipulation control portion 33 are also called the “detection device/means” and the “acquisition device/means” in association with each other. The manipulation control portion 33 is also called the “association device/means.” The first manipulation surface 72 is also called the “manipulation surface.” The second manipulation surface 74 is also called the “input auxiliary portion” or “guide portion.” The first bottom surface 81 is also called the “bottom surface.” The finger F of the manipulator is also called the “manipulation body.” The remote manipulating apparatus 100 is called the “manipulating apparatus.” The focus 62 is also called the “first image portion,” and the submenu images 164 are also called the “second image portion.” The “first image portion” and the “second image portion” may be changed suitably.
In this application, Japanese “Syudan” corresponds to English a “means” or a “device.”
A second embodiment of the present disclosure shown in
The first bottom surface 281 and the second bottom surface 283 are formed in a plane shape along the x-y plane. The first bottom surface 281 is formed in a substantially circular shape deeper than the second bottom surface 283 relative to the periphery surface 85. The second bottom surface 283 is formed to the outer circumference side of the first bottom surface 281, and is circumferentially formed coaxially with the first bottom surface 281. The second bottom surface 283 is located in the middle between the first bottom surface 281 and periphery surface 85 in the z-axis direction.
The two internal peripheral wall surfaces 282 and 284 are both formed in cylindrical shapes. For the internal peripheral wall surfaces 282 and 284, the deep-side internal peripheral wall surface 282 closer to the first bottom surface 281 than to the periphery surface 85 connects the outer edge of the first bottom surface 281 to the inner edge of the second bottom surface 283. In contrast, the shallow-side internal peripheral wall surface 284 closer to the periphery surface 85 than to the first bottom surface 281 connects the outer edge of the second bottom surface 283 to the periphery surface 85. Each internal peripheral wall surface 282, 284 may be formed along the z-axis direction or may be formed in a tapered shape whose diameter is reduced from the periphery surface 85 toward the first bottom surface 281.
The first manipulation surface 272 and the second manipulation surface 274 are formed to the recessed portion 280. The first manipulation surface 272 is formed to the first bottom surface 281 in a circular shape. The second manipulation surface 274 is formed annularly throughout the second bottom surface 283. The second manipulation surface 274 extends circumferentially in a belt shape along the outer edge of the first bottom surface 281.
The touch sensor 231 is formed in a shape of a disc opposing the entire areas of the first manipulation surface 272 and second manipulation surface 274 in the z-axis direction. Thereby, the touch sensor 231 is capable of detecting not only the moving manipulation of the finger F in the area opposing the first manipulation surface 272 but also the moving manipulation of the finger F in the area opposing the second manipulation surface 274.
As shown in
Next, the icon selection inputted to the remote manipulating apparatus 200 is explained based on
The manipulator in
In
Even when the above audio menu image is displayed, the association of the moving manipulation with the image portions 262 and 263 is started by inputting a tap to either of the manipulation surfaces 272 and 274 as if the submenu images 164 (see
Also in the second embodiment described above, the manipulator can perform the shallow manipulation while contacting the second manipulation surface 274. Since the location of the second manipulation space Sp2 is obtainable by the assistance of the second manipulation surface 274, the input in the manipulation space Sp2 becomes easy.
In addition, according to the second embodiment, the second manipulation surface 274 extends in a belt shape while curving along the outer edge of the first manipulation surface 272. Therefore, the manipulator can describe a trajectory of a substantially arc with the finger F while fixing the hand by use of either of the palm rest 39 (see
In the second embodiment, the first manipulation surface 272 is also called the “manipulation surface.” The second manipulation surface 274 is also called the “input auxiliary portion” or “guide portion.” The first bottom surface 281 is also called the “bottom surface.” The pointer 262 is also called the “first image portion.” The track icons 263 are also called the “second image portion.” The “first image portion” and the “second image portion” may be changed suitably. The remote manipulating apparatus 200 is also called the “manipulating apparatus.”
A third embodiment of the present disclosure shown in
In the above remote manipulating apparatus 300, the first manipulation space Sp1 is defined in the range in which the manipulation body distance d is less than the first threshold distance Dth1 in the area opposing the manipulation surface 370. In contrast, the second manipulation space Sp2 is defined in the range in which the manipulation body distance d is equal to or greater than the first threshold distances Dth1 and less than second threshold distance Dth2 in the area opposing the manipulation surface 370. That is, the second manipulation space Sp2 is not formed in the area opposing the second bottom surface 283.
The remote manipulating apparatus 300 defines a plurality of manipulation modes including a contact manipulation mode and an in-air manipulation mode together with the non-adjacent mode. The contact manipulation mode is the manipulation mode when the finger F is located in the first manipulation space Sp1, and corresponds to the deep manipulation mode in the first embodiment. In contrast, the in-air manipulation mode is the manipulation mode when the finger F is located in the second manipulation space Sp2, and corresponds to the shallow manipulation mode in the first embodiment.
The icon selection manipulation inputted to the above remote manipulating apparatus 300 is explained based on
The manipulator in
In
The manipulation mode selection performed by the manipulation control portion 33 (see
In S301, the presence of a tap on the manipulation surface 370 is determined based on variation of the output acquired from the touch sensor 331. When the tap is determined to be absent in S301, the wait state of the remote manipulating apparatus 300 is maintained by repeating the determination of S301. On the other hand, when the tap is determined to be present in S301, the substantially same processes of S302 and S303 as those of S102 and S103 (see
In S304, it is determined whether the finger F is in the first manipulation space Sp1. That is, it is determined whether the sensitivity value is equal to or greater than the first sensibility threshold Hth1. When a positive determination is made in S304, the flow proceeds to S305. In S305, the manipulation mode is set to the contact manipulation mode, and the flow returns to S302.
In S306 when a negative determination in S304, it is determined whether the finger F is in the second manipulation space Sp2. That is, it is determined whether the sensitivity value is less than the first sensibility threshold Hth1 and equal to or greater than the second sensibility threshold Hth2. When a positive determination is made in S306, the flow proceeds to S307. In S307, the manipulation mode is set to the in-air manipulation mode, and the flow returns to S302.
In S308 when a negative determination is made in S306, the manipulation mode is set to the non-adjacent mode, and the flow proceeds to S309. In this S308, when the manipulation mode is changed from the contact manipulation mode or in-air manipulation mode to the non-adjacent mode, the count of the elapsed time t after transition to the non-adjacent mode is started.
In S309, it is determined whether the elapsed time t that has started to be counted in the latest S308 or previous S308 has become greater than the threshold time Tth. When a positive determination is made in S309, the flow returns to S301. Thereby, the remote manipulating apparatus 300 transitions to the wait state. On the other hand, when a negative determination is made in S309, the flow returns to S302.
As in the third embodiment described above, even when the second manipulation space Sp2 is not formed in the area opposing the second bottom surface 283, the shallow manipulation guide surface 375 formed to the second bottom surface 283 may show the location of the border plane BP. Therefore, the manipulator can obtain the location of the border plane BP easily in blind manipulation. Therefore, the remote manipulating apparatus 300 is capable of acquiring a high manipulation capability of the in-air manipulation.
In addition, the manipulator can perform the input with the finger F in the second manipulation space Sp2 contiguous to the shallow manipulation guide surface 375 while permitting the shallow manipulation guide surface 375 to contact the thumb and little finger to stabilize the hand. Such assistance by the shallow manipulation guide surface 375 can make easier the input in the second manipulation space Sp2, even in the configuration in which cost reduction is intended by reducing the detection range of the touch sensor 331.
In the third embodiment, the shallow manipulation guide surface 375 is also called the “input auxiliary portion.” The map 364 is also called the “second image portion.” The “second image portion” may be changed suitably. The remote manipulating apparatus 300 is also called the “manipulating apparatus.”
A fourth embodiment of the present disclosure shown in
In the remote manipulating apparatus 400, the first threshold distance Dth1 is set to, for example, about 0.3 cm to be equal to or under a half of the depth from the second bottom surface 83 to the first bottom surface 81. Therefore, the first manipulation space Sp1 is defined in the range adjacent to the first bottom surface 81 in the area opposing the manipulation surface 470. In contrast, the second manipulation space Sp2 is expanded in the z-axis direction from the second bottom surface 83 toward the first bottom surface 81 to contain the range whose lower limit is closer to the first bottom surface 81 than to the second bottom surface 83.
Because of the setting of the above first threshold distance Dth1, a shallow manipulation guide surface 475 formed to the second bottom surface 83 is capable of showing the location of the second manipulation space Sp2 to the manipulator who performs blind manipulation. Therefore, the manipulator can certainly input the moving manipulation in the second manipulation space Sp2 by moving the finger F to the location of the shallow manipulation guide surface 475 in the z-axis direction. Therefore, the remote manipulating apparatus 400 is capable of acquiring the manipulation capability in the in-air manipulation.
In addition, in the fourth embodiment, since the second manipulation space Sp2 is expanded, the input to the second manipulation space Sp2 by blind manipulation becomes easier. The input auxiliary action synergisticly employing (i) the setting value of the first threshold distance Dth1 and (ii) the provision of the shallow manipulation guide surface 475 further improves the manipulation capability of the remote manipulating apparatus 400.
In the fourth embodiment, the shallow manipulation guide surface 475 is called the “input auxiliary portion.” The remote manipulating apparatus 400 is called the “manipulating apparatus.”
A fifth embodiment of the present disclosure shown in
In contrast, a second manipulation surface 574 is formed of a movable member 576 in the area opposing the first manipulation surface 572. The movable member 576 supports the second manipulation surface 574, and has an L-shaped cross section in the z-y plane. The movable member 576 has an input surface portion 578, a strut portion 579, and a hinge portion 577.
The input surface portion 578 is formed in a rectangular plate shape along the x-y plane, and located remote from the first manipulation surface 572. The upper surface of both surfaces of the input surface portion 578, the upper surface being remote from the first manipulation surface 572, forms the second manipulation surface 574. The strut portion 579 extends from the front end of both ends of the input surface portion 578 in the y-axis direction toward the periphery surface 85, the front end being remote from the palm rest 39. The strut portion 579 is formed in a rectangular plate shape along the z-x plane. The hinge region 577 is formed at the top end of the strut portion 579 in the extending direction. The hinge portion 577 makes the movable member 576 pivotable about the x-axis. Therefore, the movable member 576 is capable of tilting forward in the y-axis direction to be remote from the first manipulation surface 572 together with the second manipulation surface 574 (see the two-dot chain line of
In the above remote manipulating apparatus 500, the first threshold distance Dth1 is defined to fit the height from the first manipulation surface 572 to input surface portion 578. Therefore, the first manipulation space Sp1 is defined in the range lower than the input surface portion 578 in the z-axis direction in the area opposing the first manipulation surface 572. In contrast, the second manipulation space Sp2 is defined in the range in which the manipulation body distance d is less than the second threshold distance Dth2 in the area opposing the second manipulation surface 574.
Also in the fifth embodiment described above, the same advantageous effect as the first embodiment is obtained, so that the manipulator can receive the input assistance using the second manipulation surface 574 when performing the input into the second manipulation space Sp2. Therefore, the remote manipulating apparatus 500 is capable of obtaining the high manipulation capability in the input to the second manipulation surface 574.
In addition, in the fifth embodiment, the second manipulation surface 574 is formed in the area opposing the first manipulation surface 572. Therefore, the enlargement of the remote manipulating apparatus 500 due to the addition of the second manipulation surface 574 may be avoided. In addition, since the second manipulation surface 574 is spaced apart from the first manipulation surface 572, the area of the first manipulation surface 572 may be prevented from decreasing. Therefore, the remote manipulating apparatus 500 can ensure not only the manipulation capability in the input in the second manipulation space Sp2 but also the manipulation capability in the input in the first manipulation space Sp1.
Additionally, according to the fifth embodiment, when the change of the image portions 61 (see
In the fifth embodiment, the first manipulation surface 572 is called the “manipulation surface.” The second manipulation surface 574 is called the “input auxiliary portion” or “guide portion.” The movable member 576 is called the “movable support portion.” The remote manipulating apparatus 500 is called the “manipulating apparatus.”
A sixth embodiment of the present disclosure shown in
The lid member 676 supports the second manipulation surface 674, and is formed in a plate shape along the x-y plane. The lid member 676 is formed movably in the y-axis direction, and can be received in the peripheral portion of the recessed portion 680 by moving forward away from the palm rest 39 (see the two-dot chain line of
In the above remote manipulating apparatus 600, the first threshold distance Dth1 is defined to fit the height from the first manipulation surface 672 to the lid member 676. Therefore, the first manipulation space Sp1 is defined in the range lower than the lid member 676 in the z-axis direction in the area opposing the first manipulation surface 672. In contrast, the second manipulation space Sp2 is defined in the range in which the manipulation body distance d is less than the second threshold distance Dth2 in the area opposing the second manipulation surface 674.
Also in the sixth embodiment described above, by obtaining the same advantageous effect as the first embodiment, the manipulator can use the assistance of the input by the second manipulation surface 674 when performing the input into the second manipulation space Sp2. Therefore, the remote manipulating apparatus 600 is capable of obtaining the high manipulation capability in the input to the second manipulation surface 674.
In addition, as in the sixth embodiment, by providing the lid member 676 slidable in the y-axis direction, the second manipulation surface 674 can be retracted from the area opposing the first manipulation surface 672. Therefore, when the second manipulation surface 674 is unnecessary, the obstruction to the input to the first manipulation surface 672 is avoidable.
In the sixth embodiment, the first manipulation surface 672 is also called the “manipulation surface.” The second manipulation surface 674 is also called the “input auxiliary portion” or “guide portion.” The lid member 676 is also called the “movable portion.” The remote manipulating apparatus 600 is also called the “manipulating apparatus.”
The multiple embodiments of the present disclosure have been described above. The present disclosure is not limited to the above embodiments, and is applicable to various embodiments and the combinations thereof without departing from the contents of the present disclosure.
In a first modification of the above fifth embodiment shown in
In a modification of the above embodiments, a push button for selecting the icon 63 optionally is provided to be adjacent to the recessed portion 80. While superimposing the pointer 262 and focus 62 on the arbitrary icon 63, the manipulator can select the function of the icon 63 by pushing the push button.
In a modification of the above embodiments, a pressure sensitive touch sensor that detects manipulation by the finger F by detecting a pressure applied by the finger F is used as the “detection device/means.” In contrast, an infrared sensor that measures the manipulation body distance d by infrared light is used as the “acquisition device/means.” In another modification, a camera that captures the neighborhood of each manipulation surface and an image analysis circuit that acquires the manipulation body distance d by analyzing images captured by the camera are used as the “acquisition device/means.”
In a modification of the above embodiments, a display using a plasma display panel and a display using organic electroluminescence form the “display screen.” Further, in another modification, a windshield and a combiner that is provided to the upper surface of an instrument panel are set as the “display screen.” A display apparatus that projects images on the windshield and the combiner by use of a projection device/means such as a projector is contained in the navigation apparatus 50 as a component forming the display screen 52.
In a modification of the above embodiments, hysteresis is provided to the first threshold distance Dth1 and second threshold distance Dth2. Specifically, based on the fact that the finger F is located in the first manipulation space Sp1, the first threshold distance Dth1 is increased. The second threshold distance Dth2 is increased based on the fact that the finger F is located in the second manipulation space Sp2. In such a configuration, the manipulation by the finger F in each space Sp1 and Sp2 becomes easier.
In the above embodiments, the “first image portion” and the “second image portion” exemplified multiple times may be changed suitably. As mentioned above, in the above embodiments, the multiple functions (also called the section, the means, or the device) provided by the manipulation control portion 33 that executes programs may be provided by hardware and software different from the above control apparatus or a combination thereof. For example, functions such as the “association section” and “acquisition section” may be provided by a circuit of the hardware that achieves predetermined functions without programs.
The above-mentioned embodiments have explained the examples in which the present disclosure is applied to the remote manipulating apparatuses used for the display system mounted to the vehicle. However, the present disclosure is applicable also to the so-called touch-panel manipulating apparatus configured integrally with the display screen. Further, the manipulating apparatus to which the present disclosure is applied is employable generally in the display systems used for not only vehicles but also various transport apparatuses and various information terminals.
As mentioned above, in this application, Japanese “Shudan” corresponds to English a “means” or a “device.”
While the present disclosure has been described with reference to preferred embodiments thereof, it is to be understood that the disclosure is not limited to the preferred embodiments and constructions. The present disclosure is intended to cover various modification and equivalent arrangements. In addition, while the various combinations and configurations, which are preferred, other combinations and configurations, including more, less or only a single element, are also within the spirit and scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
2012-220661 | Oct 2012 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2013/004701 | 8/2/2013 | WO | 00 |