The present invention relates to an information processing device.
JP-A-2011-170598 (Patent Literature 1) describes a touch panel input device that is expected to allow the user to easily perform an operation on a touch panel by switching the touch panel layout between the layout for an operation with the left hand fingers and the layout for an operation with the right hand fingers.
PATENT LITERATURE 1: JP-A-2011-170598
However, the technology described above and conventional technologies require the user to extend his or her hand to touch a button displayed on the screen. In addition, the user must keep his or her eyes on an operation target because the user touches the panel while carefully watching the buttons displayed on the screen. In addition, the display of a hierarchical menu requires the user to touch the panel many times, increasing the number of operations and the operation time. When an operation is performed using a gesture, the user must perform a defined operation and memorize the operation.
It is an object of the present invention to provide an information processing device that allows the user to perform a desired operation with a minimum number of operations without having to largely extend his or her hand and without having to keep his or her eyes on the screen for a long time.
A vehicle-mounted device, which is an example of an information processing device of the present invention, is a vehicle-mounted device that reduces driver's distraction (distraction: state of being distracted from driving by an operation other than the driving operation) for performing a desired operation. The vehicle-mounted device includes a vehicle-mounted device control unit that controls the operation of the vehicle-mounted device in its entirety, a sensing unit that can measure the distance to an object and detect a gesture, a touch input unit through which touch input is possible, a display unit that displays a video/image, and a speaker that outputs sound.
The sensing unit monitors the distance to an object before the display unit. When it is detected that the driver's hand enters region 1 that is a specific region before the display unit, the vehicle-mounted device control unit moves a particular button and icon, displayed on the display unit, to the driver's side and, at the same time, performs control to output a sound effect from the speaker. When the sensing unit detects that the driver's hand enters region 2 that is nearer to the display unit than region 1, the vehicle-mounted device control unit expands and displays the lower-level menu of the icon and performs control to output a sound effect from the speaker. After that, the vehicle-mounted device control unit performs control to display the displayed menu for a predetermined time. The menu is kept displayed until a predetermined time elapses after this state is generated, until the driver performs a gesture such as a hand movement, or until the displayed menu is touched.
According to the present invention, the user can perform a desired operation with a minimum number of operations without having to largely extend his or her hand and without having to keep his or her eyes on the screen for a long time.
Other objects, features and advantages of the present invention will become apparent from the following detailed description of the present invention taken together with the accompanying drawings.
Embodiments of the present invention are described in detail below with reference to the drawings.
A vehicle-mounted device control unit 102, which is a part configured by a CPU and the software executed by the CPU for controlling the whole operation of the vehicle-mounted device 101, includes a distance detection unit 104, a position detection unit 105, and a gesture detection unit 106. More specifically, the vehicle-mounted device control unit 102 controls the basic operation of a car navigation system and, based on the various types of input information, controls the output content.
The distance detection unit 104 calculates the distance from a sensing unit 103, which will be des cribbed later, to a user's hand based on the voltage output from the sensing unit 103. The position detection unit 105 identifies where the user's hand is positioned based on the voltage output from the sensing unit 103. In addition, the gesture detection unit 106 determines whether the user performs a predetermined operation (hereinafter called a “gesture”), based on the voltage output from the sensing unit 103.
The sensing unit 103 is configured by an infrared-light distance sensor that includes a projector that emits an infrared light and an optical receiver that receives an infrared light reflected by an object at a short distance (for example, within 5 cm). The sensing unit 103 outputs the voltage, corresponding to the quantity of light received by the optical receiver, to the vehicle-mounted device control unit 102.
At this time, by identifying which of the infrared light sensors 103A-103C detects the user's hand, the vehicle-mounted device control unit 102 can detect in which part of the display unit 112 (upper part, middle part, or lower part) the user's hand is present. The vehicle-mounted device control unit 102 can also know the distance, for example, between the user's finger and the sensing unit 103, according to the level of the voltage output by the sensing unit 103.
In this embodiment, the space before the display unit 112, which is from the sensing unit 103 to a first distance (for example, 5 cm), is defined as region 1, the space before the display unit 112, which is from the sensing unit 103 to a second distance (for example, 2.5 cm) and which corresponds to the upper half of the display unit 112, is defined as region 2, and the space before the display unit 112, which is from the sensing unit 103 to a second distance (for example, 2.5 cm) and which corresponds to the lower half of the display unit 112, is defined as region 3, as shown in
The vehicle-mounted device control unit 102 stores a data table that defines the relation among each of these distances, the voltage value output from the sensing unit 103, and the type of the infrared light distance sensor that detects the user's hand. Based on this data table and the voltage actually output from the sensing unit 103, the vehicle-mounted device control unit 102 identifies in which region, region 1 to region 3, the user's hand is present.
The number of infrared light distance sensors configuring the sensing unit 103 and their mounting positions are not limited to those in this embodiment. In the example shown in
The component configuring the sensing unit 103 is not limited to an infrared light distance sensor. For example, any of sensors, such as a laser distance sensor, an ultrasonic distance sensor, a distance image sensor, an electric field sensor, or an image sensor, as well as a microcomputer that performs data processing or software that operates on a microcomputer, may also be used to configure the sensing unit 103.
Returning to
A switch input unit 109 sends the information, which indicates whether a switch provided on the vehicle-mounted device 101 is pressed, to the vehicle-mounted device control unit 102.
A touch input unit 110 sends the information on a touched coordinate to the vehicle-mounted device control unit 102.
A traveling state input unit 111, a part through which the information about the state of a vehicle on which the vehicle-mounted device 101 is mounted is input, sends the information about the vehicle speed, the state of the accelerator, and the state of various brakes to the vehicle-mounted device control unit 102.
The display unit 112, a device that presents video information to the user, includes a display unit such as a LCD (Liquid Crystal Display), an arithmetic processing unit necessary for the display processing for video content or the GUI (Graphical User Interface), and a memory. A touch panel, integrated with the touch input unit 110, is applied to the display unit 112 in this embodiment. A speaker 113 is means for externally outputting sound.
A tactile interface unit 114 is mounted on a device the user touches, for example, on a steering wheel or a vehicular seat. When an instruction is received from the vehicle-mounted device control unit 102, the tactile interface unit 114 sends the information to the user through the sense of touch by transmitting a vibration or by applying a weak electric current.
The operation of the vehicle-mounted device control unit 102 is described below with reference to the flowchart in
The vehicle-mounted device control unit 102 starts the sensing unit 103 (S301: “Start sensing unit”). The sensing unit 103 monitors whether the user's hand is detected in region 1 such as the one shown in
The vehicle-mounted device control unit 102 performs control for the display unit 112 to move a predetermined icon, displayed by the display unit 112, to the right side, that is, to the driver's side in such a way that the NAVI button shown in
The sound effect used in this case is the sound “pop” indicating that the hand leaves the region or the sound “whiz” indicating that an object moves. After that, the vehicle-mounted device control unit 102 performs control for the display unit 112 to return the moved icon to the initial display position shown in
If the sensing unit 103 detects that the user's hand is present in region 1 (S305: “Is user's hand present in region 1?” Yes) and that the user's hand is present in region 2 in
Similarly, if the sensing unit 103 detects that the user's hand is present in region 3 in
The vehicle-mounted device control unit 102 performs control for the speaker 113 to output a third sound effect or a voice according to the motion on the screen (S311: “Output sound effect from speaker”). As the sound effect, the “splashing sound” that sounds like the splashing of an object is output. The sound effect “tick” may also be used to let the user know the state in which the menu is displayed in a expanded manner.
After that, the vehicle-mounted device control unit 102 keeps displaying the menu in the fan-like, expanded manner for a predetermine length of time (S312: “Keep menu expanded”). If the gesture detection unit 106 detects a gesture, such as a user's bye-bye motion, before the sensing unit 103 (S313: “Is predetermined gesture detected?” Yes), the vehicle-mounted device control unit 102 stops the display of the fan-like, expanded menu (S314: “Close expanded menu”) and the processing proceeds to the steps S306 and S307.
If the gesture detection unit 106 does not detect a user's gesture (S313: “Is predetermined gesture detected?” No) and a predetermined time, for example, ten seconds, is elapsed after the menu is displayed (S315: “Is predetermined time elapsed?” Yes), the vehicle-mounted device control unit 102 performs the processing in step S314. When a displayed menu is touched and the user input operation is accepted, the menu selected through the touch is displayed at the position, where the icon has been displayed, as an icon and the lower-level menu of the selected menu is displayed in a fan-like, expanded manner. For example, if “Destination” is selected through the touch, “Destination” is displayed at the position, where the NAVI icon has been displayed, as an icon and the lower-level menu of “Destination” is displayed in a fan-like, expanded manner. When the menu selection reaches the lowest layer and a desired item is selected (S316: “Is user's input operation terminated?” Yes), the vehicle-mounted device control unit 102 sets the icon to the highest level of the menu and returns the icon to the initial display position (S317: “Return icon to initial position”) and performs the processing in step S302.
When a menu is displayed, the condition determination in S313, S315, and S316 is performed repeatedly. In the operation flow in
Although it is determined in S313 whether a predetermined gesture is detected, another configuration is also possible in which the voice recognition unit 108 determines whether a predetermined speech is detected. The word “cancel”, “home”, or “return” may be used as the predetermined speech. This configuration allows the user to stop displaying the menu, displayed in the expanded, fan-like manner, without having to bring the hand before the sensing unit 103, reducing the possibility that the user is distracted from driving the vehicle. It is also possible to stop displaying the menu, displayed in an expanded manner, and to return the icon to the initial display position by pressing a button such as a hard switch button or a steering controller button.
The configuration in which a sound effect or a voice is output from the speaker 113 may be changed to the configuration in which the tactile interface unit 114 is started either instead of outputting a sound effect from the speaker 113 or at the same time the sound effect is output from the speaker 113. This configuration allows the information to be transmitted through the user's sense of touch even when the surrounding nose is so loud that the user cannot hear the sound from the speaker 113, making it possible to suitably send the status of the operation to user.
According to this embodiment, a predetermined icon is moved to the driver's side and is displayed on the display unit 112 simply by the driver bringing his or her hand before the display unit 112 as described above. Therefore, the driver can perform the touch operation for the lower-level menu of the icon without largely changing the driving posture.
In addition, the lower-level menu of a desired icon is displayed, not by touching the icon, but simply by bringing his or her hand near to the icon. Therefore, the effort, the number of times, or the length of time required for the touch operation can be reduced. This reduces the possibility that the touch operation distracts the driver from driving.
In addition, because the menu, once displayed, remains displayed for a predetermined time, the operation is restarted with the menu displayed even after the driver returns his or her hand to the steering wheel and then restarts the operation, with the result that the time for redisplaying the menu is reduced. The display of a menu can be stopped when a predetermined time elapses or when the user performs a simple operation such as a gesture or voice recognition and, therefore, the possibility that the user is distracted from driving is reduced.
When displaying a menu in S309 or S310, a configuration is possible in which the operable menus are limited based on the information received from the traveling state input unit 111. More specifically, the vehicle-mounted device control unit 102 determines the traveling state received from the traveling state input unit 111 and allows the driver to perform an operation on all menus when the vehicle is not in the traveling state and limits an operation on a part of the menus when the vehicle is in the traveling state.
In this embodiment, the menus “Destination” and “Surrounding area search” are grayed out and unavailable for the touch operation during traveling as shown in
When a fewer sensors are used in the sensor element arrangement in
The configuration of a device in this embodiment is the same as that of the vehicle-mounted device 101 shown in
If the on-vehicle control unit 102 detects that the user's hand is present in region 1 based on the information received from the sensing unit 103, the on-vehicle control unit 102 performs control for the display unit 112 to move a predetermined icon (NAVI button) displayed on the display unit 112 (S302 to S304) and performs control to expand and display the lower-level menu of the moved icon on the display unit 112 as shown in
After that, if it is detected that the user's hand is present in region 2, such as the one shown in
On the other hand, if it is detected that the user's hand is present in region 3 such as the one shown in
After the processing of S306 is performed, the on-vehicle control unit 102 performs additional processing for performing control for the display unit 112 to stop the display of the lower-level menu of the NAVI icon or the AV icon (S1005 “Close expanded menu”).
The lower-level menu displayed in S1001 may be not only that of the NAVI icon but also that of the AV icon. In addition, a configuration is also possible in which the user determines this display setting in advance. This configuration allows a user-tailored menu to be displayed, reducing the effort and the number of operations required to perform a desired operation.
The vehicle-mounted device 101 that performs the above operation enables the driver to move an icon, displayed on the display unit 112, to the driver's side, and the lower-level menu buttons of the displayed icon to be displayed, simply by extending the hand. Therefore, this vehicle-mounted device 101 allows the driver to operate the vehicle-mounted device without largely changing the driving posture and reduces the effort, the number of operation times, and the time, required for the touch operation, thus reducing the possibility that the user is distracted from driving the vehicle.
The configuration of a device in this embodiment is the same as that of the vehicle-mounted device 101 shown in
The operation of the vehicle-mounted device 101 in this embodiment is described in detail below with reference to the operation flow in
First, when the engine of the vehicle is started, the operation of the vehicle-mounted device 101 is started. As shown in
In this embodiment, the NAVI icon and the AV icon are displayed on the display unit 112. The sensing unit 103 monitors whether the user's hand from the assistant driver's seat is detected in region 4 (left-half region before the display unit 112 at the first distance from the sensing unit 103) such as the one shown in
The vehicle-mounted device control unit 102 performs control to move the icon (NAVI icon in
The sound effect is the sound “whiz” indicating that an object moves. The vehicle-mounted device control unit 102 performs control for the display unit 112 to return the icon, which has been moved to the assistant driver's seat side, to the initial display position (S1307: “Return icon to initial position”).
If the sensing unit 103 detects the user's hand in region 4 (S1305: “Is user's hand present in region 4?” Yes) and detects the user's hand also in region 5 (left-half region before the display unit 112 at the second distance from the sensing unit 103) in
In this embodiment, “Destination”, “Surrounding area search”, “Position registration”, and “Home”, which are lower-level menus of the NAVI icon, are displayed. Similarly, if the sensing unit 103 detects the user's hand in region 6 (right-half region before the display unit 112 at the second distance from the sensing unit 103) in
In this embodiment, “FM/AM”, “List”, “Forward”, and “Reverse”, which are lower-level menus of the AV icon, are displayed.
When the processing in S1309 or S1310 is performed, the vehicle-mounted device control unit 102 performs control to output a sound effect or a voice, which is adjusted to the processing on the display unit 112, from the speaker 113 (S1311: “Output sound effect from speaker”). For example, the “splashing sound” that sounds like the splashing of an object is output.
After that, the menu is displayed (S1312: “Keep menu expanded”). If the sensing unit 103 detects that the user performs a gesture (for example, the user performs the bye-bye motion before the sensing unit 103) (S1313: “Is predetermined gesture detected?” Yes), the display of the displayed menu is stopped (S1314: “Close expanded menu”) and the processing in S1306 and S1307 is performed. If a gesture is not detected (S1313: “Is predetermined gesture detected?” No) and a predetermined, for example, ten seconds, is elapsed after the menu is displayed, (S1315: “Is predetermined time elapsed?” Yes), the processing proceeds to S1314 and the display of the menu displayed in the expanded manner is stopped.
When a displayed menu is touched and the user input operation is accepted, the menu selected through the touch is displayed at the position, where the icon has been displayed, as an icon and the lower-level menu of the selected menu is displayed in a fan-like, expanded manner. For example, if “Destination” is selected through the touch, “Destination” is displayed at the position, where the NAVI icon has been displayed, as an icon and the lower-level menu of “Destination” is displayed in a fan-like, expanded manner. When the menu selection reaches the lowest layer and a desired item is selected (S1316: “Is user's input operation terminated?” Yes), the vehicle-mounted device control unit 102 performs control for the display unit 112 to set the icon to the highest level of the menu, returns the icon, which has been moved to the driver's side, to the initial display position (S1317: “Return icon to initial position”), and performs the processing in S1302.
After the lower-level menu of the icon is displayed, the processing in S1313, S1315, and S1316 is performed repeatedly. In the operation flow in
This configuration allows the user to switch the display of menus smoothly, making it easier to search for a desired menu. Although it is determined in S1313 whether a predetermined gesture is detected, another configuration is possible in which the voice recognition unit 108 determines whether a predetermined speech is detected. The word “cancel”, “home”, or “return” may be used as the predetermined speech. This configuration allows the user to stop displaying the menu and to return the icon to the initial display position without having to bring the hand before the sensing unit 103. It is also possible to stop displaying the menu, displayed in an expanded manner, and to return the icon to the initial display position by pressing a button such as a hard switch button or a steering controller button.
Operating the vehicle-mounted device based on the operation flow described above allows not only the driver but also a person in the assistant driver's seat to display a menu on the assistant driver's seat side simply by bringing the hand before the panel. In addition, when performing a desired operation, the lower level menu of a desired icon is displayed, not by touching the icon, but by simply bringing the hand near to the icon. Therefore, the effort or the number of times required for the touch operation can be reduced. In addition, when it is necessary to stop the display of a menu, the display can be released when a predetermined time elapses or when a gesture or a voice is recognized and, therefore, there is little or no distraction for the person in the assistant driver's seat.
Although the vehicle-mounted device is used in all embodiments, the present invention is not limited thereto. The present invention is applicable to a device, such as a personal computer or a digital signage, that has a display unit and input means.
Note that, the present invention is not limited to the above-described embodiments, but includes various modifications. For example, though the above embodiments have been described in detail in order to clearly describe the present invention, the present invention is not necessarily limited to the embodiments including all the described configurations. Moreover, it is possible to replace a part of the configuration of a certain embodiment with a configuration of another embodiment, and it is also possible to add a configuration of another embodiment to the configuration of a certain embodiment. For a part of the configuration of each embodiment, addition, deletion, or replacement of another configuration is possible.
The control lines and the information lines considered necessary for the explanation are included, but not all control lines and information lines of the product are necessarily included, in the above description. In fact, it is thought that almost all configurations are interconnected.
Number | Date | Country | Kind |
---|---|---|---|
2013-141304 | Jul 2013 | JP | national |
The present application is a continuation of U.S. application Ser. No. 14/771,304, filed Aug. 28, 2015, which is a National Phase of International Application No. PCT/JP2014/064099, filed May 28, 2014, claims priority from Japanese patent application JP2013-141304 filed on Jul. 5, 2013 the contents of which are hereby incorporated by reference into this application in their entirety.
Number | Date | Country | |
---|---|---|---|
Parent | 14771304 | Aug 2015 | US |
Child | 15947519 | US |