The present disclosure relates to a vehicular display apparatus and a displaying method.
For instance, a character input apparatus is provided to permit a user to input a character with a pointing device such as a mouse or a trackball to display the inputted character. Further, a character input apparatus is provided to permit a user to input a character by operating a touch panel to display the inputted character. This type of character input apparatus displays a rectangular array of a list of characters that can be inputted, such as Japanese (syllabary), alphabets, symbols, numbers. When the user selects a character, the selected character can be inputted. In contrast, a circle-shaped display may be used for a display apparatus.
According to an example of the present disclosure, in a vehicular display apparatus, in response to that a rotation operation is detected, a view of a plurality of items displayed on a circular ring-shaped second display area is rotated along the outer periphery of a circle-shaped first display area. An item displayed at a predetermined position among the plurality of items displayed on the second display area is determined as a processing target item. Processing is executed for inputting a character on a character input screen on the first display area according to the determined processing target item.
The objects, features and advantages of the present disclosure will become more apparent from the following detailed description made with reference to the accompanying drawings. In the drawings:
Hereinafter, an embodiment will be described with reference to the drawings. A vehicular display apparatus 1 is an apparatus that is mounted at a position where a driver can operate the apparatus while sitting in a driver's seat in a vehicle cabin. As shown in
The controller 2, which may also be referred to as a processor, may be configured to be a microcomputer (i.e., computer). As one example of the present embodiment, such a microcomputer will be described as including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and an I/O (Input/Output). The controller 2 (i.e., the microcomputer) executes processing corresponding to a computer program by executing the computer program stored in the non-transitory tangible storage medium, and controls the overall operation of the vehicular display apparatus 1.
Further, the functions executed by the controller 2 may be achieved by one or more controllers 2 (i.e., one or more processors), or one or more computers.
The computer program executed by the controller 2 includes a displaying program executing a displaying method.
The display 3 includes a first display unit 10 and a second display unit 11, as shown in
The second display unit 11, which is provided concentrically with the first display unit 10, has a circular ring-shaped second display area 13 as a display screen to surround the outer peripheral of the first display unit 10. The second display unit 11 receives a second display command signal from the controller 2 and then displays the second display information specified by the received second display command signal on the second display area 13. For example, when the mail application is activated, the second display unit 11 displays, on the second display area 13, several icons indicating operations that can be performed by the user on the mail screen view as the second display information. The icons include a reply icon 14a (REP), a transfer icon 14b (TRA), a save icon 14c (SAV), an address storage icon 14d (AD STO), and an edit icon 14e (EDI), as shown in
The operation switch 4 is a touch panel provided in the first display area 12 of the first display unit 10 and the second display area 13 of the second display unit 11. For example, when the user touches the first display area 12 to perform a screen change operation, the operation switch 4 outputs a screen operation detection signal that can specify the screen change operation to the controller 2. Upon receiving the screen operation detection signal from the operation switch 4, the controller 2 specifies a screen change operation on the first display area 12 by the received screen operation detection signal. The first display command signal corresponding to the specified screen change operation is then outputted to the first display unit 10.
In this case, upon receiving the first display command signal from the controller 2, the first display unit 10 switches or changes the display screen view according to the received first display command signal. That is, when the user performs a drag operation as a screen change operation while the mail screen view illustrated in
Further, for example, when the user touches the second display area 13 to perform a rotation operation, the operation switch 4 outputs a rotation operation detection signal capable of specifying the rotation operation to the controller 2. When the rotation operation detection signal is received from the operation switch 4, the controller 2 specifies a rotation operation on the second display area 13 based on the received rotation operation detection signal. The second display command signal corresponding to the specified rotation operation is outputted to the second display unit 11.
In this case, when the second display command signal is received from the controller 2, the second display unit 11 rotates the view (i.e., display) of the icons on the second display area 13 according to the received second display command signal. That is, when the user performs a rotation operation while the mail screen view is displayed as illustrated in
The remote control terminal 16 is provided separately from the vehicular display apparatus 1. When the user operates the remote control terminal 16, an operation detection signal capable of specifying the operation content is transmitted to the remote control sensor 5 by wireless communication such as WiFi (registered trademark) or Bluetooth (registered trademark). When receiving the operation detection signal from the remote control terminal 16, the remote control sensor 5 outputs the received operation detection signal to the controller 2. The remote control terminal 16 is configured to be capable of performing a plurality of operations such as a long press, a short press, a movement in eight directions (up, down, left, right, and diagonally).
The internal memory 6 and the external storage 7 are configured to be able to store various databases and the like. The controller 2 outputs a read signal to the internal memory 6 or the external storage 7. The storage information specified by the read signal is thereby read out from the storage information stored in the internal memory 6 or the external storage 7. The read storage information is displayed on the display 3. That is, when activating the mail application, the controller 2 outputs a mail information read signal to the internal memory 6 or the external storage 7. The mail information stored in the internal memory 6 or the external storage 7 is read out. The above mail screen view is thereby displayed on the display 3 according to the read mail information. Further, when the navigation application is activated, the controller 2 outputs a navigation information read signal to the internal memory 6 or the external storage 7. The navigation information stored in the internal memory 6 or the external storage 7 is thereby read out. The map screen view described above is then displayed on the display 3 according to the read navigation information.
The speaker 8 is arranged at a position where the user can hear a speech in the vehicle cabin. When a speech command signal is received from the controller 2, a speech specified by the received speech command signal is outputted.
The microphone 9 is arranged at a position in the vehicle cabin at which a speech uttered by the user can be captured. When a speech uttered by the user is captured, a speech capture signal that can specify the captured speech is outputted to the controller 2. When the speech capture signal is received from the microphone 9, the controller 2 recognizes the speech specified by the received speech capture signal, and specifies the speech uttered by the user. For example, the controller 2 receives a speech capture signal from the microphone 9 in response to the user utters a speech of “KE”, “TU”, “TE”, and “I”, which signifies “determine” in English. The speech specified by the received speech capture signal is thereby recognized, and the speech uttered by the user is specified as “KE”, “TU”, “TE”, and “I”.
The controller 2 includes a first display control unit 2a, a second display control unit 2b, a rotation operation detection unit 2c, a determination operation detection unit 2d, a screen change detection unit 2e, and a processing execution unit 2f. The first display control unit 2a outputs the first display command signal described above to the first display unit 10 and controls the view (i.e., display) on the first display area 12. The second display control unit 2b outputs the above-described second display command signal to the second display unit 11, and controls the view on the second display area 13.
The rotation operation detection unit 2c receives the rotation operation detection signal described above from the operation switch 4 and detects a rotation operation performed on the second display area 13 by the user. The determination operation detection unit 2d receives the above-described speech capture signal from the microphone 9, and detects a determination operation performed by the user. The screen change detection unit 2e receives the screen change detection signal from the operation switch 4 and detects a screen change operation performed on the first display area 12 by the user.
The processing execution unit 2f determines an item displayed at a predetermined position among a plurality of items displayed on the second display area 13 as a processing target item. The processing is executed according to the determined processing target item. More specifically,
Suppose that the navigation application is activated and the destination icon 15a is displayed at the uppermost position of the second display area 13 (see “P” in
Next, the operation of the above configuration will be described with reference to
(1) Application Monitoring Process
When activating the application monitoring process, the controller 2 determines whether an activation operation for activating the application, a stop operation for stopping the application, or a change operation for changing the application, has been performed (S1 to S3). In this case, the user operates an application activation icon on a menu screen view (not shown) or utters a speech of “A”, “PU”, “RI”, “KU”, “DO”, “U”, which signifies “activate an application” in English, to thereby activate the application. In addition, the user can similarly perform a stop operation or a screen change operation for the application.
When the controller 2 determines that an activation operation for activating the application has been performed (S1: YES), the controller 2 activates the application specified by the activation operation (S4). That is, the controller 2 activates the mail application when an activation operation for activating the mail application is performed. As shown in
When the controller 2 determines that a stop operation for stopping the application has been performed (S2: YES), the controller 2 stops the application being activated at that time (S5). That is, if a stop operation is performed while the mail application is activated, the controller 2 stops the activated mail application. If a stop operation is performed while the navigation application is activated, the controller 2 stops the activated navigation application.
When the controller 2 determines that the change operation for changing the application has been performed (S3: YES), the controller 2 stops the activated application at that time. The application specified by the change operation is activated to change the applications (S6). That is, for example, when the change operation to the navigation application is performed while the mail application is activated, the controller 2 stops the activated mail application. The navigation application is then activated and the mail application is changed to the navigation application. The controller 2 determines whether the accessory signal is switched from ON to OFF (S7). As long as the accessory signal is ON, the controller 2 repeats the above steps S1 to S6. When the controller 2 determines that the accessory signal has been switched from ON to OFF, the controller 2 ends the application monitoring process.
(2) Process Under Application Activation
When activating the process under application activation, the controller 2 determines whether the user's screen change operation on the first display area 12 or the user's rotation operation on the second display area 13 has been performed (S11, S12). The controller 2 receives a screen operation detection signal from the operation switch 4 and determines that the user has performed a screen change operation (S11: YES). Thereby, the controller 2 outputs a first display command signal corresponding to the screen change operation to the first display unit 10, and switches the view on the first display area 12 according to the screen change operation (S13). That is, the controller 2 switches the mail screen view according to the screen change operation when the mail application is activated, and switches the map screen view according to the screen change operation when the navigation application is activated.
The controller 2 receives a rotation operation detection signal from the operation switch 4 and determines that the user has performed the rotation operation on the second display area 13 (S12: YES). Thereby, the controller 2 outputs a second display command signal corresponding to the rotation operation to the second display unit 11, and rotates the view of the item on the second display area 13 according to the rotation operation (S14, corresponding to a display control step). The controller 2 then determines whether the operation for determining the processing target item has been performed (S15). That is, the controller 2 rotates the view of the various icons 14a to 14e when the mail application is activated, while rotating the view of the various icons 15a to 15e when the navigation application is activated. It is then determined whether the operation of determining the processing target item has been performed.
The controller 2 detects, for example, that the user has uttered the speech “KE”, “TU”, “TE”, and “I”, which signifies “determine” in English, and determines that the operation to determine the processing target item has been performed (S15: YES). At that time, the item corresponding to the icon displayed at the uppermost position of the second display area 13 is determined as the processing target item (S16). Then, the controller 2 executes the processing according to the determined processing target item, and outputs a first display command signal to the first display unit 10. A screen view corresponding to the item determined as the processing target item is thereby displayed on the first display area 12 (S17, corresponding to a processing execution step).
That is, the user has uttered the speech of “KE”, “TU”, “TE”, and “I” while the mail application is activated and the reply icon 14a is displayed at the uppermost position of the second display area 13. Responsive to the detection thereof, as shown in
The controller 2 performs the process described above, so that the user performs a rotation operation on the second display area 13 while keeping the line of sight fixed to the uppermost position of the second display area 13. Here, by uttering or pronouncing the speech of “KE”, “TU”, “TE”, and “I”, which signifies “determine” in English, while the desired icon is selected, it is possible to execute the processing corresponding to the selected desired icon. In other words, the user can select a desired icon with as little eye movement as possible even while driving. After ensuring safety, it is possible to execute the processing corresponding to the selected desired icon.
The above has described the case where a desired function is selected by selecting the icons 14a to 14e corresponding to the mail screen view or the icons 15a to 15e corresponding to the map screen view. The same applies to input of desired characters. That is, as shown in
Also, the above has described the case where the user selects a desired icon and utters the speech of “KE”, “TU”, “TE”, and “I”, which signifies “determine” in English, to execute the processing corresponding to the selected desired icon. Instead of uttering the speech of “KE” “TU” “TE” and “I”, the operation switch in the steering wheel may be used. The user may operate an operation switch in the steering wheel to execute the processing corresponding to the selected desired icon. Also, the position where the user is holding the steering wheel may be detected by a camera or the like.
The processing corresponding to the selected desired icon may be executed by the user holding a predetermined position of the steering wheel.
The embodiment described above may provide effects as below. The user performs a rotation operation on the second display area 13 in the display apparatus 1 for a vehicle. In the second display area 13, a plurality of icons or characters thereby rotate along the outer periphery of the first display unit 10. The processing is executed according to the icon or character displayed at the uppermost position of the second display area 13 among the plurality of icons or characters. The user performs a rotation operation on the second display area 13 while keeping the line of sight fixed to the uppermost position of the second display area 13. A desired icon or character can thus be selected, and processing can be executed according to the selected desired icon or character. That is, even if the user is driving, it is possible to select a desired icon or character by minimizing the line of sight movement. In the configuration including the circle-shaped display 3, it is possible to enhance the usability when inputting characters or selecting functions.
In addition, the user utters the speech of “KE”, “TU”, “TE”, and “I”, which signifies “determine” in English, under the state where a desired icon or character is selected. Responsive thereto, the processing is executed according to the icon or character displayed at the uppermost position of the second display area 13 among the plurality of icons or characters. The user performs a rotation operation on the second display area 13 while keeping the line of sight fixed to the uppermost position of the second display area 13. After that, the speech of “KE”, “TU”, “TE” and “I” is uttered. With only this, the processing can be executed according to the selected desired icon or character.
In the first display unit 10, the density on displayed contents is relatively decreased on the center side, and the density on displayed contents is relatively increased on the peripheral side. That is, in the case of displaying a character, the character size is relatively increased on the center side to relatively decrease the density on characters, whereas the character size is relatively decreased on the peripheral side to relatively increase the density on characters. Further, in the case of displaying a map, the scale of the map is relatively large on the center side to make the density on map relatively small, whereas the scale of the map is relatively small on the peripheral side to make the density on map relatively large. This makes it easier for the user to recognize the information displayed on the center side than the information displayed on the peripheral side of the first display unit 10 (i.e., the first display area 12).
Although the present disclosure has been described in accordance with the embodiment, it is understood that the present disclosure is not limited to such an embodiment or structure. The present disclosure encompasses various modifications and variations within the scope of equivalents. In addition, various combinations and configurations, as well as other combinations and configurations that include only one element, more, or less, fall within the scope and spirit of the present disclosure.
The configuration to display the mail screen view corresponding to the mail application and the map screen view corresponding to the navigation application has been exemplified. The present disclosure can be applied to a case where another screen view corresponding to another application is displayed. For example, when activating an audio application, a play icon, a fast forward icon, a rewind icon, a pause icon, a volume up icon, a volume down icon, or the like may be selectable.
The configuration in which the uppermost position of the second display area 13 is the predetermined position has been exemplified. If the position is easy for the user to see, a position different from the uppermost position of the second display area 13 may be set as a predetermined position.
For reference to further explain features of the present disclosure, the description is added as follows.
Further, a character input apparatus is provided to permit a user to input a character by operating a touch panel to display the inputted character. This type of character input apparatus displays a rectangular array of a list of characters that can be inputted, such as Japanese (syllabary), alphabets, symbols, numbers. When the user selects a character, the selected character can be inputted.
In contrast, a circle-shaped display may be used for a display apparatus. If the above-mentioned rectangular array is applied to a circle-shaped display, there is arising a useless area. In contrast, a display apparatus having a circle-shaped display may provide a circular ring-shaped area surrounding the outer periphery of the circle-shaped display; the circular ring-shaped display displays rows of Japanese syllabary (“A” row, “KA” row, “SA” row, . . . ). In this case, after any one of rows of “A” row, “KA” row, “SA” row, . . . is selected, a character corresponding to “A” column, “I” column, “U” column, “E” column, “O” column of the selected row may be selected. That is, if a “KA” row is selected, “KA” column, “KI” column, “KU” column, “KE” column, and “KO” column are displayed as “A”, “I”, “U”, “E”, and “O” of the “KA” row. If “KI” is then selected, “KI” is inputted, for instance. The above-described configuration forces a line-of-sight movement to select a desired row or column. This may not be suitable for use in a vehicle. In addition, two stepwise operations of selecting a row and then selecting a column are required to make operations troublesome. Such an issue is not limited to inputting characters, but may also be assumed when selecting a mail function or a navigation function.
Under such circumstances, it is desired to improve usability when performing operations of character input or function selection.
An aspect of the present disclosure described herein is set forth in the following clauses.
According to an aspect of the present disclosure, a vehicular display apparatus is provided as including a first display unit, a second display unit, a second display control unit, a rotation operation detection unit, and a processing execution unit. The first display unit is configured to have a circle-shaped first display area to display a character input screen view. The second display unit is configured to have a circular ring-shaped second display area provided concentrically with the first display unit to surround an outer periphery of the first display unit. The second display control unit is configured to control a view on the second display area. The rotation operation detection unit is configured to detect a rotation operation by a user on the second display area. The processing execution unit is configured to perform processing according to a processing target item. Herein, in response to that the rotation operation is detected, the second display control unit is configured to rotate a view of a plurality of items displayed on the second display area along the outer periphery of the first display unit. The processing execution unit is configured to determine as the processing target item an item displayed at a predetermined position among the plurality of items displayed on the second display area, and to execute processing of inputting a character on the character input screen view according to the determined processing target item.
When a user performs a rotation operation on the circular ring-shaped second display area, the view of the plurality of items on the second display area rotates along the outer periphery of the circle-shaped first display area. The process is executed according to the item displayed at a predetermined position among the plurality of items. The user can select a desired item by performing a rotation operation on the second display area while keeping the line of sight fixed at the predetermined position. The processing can thus be executed according to the selected desired item. That is, even if the user is driving, it is possible to select a desired item by minimizing the line of sight movement. Such a configuration having a circle-shaped display area can improve the usability when performing operations of character input or function selection.
Number | Date | Country | Kind |
---|---|---|---|
2017-253499 | Dec 2017 | JP | national |
The present application is a continuation application of International Patent Application No. PCT/JP2018/037406 filed on Oct. 5, 2018, which designated the U.S. and claims the benefit of priority from Japanese Patent Application No. 2017-253499 filed on Dec. 28, 2017. The entire disclosures of all of the above applications are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2018/037406 | Oct 2018 | US |
Child | 16908238 | US |