The present invention relates to an information processing device and an information processing method.
PTL 1 describes an information display device in which a screen displayed on a display section includes a main screen, a menu screen, and a subscreen, and the menu screen is displayed as a pop-up on the main screen. PTL 2 describes an in-vehicle electronic apparatus which is capable of editing a shortcut button displayed on a menu screen superimposingly displayed in a navigation screen.
When new information is displayed on a display device, pop-up display can be carried out such that, while the previous displayed content is maintained in a partial region of the display region of the display device, new information is displayed in a region excluding the partial region, obtaining a visual effect that a new screen appears in the display region of the display device. A control element for user's operation, such as a button icon, is arranged in a pop-up image displayed through pop-up display, making it possible to provide a new interface to the user.
According to the pop-up display, it is possible to provide new information to the user while partially maintaining the previous displayed content. However, if a user's operation is received in a newly displayed pop-up image and display in the pop-up image is sequentially updated in accordance with the user's operation, it may be difficult to understand the relation between the updated displayed content and the operation as the starting point where the pop-up image is displayed or continuity from the operation as the starting point to the updated displayed content. For example, in a car navigation device, when an operation is carried out which includes a plurality of steps of displaying a pop-up image for setting a destination on a display screen, designating conditions in the displayed pop-up image, searching for destination candidates on the basis of the designated conditions, and finally setting a destination from among the searched destination candidates, it may be difficult for the user to understand situations, such as which operation was first carried out to display the pop-up image, what kind of path was subsequently followed to reach the current displayed content, and which operation should be carried out next.
In consideration of the above-described problem, an object of the invention is to allow a user to easily understand the relation between a current operation and a starting point, and continuity from the starting point to the operation.
In order to achieve the above-described object, according to the invention, a superimposed display image which is generated when a user's operation of a predetermined display element is detected includes a relation display image which allows the user to visually understand the relation between the predetermined display element and the superimposed display image. Even when the displayed content of the superimposed display image is updated, the relation display image is generated so as to be continuously displayed, allowing the user to easily understand the relation between a current operation and a starting point, and continuity from the starting point to the operation.
That is, the invention provides an information processing device capable of processing navigation information. The information processing device includes image processing means for generating a signal for displaying a basic screen including a predetermined display element, to which a predetermined function is assigned, on a display device and outputting the signal to the display device, detection means for detecting a user's operation of the predetermined display element displayed on the display device, and image generation means for generating a superimposed display image for providing the predetermined function assigned to the predetermined display element when the operation is detected by the detection means. When the superimposed display image is generated by the image generation means, the image processing means generates a signal for displaying a superimposed screen, in which the superimposed display image is superimposed on the basic screen, on the display device and outputs the signal to the display device. The superimposed display image generated by the image generation means includes a relation display image which allows the user to visually understand a relation between the predetermined display element and the superimposed display image. The relation display image is generated so as to be continuously displayed on the display device even when a displayed content of the superimposed display image is updated.
In the invention, a basic screen refers to a screen on which a superimposed display image is superimposingly displayed in front, that is, a screen which becomes a background when viewed from the superimposed display image, and is a desktop screen, a window screen, a navigation screen, an AV (Audio Visual) screen, or the like in a system, such as an in-vehicle device including a GUI (Graphical User Interface). However, the basic screen is not limited to the above-described example and may be a screen which can become the background of the superimposed display image.
The basic screen includes a predetermined display element to which a predetermined function is assigned. Here, the predetermined function is one of various functions which are provided by the information processing device. For example, when the information processing device is an in-vehicle navigation device or is mounted in an in-vehicle navigation device, a destination search/setting function which is activated from a navigation screen is exemplified. The predetermined function is activated when a user's operation of a predetermined display element for calling the function is received. To this end, it is preferable that the predetermined display element is an image which allows the user to intuitively understand the function which can be called. In many cases, the predetermined display element is expressed by a graphic or characters and is an image (icon) which functions as a button.
If the user operates a display element, a function assigned to the operated display element is activated, the operation is detected by the detection means, and a superimposed display image is generated by superimposed display image generation means. In the invention, the superimposed display image is an image which is superimposingly displayed in front of the basic screen so as to provide a predetermined function. In a system including a GUI, a pop-up image is an example of the superimposed display image. For example, in an in-vehicle navigation device, if a navigation screen (basic screen) is displayed and a user's touch operation of a destination button (predetermined display element) for calling a destination search/setting function (predetermined function) is detected, a pop-up image (superimposed display image) for operating the destination search/setting function is generated and superimposingly displayed in front of the navigation screen as a pop-up.
The superimposed display image includes a relation display image which represents the relation between the superimposed display image and the predetermined display element, and the relation display image is generated so as to be continuously displayed regardless of the update of the superimposed display image. The relation display image is an image which allows the user to visually understand the relation between the display element and the superimposed display image which is displayed when a user's operation of the display element is received. As the relation display image, for example, a balloon-like image may be used, or various images, such as a connector-like image connecting a superimposed display image to a display element, may be used.
According to the invention, a relation display image is used to display the relation between a display element and a superimposed display image which is superimposingly displayed when a user's operation of the display element is received. Thus, it is possible for the user to easily understand the relation between a current operation and a starting point, and continuity from the starting point to the operation.
In displaying the superimposed display image generated by the image generation means, the image processing means may generate a signal for displaying an animation, in which the superimposed display image appears with a position where the predetermined display element is displayed as a starting point, on the display device and may output the signal to the display device. In displaying a superimposed display image, a visual effect is added with a display element whose operation was carried out to cause display as a starting point, making it possible for the user to more intuitively understand the relation and continuity.
The superimposed display image may be an image in which scroll display of information is possible in the superimposed display image. The scroll display is possible in a superimposed display image, making it possible to transmit a lot of information to the user in the superimposed display image with a limited display range. Information is scroll-displayed in the superimposed display image, such that, even when the displayed content of the superimposed display image is updated, the relation display image is continuously displayed.
A user's operation may be received through an input device, such as a keyboard or a mouse connected to the information processing device, or a button provided as hardware, the display device may be a touch panel display, and the detection means may detect the user's touch operation on the touch panel display. With this configuration, the relation between a place that the user actually touched in order to carry out a touch operation and a superimposed display image is displayed, making it possible for the user to more intuitively understand the relation and continuity.
The invention may also be embodied as a method or a program which causes a computer to function as the above-described means. The invention may also be embodied as a computer or another device or machine-readable recording medium in which the program is recorded. The computer-readable recording medium refers to a recording medium in which information, such as data or a program, is accumulated by electrical, magnetic, optical, mechanical, or chemical reaction, and is readable from a computer or the like.
According to the invention, it becomes possible for the user to easily understand the relation between a current operation and a starting point, and continuity from the starting point to the operation.
Hereinafter, the best mode for carrying out the invention will be illustratively described. The following embodiment is just for illustration, and the invention is not limited thereto.
<Configuration>
Hereinafter, the configuration of the main unit 2 will be described. The brake detection section 4 detects whether or not the parking brake of the vehicle is applied, and notifies the detection result to the control section 20. The brake detection section 4 detects the state of the brake from the conduction state of a switch which is switched on/off in interlocking with the motion of the parking brake lever (or pedal). The brake detection section 4 electrically detects the conduction state of the switch through a terminal 26A.
The reverse detection section 5 detects whether or not the gearshift of the vehicle is at the reverse position (backward movement) and notifies the detection result to the control section 20. The reverse detection section 5 detects the state of the gearshift from the on/off of a switch which moves in interlocking with the gearshift. The reverse detection section 5 electrically detects the conduction state of the switch through a terminal 26B.
The portable player interface 6 is an interface for bidirectional communication with a portable player (for example, iPOD (Registered Trademark) which reproduces music or the like. If a portable player is externally connected, the portable player interface 6 starts bidirectional communication to send an audio signal from the player to the control section 20 and to send a control signal, such as reproduction start or music selection, from the control section 20 to the player. The portable player interface 6 performs communication with the player through a cord connected to a terminal 26C.
The broadcast wave receiving section 7 is a circuit which includes a One Seg tuner (the application for trademark registration for “One Seg” is pending), an AM (Amplitude Modulation) tuner, and an FM (Frequency Modulation) tuner. The broadcast wave receiving section 7 controls the reception state of the tuner in accordance with the control signal from the control section 20 and sends signals of electric waves received by an antenna connected to a terminal 26D to the control section 20.
The external sound/image input section 8 is a circuit which receives a composite image signal or sound signal from a video/audio equipment connected to a terminal 26E and sends the composite image signal or sound signal to the control section 20.
The GPS (Global Positioning System) information receiving section 9 receives signals of electric waves from a GPS satellite received by a GPS antenna connected to a terminal 26F and sends the received signal to the control section 20. As well known in the art, the GPS is the system which measures the position of the vehicle on the basis of electric waves from at least three satellites from among many GPS satellites circulating the earth. The GPS information receiving section 9 processes the signals of electric waves of the GPS satellites circulating the earth. The signals from the GPS satellites received by the UPS information receiving section 9 are used in car navigation.
The vehicle speed detection section 10 is a circuit which detects a vehicle speed pulse signal generated in accordance with the rotation angle of the axle and sends the vehicle speed pulse signal to the control section 20. The vehicle speed pulse signal detected by the vehicle speed detection section 10 is a step-like vehicle speed pulse signal which is output from a vehicle speed sensor or an electronic control unit controlling the engine or brake of the vehicle, and is used in determining the vehicle speed from the number of pulses per unit time. If the number of pulses per unit time increases, the vehicle is accelerating, and if the number of pulses per unit time decreases, the vehicle is decelerating. The correlation between the speed of the vehicle and the vehicle speed pulses changes depending on the manufacturer who manufactures the vehicle, the vehicle type, the size of each wheel to be mounted, air pressure, or the like. For this reason, in the control section 20, the correlation between the speed of the vehicle and the vehicle speed pulses is appropriately updated from the correlation between the traveling distance of the vehicle calculated on the basis of the positioning result by the GPS and the number of pulses detected during traveling. The vehicle speed detection section 10 electrically detects the vehicle speed pulse signal output from the electronic control unit through a terminal 26G.
The camera image input section 11 is a circuit which receives an image signal from a rear-view camera which is a video camera photographing the rear side of the vehicle and sends the image signal to the control section 20. That is, when the reverse detection section 5 detects the reverse of the vehicle, the camera image input section 11 sends an image signal from the video camera connected to a terminal 26H to the control section 20.
The amplifier 12 is a circuit which amplifies a sound signal sent from the control section 20 to a speaker connected to a terminal 26I in the vehicle interior. The amplifier 12 can arbitrarily change the amplification factor in accordance with the control signal from the control section 20.
The opening/closing control section 13A is a circuit which carries out an opening/closing operation of the display unit 3. The opening/closing control section 13A controls the motor 15 in accordance with the control signal from the control section 20 or processes the signal from the angle sensor 14 to open/close the display unit 3.
The angle control section 13B is a circuit which adjusts the angle of the display unit 3. Similarly to the opening/closing control section 13A, the angle control section 13B controls the motor 15 in accordance with the control signal from the control section 20 or processes the signal from the angle sensor 14 to adjust the angle of the display unit 3. The angle of the display unit 3 refers to the relative angle the front side of the main unit 2 and the front side of the display unit 3 (that is, the front side of the touch panel 21) centering on the axis extending in the left-right direction of the navigation device 1.
The angle sensor 14 is a sensor which detects the angle of the display unit 3, and notifies the detected angle as an electrical signal to the opening/closing control section 13A and the angle control section 13B. The motor 15 is a motor which adjusts the angle of the display unit 3, and moves up or down the upper end of the display unit 3 or moves the lower end of the display unit 3 forward and backward. If receiving the control signal from the control section 20, the opening/closing control section 13A and the angle control section 13B determines the difference between the angle of the display unit 3 detected by the angle sensor 14 and the target value of the angle determined on the basis of the control signal, and performs feedback control of the motor 15 such that the angle of the display unit 3 detected by the angle sensor 14 coincides with the target value.
The CD drive 16 is an optical disk reading device which reads a CD having recorded therein audio contents, such as music, and reproduces audio contents, and includes an optical pickup lens or light-emitting element, a disk driving motor, and the like.
The card memory interface 17 is a memory card reader/writer which reads and writes a nonvolatile semiconductor memory card with no storage holding operation. A memory card inserted into the card memory interface 17 has storage capacity of about 4 GB, and has recorded therein road information, such as highways or roads, map data including spot information (hereinafter, also referred to as POI (Point Of Interest) data) regarding various facilities, such as theme parks and gas stations, and data, such as telephone numbers or facilities names. The control section 20 accesses map data recorded in the memory card to realize all functions, route search of car navigation.
The gyro sensor 19 is a biaxial gyro sensor which is embedded in the main unit 2. The gyro sensor 19 enables vehicle positioning even when the GPS information receiving section 9 cannot receive the electric waves from the GPS satellites. When it is impossible to receive the electric waves from the GPS satellites, the control section 20 calculates the position of the vehicle is calculated on the basis of the vehicle speed detected by the vehicle speed detection section 10 and the traveling direction of the vehicle detected by the gyro sensor 19.
The control section 20 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an input/output interface, and the like. If the accessory power supply of the vehicle is turned on, the control section 20 executes a computer program recorded in the ROM to realize various functions by using data of the memory card inserted into the card memory interface 17, data stored in the RAM, or the like. The details of various functions which are realized by the control section 20 will be described below.
Next, the constituent elements constituting the display unit 3 will be described. The touch panel 21 is a GUI (Graphical User Interface) in which a color liquid crystal display and a touch sensor are combined. In the touch panel 21, the screen is displayed with a 7.0-inch EGA (Enhanced Graphics Adapter) liquid crystal display, and if an icon or the like displayed on the screen is depressed, the touch sensor detects the depressing.
The display processing section 22 is a circuit which draws a screen to be displayed on the liquid crystal display of the touch panel 21. The display processing section 22 drives thin-film transistors arranged in the liquid crystal display in a lattice at uniform intervals on the basis of an image signal sent from the control section 20, and draws the screen of the touch panel 21.
If the touch sensor detects a touch operation on the touch panel 21, the operation receiving section 23 specifies the touched position on the screen and sends information of the specified position to the control section 20.
The operation button 24 is a mechanical button instead of a button (button image) which is displayed on the touch panel 21 in the form of an icon, and as shown in
The infrared ray receiving/emitting unit 25 is an interface for bidirectional communication between the navigation device 1 and a mobile phone using infrared rays, and is constituted by a light-emitting element which electrically emits infrared rays and a light-receiving element which converts the received infrared rays to electricity. The infrared ray receiving/emitting unit 25 sends the control signal or data from the control section 20 to the mobile phone and also sends the control signal or data from the mobile phone to the control section 20. As shown in
Next, various functions which are realized by the control section of the main unit 2 will be described in detail.
The operation processing functional section 51 displays an operation screen for controlling the operations of various functional sections on the touch panel 21 through the image processing functional section 57, or processes an operation signal from the operation receiving section 23, the operation button 24, or the reset button 18 and controls the operations of various functional sections.
The operation processing functional section 51 includes a user operation detection functional section 51a which detects a touch operation by a user to buttons included in a basic screen which becomes a background of a pop-up image such as a multi screen which includes a navigation region and an AV region, an entire navigation screen or an entire AV screen, and a superimposed display image generation functional section 51b which generates the pop-up screen to be superimposingly displayed on the basic screen when the touch operation by the user is detected.
If the accessory power supply of the vehicle is powered on, the positioning functional section 52 measures the position (latitude and longitude) of the vehicle on the basis of information of electric waves from the satellites sent from the GPS information receiving section 9, information of the vehicle speed notified from the vehicle speed detection section 10, and information of the angular speed sent from the gyro sensor 19.
The route guidance functional section 53 is a functional section which finds out the route from the current location of the vehicle to the destination set by the user and carries out route guidance. The route guidance functional section 53 finds out the traveling route from the position of the vehicle measured by the positioning functional section 52 to the destination from map data of the memory card inserted into the card memory interface 17. The route of the vehicle is guided by sound and images from the relationship between the found traveling route and the position of the vehicle.
The map data processing functional section 54 generates graphic data of the map displayed on the touch panel 21 on the basis of map data of the memory care inserted into the card memory interface 17 or data of the traveling route found by the route guidance functional section 53, data of VICS (Registered Trademark) road traffic information acquired from FM broadcast waves through the broadcast wave receiving section 7, positional data of the vehicle measured by the positioning functional section 52, and the like.
The user data processing functional section 55 writes spot information (for example, positional information of the home) to be registered by the user or history information of route search and setting information, such as display/non-display of icons, into the RAM or reads the information from the RAM.
The sound processing functional section 56 is a functional section which processes the signal of sound output from the speaker through the amplifier 12. That is, the sound processing functional section 56 sends an audio broadcast received by the broadcast wave receiving section 7, an audio signal acquired from the player by the portable player interface 6, or an audio signal to be reproduced by the CD drive 16 to the amplifier 12, or superimposes a sound signal of route guidance from the route guidance functional section 53 on the audio signal and sends the resultant signal to the amplifier 12.
The image processing functional section 57 is a functional section which generates image data to be displayed on the touch panel 21. That is, the image processing functional section 57 synthesizes data of an operation screen generated by the operation processing functional section 51 and data of the screen of a map for display generated by the map data processing functional section 54 and sends the resultant signal to the display processing section 22, sends image data of television broadcast received by the broadcast wave receiving section 7 to the display processing section 22, or sends an image signal from the camera image input section 11 to the display processing section 22 in interlocking with the detection of the backward movement of the vehicle by the reverse detection section 5. The image processing functional section 57 stops the notification of image data if the brake detection section 4 detects the release of the parking brake in sending image data of television broadcast to the display processing section 22.
<Operation>
Hereinafter, the operation of the navigation device 1 will be described.
(D101) An opening screen (D101) will be described. If the accessory power supply of the vehicle is powered on and power is supplied to the navigation device 1, the control section 20 executes the computer program stored in the ROM to initialize the navigation device 1 and to realize various functional sections shown in
(D102) Next, a multi screen (D102) will be described. If four seconds has elapsed after the opening screen has been displayed, the image processing functional section 57 generates the multi screen (D102), in which the operation screen for AV and the operation screen for navigation are combined, on the basis of image data of operation buttons stored in the ROM or map data read by the map data processing functional section 54, and displays the multi screen (D102) on the touch panel 21.
As shown in
In this state, if the operation processing functional section 51 detects that the “AV” button is depressed, the image processing functional section 57 carries out transition to the screen display state of the entire AV screen (D104). A case where other buttons are depressed will be described in detail after the description of the entire navigation screen (D103) and the entire AV screen (D 104).
(D103) Next, the entire navigation screen (D103) will be described. If the operation processing functional section 51 detects that the “navigation” button displayed on the multi screen (D102) is depressed, the image processing functional section 57 gradually hides the AV region to display the navigation region in the entire screen.
As shown in
(D104) Next, the entire AV screen (D104) will be described. If the operation processing functional section 51 detects that the “AV” button displayed on the multi screen (D102) is depressed, the image processing functional section 57 gradually hides the navigation region to display the AV region in the entire screen.
As shown in
The screen transition (
Next, description will be provided as to a flow of screen transition shown in
In Steps S101 to S103, an operation of a display element, such as a button displayed on the basic screen, is detected, and a pop-up image (superimposed display image) is displayed. If the user operation detection functional section 51a detects the operation of each button displayed in the navigation region of the basic screen displayed on the touch panel 21 (Step S101), the superimposed display image generation functional section 51b specifies a function according to a button whose operation is detected and generates a pop-up image for providing the function (Step S102). A software module which is executed for function provision and data of the generated pop-up image are recorded in a ROM in advance in association with identification information of buttons (display elements) displayed on the basic screen and are specified by using the identification information of a display element operated by the user. The image processing functional section 57 generates an image signal for displaying the generated pop-up image to be superimposed on the basic screen and outputs the image signal to the touch panel 21 (Step S103). Then, a screen in which the pop-up image is superimposingly displayed in front of the basic screen, such as the menu screen (D201), the destination setting screen (D202), or the peripheral facility search screen (D203), is displayed on the touch panel 21. The pop-up image refers to an image which, when a button displayed on the screen is depressed, is displayed in a standing state in front of the basis screen so as to provide a function associated with the button and is, for example, an image in which menu items are displayed.
The menu screen (D201) is a screen which includes a pop-up image 61a for providing a menu selection function of carrying out the setting of the navigation device 1 or the like, and is displayed when the user operation detection functional section 51a detects that the “menu” button displayed in the navigation region of the multi screen (D102) or the entire navigation screen (D103) is depressed.
The destination setting screen (D202) is a screen which includes a pop-up image 61b for providing a destination setting function, and is displayed when the user operation detection functional section 51a detects that the “destination” button displayed in the navigation region of the multi screen (D102) or the entire navigation screen (D103) is depressed.
The destination setting screen (D202) includes buttons for destination search, such as a “search by Japanese syllabary” button, a “search by address” button, a “search by mobile connection” button, a “search by history” button, a “search by favorite” button, a “search by telephone number” button, a “search by facility/genre” button, a “search by previous map” button, a “search by map code” button, and a “search by additional data” button. The buttons for destination search are displayed in the pop-up image 61b of the destination setting screen (D202) such that all buttons are listed by a scroll operation.
The peripheral facility search screen (D203) is a screen which includes a pop-up image 61c for providing a peripheral facility search function, and is displayed when the user operation detection functional section 51a detects that a “peripheral” button displayed in the navigation region of the multi screen (D102) or the entire navigation screen (D103) is depressed.
The intersection enlargement screen (D204) is displayed when a destination is set on the above-described destination setting screen (D202) or the peripheral facility search screen (D203) or when a “home” button displayed on the multi screen (D102) or the entire navigation screen (D103) is depressed to set a destination and route guidance by the route guidance functional section 53 starts. The route guidance functional section 53 guides a route on the basis of the host vehicle position measured by the positioning functional section 52 and map data which is read from the memory by the data processing functional section 54. The route guidance functional section 53 displays the intersection enlargement screen (D204) after the vehicle draws near an intersection where the vehicle turns right or left and also passes sound data for route guidance to the sound processing functional section 56.
In Steps S104 to S107, an operation of a display element displayed in a pop-up image is detected, and the displayed content of the pop-up image is updated. If the user operation detection functional section 51a detects a touch operation of a display element in a pop-up image, such as a button for destination search displayed on the destination setting screen (D202) (Step S104), the type of the user's operation is determined (Step S105). When the detected user's operation is an operation to close a pop-up image, such as a touch operation of a “return” button displayed in the pop-up image, the pop-up image is closed and the basic screen (in this case, the multi screen (D102) or the entire navigation screen (D103)) is again displayed in front. That is, if the user operation detection functional section 51a detects the touch operation of the “return” button, the superimposed display image generation functional section 51b ends generation of a pop-up image, and the image processing functional section 57 ends superimposed display of a pop-up image on the basic screen. Thereafter, the processing of this flowchart ends.
When the detected user's operation is not an operation to close a pop-up image, the superimposed display image generation functional section 51b generates a pop-up image updated in accordance with the operation to update the displayed content in the pop-up image (Step S106). Updated data of the pop-up image is recorded in advance in the ROM in association with identification information of buttons (display elements) displayed in the pop-up image, and specified by using identification information of a display element operated by the user. The image processing functional section 57 generates an image signal for superimposingly displaying the generated pop-up image on the basic screen and outputs the image signal to the touch panel 21 (Step S107). Thus, display of a pop-up image superimposingly displayed in front of the basic screen is updated.
If the user operation detection functional section 51a detects that any button for destination setting is depressed, the image processing functional section 57 displays a corresponding screen. For example, the image processing functional section 57 displays a screen for character input if the “search by Japanese syllabary” button is depressed, displays a screen for selecting a prefecture or the like if the “search by address” button is depressed, displays a screen for requesting the user to bring a mobile phone close to the infrared ray receiving/emitting section 25 if the “search by mobile connection” button is depressed, displays destinations which have previously been searched for if the “search by history” button is depressed, displays a screen including a list of favorite destinations registered by a user's operation if the “search by favorite” button is depressed, displays a screen for telephone number input if the “search by telephone number” button is depressed, displays a screen for selecting a genre if the “search by facility/genre” button is depressed, displays a screen including a map last displayed if the “search by previous map” button is depressed, displays a screen for map code input if the “search by map code” button is depressed, and displays a screen for selecting a destination from additional data if the “search by additional data” button is depressed. Communication data which is provided from the mobile phone by the infrared ray receiving/emitting section 25 includes positional information, such as the latitude or longitude of a destination, an address, or a telephone number. If a destination is set on a screen which is displayed when the “search by Japanese syllabary”, “search by address”, “search by mobile connection”, “search by favorite”, “search by telephone number”, “search by facility/genre”, “search by previous map”, “search by map code”, “search by additional data”, or “search by history” button is depressed, the route guidance functional section 53 finds out the shortest route from the host vehicle position measured by the positioning functional section 52 and the destination and starts route guidance.
Similarly to the destination setting screen (D202), the pop-up image 61d on the Japanese syllabary search screen (D205) includes the balloon-like relation display image 62 which represents the relation between the “destination” button and the pop-up image 61d. The relation display image 62 continuously indicates the relation with the “destination” button after searching is subsequently carried out and the search result is displayed in the updated pop-up image (not shown). As described above, the relation with a button for calling a pop-up image is continuously displayed even when the displayed content of a pop-up image is updated, making it possible for the user to easily understand that a currently displayed screen is a screen which is displayed in accordance with which operation is initially carried out. For example, according to the above-described example, even when the destination setting screen (D202) progresses to the Japanese syllabary search screen (D205) and then progresses to a search result screen (not shown), the display of the balloon-like relation display image 62 is maintained, making it possible for the user to intuitively understand that a current operation is an operation which is started by an operation of the “destination” button. This allows the user to feel comfortable with the interface of the navigation device 1.
Next, the display mode of the AV screen of the navigation device 1 will be described. As shown in
According to the navigation device 1 of this embodiment, the relation display image 62 which allows the user to visually understand the relation between an operated button or the like and a pop-up image and is continuously displayed even when the displayed content of a pop-up image is updated is included in each of the pop-up images 61a to 61d which are displayed when a touch operation of a display element, such as a button displayed on the basic screen, is detected. Thus, it is possible for the user to easily understand the relation between a current operation and a starting point, and continuity from the starting point to the operation. A function which is called by the basic screen is provided by using a pop-up image, and the basic screen is continuously displayed on the background, making it also possible for the user to easily understand the relation between a current operation and a starting point. Although in this embodiment, a pop-up image has been described which is displayed when an operation of a button or the like arranged in the navigation region is received, the invention may be applied to a pop-up image which is displayed when an operation of a button or the like arranged in the AV region of the multi screen (D102) or the entire AV screen (D104) is received.
Although in this embodiment, when a user's operation of a display element displayed in a pop-up image is carried out, the content of an initially displayed pop-up image is updated, instead of updating the displayed content, a new pop-up image may be further displayed in front of the displayed pop-up image.
In this way, it becomes possible for the user to understand the operation progress to a currently displayed screen while easily understanding the relation between a current operation and a starting point, and continuity from the starting point to the operation. According to this display method, it is possible to return from the Japanese syllabary search screen (D206) to the destination setting screen (D202) without using the “return” button. That is, if a touch operation of a tab portion (in
Although in this embodiment, the invention is applied to an in-vehicle navigation device, the invention is not limited thereto. For example, the invention may be applied to a portable navigation device or a portable electronic apparatus having a navigation function, such as a mobile phone.
Although in this embodiment, the invention is applied to a navigation device serving as an information processing device capable of processing navigation information, the invention is not limited thereto. For example, the invention may be applied to a navigation device having a touch panel or an image processing device which is externally connected to or embedded in a portable electronic apparatus and generates a navigation image.
This application is based on Japanese Patent Application No. 2008-234446 filed on Sep. 12, 2008, the content of which is incorporated herein by reference.
Number | Date | Country | Kind |
---|---|---|---|
2008-234446 | Sep 2008 | JP | national |
Filing Document | Filing Date | Country | Kind | 371c Date |
---|---|---|---|---|
PCT/JP2009/065959 | 9/11/2009 | WO | 00 | 3/11/2011 |