This application is based on Japanese Patent Application No. 2011-237156 filed on Oct. 28, 2011, the disclosure of which is incorporated herein by reference.
The present disclosure relates to an in-vehicle display apparatus that displays operation icons on a display panel.
An in-vehicle display apparatus, which is applied to a navigation apparatus, displays one or more icons on a display panel. The icons displayed on the display panel respectively correspond to predetermined operations. The predetermined operations may include setting a destination, registering information of a position, setting audio and the like. A touch panel switch is disposed on a surface of the display panel. When an icon displayed on the display panel is manipulated by a user by touching, a point on the touch panel switch corresponding to a coordinate of the manipulated icon is activated. Thus, a manipulation of the icon is detected by the touch panel switch.
As disclosed in JP-A-2010-085207, especially in FIG. 18 to FIG. 22 of JP-A-2010-085207, the display panel of the in-vehicle display apparatus is equipped to, for example, an installment panel of a vehicle, and the icons are equally displayed on the display panel.
As described above, the display panel of the in-vehicle display apparatus is equipped to the installment panel of the vehicle. Thus, in a case where a user is seated on a driver seat or on a assistant driver seat, when (i) the user intends to manipulate a predetermined icon, and (ii) the predetermined icon is arranged at a distance from the seat where the user is seated, the user needs to extend his or her arm to touch the predetermined icon. Further, when the user changes his or her mind during approach to the predetermined icon and intends to manipulate another icon, the user needs to move his or her hand in front of the display panel.
In view of the foregoing difficulties, it is an object of the present disclosure to provide an in-vehicle display apparatus, which displays icons so that the icons are selectively manipulated by a user from a position near to a seat where the user is seated.
According to an aspect of the present disclosure, an in-vehicle display apparatus includes, which controls a display panel equipped to a vehicle to display a plurality of operation icons to be manipulated by a user, includes a finger position detector, a display controller, and a touch panel switch. The finger position detector detects an approach of a finger and a finger approach position. The finger approach position is defined as a point on the display panel and corresponds to a fingertip of the finger. The display controller controls the display panel to display an icon display window on the display panel. The icon display window includes the operation icons. The touch panel switch is disposed on a surface of the display panel. The touch panel switch generates an input signal corresponding to one of the operation icons when detecting that the one of the operation icons is touched by the user. The touch panel switch further transmits the input signal to the display controller. The icon display window has two display modes including a rearrange target display mode in which the operation icons displayed in the icon display window are to be rearranged and an adjacence display mode in which the operation icons are displayed adjacent to the finger approach position in the icon display window. In a case where the icon display window is displayed in the rearrange target display mode and a display control start condition is satisfied, the display controller controls the display panel to display the icon display window in the adjacence display mode when the finger position detector detects the approach of the finger to the display panel. When the display controller receives the input signal from the touch panel switch, the display controller executes a predetermined operation corresponding to the one of the operation icons.
In the above apparatus, in a case where the icon display window is displayed in the rearrange target display mode and the display control start condition is satisfied, the display controller controls the display panel to display the icon display window in the adjacence display mode when the finger position detector detects the approach of the finger to the display panel. Thus, the operation icons are selectively manipulated by the user from a position near to a seat where the user is seated.
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description made with reference to the accompanying drawings. In the drawings:
An in-vehicle display apparatus 23 according to a first embodiment of the present disclosure will be described with reference to
As shown in
The controller 2 includes a central processing unit (CPU), a random access memory (RAM), a read only memory (ROM), and an input/output (I/O) bus. The controller 2 executes a control process in order to control the in-vehicle navigation apparatus 1. The position detector 3 includes an accelerator sensor (G-sensor) 3a, a gyroscope 3b, a distance sensor 3c, and a global positioning system (GPS) receiver 3d. Each of the G-sensor 3a, the gyroscope 3b, the distance sensor 3c, and the GPS receiver 3d has a different detection error from one another. The controller 2 detects and specifies a present position of a vehicle based on signals detected by the G-sensor 3a, the gyroscope 3b, the distance sensor 3c, and the GPS receiver 3d. The position detector 3 does not necessarily include all of the G-sensor 3a, the gyroscope 3b, the distance sensor 3c, and the GPS receiver 3d. That is, the position detector 3 may selectively include some of the G-sensor 3a, the gyroscope 3b, the distance sensor 3c, and the GPS receiver 3d. For example, the G-sensor 3a, the gyroscope 3b, the distance sensor 3c, and the GPS receiver 3d may be selectively included in the position detector 3 under a condition that the position detector 3 is able to detect the present position of the vehicle with a predetermined accuracy. Further, the position detector 3 may include a steering sensor, which detects a steering angle of a steering wheel, and a wheel sensor, which detects the rotation number of a wheel.
The map data reader 4 may be equipped with a storage media such as a CD-ROM, a DVD-ROM, a memory card, and a HDD. The map data reader 4 reads a map data and a map matching data stored in the storage media, and transmits the map data and the matching data to the controller 2. The switch group 5 includes one or more mechanical keys. Some of the mechanical keys are arranged around the display section 9, and some of the mechanical keys are equipped to a steering wheel. When detecting that a user performs an operation by manipulating the switch group 5, the switch group 5 transmits an operation detection signal to the controller 2. The operations performed by the user may include, for example, displaying a menu, setting a destination, searching for a route, starting a route guidance, switching from a present display window to another display window, performing an audio volume control.
As shown in
The display windows displayed on the display panel 14 further include an icon display window (IDW), which includes one or more icons for performing different operations. The icon display window may be displayed in an entire region of the display panel 14 or in a partial region of the display panel 14. Hereinafter, the icons for performing different operations are referred to as operation icons. The icon display window has two display modes including a rearrange target display mode (RTDM) and an adjacence display mode (ADM). The rearrange target display mode is defined as a display mode in which the operation icons need to be rearranged. Specifically, the rearrange target display mode is a static state of the operation icons, which need to be rearranged. In the rearrange target display mode, the operation icons may be equally arranged in the icon display window. Further, the operation icons may be arranged in another manner other than being equally arranged in the rearrange target display mode. The adjacence display mode is defined as a display mode in which the operation icons are displayed adjacent to a finger approach position, which will be described later. Specifically, the adjacence display mode includes a moving state, in which the operation icons are moving from initial display positions toward predetermined display positions, and a static state, in which the operation icons are arranged at the predetermined display positions. Thus, compared with the rearrange target display mode, the operation icons are more adjacent to the finger approach position in the adjacence display mode. In the present disclosure, a thumb is also referred to as a finger for convenience of description, and the finger approach position is defined as a point on the display panel 14 to which a user approaches with a finger. Specifically, the finger approach position corresponds to a fingertip of the finger of the user.
When the user manipulates an operation icon, the touch panel switch 15 generates an input signal corresponding to the manipulated operation icon, and transmits the input signal to the controller 2. The touch panel switch 15 is sensitive to touch, force, or pressure. Thus, the user may manipulate the touch panel switch 15 by touching, forcing, or pressing the touch panel switch 15 with a finger. As shown in
The first sensor 7 and the second sensor 8 shoot images of a front region of the display panel 14. In an image shot by the first sensor 7 and the second sensor 8, a predetermined imaginary frame is set to define a determination region. Thus, the determination region is included in the front region of the display panel 14. The predetermined imaginary frame is defined by the controller 2 in such a manner that the determination region substantially corresponds to the display panel 14. That is, the front region is a broader than the determination region. The finger position detector 16 determines whether the finger approaches to the display panel 14 based on the image defined by the determination region.
Further, the finger behavior detector has one or more image data in order to detect a finger from the image, which is shot by the first sensor 7 and the second sensor 8. The image data include a finger image data and a hand image data. Specifically, the finger image data is a data of a finger image that shows one or more fingers, and the hand image data is a data of a hand image that shows a hand with one or more fingers pointed. The finger image and the hand image, whose data are used to detect a finger from the image shot by the first sensor 7 and the second sensor 8, includes at least a fingertip. Thus, the finger image data and the hand image data at least include a fingertip data. Hereinafter, the image data, which is used to detect a finger from the image shot by the first sensor 7 and the second sensor 8, is also referred to as finger detect image data. For example, a finger detect image data for detecting a finger shown in
When the finger position detector 16 detects that the image shot of the determination region includes a finger and a data of the image shot is similar to one of the finger detect image data, the controller 2 determines that a finger approaches to the display panel 14. When the finger position detector 16 detects that the finger moves in the image shot of the determination region, the controller 2 determines that the finger moves in front of the display panel 14 within the determination region. When the finger position detector 16 detects that the finger stays still in the image shot of the predetermination region for a predetermined time, the controller 2 determines that the finger stops moving in front of the display panel 14. When the controller 2 detects that a size of the finger in the image shot becomes smaller than a predetermined image size, or the finger disappears from the image shot, the controller 2 determines that the finger moved away from the display panel 14.
The in-vehicle navigation apparatus 1 may further include a speaker 17, a microphone 18, and a remote controller 19. The audio controller 10 outputs an audio guidance, such as a warming alarm and an audio route guidance, from the speaker 17. The speech recognize section 11 is controlled by the controller 2. When the speech recognize section 11 is activated, the speech recognize section 11 recognizes an audio signal transmitted from the microphone 18 based on a speech recognition algorithm executed by the controller 2. When receiving an operation signal from the remote controller 19, the remote control sensor 12 transmits the operation signal to the controller 2, and the controller 2 performs an operation corresponding to the operation signal. The in-vehicle LAN connection section 13 provides an interface to an in-vehicle LAN 20. The in-vehicle LAN connection section 13 receives a speed signal, an accessory (ACC) signal, a parking brake signal via the in-vehicle LAN 20, and transmits the speed signal, the ACC signal, and the parking brake signal to the controller 2. The speed signal is generated based on a pulse signal, which is output from a speed sensor (not shown) equipped to the vehicle. The ACC signal indicates a state of an ACC switch, and includes an ON state and an OFF state. The parking brake signal indicates whether the vehicle is in a parked state or not. When the vehicle is parked, the parking brake signal is in an ON state; when the vehicle is non-parked, that is traveling, the parking brake signal is in an OFF state.
From a functional point of view, the controller 2 includes a map data acquire section, a map specify section, a route search section, a route guide section, and a drawing section. The map data acquire section acquires a data of a map indicating an area. The map specify section specifies a road including the present position of the vehicle based on the present position of the vehicle and road data included in the map data acquired by the map data acquire section. The route search section searches for a guidance route from the present position of the vehicle to a destination set by the user. The route guide section performs a route guidance by calculating one or more necessary positions to go through based on the guidance route, the road data included in the map data, and one or more position data of one or more intersections included in the map data. The drawing section generates a guidance map around the present position of the vehicle. The guidance map includes a simplified view of a highway, an amplified view of an intersection and the like.
As described above, the in-vehicle display apparatus 23 includes the controller 2, the display section 9 having the display panel 14 and the touch panel switch 15, and the finger position detector 16 having the first sensor 7, the second sensor 8, and the finger behavior detector.
The controller 2 further provides a display controller and a learning section, which calculates a manipulation frequency. The following will describe a control process executed by the controller 2 in order to function as the display controller and the learning section with reference to
Each of the operation icons Ia to Ih schematically indicates a corresponding operation. A mark of the operation icon may be a character icon (not shown) other than a symbol icon, which is shown in
At step S1, when the controller 2 determines that the icon display window is displayed in the rearrange target display mode, the process proceeds to step S2. At step S2, the controller 2 determines whether the vehicle is in the parked state based on the parking brake signal. For example, the controller 2 may determine that the vehicle is in the parked state when the parking brake signal is in the On state. Further, for example, the controller 2 may determine that the vehicle is in the parked state when a vehicle speed calculated from the speed signal is zero.
When the controller 2 determines that the vehicle is in the parked state, the process proceeds to step S3. At step S3, the controller 2 determines whether a finger approaches to the display panel 14 based on a detection result of the finger position detector 16. When the controller 2 determines that the finger approaches to the display panel 14, the process proceeds to step S4. At step S4, as shown in
At step S6, the display controller controls the display panel 14 to display the icon display window in such a manner that the operation icons Ia to Ih are displayed at the corresponding display positions calculated at step S5. That is, the operation icons Ia to Ih are moved toward the finger approach position P1 so that the operation icons are displayed at the calculated display positions.
The display position of each of the operation icons Ia to Ih is calculated in such a manner that a display position of an operation icon is not overlapped with a display position of another operation icon. The following will describe an exemplary method of calculating the display positions of the operation icons Ia to Ih. A display position of a first operation icon, which is defined as an operation icon to be moved firstly, is calculated in such a manner that the display position of the first operation icon is overlapped with the finger approach position P1. Then, a display position of a second operation icon, which is defined as an operation icon to be moved secondly, is calculated based on the display position of the first operation icon and a size of the first operation icon. Display positions of the other operation icons are calculated in a similar way. Alternatively, the display positions of the operation icons Ia to Ih may be calculated in another method. For example, when the icon display window displayed on the display panel 14 is a cell-based icon display window, that is one cell corresponds to one operation icon, the cells in which the operation icons Ia to Ih are to be arranged may be determined by the finger approach position P1. The display positions of the operation icons Ia to Ih may be calculated in another method other than above-described methods.
During the moving of the operation icons Ia to Ih, the operation icons Ia to Ih may be displayed in the icon display window with original sizes and original color strengths. Further, during the moving of the operation icons Ia to Ih, the operation icons Ia to Ih may be displayed in a fade-out and fade-in manner. Specifically, the operation icons Ia to Ih gradually fade out at original display positions when the moving starts. After the moving is finished, the operation icons Ia to Ih gradually fade in to have the original sizes and the original color strengths at the calculated new display positions. Further, during the moving of the operation icons Ia to Ih, when one operation icon touches with another operation icon, the operation icons Ia to Ih may be displayed in such a manner that the one operation icon bumps the another operation icon. The operation icons Ia to Ih may also be displayed in a manner other than above-described manners during the moving. Then, the process proceeds to step S7.
At step S7, the controller 2 determines whether the finger stops moving based on the detection result of the finger position detector 16. That is, the controller 2 determines whether the finger approach position P1 is changed or not on the display panel 14. When the controller 2 determines that the finger continues moving without stop, the process proceeds to step S3, and the controller 2 determines again whether the finger approaches to the display panel 14 based on the detection result of the finger position detector 16. At step S7, when the controller 2 determines that the finger stops moving, the process proceeds to step S8. At step S8, the display controller locks the icon display window, which is displayed in the adjacence display mode. Specifically, when the finger stops moving at a first moment during the approach of the finger to the display panel 14 and the icon display window is displayed in the moving state of the adjacence display mode at the first moment, the icon display window is locked in the moving state, which is displayed at the first moment. Specifically, when the display controller locks the icon display window at the first moment, the operation icons stop moving at certain positions between the initial display positions and the predetermined display positions adjacent to the finger approach position. Further, when the finger stops moving at a second moment during the approach of the finger to the display panel 14 and the icon display window is displayed in the static state of the adjacence display mode at the second moment, the icon display window is displayed in the static state of the adjacence mode, which is displayed at the second moment. Specifically, the operation icons are displayed at the predetermined display positions adjacent to the finger approach position. Then, the process proceeds to step S9. At step S9, the controller 2 stands by for a predetermined time, which is defined as a stand-by time. During the stand-by time, the controller 2 detects whether the user manipulates one of the operation icons Ia to Ih.
When the predetermined stand-by time elapses, at step S14, the display controller unlocks the icon display window, which is displayed in the adjacence display mode. Thus, the predetermined stand-by time is also referred to as an unlock condition. Specifically, when the predetermined stand-by time elapses, the icon display window displayed in the adjacence display mode is unlocked. Then, the process proceeds to step S15. At step S15, the controller 2 detects whether the finger moved away from the front region of the display panel 14 based on the detection result of the finger position detector 16. At step S15, when the controller 2 determines that the finger moved away from the front region of the display panel 14, the process proceeds to step S16. At step S16, the display controller reset the operation icons Ia to Ih so that the operation icons Ia to Ih are displayed at the original display positions. That is, the display controller controls the display panel 14 to display a pre-moving icon display window (PRE-MOVE IDW), which is defined as the icon display window before the moving of the operation icons Ia to Ih. At step S15, when the controller 2 determines that the finger moved away from the front region of the display panel 14, the process returns to step S3. That is, a reset trigger of the icon display window includes a moving away of the finger from the front region of the display panel 14. Further, the reset trigger of the icon display window may further include a twice touch or a long touch on a predetermined portion of the display panel 14. The predetermined portion of the display panel 14 is a portion where the operation icons Ia to Ih are not arranged after the moving. The reset trigger is defined as a condition under which the operation icons Ia to Ih are reset so the operation icons Ia to Ih are displayed at the original display positions.
At step S10, when the one of the operation icons Ia to Ih is manipulated, the process proceeds to step S11. Hereinafter, the one of the operation icons manipulated by the user is referred to as the manipulated operation icon. At step S11, the controller 2 performs a customize process. The customize process is a process for setting a display position of the manipulated operation icon. For example, the customize process may lock the manipulated operation icon at the present position corresponding to the finger approach position P1, or may move the manipulated operation icon to a new display position different from the present display position. The following will describe an example of the customize process. In a case where the manipulated operation icon is pressed for a predetermined time at the present display position, the manipulated operation icon is locked at the present display position. Then, the present display position of the manipulated operation icon is recorded as a customized display position. In a case where the finger moves on the display panel 14 with pressing the manipulated operation icon, the manipulated operation icon moves according to a moving of the finger. After the manipulated operation icon is moved to a new display position on the display panel 14, the new display position is recorded as the customized display position. Further, in this case, the manipulated operation icon is also referred to as a customized operation icon after the customized display position is recorded. When displaying the customized operation icon on a next rearrange target display mode, the customized operation icon is displayed at the customized display position on the display panel 14. Further, when the manipulated operation icon is pressed for a short time less than the predetermined time, the customize process is not executed, and the process proceeds to step S12.
At step S12, the learning section records a type of the manipulated operation icon and manipulation times of the manipulated operation icon is recorded, and calculates a manipulation frequency of the manipulated operation icon. Specifically, the manipulation frequency is defined as a ratio of total manipulation times of the manipulated operation icon to total manipulation times of all of the operation icons Ia to Ih. Further, when an operation icon has a great total manipulation times, the learning section may determine that the operation icon has a high manipulation frequency, and when an operation icon has a small total manipulation times, the learning section may determine that the operation icon has a low manipulation frequency.
At step S13, the controller 2 displays a display window corresponding to the manipulated operation icon on the display panel 14, and the controller 2 executes a predetermined operation corresponding to the manipulated operation icon. In this case, the predetermined operation corresponding to the manipulated operation icon may be a switchover to a subordinate display window. For example, when the manipulated operation icon is an air conditioner icon, a display window for setting the air conditioner is displayed on the display panel 14.
When the manipulation frequency of the manipulated operation icon is calculated at least one time, at step S5, the controller 2 calculates the display position of the manipulated operation icon with respect to the finger approach position P1 based on the manipulation frequency. That is, an operation icon having a higher manipulation frequency is arranged nearer to the finger approach position P1. When two operation icons have the same manipulation frequency, the one, which has a smaller distance to the finger approach position P1, is arranged nearer to the finger approach position P1. Thus, at step S6, an operation icon having a highest manipulation frequency is arranged nearest to the finger approach position P1.
In the present embodiment, when the predetermined stand-by time elapses (step S9: “YES”), the operation icons Ia to Ih are unlocked at step S14. Then, at step S15, the controller 2 detects whether the finger moved away from the front region of the display panel 14. When detecting that the finger moved away from the front region of the display panel 14 (step S15: “YES”), the process proceeds to step S16 to reset the operation icons Ia to Ih at the original display positions. That is, the pre-moving icon display window is displayed on the display panel 14 at step S16. When detecting that the finger stops moving in the front region of the display panel 14 (step S15: “NO”), the process returns to step S3.
With this configuration, under conditions that (i) the icon display window is displayed in the rearrange target display mode, and (ii) the vehicle is in the parked state, when the user approaches to the display panel 14 with the finger, the finger position detector 16 detects the approach of the finger and the finger approach position P1 on the display panel 14. Then, the display controller controls the display panel 14 to display the icon display window in the adjacence display mode in which the operation icons Ia to Ih are arranged adjacent to the finger approach position P1. Here, a display control start condition is defined as the vehicle is in the parked state.
With above-described configuration, the operation icons Ia to Ih move adjacent to the finger approach position P1. Thus, the user can selectively manipulate one of the operation icons Ia to Ih from a position near to a seat. Further, since, the display control start condition is defined as the parked state of the vehicle, the display control is performed only during the vehicle is in the parked state. Thus, the user can concentrate on a manipulation of the operation icons Ia to Ih.
Further, in the present embodiment, the operation icons Ia to Ih are displayed adjacent to the finger approach position P1. The operation icon having the higher manipulation frequency is arranged nearer to the finger approach position P1. With this configuration, the operation icon having the highest manipulation frequency is arranged nearest to the finger approach position P1. Thus, the user can manipulate the operation icons Ia to Ih with less moving. Further, the learning section calculates manipulation frequencies of the operation icons Ia to Ih. Thus, a list of the operation icons Ia to Ih based on the manipulation frequency from high to low is calculated and set by the learning section.
Further, in the present embodiment, when the finger position detector 16 detects that the finger stops moving before manipulating an operation icon on the display panel 14, the controller 2 locks the icon display window, which is displayed in the adjacence display mode. Specifically, when the finger stops moving during the icon display window is displayed in the moving state of the adjacence display mode, the icon display window is locked in the moving state of the adjacence display mode. Further, when the finger stops moving during the icon display window is displayed in the static state of the adjacence display mode, the icon display window is locked in the static state of the adjacence display mode. The operation icons Ia to Ih are being locked until the unlock condition is satisfied. With this configuration, when the user stops moving the finger, the icon display window is locked in the adjacence display mode. Thus, the user can easily find an operation icon and manipulate the operation icon.
A second embodiment of the present disclosure will be described with reference to
A third embodiment of the present disclosure will be described with reference to
A fourth embodiment of the present disclosure will be described with reference to
A process executed to perform the icon separation display control is shown in
At step T4, the operation icons Ia to Ih are displayed at the corresponding display positions, which are apart from the finger approach position P1. That is, the icon display window is displayed in the separation display mode. An example of the separation display mode is shown in
At step T5, when determining that the finger moved away from the display panel 14, the process proceeds to step T6. At step T6, the controller 2 resets the operation icons Ia to Ih so that the operation icons Ia to Ih are displayed at the original display positions before the moving. According to the fourth embodiment, when the finger position detector 16 detects that the finger approaches to the display panel 14, the controller 2 displays the icon display window in the separation display mode so that the operation icons Ia to Ih are displayed apart from the finger approach position P1. Thus, the operation icons Ia to Ih are hard to be manipulated during the traveling of the vehicle.
A fifth embodiment of the present disclosure will be described with reference to
According to the fifth embodiment, when displaying the limited operation icons and the unlimited operation icons, the limited operation icons are displayed apart from the finger approach position P1 so that the limited operation icons are hard to be manipulated, and the unlimited operation icons are displayed adjacent to the finger approach position P1 so that the unlimited operation icons are easy to be manipulated.
A sixth embodiment of the present disclosure will be described with reference to
As shown in
When the finger position detector 16 detects that the finger stops moving at step S7, the controller 2 locks the icon display window displayed in the adjacence display mode at step S8. Then, at step S10, the controller 2 determines whether one of the operation icons Ia to Ih is manipulated by touching.
At step S10, when the controller 2 determines that one of the operation icons Ia to Ih is manipulated, the process proceeds to step Sc. At step Sc, the controller 2 determines whether the child mode is activated. When determining that the child mode is activated, the process proceeds to step Sd without execution of step S11, step S12, and step S13. At step Sd, the controller 2 switches the display mode of the icon display window from the adjacence display mode to the child display mode. That is, the controller 2 executes a child mode display control. Then, step S14 is executed. In the child display mode, when one of the operation icons Ia to Ih is manipulated, a color of the manipulated operation icon is changed. Further, the manipulated operation icon may be displayed in a blinking manner, or a size of the manipulated operation icon is increased. The manipulated operation icon may be displayed in a manner other than above-described manners.
When determining that the child mode is deactivated at step Sc, the process proceeds to step S11 similar to the first embodiment.
According to the sixth embodiment, the in-vehicle display control apparatus 23 includes the child mode setting section 22, which activates and deactivates the child mode. In a case where the child mode is activated, when the finger position detector 16 detects the approach of the finger, the controller2 controls the display panel 14 to display the icon display window in the adjacence display mode by execution of step S1 to step S10. Further, when the controller 2 receives the input signal from the touch panel switch 15, the controller 2 displays the icon display window in the child display mode at step Sd without execution of step S11 to step S13.
With this configuration, when a child approaches to the display panel 14 with a finger and moves the finger in the front region of the display panel 14, the operation icons Ia to Ih are being moved according to the moving of the finger. Thus, the child can play with the display panel 14. Further, even when the operation icon is manipulated by touching the touch panel switch 15, the operation corresponding to the operation icon is not performed. Instead, the operation icon is displayed in the child display mode. That is, when one of the operation icons Ia to Ih is manipulated, the icon display window for the child is displayed. Thus, the child can play with the display panel 14 without performing an actual operation corresponding to the manipulated operation icon.
In the present embodiment, the child mode switch, which provides the child mode setting section 22, is equipped to the installment panel 21 in order to activate and deactivate the child mode. Alternatively, a predetermined switch equipped to the switch group 5 or the remote controller 19 may provide the child mode setting section 22. In this case, the predetermined switch activates and deactivates the child mode by performing a predetermined manipulation such as long-press. Further, predetermined plural switches equipped to the switch group 5 or the remote controller 19 may provide the child mode setting section 22. In this case, the plural switches activate and deactivate the child mode by performing a predetermined manipulation such as pressing the plural switches at one time. Further, when displaying the icon display window in the adjacence display mode, one or more imaginary concentric circles may be defined around the finger approach position P1. As described above, the manipulation frequency may be set to have three levels including the high level, the medium level, and the low level. In this case, the operation icon included in the high level may be arranged in the firstly close concentric circle to the finger approach position P1. Further, the operation icon included in the medium level may be arranged in the secondly close concentric circle to the finger approach position P1. Further, the operation icon included in the low level may be arranged in the thirdly close concentric circle to the finger approach position P1. In the present disclosure, the manipulation frequency is set to have three levels. Alternatively, the manipulation frequency may be set to have two, four or more than four levels.
Further, the first sensor 7 and the second sensor 8 may be respectively arranged on an upside and a downside of the display panel 14. Further, the operation icons Ia to Ih may be displayed in a predetermined manner, which is set by the user, in the rearrange target display mode other than the equally arranged manner.
While only the selected exemplary embodiments have been chosen to illustrate the present disclosure, it will be apparent to those skilled in the art from this disclosure that various changes and modifications can be made therein without departing from the scope of the disclosure as defined in the appended claims. Furthermore, the foregoing description of the exemplary embodiments according to the present disclosure is provided for illustration only, and not for the purpose of limiting the disclosure as defined by the appended claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2011-237156 | Oct 2011 | JP | national |