This application claims priority to Japanese Patent Applications No. 2018-121297 and No. 2018-121298 filed on Jun. 26, 2018, the entire contents of which are incorporated herein by reference.
The present disclosure relates to an electronic apparatus, a mobile body, a program, and a control method.
Conventionally, apparatuses that assist users to drive a vehicle are known. For example, PTL 1 set forth below discloses a car navigation system installed in a vehicle. The car navigation system disclosed in PTL 1 assists driving of a vehicle by displaying information regarding, for example, a travel route to a destination and the like on a display.
An electronic apparatus according to an embodiment includes a sensor and a controller. The sensor is configured to detect a gesture made without contacting the electronic apparatus. The controller is configured to, when the gesture made without contacting the electronic apparatus is detected while a first screen and a second screen are displayed on a display screen of a display, locate a position of a driver based on a direction of a first detected gesture and change the display screen to a display screen corresponding to the position of the driver.
An electronic apparatus according to an embodiment includes a sensor and a controller. The sensor is configured to detect a gesture made without contacting the electronic apparatus. The controller is configured to, when the gesture made without contacting the electronic apparatus is detected while an icon is displayed on a display screen of a display, locate a position of a driver based on a direction of a first detected gesture and shift the icon to a position near the driver.
A mobile body according to an embodiment includes the electronic apparatus described above.
A mobile body according to an embodiment is communicatively connected to the electronic apparatus described above.
A program according to an embodiment is a program for controlling an electronic apparatus that includes a sensor configured to detect a gesture made without contacting the electronic apparatus and also includes a controller. The program causes the controller to perform a step of, when the gesture made without contacting the electronic apparatus is detected while a first screen and a second screen are displayed on a display screen of a display, locating a position of a driver based on a direction of a first detected gesture and changing the display screen to a display screen corresponding to the position of the driver.
A program according to an embodiment is a program for controlling an electronic apparatus that includes a sensor configured to detect a gesture made without contacting the electronic apparatus and also includes a controller. The program causes the controller to perform a step of, when the gesture made without contacting the electronic apparatus is detected while an icon is displayed on a display screen of a display, locating a position of a driver based on a direction of a first detected gesture and shifting the icon to a position near the driver.
A control method according to an embodiment is a control method of an electronic apparatus that includes a sensor configured to detect a gesture made without contacting the electronic apparatus and also includes a controller. The control method includes a step of, when the gesture made without contacting the electronic apparatus is detected while a first screen and a second screen are displayed on a display screen of a display, locating a position of a driver based on a direction of a first detected gesture and changing the display screen to a display screen corresponding to the position of the driver.
A control method according to an embodiment is a control method of an electronic apparatus that includes a sensor configured to detect a gesture made without contacting the electronic apparatus and also includes a controller. The control method includes a step of, when the gesture made without contacting the electronic apparatus is detected while an icon is displayed on a display screen of a display, locating a position of a driver based on a direction of a first detected gesture and shifting the icon to a position near the driver.
A user of a car navigation system disclosed in PTL 1 performs a touch input on a display to perform an input operation. However, it is preferable that a driver avoids performing a touch input while driving, from the viewpoint of safe driving of a vehicle. In consideration of such a circumstance, an object of the present disclosure is to provide an electronic apparatus, a mobile body, a program, and a control method that can improve driving safety of a mobile body. According to an embodiment, an electronic apparatus, a mobile body, a program, and a control method that can improve driving safety of a mobile body can be provided.
(Configuration of Electronic Apparatus)
As illustrated in
The electronic apparatus 1 may be realized by various apparatuses used for driving or steering a mobile body. The mobile body may be any movable apparatus. The mobile body may allow a user to board. The mobile body as used herein encompasses vehicles, ships, and aircrafts. Vehicles may include, for example, electric vehicles, hybrid electric vehicles, gasoline vehicles, motorcycles, bicycles, welfare vehicles, or the like. Vehicles may include, for example, railway vehicles. The mobile body may be driven or steered by a user. At least a part of a user operation associated with driving or steering the mobile body may be automated. The mobile body may be able to move autonomously without a user operation. In the following description, the mobile body will be assumed as an automobile to be driven by a user.
In a case in which the mobile body is an automobile, the electronic apparatus 1 may be realized by an in-vehicle apparatus such as a car navigation system installed in the automobile. The electronic apparatus 1 may be realized by, for example, a mobile phone terminal, a phablet, a tablet PC (Personal Computer), a smartphone, a feature phone, or the like. In this case, the electronic apparatus 1 may be communicatively connected in a wired or wireless manner with a system installed in the automobile to be driven by a user. For example, the electronic apparatus 1 may be realized by a smartphone and communicatively connected to the system installed in the vehicle via Bluetooth® (Bluetooth is a registered trademark in Japan, other countries, or both). The electronic apparatus 1 is not limited to the above examples and may be realized by any apparatus used in driving or steering a mobile body. The electronic apparatus 1 may be realized by, for example, a PDA (Personal Digital Assistant), a remote control terminal, a portable music player, a game machine, an electronic book reader, a home electric appliance, an industrial device (FA device), or the like. In the following description, the electronic apparatus 1 is assumed to be realized by a car navigation system installed in an automobile.
The timer 12 receives a timer operation instruction from the controller 11 and, when a predetermined time has elapsed, outputs a signal indicating accordingly to the controller 11. The timer 12 may be provided independently of the controller 11 as illustrated in
The camera 13 images a subject around the electronic apparatus 1. The camera 13 is provided on, for example, a surface of the electronic apparatus 1 on which the display 14 is provided.
The display 14 displays a screen. The screen includes at least one of, for example, a character, an image, a symbol, a figure, and the like. The display 14 may be a liquid crystal display, an organic EL (Electroluminescence) panel, an inorganic EL panel, or the like. In the present embodiment, the display 14 is a touch panel display (a touch screen display). The touch panel display detects a contact made by a finger or a stylus pen and locates a contact position. The display 14 can simultaneously detect a plurality of contact positions contacted by fingers, stylus pens, or the like.
The microphone 15 detects a sound around the electronic apparatus 1, including a person's voice.
The storage 16 serves as a memory and stores a program and data. The storage 16 temporarily stores a processing result by the controller 11. The storage 16 may include any storage device such as a semiconductor storage device or a magnetic storage device. The storage 16 may include multiple types of storage devices. The storage 16 may include a combination of a portable storage medium such as a memory card and a reading device for the storage medium.
The program stored in the storage 16 includes an application to be executed in the foreground or background and a control program that supports an operation of the application. The application causes the controller 11 to execute, for example, an operation corresponding to a gesture. The control program is, for example, an OS (Operating System). The application and the control program may be installed in the storage 16 via communication performed by the communication interface 17 or a storage medium.
The communication interface 17 is an interface for wired or wireless communication. A communication method employed by the communication interface 17 according to the embodiment is a wireless communication standard. For example, the wireless communication standard includes a cellular phone communication standard such as 2G, 3G, or 4G. For example, a communication standard of the cellular phone includes LTE (Long Term Evolution), W-CDMA (Wideband Code Division Multiple Access), CDMA2000, PDC (Personal Digital Cellular), GSM® (Global System for Mobile communications: GSM is a registered trademark in Japan, other countries, or both), PHS (Personal Handy-phone System), or the like. For example, the wireless communication standard includes WiMAX (Worldwide Interoperability for Microwave Access), IEEE 802.11, Bluetooth® (Bluetooth is a registered trademark in Japan, other countries, or both), IrDA (Infrared Data Association), NFC (Near Field Communication), or the like. The communication interface 17 can support one or more of the above communication standards.
The speaker 25 outputs a sound. The speaker 25 outputs, for example, a voice that guides a route to an input destination of the automobile. In a case in which the electronic apparatus 1 is realized by an apparatus capable of making a phone call, the speaker 25 outputs, for example, a voice of the other party during a phone call. Further, for example, when news or weather forecast are read out, the speaker 25 outputs the contents as a voice from.
The proximity sensor 18 detects a relative distance to an object around the electronic apparatus 1 and a moving direction of the object, in a non-contact manner. In the present embodiment, the proximity sensor 18 includes one light source infrared LED (Light Emitting Diode) and four infrared photodiodes. The proximity sensor 18 emits infrared light to the object from the light source infrared LED. The proximity sensor 18 receives reflected light from the object as incident light of the infrared photodiode. Then, the proximity sensor 18 can measure the relative distance to the object, based on an output current of the infrared photodiode. Further, the proximity sensor 18 detects the moving direction of the object, based on a time difference in which the reflected light from the object enters each infrared photodiode. Thus, the proximity sensor 18 can detect an operation using an air gesture (hereinafter, simply referred to as “gesture”) performed by the user of the electronic apparatus 1 without contacting the electronic apparatus 1. Here, the proximity sensor 18 may include a visible light photodiode.
The controller 11 is a processor such as, for example, a CPU (Central Processing Unit). The controller 11 may be an integrated circuit such as a SoC (System-on-a-Chip) in which other elements are integrated. The controller 11 may be configured by combining a plurality of integrated circuits. The controller 11 centrally controls an operation of the electronic apparatus 1 to realize various functions.
In a case in which the electronic apparatus 1 is realized by a car navigation system installed in an automobile, the controller 11 included in the electronic apparatus 1 may be, for example, an ECU (Electric Control Unit or Engine Control Unit) provided to the automobile.
The controller 11 refers to the data stored in the storage 16, as necessary. The controller 11 realizes various functions by executing commands included in a program stored in the storage 16 and controlling other functional units such as the display 14. For example, the controller 11 acquires information regarding a user's gesture detected by the proximity sensor 18. For example, when the automobile is stationary and the electronic apparatus 1 can be operated via the touch panel, the controller 11 acquires user contact data from the touch panel. Further, for example, the controller 11 acquires information detected by a sensor other than the proximity sensor 18.
In the present embodiment, further, the controller 11 has a function as a display driver to control a display of the display 14. That is, in the present embodiment the controller 11 can directly control the display 14 to display an image.
In another embodiment, a display driver may be provided independently of the controller 11. In this case, the controller 11 may cause the display 14 to display an image via the display driver.
The UV sensor 19 can measure a volume of ultraviolet rays in sunlight or the like.
The illuminance sensor 20 detects an illuminance of ambient light incident on the illuminance sensor 20.
The acceleration sensor 21 detects a direction and a magnitude of acceleration acting on the electronic apparatus 1. For example, the acceleration sensor 21 is of a three-axis (a three-dimensional) type that detects acceleration in an x-axis direction, a y-axis direction, and a z-axis direction. The acceleration sensor 21 may be of, for example, a piezoresistive type or an electrostatic capacitance type.
The geomagnetic sensor 22 detects an orientation of the geomagnetism and enables a measurement of an orientation of the electronic apparatus 1.
The atmospheric pressure sensor 23 detects an atmospheric pressure outside the electronic apparatus 1.
The gyro sensor 24 detects an angular velocity of the electronic apparatus 1. The controller 11 can measure a change in the orientation of the electronic apparatus 1 by performing time-integration of the angular velocity acquired by the gyro sensor 24.
(Gesture Operation of Electronic Apparatus)
The operation corresponding to the gesture is not limited to the volume adjustment. The operation corresponding to the gesture may be another operation that can be executed based on the detected gesture. For example, the operation corresponding to the gesture may include zooming in or out of information displayed on the display 14, adjusting a brightness of the display of the display 14, starting reading aloud predetermined information by a voice, stopping reading aloud by a voice, or the like.
(Gesture Detection Method)
Here, a method to detect the user's gesture used by the controller 11 based on the output of the proximity sensor 18 will be described with reference to
The controller 11 acquires the detection values by the photodiodes SU. SR, SD and SL from the proximity sensor 18. Then, to grasp the movement of the detection object in the direction of the virtual line D1, the controller 11 may perform integration of a value obtained by subtracting the detection value by the photodiode SU from the detection value by the photodiode SD using a predetermined time. In an example illustrated in
Further, the controller 11 may perform integration of a value obtained by subtracting the detection value by the photodiode SR from the detection value by the photodiode SL using a predetermined time. From a change in the integrated value (e.g., a change of a positive value, zero, or a negative value), the controller 11 can grasp a movement of the detection object in a direction orthogonal to the virtual line D1.
Alternatively, the controller 11 may calculate using the detection values by all of the photodiodes SU, SR, SD and SL. That is, the controller 11 may grasp the moving direction of the detection object without calculation by separating it into components in the direction of the virtual line D1 and in the direction orthogonal thereto.
A gesture to be detected includes, for example, a left-right gesture, an up-down gesture, an oblique gesture, a clockwise circular gesture, a counterclockwise circular gesture, or the like. For example, the right-left gesture is a gesture performed in a direction substantially parallel to a longitudinal direction of the electronic apparatus 1. The up-down gesture is a gesture performed in a direction substantially parallel to a transverse direction of the electronic apparatus 1. The oblique gesture is a gesture performed on a plane substantially parallel to the electronic apparatus 1 in a direction that is parallel to neither the longitudinal direction nor the transverse direction of the electronic apparatus 1.
Here, the photodiodes SU, SR, SD and SL receive reflected light of infrared light emitted from the light source infrared LED 180 on the detection object and output the respective detection values corresponding to an amount of received light. At this time, the controller 11 can also determine that the detection object is approaching or moving away from the proximity sensor 18. First, when at least one of the detection values of the photodiodes SU, SR, SD, and SL is equal to or larger than a predetermined threshold (e.g., a value that is not zero), the controller 11 can determine that the detection object is present. After the controller 11 determines that the detection object is present, when at least one of the detection values of the photodiodes SU, SR. SD, and SL relatively increases, the controller 11 can determine that the detection object is approaching the electronic apparatus 1. Further, after the controller 11 determines that the detection object is present, when at least one of the detection values of the photodiodes SU, SR, SD, and SL relatively decreases, the controller 11 can determine that the detection object is moving away from the electronic apparatus 1. At this time, the controller 11 can determine a user's gesture in which the hand approaches or moves away from the electronic apparatus 1, or a gesture in combination of one of these gestures and another gesture described above (e.g., the left-right gesture).
(Car Mode)
For example, the controller 11 can perform an operation to adjust the volume of the sound output from the electronic apparatus 1, based on a specific gesture (e.g., a gesture in which the user moves the hand up and down). The electronic apparatus 1 can receive a touch input to the touch screen display from the user. However, when user performs a touch input during driving, the user may move his/her eyes to the display 14 for a while to confirm a distance to the touch screen display and a contact position. Even if at least a part of the user operation associated with driving is automated, it is desired that the user continues to pay attention on the situation around the vehicle during driving, from the viewpoint of safe driving of a vehicle. That is, it is preferred that the user does not perform a touch input. The electronic apparatus 1 capable of accepting an input operation by a gesture as described in the present embodiment enables the user to perform the input operation without contacting the electronic apparatus 1. This facilitates ensuring driving safety even when the user performs an input operation during driving.
Here, the electronic apparatus 1 may have a plurality of modes. The modes means operation modes (operation states or operation situations) that restrict overall operations of the electronic apparatus 1. Only one mode can be selected at a time. In the present embodiment, the modes of the electronic apparatus 1 include a first mode and a second mode. The first mode is a normal operation mode (a normal mode) suitable for, for example, use in situations other than driving. Such situations other than driving include, for example, any one of a situation in which the engine of the automobile is not running, a situation in which a shift lever is in a predetermined range (e.g., a parking range), a situation in which the brake is depressed, and a situation in which a route to the destination is not displayed. The second mode is an operation mode (a car mode) of the electronic apparatus 1 suitable for driving the automobile while displaying the route to the destination on the display 14 of the electronic apparatus 1. As described above, it is preferred that a gesture input is enabled in the second mode. That is, when the mode of the electronic apparatus 1 is switched to the second mode, the proximity sensor 18 is preferably operated in conjunction with the switching to be able to detect a gesture. The electronic apparatus 1 may switch the mode of the electronic apparatus 1 based on, for example, a predetermined input operation in respect to the electronic apparatus 1 or a predetermined input operation in respect to a vehicle by a user.
Here,
In the electronic apparatus 1 serving an in-vehicle apparatus, in particular, a configuration of the screen (hereinafter, referred to as a display screen) displayed in the entire display area of the display 14 is generally fixed. For example, regardless of a type of a vehicle, the first screen 140 (e.g., the map screen) is displayed in a left half of the display screen and the second screen 150 (e.g., the road information screen) is displayed in a right half of the display screen, as illustrated in
(Method to Determine Position of Driver's Seat)
Here, in the electronic apparatus 1 gesture directions to be detected may be predetermined. For example, the gesture directions to be detected may be set to be the up-down direction, the left-right direction, and the front-rear direction, as illustrated in
In
The controller 11 determines a position of the driver's seat in an automobile equipped with the electronic apparatus 1. For example, as illustrated in
Here, the driver's seat refers to a seat in which a user who drives the automobile 30 sits. In a case in which the automobile 30 includes a steering wheel (a steering device) and the user operates the steering wheel to drive, the driver's seat is the front seat at the position where the steering wheel is arranged. The steering wheel can be, for example, a handle, a lever, a bar, or the like. Generally, a steering wheel of the automobile 30 is arranged in front of one of the seats in the front row. Thus, the controller 11 determines the seat 31 positioned on the right side of the front row or the seat 32 positioned on the left side of the front row to be the position of the driver's seat.
In the present embodiment, the controller 11 determines the position of the driver's seat, based on a direction in which a first gesture is detected. The first detected gesture may be a gesture first detected after electric power is supplied to the electronic apparatus 1. In another example, the first detected gesture may be a first gesture detected after electronic apparatus 1 displays at least one of a predetermined character, image, and voice to the user. For example, the predetermined character or image may be a message such as “Please reach out to the passenger side from the drivers seat” displayed on the display 14 or an image illustrating the contents of the message. Further, the predetermined sound may be, for example, a voice output from the speaker 25, such as “Please reach out from the driver's seat to the passenger seat.”
After electric power is supplied to the electronic apparatus 1, the user brings the hand closer to the electronic apparatus 1 to use it. In this case, the user's hand extends from a direction of the seat in which the user is sitting. That is, in a case in which the user is sitting on the right side with respect to the traveling direction, the user's hand extends from the right side of the proximity sensor 18 with respect to the traveling direction. On the other hand, in a case in which the user is sitting on the left side with respect to the traveling direction, the users hand extends from the left side of the proximity sensor 18 with respect to the traveling direction. Thus, the electronic apparatus 1 can locate the position of the user, based on the direction of the first gesture detected by the proximity sensor 18. Generally, the user who intends to operate the electronic apparatus 1 while driving the automobile 30 can be assumed to be a user who drives the automobile 30. Accordingly, the controller 11 can locate the position of the driver's seat, based on the direction in which the first gesture is detected.
(Display Setting)
After the controller 11 locates the position of the driver's seat (i.e., the position of the driver), the controller 11 changes the display screen of the display to a display screen corresponding to the position of the driver. The controller 11 changes an original display screen to a display screen corresponding to the position of the driver, in accordance with display setting of the electronic apparatus 1, which will be described later. Here, “change” of the display screen includes a case in which the original display screen is not changed in accordance with the display setting of the electronic apparatus 1, that is, a case in which there is no change in the contents of the display screen as a result.
The user can specify the display settings in a setting screen of the electronic apparatus 1.
(Display Position According to Priority)
When the controller 11 determines that the position of the driver's seat is the right side (the seat 31 in the example illustrated in
(Change of Display Screen Area)
The display screen illustrated in
When it is determined that the driver's seat is positioned on the right side, the controller 11 increases a display screen area (a size) of the map screen located near the driver. The controller 11 increases the size of the map screen near the driver to be larger than at least the size of the road information screen positioned farther from the driver than the map screen. Here, a specific size (e.g., 70% of the display area of the display 14) of the screen near the driver can be set in the setting screen. Further, for example, a ratio of the size of the screen near the driver to the size of another screen positioned farther from the driver can be set in the setting screen. Here, the controller 11 continues to display the screen positioned farther from the driver. As illustrated in
(Icon Display Position)
The display screens illustrated in
Here, the icon group 160 and the icon 170 are preferably displayed near the driver to facilitate an operation such as a gesture selection. The “Icon Position” in the setting screen illustrated in
When the controller 11 determines that the driver's seat is positioned on the right side, the controller 11 displays the icon group 160 and the icon 170 on the right side near the driver, as illustrated in the lower diagram in
When the proximity sensor 18 detects the first gesture (e.g., the left-right gesture), the controller 11 may display the icons 160A, 160B, 160C, and 160D included in the icon group 160 in a rotated manner. Then, when the proximity sensor 18 detects the second gesture (e.g., the up-down gesture), the controller 11 may execute the function of the icon displayed closest to the driver's seat.
Although the display setting items have been individually described above, the controller 11 can change the display screen according to a plurality of display settings. For example, in a case in which the position of the motorway map screen is set to “driver's side”, the size of multiple images is set to “change”, and the position of the icon is set to “drivers side”, the controller 11 changes the display screen in a manner illustrated in
(Display Adjustment for Single Screen)
The display screen illustrated
When the controller 11 determines that the driver's seat is positioned on the right side, the controller 11 shifts the center position of the single screen to the driver's seat side and displays the single screen. Here, a shift amount (an amount of movement) may be set in the setting screen. As illustrated in
(Operation Flow Executed by Controller)
The controller 11 determines the position of the driver's seat in the automobile by the method described above (step S1).
When there is a setting for the screen display position according to the priority (Yes in step S2), the controller 11 adjusts a screen position and displays the screen accordingly (step S3). For example, when the position of the motorway map screen is set to the driver's seat side in the setting screen illustrated
After step S3, or when there is no setting of the screen display position according to the priority (No in step S2), the controller 11 proceeds to step S4.
When there is a setting for screen area adjustment (Yes in step S4), the controller 11 adjusts the screen area and displays the screen accordingly (step S5). For example, when the size of multiple images is set to change in the setting screen illustrated in
After step S5, or when there is no setting for screen area adjustment (No in step S4), the controller 11 proceeds to step S6.
When there is a setting for the icon display position (Yes in step S6), the controller 11 adjusts and displays the icon (step S7). For example, when the position of the icon is set to the driver's seat side in the setting screen illustrated in
After step S7, or when there is no setting for the icon display position (No in step S6), the controller 11 ends the series of processes.
As described above, the electronic apparatus 1 according to the present embodiment locates the position of the driver's seat, based on the direction of the first detected gesture, and changes the display screen of the display 14 to the display screen corresponding to the position of the driver. Thus, the electronic apparatus 1 realizes a screen layout that facilitates an operation from the driver's seat. Because the electronic apparatus 1 enables a gesture input operation, the electronic apparatus 1 can be operated without the necessity for the driver to shift the line of sight to the display 14, unlike apparatuses that employ touch operations. Accordingly, the user can continue to pay attention on the surroundings of the automobile during driving, and thus driving safety is improved.
Although the present disclosure has been described based on the drawings and the embodiments, it should be appreciated that those skilled in the art can easily make various variations and modifications based on the disclosure herein. Accordingly, these variations and modifications are included in the scope of the present disclosure. For example, the functions and the like included in each means or each step can be rearranged without logical contradiction, such that a plurality of means or steps are combined, or a means or step is subdivided.
Although in the above embodiment a gesture is described to be detected by the proximity sensor 18, the gesture does not necessarily need to be detected by the proximity sensor 18. The gesture may be detected by any sensor that can detect a user's gesture made without contacting the electronic apparatus. An example of such a sensor includes, for example, a camera 13 or the like.
The sensor capable of detecting a user's gesture made without contacting the electronic apparatus may include, for example, a distance measurement sensor. For example, the electronic apparatus 1 may include a distance measurement sensor instead of, or in addition to, the proximity sensor 18 and detect a gesture using the distance measurement sensor.
The distance measurement sensor is a sensor capable of measuring a distance to an object. The distance measurement sensor may be, for example, a ToF (Time of Flight) sensor. The distance measurement sensor configured by a ToF sensor includes a light emitting unit configured to emit sine wave modulated light (infrared laser light) to an object and a light receiving unit configured to receive reflected light of emitted infrared laser light from the object. The light receiving unit includes, for example, an image sensor in which a plurality of light receiving elements are arranged. The ToF sensor measures time (flight time) from emitting of infrared laser light to receiving of reflected light by each light receiving element. The ToF sensor can measure the flight time, based on a phase difference between emitted infrared laser light and received reflected light. The ToF sensor can measure the distance to the object that reflected the emitted infrared laser light, based on the measured flight time. The ToF sensor can detect a moving direction of the object, based on a time difference between reflected light from the object entering each of the plurality of light receiving elements. Thus, the ToF sensor can detect a user's gesture, based on the same principle as the proximity sensor 18. The distance measurement sensor may be arranged on the same surface of the electronic apparatus 1 as the surface on which, for example, the proximity sensor 18 is arranged.
Here, a method to detect a user's gesture by the controller 11, based on an output of the distance measurement sensor, will be described with reference to
The light receiving unit 26b may include a plurality of light receiving elements. For example, the light receiving unit 26b may include nine light receiving elements arranged in 3 rows and 3 columns, as illustrated in
The distance measurement sensor 26 can detect a distance to the object from each of the nine light receiving elements, based on the phase difference between infrared laser light emitted by the light emitting section 26a and reflected light received by each of the nine light receiving elements of the light receiving section 26b. The distance measurement sensor 26 can detect a gesture, based on the distance to the object from each of the nine light receiving elements and a change in the distance with time.
For example, it is assumed that the user performs a gesture to move the hand from left to right. In this case, respective distances to the object detected by the light receiving elements Ch21, Ch22, and Ch23 in the middle row are, for example, D21, D22, and D23.
Although the light receiving unit 26b is described above to include nine light receiving elements, the number of light receiving elements included in the light receiving section 26b is not limited thereto. Also, the arrangement of the plurality of light receiving elements is not limited to the arrangement illustrated in
The light emitting unit 26a of the distance measurement sensor 26 may include a plurality of light emitting elements. In this case, the distance to the object from each of the nine light emitting elements can be measured based on the phase difference between the infrared laser light emitted from each light emitting element and the reflected light received by the light receiving unit 26b. In this case also, the distance measurement sensor 26 can detect a gesture by applying the above principle, based on the distance to the object from each of the nine light emitting elements and the change in the distance with time.
In the above embodiment, further, although the controller 11 has been described to determine the position of the driver's seat according to the direction in which the first gesture is detected, this is not restrictive. The controller 11 may determine the position of the driver's seat using at least one of the methods described below instead of, or in combination with, the direction in which the first gesture is detected.
For example, the controller 11 may determine the position of the driver's seat, based on information preliminarily stored in the storage 16. For example, in a case in which the electronic apparatus 1 is preliminarily installed in the automobile 30, the storage 16 may store information regarding the position of the driver's seat. Alternatively, in a case in which the user inputs information regarding the position of the driver's seat by performing an input operation in respect to the electronic apparatus 1, the storage 16 may store the information regarding the position of the driver's seat. In this case, the controller 11 can determine the position of the driver's seat, based on the information regarding the position of the driver's seat stored in the storage 16.
For example, the controller 11 may determine the position of the driver's seat, based on an image captured by the camera 13. In particular, the controller 11 activates the camera 13 when executing control based on a gesture (e.g., when the electronic apparatus 1 is in a first operation mode). The camera 13 captures an image in front of the display 14, i.e., an interior of the automobile 30. The controller 11 may analyze the image captured by the camera 13 and determine the position of the seat in front of the steering wheel to be the position of the driver's seat. The controller 11 may analyze the image captured by the camera 13 and, when the image includes the user in the seat in front of the steering wheel, determine the position of the seat to be the position of the driver's seat. When the controller 11 determines the position of the driver's seat, the controller 11 may stop an operation of the camera 13. As a result, the controller 11 can reduce power consumption by the camera 13.
For example, in a case in which each of the seats 31 to 34 is provided with a pressure sensor, the controller 11 may determine the position of the driver's seat, based on an output of the pressure sensor. The pressure sensor may be provided, for example, under a seating surface of each of the seats 31 to 34 to which a load is applied when a user sits down. The pressure sensor detects a pressure applied to the seating surface of each of the seats 31 to 34. When a user gets in an automobile and sits down in a seat, the controller 11 can identify the seat in which the user sits, based on an output from the pressure sensor arranged in the seat. The controller 11 may determine the position of the seat in which the user sits to be the position of the driver's seat. This method can be used, for example, when one user gets in an automobile.
For example, in a case in which a motion sensor is arranged in front of each of the seats 31 to 34, the controller 11 may determine the position of the driver's seat, based on an output of the motion sensor. The motion sensor may detect whether a user is sitting in one of the seats 31 to 34, by sensing a change in ambient temperature using, for example, infrared rays. When the user gets in an automobile and sits down in a seat, the controller 11 can specify the seat in which the user sits, based on an output from the motion sensor arranged in front of the seat. The controller 11 may determine the position of the seat in which the user is sitting to be the position of the driver's seat. This method can be used, for example, when one user gets in an automobile.
For example, the controller 11 may determine the position of the driver's seat, based on opening and closing of the door of the automobile 30. For example, the automobile 30 is assumed to have one door near each of the seats 31 to 34. In particular, the automobile 30 includes one door on the right side of the seat 31 located on the right side of the front row, on the left side of the seat 32 located on the left side of the front row, on the right side of the seat 33 located on the right side of the rear row, and on the left side of the seat 34 located on the rear side left side. Also, each door is assumed to be provided with a sensor configured to detect opening and closing. The controller 11 can determine that a user sits in the seat closest to a door that has been opened and closed. This is because it is generally assumed that the user gets in the automobile 30 through the door closest to the seat in which the user intends to sit. The controller 11 may determine the position of the seat in which the user is determined to sit to be the position of the driver's seat. This method can be used, for example, when one user gets in an automobile.
For example, the controller 11 may determine the position of the driver's seat, based on a position of a door of the automobile 30 which is unlocked. When the automobile 30 has a plurality of doors as described above, the controller 11 can determine that the user sits in the seat closest to the door on which an unlocking operation is performed. This is because it is generally assumed that the user unlocks the door closest to the seat in which the user intends to sit and gets in the automobile 30 through the door. The controller 11 may determine the position of the seat in which the user is determined to sit to be the position of the driver's seat. This method can be used, for example, when one user gets in an automobile.
For example, the controller 11 may determine the position of the driver's seat, based on a hand operating the touch screen display. For example, the user preliminarily stores fingerprint data of the user's left and right fingers in the storage 16 of the electronic apparatus 1. The user can store the fingerprint data in the storage 16 of the electronic apparatus 1 by performing, for example, an input operation for registering the fingerprint data. Then, when power is supplied to the electronic apparatus 1 for drive of the automobile 30 and a user's finger contacts the touch screen display, the controller 11 reads the fingerprint of the finger contacting the touch screen display and determines whether the finger is a user's right hand finger or left hand finger. The controller 11 determines that the seat on the side opposite from the direction of a determined hand (i.e., the right hand or the left hand) is the seat in which the user is sitting. For example, when the user is sitting in the seat on the right side, the user is assumed to perform a touch input with the left hand in respect to the touch screen display arranged in the center. Thus, when the controller 11 determines that the finger contacting the touch screen display is a user's left hand finger, the controller 11 determines that the user is sitting in the seat on the right side. On the other hand, when the user is sitting in the seat on the left side, the user is assumed to perform a touch input with the right hand in respect to the touch screen display arranged in the center. Thus, when the controller 11 determines that the finger contacting the touch screen display is a user's right hand finger, the controller 11 determines that the user is sitting in the seat on the left side. The controller 11 may determine the position of the seat in which the user is determined to be sitting to be the position of the driver's seat.
For example, the controller 11 may determine the position of the driver's seat, based on a sound detected by the microphone 15. For example, based on a sound detected by the microphone 15, the controller 11 determines a direction in which the sound is generated. The controller 11 can determine that the direction in which the sound is determined to have been generated is the direction in which the user is present. Thus, the controller 11 may determine the position of the seat in the direction in which the sound is generated to be the position of the driver's seat.
In the above embodiment, further, the controller 11 may change a gesture detection range of the proximity sensor 18, according to the determined position of the driver's seat. The gesture detection range may include a direction that can be detected by the proximity sensor 18. For example, a case in which the proximity sensor 18 is movably provided in the left-right direction on the console panel is assumed. The controller 11 may control the proximity sensor 18 to face the determined driver's seat. That is, for example, when the controller 11 determines that the driver's seat is positioned on the right side with respect to the traveling direction, the controller 11 may turn the proximity sensor 18 to the right side with respect to the traveling direction. Similarly, when the controller 11 determines that the driver's seat is positioned on the left side with respect to the traveling direction, the controller 11 may turn the proximity sensor 18 to the left side with respect to the traveling direction. The proximity sensor 18 has a limited viewing angle to be able to detect a gesture. Thus, a user's gesture made out of the gesture detection range of the proximity sensor 18 will not be detected. However, by changing the detection range of the proximity sensor 18 and directing the detection range of the proximity sensor 18 toward, for example, the driver's seat in which the user is sitting, the proximity sensor 18 can detect a users gesture more easily. Because the proximity sensor 18 can easily detect a user's gesture, a gesture input by the user is less likely to be overlooked, enabling the user to focus on driving. Accordingly, driving safety is improved.
In the above embodiment, the electronic apparatus 1 includes the proximity sensor 18. Here, the electronic apparatus 1 may include a plurality of proximity sensors 18. In the above embodiment, the proximity sensor 18 is arranged in a portion on a housing of the electronic apparatus 1 positioned at the center of a left-right direction (an x-axis direction) side of the display 14 under the display 14 in the up-down direction (in a y-axis direction). Here, the electronic apparatus 1 may include the proximity sensor 18 arranged at a different position from the above embodiment. That is, the number of the proximity sensors 18 and a position of the proximity sensor 18 included in the electronic apparatus 1 are not limited.
In the above embodiment, when the proximity sensor 18 detects the first gesture (e.g., a left-right gesture), the controller 11 rotates and displays the icon group 160. Here, the controller 11 may display a high priority icon near the driver, instead of rotating the icon group 160. In this case, the driver can specify the priority of the icons included in the icon group 160 in the setting screen.
Further, in the above embodiment the icon group 160 is arranged along one longitudinal side of the display 14 (the lower side extending in the left-right direction). Here, an icon group 185 may be displayed along a transverse direction (the vertical direction) of the display 14. As illustrated in
Here, on the left and right sides of the display 14, the icon group 185 and an icon group 190 may be displayed along the transverse directions (the vertical directions). The icon group 190 is a set of a plurality of icons 190A and 190B, as illustrated in
Many aspects of the disclosure herein may be represented by a series of operations executed by a computer system or other hardware those are capable of executing a program instruction. The computer system or the other hardware include, for example, a general-purpose computer, a PC (personal computer), a special purpose computer, a workstation, PCS (Personal Communications System; a personal mobile communication system), a mobile (cellular) phone, a mobile phone having a data processing function, an RFID receiver, a game machine, an electronic notepad, a laptop computer, a GPS (Global Positioning System) receiver, and other programmable data processing apparatuses. Note that in the embodiments the various operations and control methods are executed by a dedicated circuit implemented with a program instruction (software) (e.g., discrete logic gates interconnected to perform a specific function), or a logical block, a program module and the like executed by at least one processor. The at least one processor for executing the logical block, the program module and the like includes, for example, at least one microprocessor, CPU (Central Processing Unit), ASIC (Application Specific Integrated Circuit), DSP (Digital Signal Processor), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), a processor, a controller, a microcontroller, a microprocessor, an electronic apparatus, and other apparatuses designed to be capable of executing the functions described herein, and/or a combination thereof. The embodiments presented herein are implemented by, for example, hardware, software, firmware, middleware, a microcode, or any combination thereof. The instruction may be a program code or a code segment for executing a necessary task. The instruction may be stored in a machine-readable non-transitory storage medium or in another medium. The code segment may represent any combination of a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class or an instruction, and a date structure or a program statement. The code segment, with another code segment or a hardware circuit, transmits/receives information, a data argument, a variable, and memory contents. Thereby, the code segment is connected to the another code segment or the hardware circuit.
The storage 16 used herein may be a computer readable tangible carrier (medium) including a range of a solid-state memory, a magnetic disk, or an optical disk. Such a media stores an appropriate set of computer instructions such as program modules for causing the processor to execute the techniques disclosed herein, or data structures. The computer-readable media includes: electrical connection with one or more wires; a magnetic disk storage; a magnetic cassette; a magnetic tape; another type of magnetic or optical storage device such as CD (Compact Disk), LD® (Laser Disk, LD is a registered trademark in Japan, other countries, or both), DVD® (Digital Versatile disc, DVD is a registered trademark in Japan, other countries, or both), a Floppy® disk (Floppy is a registered trademark in Japan, other countries, or both), and a Blu-ray disc (Blue-ray disc is a registered trademark in Japan, other countries, or both); a portable computer disk; RAM (Random Access Memory); ROM (Read-Only Memory); rewritable and programmable ROM such as EPROM (Erasable Programmable Read-Only Memory). EEPROM (Electrically Erasable Programmable Read-Only Memory), and a flash memory; other tangible storage media capable of storing information; and any combination of the above. The memory may be provided inside and/or outside a processor or a processing unit. As used herein, the term “memory” refers to any types of a long-term memory, a short-term memory, a volatile memory, a nonvolatile memory, or other memories. That is, the term “memory” is not limited to a particular type of memory and/or a particular number of memories. Further, a type of a medium to store information is also not limited.
Number | Date | Country | Kind |
---|---|---|---|
2018-121297 | Jun 2018 | JP | national |
2018-121298 | Jun 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/022027 | 6/3/2019 | WO | 00 |