This application claims priority to and the benefit of Japanese Patent Application No. 2017-082324 (filed on Apr. 18, 2017), the entire contents of which are incorporated herein by reference.
The present disclosure relates to an electronic device.
For example, electronic devices such as smartphones and tablet PCs are generally provided with a touch panel. Typically, users control the electronic devices by touching the touch panels. Recently, electronic devices that detect gestures performed by users positioned remote from the electronic devices using a proximity sensor such as an infrared sensor and perform input operations corresponding to the gestures are known.
PTL 1: JP-A-2015-225493
In the mobile terminal according to the PTL 1, a proximity sensor is provided on the front surface (the surface where the display is provided, i.e., the front side). Thus, in order to perform a gesture input operation, a user needs to perform a gesture in front of the mobile terminal
Here, a mobile terminal may be used while having the front surface facing up (i.e., the front surface facing the ceiling) on a table. In order to perform a gesture input operation in this state, the user needs to extend their hand to the vicinity of the front surface of the mobile terminal Also, a mobile terminal may be carried in, for example, a pocket. In order to perform a gesture input operation in this state, the user needs to take the mobile terminal out of the pocket and direct the front surface to the user before performing a gesture. As described above, in order to perform a gesture input operation using the mobile terminal described in the PTL 1, the user may need to move the mobile terminal in accordance with the orientation thereof or to adjust the orientation of the mobile terminal Thus, there is a risk that the user experiences poor operability in performing an input operation based on a gesture.
Thus, it could be helpful to provide an electronic device that improves user operability associated with a gesture.
An electronic device according to an embodiment of the present disclosure includes a first proximity sensor, a second proximity sensor, and a controller configured to select between first gesture detection based on a value output from the first proximity sensor and second gesture detection, based on a value output from the second proximity sensor.
An electronic device according to an embodiment of the present disclosure includes a first proximity sensor, a second proximity sensor, and a controller configured to switch between the first proximity sensor and the second proximity sensor, based on a state of the electronic device.
According to an embodiment of the present disclosure, an electronic device capable of improving user operability associated with a gesture can be provided.
In the accompanying drawings:
Electronic Device Configuration
An electronic device 1 according to an embodiment includes a proximity sensor 18 (a gesture sensor) and a controller 11 as illustrated in
When a predetermined period of time has elapsed after receiving a timer operating instruction from the controller 11, the timer 12 outputs a signal to that effect to the controller 11. The timer 12 may be provided independently of the controller 11 as illustrated in
The camera 13 captures an image of an object located in the vicinity of the electronic device 1. The camera 13 is, for example, a front-facing camera 13a (see
The display 14 displays a screen. The screen includes at least one of, for example, characters, images, symbols, and graphics. The display 14 may be an LCD (Liquid Crystal Display). The display 14 may be an organic EL (Electroluminescence) panel or an inorganic EL panel. In the present embodiment, the display 14 is a touchpanel display (a touchscreen display). The touchpanel display detects a contact made by a finger or a stylus pen and locates a contact position. The display 14 can simultaneously detect a plurality of positions which are contacted by fingers or stylus pens.
The microphone 15 detects a sound around the electronic device 1 including a human voice.
The storage 16 serves as a memory and stores programs and data. The storage 16 temporarily stores the results of operations by the controller 11. The storage 16 may include any storage device such as a semiconductor storage device or a magnetic storage device. The storage 16 may include a plurality of types of storage devices. The storage 16 may include a combination of a portable storage medium such as a memory card and a storage medium reader.
The programs stored in the storage 16 include an application for running in the foreground or the background and a control program for assisting the running of the application. For example, the application causes the controller 11 to perform an operation corresponding to a gesture. The control program is, for example, an OS (Operating System). The application and the control program may be installed in the storage 16 via communication performed by the communication interface 17 or the storage medium.
The communication interface 17 is an interface that enables wired or wireless communication. A communication method used by the communication interface 17 according to an embodiment conforms to a wireless communication standard. For example, wireless communication standards include communication standards for cellular phones such as 2G, 3G, and 4G Communication standards for cellular phones include, for example, LTE (Long Term Evolution) and W-CDMA (Wideband Code Division Multiple Access). Communication standards for cellular phones also include, for example, CDMA2000 and PDC (Personal Digital Cellular). Communication standards for cellular phones further include, for example, GSM® (Global System for Mobile communications; GMS is a registered trademark in Japan, other countries, or both) and PHS (Personal Handy-phone System). For example, wireless communication standards include WiMAX (Worldwide Interoperability for Microwave Access), IEEE 802.11, and Bluetooth® (Bluetooth is a registered trademark in Japan, other countries, or both). Wireless communication standards include, for example, IrDA (Infrared Data Association) or NFC (Near Field Communication). The communication interface 17 may support one or more communication standards mentioned above.
The speaker 25 outputs sound. For example, during a telephone call, the other party's voice is output from the speaker. For example, when news or weather forecast is read aloud, the content is output as a sound from the speaker 25.
The proximity sensor 18 detects a relative distance between an object in the vicinity of the electronic device 1 and the electronic device 1 together with a moving direction of the object, in a non-contact manner. In the present embodiment, the proximity sensor 18 includes one light source infrared LED (Light Emitting Diode) and four infrared photodiodes. The proximity sensor 18 causes the light source infrared LED to irradiate the object. The proximity sensor 18 receives reflected light from the object as incident light at the infrared photodiodes. Then, the proximity sensor 18 can measure a relative distance to the object based on an output current of the infrared photodiodes. The proximity sensor 18 also detects a moving direction of the object based on temporal differences between the reflected light from the object incident on the infrared photodiodes. Thus, the proximity sensor 18 can detect an operation using an air gesture (hereinafter, referred to simply as a “gesture”) performed by a user of the electronic device 1 without touching the electronic device 1. Here, the proximity sensor 18 may include a visible light photodiode. In the present embodiment, the proximity sensor 18 is configured as a plurality of proximity sensors: a first proximity sensor 18a and a second proximity sensor 18b provided on different surfaces of the electronic device 1 (e.g., see
The controller 11 is configured as, for example, a processor such as a CPU (Central Processing Unit). The controller 11 may be configured as an integrated circuit such as a SoC (System-on-a-Chip) that includes integrated components. The controller 11 may be configured as a combination of a plurality of integrated circuits. The controller 11 realizes various functions by integrally controlling operation of the electronic device 1.
In particular, the controller 11 refers to the data stored in the storage 16 as necessary. The controller 11 realizes various functions by executing instructions included in the programs stored in the storage 16 and controlling other functional units including the display 14. For example, the controller 11 acquires data regarding a contact made by a user. For example, the controller 11 acquires information regarding a gesture performed by a user detected by the proximity sensor 18. For example, the controller 11 acquires information regarding the remaining time in a countdown (a timer time) from the timer 12. Also, for example, the controller 11 recognizes a running status of an application. Further, for example, the controller 11 determines an orientation of the electronic device 1 based on information regarding an acceleration detected by the acceleration sensor 21.
The UV sensor 19 can measure the level of ultraviolet light contained in sunlight.
The illuminance sensor 20 detects the intensity of ambient light incident on the illuminance sensor 20. The illuminance sensor 20 may be configured as, for example, a photodiode or a photo-transistor.
The acceleration sensor 21 detects a direction and magnitude of an acceleration applied to the electronic device 1. A value output from the acceleration sensor 21 is information regarding a detected acceleration. The acceleration sensor 21 is, for example, a three-axis (a three-dimensional) type configured to detect acceleration in an x-axis direction, a y-axis direction, and a z-axis direction. The acceleration sensor 21 may be of, for example, a piezo-resistive type or a capacitance type.
The geomagnetic sensor 22 detects the direction of geomagnetism and allows measurement of the orientation of the electronic device 1.
The atmospheric pressure sensor 23 detects air pressure (atmospheric pressure) external to the electronic device 1.
The gyro sensor 24 detects an angular velocity of the electronic device 1. The controller 11 can measure a change in the orientation of the electronic device 1 by performing time integration of the angular velocity acquired by the gyro sensor 24.
Gesture Operation of the Electronic Device
The electronic device 1 illustrated in
Proximity Sensor Gesture Detection Method
Here, a method employed by the controller 11 for detecting a users gesture based on an output of the proximity sensor 18 will be described with reference to
The controller 11 acquires the values detected by the photodiodes SU, SR, SD, and SL from the proximity sensor 18. Then, in order to recognize motion of the detection object in the direction of the virtual line D1, the controller 11 may perform time integration of a value acquired by subtracting the value detected by the photodiode SU from the value detected by the photodiode SD over a predetermined time period. In the example of
Further, the controller 11 may perform time integration of a value acquired by subtracting the value detected by the photodiode SR from the value detected by the photodiode SL over a predetermined time period. From the change in the integrated value (e.g. a positive, zero, or negative change), the controller 11 can recognize a motion of the detection object in a direction orthogonal to the virtual line D1 (i.e., a direction approximately parallel to the transverse direction of the electronic device 1).
Alternatively, the controller 11 may perform the calculation using all of the values detected by the photodiodes SU, SR, SD, and SL. That is, the controller 11 may recognize the moving direction of the detection object without separating components of the longitudinal direction of the electronic device 1 and components of the transverse direction from each other for the calculation.
The gestures detected by the proximity sensor 18 include, for example, a left-right direction gesture, an up-down direction gesture, an oblique gesture, a gesture which draws a circle in a clockwise direction, and a gesture which draws a circle in a counter-clockwise direction. For example, the left-right direction gesture is a gesture performed in the direction approximately parallel to the transverse direction of the electronic device 1. The up-down direction gesture is a gesture performed in the direction approximately parallel to the longitudinal direction of the electronic device 1. The oblique gesture is a gesture performed in a direction that is not parallel to the longitudinal direction or the transverse direction of the electronic device 1 in a plane approximately parallel to the electronic device 1.
Kitchen Mode
Here, the electronic device 1 has a plurality of modes. The modes refer to operation modes (operating states or operating conditions) that limit the overall operations of the electronic device 1. Only one of the modes can be selected at a time. In the present embodiment, the modes of the electronic device 1 include at least a first mode and a second mode. The first mode is a normal operation mode (a normal mode) suitable for use in rooms other than a kitchen and outside. The second mode is an operation mode (a kitchen mode) of the electronic device 1 suitably used when a user is cooking viewing a recipe in the kitchen. As described above, the second mode enables input operations made by gestures.
In the second mode (the kitchen mode), the electronic device 1 detects a gesture using the proximity sensor 18. Here, it is assumed that the proximity sensor 18 is provided only on the front surface of the electronic device 1. In this case, it is preferable that the electronic device 1 is supported by a stand or the like such that the front surface is directed to the user and the user's gesture can be detected (see
The electronic device 1 according to the present embodiment includes the first proximity sensor 18a and the second proximity sensor 18b. One of the first proximity sensor 18a and the second proximity sensor 18b is selected based on a state of the device (the electronic device 1). Here, the second proximity sensor 18b is provided on a surface different from a surface having the first proximity sensor 18a provided thereon. The electronic device 1 according to the present embodiment can improve the user operability associated with a gesture by appropriately selecting the first proximity sensor 18a or the second proximity sensor 18b based on a state of the electronic device 1 as described later.
Surface Provided with Proximity Sensor
As illustrated in
As illustrated in
Although in the present embodiment the second proximity sensor 18b is located slightly leftward from the center in the transverse direction of the front surface of the electronic device 1, this is not restrictive. Also, the second proximity sensor 18b may be provided on any other surface, other than the top surface (the plane surface), different from the surface having the first proximity sensor 18a provided thereon. For example, the second proximity sensor 18b may be provided on the bottom surface (as illustrated in
Here, when the timer 12 finishes counting down and an alarm goes off while the user is moving, the user may wish to stop the alarm by performing a gesture (e.g., by moving a hand or a portion of the face such as the jaw). Also, when the electronic device 1 configured as a mobile phone receives a phone call while the user is moving, the user may wish to answer the phone call by performing a gesture (e.g., by moving a hand in the transverse direction of the front surface of the electronic device 1 or shaking the head). However, the first proximity sensor 18a provided on the front surface of the electronic device 1 is enclosed in the pocket and cannot detect a user's gesture.
The controller 11 selects between first gesture detection based on a value output from the first proximity sensor 18a and second gesture detection based on a value output from the second proximity sensor 18b, based on whether the electronic device 1 is in the vertical orientation (as an example of the state of the device). In a specific example, the controller 11 selects the second gesture detection when the electronic device 1 is in the predetermined orientation (i.e., the vertical orientation), or selects the first gesture detection when the electronic device 1 is not in the predetermined orientation. In the example of
By performing a gesture near, for example, the top surface of the electronic device 1 (i.e., within a detection range of the second proximity sensor 18b), the user can stop an alarm or answer a phone call while the user is moving with the electronic device 1 in a pocket. Thus, the user operability associated with a gesture is improved. Especially in a case in which the electronic device 1 is in a chest pocket, the user can stop an alarm or answer a phone call by performing a gesture such as moving the chin or shaking the head near the second proximity sensor 18b even when the user's both hands are not free.
The controller 11 determines a state of the own device using, for example, a value output from the acceleration sensor 21. In the present embodiment, the controller 11 acquires information regarding an acceleration detected by the acceleration sensor 21. Then, the controller 11 determines the orientation of the electronic device 1 based on a direction of the gravitational acceleration obtained from the acceleration information and selects the first gesture detection or the second gesture detection.
Although the case in which the electronic device 1 is in a pocket is described as an example, the controller 11 can also improve the operability by performing the same operation in a case in which, for example, the electronic device 1 is in a bag.
Another Example of Providing Second Proximity Sensor
Here, in the electronic device 1 according to the present embodiment, the second proximity sensor 18b is provided on the top surface. However, the second proximity sensor 18b may be provided on the bottom surface in another example (
For example, a user may cook while viewing a recipe displayed by the electronic device 1 that operates in the second mode (the kitchen mode) and is placed on the table as illustrated in
In addition to the operation for selecting the first gesture detection or the second gesture detection based on the orientation of the electronic device, when one of the first proximity sensor 18a and the second proximity sensor 18b satisfies a predetermined condition, the controller 11 performs an operation to select the other one of the proximity sensors. In the present embodiment, the predetermined condition is the elapse of a predetermined time period (e.g., 30 seconds) without a gesture input operation. Here, the predetermined condition is not limited thereto. For example, the predetermined condition may be the elapse of predetermined time period regardless of the presence or absence of a gesture input operation. Or, the predetermined condition may be detection of a user's particular gesture (e.g., a gesture of drawing a circle).
In the example of
Without the necessity for the user to extend their hand to the vicinity of the front surface of the electronic device 1, the gesture detection based on an output from the second proximity sensor 18b is performed after the predetermined time period. The user can perform an input operation by performing a gesture near the bottom surface (within the detection range of the second proximity sensor 18b) of the electronic device 1. That is, the user does not need to extend their hand to the vicinity of the front surface of the electronic device 1, and thus the user operability associated with a gesture is improved.
Here, in the example of
Also, in a case in which the electronic device 1 is on a table with the front surface facing down in
Still Another Example of Providing Second Proximity Sensor
In still another example, the second proximity sensor 18b may be provided on the rear surface (
In the examples illustrated in
In the example of
In the example of
Flowchart
The controller 11 acquires information regarding the acceleration detected by the acceleration sensor 21 (step S1). Then, the controller 11 determines the orientation of the electronic device 1 based on the information regarding the acceleration.
When the electronic device 1 is not in the vertical orientation (in a case of No in step S2), the controller 11 detects a gesture based on a value output from the first proximity sensor 18a (step S3).
When the predetermined time period has elapsed without a gesture input operation based on a value output from the first proximity sensor 18a (in a case of Yes in step S4), the controller 11 proceeds to step S6. That is, the controller 11 selects the second proximity sensor 18b over the first proximity sensor 18a.
When there is a gesture input operation based on a value output from the first proximity sensor 18a within the predetermined time period, or before the predetermined time period has elapsed without a gesture input operation (in a case of No in step S4), the controller 11 proceeds to step S5.
When there is an instruction to end a gesture input operation by, for example, cancellation of the second mode made by the user (in a case of Yes in step S5), the controller 11 ends the sequence of switching operations.
When there is no instruction to end the gesture input operation (in a case of No in step S5), the controller 11 returns to step S3.
When the electronic device 1 is in the vertical orientation (in a case of Yes in step S2), the controller 11 detects a gesture based on a value output from the second proximity sensor 18b (step S6).
When the predetermined time period has elapsed without a gesture input operation based on a value output from the second proximity sensor 18b (in a case of Yes in step S7), the controller 11 proceeds to step S3. That is, the controller 11 selects the first proximity sensor 18a over the second proximity sensor 18b.
When there is a gesture input operation based on a value output from the second proximity sensor 18b within the predetermined time period, or before the predetermined time period has elapsed without a gesture input operation (in a case of No in step S7), the controller 11 proceeds to step S8.
When there is an instruction to end a gesture input operation by, for example, cancellation of the second mode made by the user (in a case of Yes in step S8), the controller 11 ends the sequence of switching operations.
When there is no instruction to end the gesture input operation (in a case of No in step S8), the controller 11 returns to step S6.
As described above, the electronic device 1 according to the present embodiment includes the first proximity sensor 18a and the second proximity sensor 18b. The electronic device 1 further includes the controller 11 configured to select between the first gesture detection based on a value output from the first proximity sensor 18a and the second gesture detection based on a value output from the second proximity sensor 18b based on a state of the device (the electronic device 1). The state of the device may include, for example, whether the device is in the predetermined orientation (e.g., the vertical orientation). The state of the device may include whether the predetermined time period has elapsed without a gesture input operation. The first proximity sensor 18a or the second proximity sensor 18b is appropriately selected by the controller 11 based on the state of the device. This eliminates the necessity for the user to operate the electronic device 1 in accordance with the orientation thereof or to adjust the orientation of the electronic device 1. Thus, the electronic device 1 according to the present embodiment can improve the user operability in relation to a gesture.
Although the present disclosure has been described based on the figures and the embodiments, it should be appreciated that those who are skilled in the art may easily perform variations or alteration based on the present disclosure. Accordingly, such variations and alterations are to be included in the scope of the present disclosure. For example, the functions included in each of the means or steps may be rearranged avoiding a logical inconsistency, such that a plurality of means or steps are combined, or one means or step is subdivided.
In the above embodiment, the controller 11 detects a gesture by selecting one of an output from the first proximity sensor 18a and an output from the second proximity sensor 18b. That is, when an output from one of the proximity sensors 18 is selected, it is likely that the other one of the proximity sensors 18 is in operation as well. Here, in order to save power consumption, the controller 11 may switch between the first proximity sensor 18a and the second proximity sensor 18b based on the state of the device. That is, the controller 11 may turn off (stop) one of the proximity sensors 18 that is not being used to detect a gesture. For example, when the first proximity sensor 18a is selected in step S3 of
In the electronic device 1 of the above embodiment, the first proximity sensor 18a is provided on the front surface and the second proximity sensor 18b is provided on the top surface. In another example, the second proximity sensor 18b is provided on the bottom surface or the rear surface, rather than the top surface. Here, the first proximity sensor 18a and the second proximity sensor 18b may be provided on any surfaces different from each other. For example, the electronic device 1 may include the first proximity sensor 18a provided on the rear surface and the second proximity sensor 18b provided on a side surface. In this case, when the predetermined time period has elapsed without a gesture input operation based on an output from one of the proximity sensors 18, the controller 11 may select an output from the other one of the proximity sensors 18. That is, when the predetermined time period has elapsed without a gesture input operation, the controller 11 may select the other one of the proximity sensors 18, regardless of the orientation of the electronic device 1.
In the above embodiment, the electronic device 1 includes two proximity sensors 18. Here, the electronic device 1 may include three or more proximity sensors 18. In this case, the electronic device 1 includes at least two surfaces having the proximity sensors 18 provided thereon. Each of the three or more proximity sensors 18a may be provided on a different surface. Alternatively, for example, two proximity sensors 18 may be provided on the front surface, and one proximity sensor 18 may be provided on the rear surface. Respective operations of three or more proximity sensors 18 may be switched therebetween in a predetermined order when, for example, the predetermined time period has elapsed without a gesture input operation.
At least some outputs from the plurality of proximity sensors 18 provided to the electronic device 1 may be used by the controller 11 to determine the orientation of the electronic device 1. In
The controller 11 may use an output from the gyro sensor 24 to determine the orientation of the electronic device 1. In this case, the controller 11 can also recognize a change in the orientation of the electronic device 1. The controller 11 may use an output from the illuminance sensor 20 to determine a state of the own device. For example, when the controller 11 recognizes that the illuminance of ambient light is lower than a threshold (e.g., 50 lux) based on an output from the illuminance sensor 20, the controller 11 may determine that the electronic device 1 is in a pocket or a bag.
Many aspects of the disclosure herein may be represented by a series of operations executed by a computer system or other hardware capable of executing a program instruction. The computer system or the other hardware includes, for example, a general-purpose computer, a PC (personal computer), a specialized computer, a workstation, a PCS (Personal Communications System, a personal mobile communication system), a mobile (cellular) phone, a mobile phone having a data processing function, an RFID receiver, a game machine, an electronic notepad, a laptop computer, a GPS (Global Positioning System) receiver, and other programmable data processing apparatuses. Note that in each embodiment the various operations or control methods are executed by a dedicated circuit implemented by a program instruction (software) (e.g., discrete logic gates interconnected to perform a specific function), a logical block, a program module and/or the like executed by at least one processor. The at least one processor for executing the logical block, the program module and/or the like includes, for example, at least one microprocessor, CPU (Central Processing Unit), ASIC (Application Specific Integrated Circuit), DSP (Digital Signal Processor), PLD (Programmable Logic Device), FPGA (Field Programmable Gate Array), a controller, a microcontroller, a microprocessor, an electronic device, and other apparatuses designed to be capable of executing the functions described herein, and/or a combination thereof. The embodiments presented herein are implemented by, for example, hardware, software, firmware, middleware, a microcode, or any combination thereof. The instruction may be a program code or a code segment for executing a necessary task. The instruction may be stored in a machine-readable non-transitory storage medium or in another medium. The code segment may represent any combination of a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a class or an instruction, and a date configuration or a program statement. The code segment transmits/receives information, a data argument, a variable, and storage contents with another code segment or a hardware circuit. Thus, the code segment is connected to the another code segment or the hardware circuit.
The storage 16 used herein may be a computer readable tangible carrier (medium) including a range of a solid-state memory, a magnetic disk, or an optical disk. Such a media stores an appropriate set of computer instructions such as program modules for causing the processor to execute the techniques disclosed herein, or data structures. The computer-readable media includes: electrical connection with one or more wires; a magnetic disk storage; a magnetic cassette; a magnetic tape; another type of magnetic storage device; an optical storage device such as CD (Compact Disk), LD® (Laser Disk, LD is a registered trademark in Japan, other countries, or both), DVD® (Digital Versatile Disc, DVD is a registered trademark in Japan, other countries, or both), a Floppy® disk (Floppy is a registered trademark in Japan, other countries, or both), and a Blu-ray® disc (Blu-ray is a registered trademark in Japan, other countries, or both); a portable computer disk; RAM (Random Access Memory); ROM (Read-Only Memory); EPROM (Erasable Programmable Read-Only Memory; EEPROM (Electrically Erasable Programmable Read-Only Memory); a flash memory; other tangible storage media capable of storing information; and any combination of the above. The memory may be provided inside and/or outside a processor/processing unit. As used herein, the term “memory” refers to any types of a long-term memory, a short-term memory, a volatile memory, a nonvolatile memory, or other memories and is not limited to a particular type of memory, a particular number of memories, or a particular medium to store information.
1 electronic device
11 controller
12 timer
13 camera
13
a front-facing camera
13
b rear-facing camera
14 display
15 microphone
16 storage
17 communication interface
18 proximity sensor
18
a first proximity sensor
18
b second proximity sensor
18
c third proximity sensor
19 UV sensor
20 illuminance sensor
21 acceleration sensor
22 geomagnetic sensor
23 atmospheric pressure sensor
24 gyro sensor
25 speaker
180 light source infrared LED
181 lens
SU, SR, SD, SL photodiode
Number | Date | Country | Kind |
---|---|---|---|
2017-082324 | Apr 2017 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/014728 | 4/6/2018 | WO | 00 |