This relates generally to electronic devices, and, more particularly, to electronic devices with sensors.
Electronic devices such as laptop computers and other electronic devices include input devices such as keyboards, touch pads, and touch sensitive displays. Using these input devices, users can control the operation of the electronic devices.
It can be challenging to operate electronic devices using certain input devices. For example, some input devices are only configured to detect touch input on a two-dimensional surface. This may overly limit the types of input that a user can provide to an electronic device.
An electronic device may include air input sensors that gather air input from a user. The air input may be input from a user's fingers, a stylus, or other object in a volume of air near the electronic device. The air input sensors may include ultrasonic transducers that emit ultrasonic signals towards the volume of air and that detect the ultrasonic signals after the signals reflect from the external object. Using time-of-flight measurement techniques, control circuitry may track the movement of the external object in the volume of air near the electronic device.
A display may display visual feedback of the air input, such as shadows that preview where the input will be directed to on the display. The volume of air where input is detected may be divided into multiple input zones that trigger different actions from the electronic device. The ultrasonic transducers may include acoustic lenses to focus sound onto the transducers and/or to diverge a sound signal across a given range of angles.
To enhance the ability of a user to operate an electronic device, the electronic device may be provided with air input sensors. The air input sensors may detect the presence of external objects such as a user's fingers, a stylus, or other object, without direct contact between the objects and the air input sensors. For example, air gestures with the user's hands and/or movements of a stylus or pen in the air above and/or adjacent to the electronic device may be used to control the electronic device.
An illustrative electronic device of the type that may be provided with air input sensors is shown in
As shown in
Input-output circuitry in device 10 such as input-output devices 12 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 12 may include buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, speakers, tone generators, vibrators and other haptic output devices, sensors with digital image sensors such as visible light cameras and other sensors, light-emitting diodes and other status indicators, data ports, etc. A user can control the operation of device 10 by supplying commands through input-output devices 12 and may receive status information and other output from device 10 using the output resources of input-output devices 12.
Input-output devices 12 may include one or more displays such as display 14. Display 14 may be an organic light-emitting diode display, a liquid crystal display, or other display. Display 14 may be a touch screen display that includes a touch sensor for gathering touch input from a user or display 14 may be a touch insensitive display that is not sensitive to touch. A touch sensor for display 14 may be based on an array of capacitive touch sensor electrodes, acoustic touch sensor structures, resistive touch components, force-based touch sensor structures, a light-based touch sensor, or other suitable touch sensor arrangements.
Input-output devices 12 may also include sensors 18. Sensors 18 may include microphones, force sensors, touch sensors, temperature sensors, air pressure sensors, moisture sensors, ambient light sensors and other light-based sensors, magnetic sensors, sensors for measuring movement of device 10 along a surface (e.g., a light source such as a light-emitting diode and/or laser diode and a corresponding visible light or infrared camera that captures images of a portion of work surface under device 10 as device 10 is moved across the work surface, device position (movement) sensors based on rotating wheels that track surface movements, etc.), image sensors for such as visible-light and infrared cameras (e.g., digital image sensors and lenses for measuring three-dimensional hand gestures and other user gestures, etc.), grip sensors (e.g., capacitance-based grip sensors, optical grip sensors, etc.), and/or other sensors. If desired, sensors 18 may include sensors for measuring the orientation, movement, and/or position of device 10 such as inertial measurement units that include accelerometers, compasses, and/or gyroscopes. An accelerometer may be used to measure vibrations that pass to device 10 through a tabletop or other surface from a user's fingers.
As shown in
During operation, control circuitry 16 may use air input sensors 20 to gather user input in the air above, in front of, behind, or otherwise adjacent to electronic device 10. The user input may include three-dimensional gestures (e.g., hand gestures made in the air without necessarily contacting device 10), stylus input (e.g., stylus input in the air without necessarily contacting device 10), and/or other user input. Control circuitry 16 may, for example, use sensors 20 to gather information such as information on the position of a user's finger(s), stylus, or other object in a three-dimensional volume of air near device 10. This additional user input may help extend the input capabilities of device 10 and may thereby supplement the information gathered using buttons, touch sensors, trackpads, keyboards, pointing device movement sensors, and/or other input devices in device 10. User input may be used in manipulating visual objects on display 14 (e.g., display icons, etc.), may be used in providing drawing input to display 14, may be used in supplying device 10 with text, may be used in making menu selections, and/or may otherwise be used in operating device 10.
A perspective view of an illustrative electronic device that includes air input sensors 20 is shown in
As shown in
Air input sensors 20 may be placed in any suitable location on device 10 depending on the desired input region. In the example of
Air input sensors 20 may be acoustic sensors (e.g., ultrasonic sensors), capacitive sensors, radio-frequency sensors, optical sensors such as visible light image sensors, infrared image sensors, proximity sensors formed from light-emitting diodes and photodetectors, three-dimensional camera systems such as depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices that capture three-dimensional images), self-mixing sensors, light detection and ranging (lidar) sensors that gather time-of-flight measurements (e.g., time-of-flight cameras), a combination of two or more of these sensors, and/or other sensors for measuring finger position and/or the position of other objects 40 in input region 32.
As an example, air input sensors 20 may be optical sensor elements that each include a light-emitting diode, laser, or other light emitter (e.g., an infrared light-emitting device) and that include a light detector (e.g., an infrared photodetector). The amount of emitted infrared light that is detected by an infrared photodetector after reflecting from an external object may be used to measure the location of the external object (e.g., a finger) and thereby detect air gestures and/or other air input. Another illustrative arrangement involves using ultrasonic sound emitters to emit ultrasonic signals and using ultrasonic sound detectors (e.g., microphones) to detect reflected acoustic signals and thereby gather information on air gestures and/or other air input. If desired, location measurements may be gathered using radio-frequency sensor elements (radio-frequency emitters and corresponding radio-frequency receivers). Other air input monitoring sensors and/or combinations of these sensors may also be used in forming air input sensors 20. Configurations for air input sensors 20 that use ultrasonic sound emitters and detectors to detect air gestures and/or other air input are sometimes described herein as an illustrative example.
During operation, one or more air input sensors 20 may emit signals such as signal 46 (e.g., an acoustic signal, an optical signal, a radio-frequency signal, and/or other suitable signal) that interacts with object 40 in input region 32. One or more air input sensors 20 may detect reflected signal 46R after it reflects off of object 40 in input region 32. Based on the emitted and detected signals, control circuitry 16 may determine the position (and may therefore track the movement) of object 40 relative to sensors 20. With this arrangement, a user may use input region 32 as a standalone input region and/or as a supplemental input region that supplements other input devices in device 10 such as touch-sensitive display 14. This allows a user to effectively provide input that might be difficult to provide directly to the body of device 10 (e.g., directly to buttons or a touch sensor on housing 22).
As an example, a user may move object 40 in input region 32 to move a cursor on a display such as display 14, to select an item in a list, to highlight an item, to drag and drop items, to launch an item, to provide drawing input (e.g., to draw a line or provide other drawing input), and/or to otherwise interact with device 10. User input in the air (e.g., air input) may include finger taps in the air (single taps, double taps, triple taps, etc.), gestures in the air formed from lingering finger positions (hovers, persistent finger presence in a particular location in region 32), single-finger swipes in the air, multi-finger swipes in the air, pinch-to-zoom gestures in the air, and other multi-touch finger gestures in the air, hand gestures in the air, other two-dimensional and three-dimensional gestures in the air (e.g., waving a user's hand and fingers in the air in region 32, etc.), stylus movements in the air, pen movements in the air, and/or any other suitable user input from a user body part, stylus controlled by a user, and/or other external objects.
Air input sensors 20 may be formed in one or more locations on upper housing 22A and/or one or more locations on lower housing 22B. In the example of
As shown in
Visual elements 34 may be any suitable display element (e.g., may have any suitable shape, color, shading, etc.). If desired, visual elements 34 may change based on the movement of object 40. For example, movements along the x-y plane of
As shown in
Visual element 34 may be any suitable display element (e.g., may have any suitable shape, color, shading, etc.). If desired, visual element 34 may change based on the movement of item 40. For example, movements along the x-y plane of
Air input of the type shown in
If desired, different air gestures may be used in interacting with options of interest. For example, a finger hovering over a particular location on display 14 may be used to highlight a desired option on display 14, whereas a continued presence (dwell) over that location may be used to activate the highlighted option. This is merely illustrative, however. If desired, an option may be selected by a hover followed by a swipe, an air gesture with two (or more) fingers can be used to select an option, or an air gesture such as an air gesture swipe may be used to move an on-screen object. If desired, a hovering gesture may be used to highlight a desired option followed by a tap or other touch event to select the highlighted option.
As an example, when object 40 is in input zone 32C, control circuitry 16 may perform a first set of actions in response to movements of object 40 such as providing visual feedback on display 14 indicating where the air input is mapping to on display 14 but without actually selecting or manipulating any objects on display 14. When object 40 is in input zone 32B, control circuitry 16 may perform a second set of actions in response to movements of object 40 such as providing visual feedback on display 14 indicating that an object on display 14 has been selected, that a line is being drawn, and/or that other objects on display 14 are being manipulated with the air input being detected in zone 32B. When object 40 is in input zone 32A, control circuitry 16 may perform a third set of actions in response to movements of object 40 such as providing visual feedback on display 14 that includes additional options to interact with items on display 14. This is merely illustrative, however. In general, control circuitry 16 may take any suitable action in response to air input in zones 32A, 32B, and 32C detected by air input sensors 20. If desired, input region 32 may instead or additionally be divided into different zones along the x-axis and/or y-axis of
During operation, one or more of ultrasonic transducers 42 may be used to emit ultrasonic signals 46. The ultrasonic signals 46 may reflect off of object 40A (e.g., the user's hand, one or more fingers, a stylus, a pen, a paintbrush, and/or any other suitable object) in input region 32. One or more of ultrasonic transducers 42 may be used to detect ultrasonic signals 46R after the signals reflect off of object 40A in input region 32. Using time-of-flight measurement techniques, control circuitry 16 may determine the time that it takes for the emitted signal 46 to reflect back from object 40A, which may in turn be used to determine the position of object 40A in three-dimensional space (e.g., control circuitry 16 may determine the x, y, and z coordinates of object 40A at a given time based on emitted signals 46, based on reflected signals 46R, and based on the known positions of transducers 42 relative to one another). As object 40A moves within region 32, control circuitry 16 may continue to monitor changes in the position of object 40A using transducers 42.
If desired, the same transducer 42 may be used to emit and detect signals 46 (e.g., one or more of transducers 42 may be operable in a transmitting mode and a receiving mode). For example, transducers 42 may emit ultrasonic signal pulses during an emitting period and may subsequently be reconfigured as microphones during a listening period. Control circuitry 16 may cycle transducers 42 back and forth between emitting mode and listening mode (i.e., receiving mode). This is merely illustrative, however. If desired, some transducers 42 may be designated emitting transducers while other transducers 42 may be designated receiving transducers. During emitting mode, transducers 42 may emit ultrasonic signals at one or more different frequencies (e.g., ranging from 20 kHz to 340 kHz or other suitable frequencies). Transducers 42 may be piezoelectric micromachined ultrasonic transducers, capacitive micromachined ultrasonic transducers, and/or other suitable ultrasonic transducers. Transducers 42 may be fixed or may be steerable.
If desired, control circuitry 16 may use multiple transducers 42 to detect air input from multiple objects 40 in input zone 32 (e.g., simultaneously, if desired). As shown in
Distance A2 may be determined as a function of angle Y or angle X (e.g., A2=A1 sin(X) or A2=A1 cos(Y)). Distance A2 may also be determined as a function of the phase difference between the signal received by transducer 42-1 and the signal received by transducer 42-2 (e.g., A2=(Δϕλ)/(2π), where Δϕ is the phase difference between the signal received by transducer 42-1 and the signal received by transducer 42-2 and λ is the wavelength of the received signal 46R. Control circuitry 16 may include phase measurement circuitry coupled to each transducer 42 to measure the phase of the received signals and identify a difference in the phases (Δϕ). The two equations for A2 may be set equal to each other (e.g., A1 sin(X)=(Δϕλ)/(2π)) and rearranged to solve for angle X (e.g., X=sin−1((Δϕλ)/(2πA1)) or may be rearranged to solve for angle Y. As such, the angle of arrival may be determined (e.g., by control circuitry 26) based on the known (predetermined) distance between transducer 42-1 and transducer 42-2, the detected (measured) phase difference between the signal received by transducer 42-1 and the signal received by transducer 42-2, and the known wavelength or frequency of the received signals 46. The wavelength λ of signal 46R may be equal to the speed of sound in air divided by the frequency of the signal 46R.
The speed of sound is dependent upon characteristics of the medium that the sound is traveling through. If desired, control circuitry 16 may take into account atmospheric conditions such as altitude (e.g., relative to sea level) and/or air pressure when determining the speed of sound. For example, control circuitry 16 may obtain air pressure information and/or altitude information from one or more sensors in device 10, from one or more sensors in an external electronic device, and/or from an online database. Control circuitry 16 may determine the speed of sound for time-of-flight measurement purposes based on the air pressure information and/or altitude information. This is merely illustrative, however. If desired, control circuitry 16 may use a predetermined value for the speed of sound (e.g., the speed of sound in air).
If desired, transducers 42 may use an acoustic lens to diverge acoustic waves 46 emitted by transducer 42 over a range of angles and/or to focus acoustic waves 46R received over a range of angles onto transducer 42. This type of arrangement is illustrated in
In the example of
Transducer 42T may emit signals 46 initially directed vertically parallel to the z-axis of
In the example of
Device 10 may gather and use personally identifiable information. It is well understood that the use of personally identifiable information should follow privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining the privacy of users. In particular, personally identifiable information data should be managed and handled so as to minimize risks of unintentional or unauthorized access or use, and the nature of authorized use should be clearly indicated to users.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
Number | Name | Date | Kind |
---|---|---|---|
9377860 | Weber et al. | Jun 2016 | B1 |
9519365 | Hitosuga | Dec 2016 | B2 |
10061396 | Shibayama et al. | Aug 2018 | B1 |
10175721 | Sun et al. | Jan 2019 | B2 |
10534479 | Holmgren et al. | Jan 2020 | B2 |
10921854 | Garelli et al. | Feb 2021 | B2 |
20060161871 | Hotelling | Jul 2006 | A1 |
20080158172 | Hotelling et al. | Jul 2008 | A1 |
20090228841 | Hildreth | Sep 2009 | A1 |
20100026656 | Hotelling et al. | Feb 2010 | A1 |
20110193818 | Chen | Aug 2011 | A1 |
20110205186 | Newton et al. | Aug 2011 | A1 |
20110289456 | Reville et al. | Nov 2011 | A1 |
20110310005 | Chen | Dec 2011 | A1 |
20140267130 | Hwang et al. | Sep 2014 | A1 |
20140347296 | Yoshikawa | Nov 2014 | A1 |
20150062069 | Shin et al. | Mar 2015 | A1 |
20150130742 | Chen et al. | May 2015 | A1 |
20150130764 | Woolley et al. | May 2015 | A1 |
20160067602 | Holmgren et al. | Mar 2016 | A1 |
20160162146 | Wu | Jun 2016 | A1 |
20170090865 | Armstrong-Muntner et al. | Mar 2017 | A1 |
20170285843 | Roberts-Hoffman et al. | Oct 2017 | A1 |
20180068636 | Kim et al. | Mar 2018 | A1 |
20180088632 | Dreessen et al. | Mar 2018 | A1 |
20190265828 | Hauenstein | Aug 2019 | A1 |
20190369755 | Roper | Dec 2019 | A1 |
20200073512 | Jiang | Mar 2020 | A1 |
Number | Date | Country |
---|---|---|
WO-2014161276 | Oct 2014 | WO |
WO-2014161775 | Oct 2014 | WO |
Entry |
---|
U.S. Appl. No. 15/866,778, filed Jan. 10, 2018. |
Number | Date | Country | |
---|---|---|---|
20230081556 A1 | Mar 2023 | US |