Electronic devices such as smartphones, tablet computers, and wearables typically include a metal and/or plastic housing to provide protection and structure to the devices. The housing often includes openings to accommodate physical buttons that are utilized to interface with the device. However, there is a limit to the number and types of physical buttons that are able to be included in some devices due to physical, structural, and usability constraints. For example, physical buttons may consume too much valuable internal device space and provide pathways where water and dirt may enter a device to cause damage. Consequently, other mechanisms for allowing a user to interacting with electronic devices are desired.
Various embodiments of the invention are disclosed in the following detailed description and the accompanying drawings.
The invention can be implemented in numerous ways, including as a process; an apparatus; a system; a composition of matter; a computer program product embodied on a computer readable storage medium; and/or a processor, such as a processor configured to execute instructions stored on and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless stated otherwise, a component such as a processor or a memory described as being configured to perform a task may be implemented as a general component that is temporarily configured to perform the task at a given time or a specific component that is manufactured to perform the task. As used herein, the term ‘processor’ refers to one or more devices, circuits, and/or processing cores configured to process data, such as computer program instructions.
A detailed description of one or more embodiments of the invention is provided below along with accompanying figures that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the purpose of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
The housing for electronic devices provides structure and protection to the components therein and typically includes openings to accommodate physical buttons used to control the device. However, such physical buttons consume valuable device spaces, provide pathways for contaminants to enter the device and have fixed locations. Consequently, other mechanisms for interfacing with an electronic device such as a mobile phone (e.g. a smartphone), a tablet, and/or a wearable is desired.
Touch surfaces are increasing utilized in displays of computer devices. Such touch surfaces can be used to interact with the device. For example, the touch surface may be part of a display for a cell phone or smart phone, a wearable, a tablet, a laptop, a television etc. Various technologies have been traditionally used to detect a touch input on such a display. For example, capacitive and resistive touch detection technology may be used. Using resistive touch technology, often a glass panel is coated with multiple conductive layers that register touches when physical pressure is applied to the layers to force the layers to make physical contact. Using capacitive touch technology, often a glass panel is coated with material that can hold an electrical charge sensitive to a human finger. By detecting the change in the electrical charge due to a touch, a touch location can be detected. However, with resistive and capacitive touch detection technologies, the glass screen is required to be coated with a material that reduces the clarity of the glass screen. Additionally, because the entire glass screen is required to be coated with a material, manufacturing and component costs can become prohibitively expensive as larger screens are desired. Capacitive touch surface technologies also may face significant issues in use with metal (i.e. conductive) and/or curved surfaces. This limitation may restrict capacitive touch surfaces to smaller, flat displays. Thus, traditional touch surfaces may be limited in utility.
Electrical components can be used to detect a physical disturbance (e.g., strain, force, pressure, vibration, etc.). Such a component may detect expansion of or pressure on a particular region on a device and provide an output signal in response. Such components may be utilized in devices to detect a touch. For example, a component mounted on a portion of the smartphone may detect an expansion or flexing of the portion to which the component is mounted and provide an output signal. The output signal from the component can be considered to indicate a purposeful touch (a touch input) of the smartphone by the user. Such electrical components may not be limited to the display of the electronic device.
However, a smartphone or other device may undergo flexing and/or localized pressure increases for reasons not related to a user's touch. Thus, purposeful touches by a user (touch inputs) are desired to be distinguished from other physical input, such as bending of the device and environmental factors that can affect the characteristics of the device, such as temperature. In some embodiments, therefore, a touch input includes touches by the user, but excludes bending and/or temperature effects. For example, a swipe or press of a particular region of a mobile phone is desired to be detected as a touch input, while a user sitting on the phone or a rapid change in temperature of the mobile phone should not to be determined to be a touch input.
A system that may provide user interface elements based on touch inputs is described. The system includes sensors and at least one processor. The sensors are configured to sense force. In some embodiments, the sensors include touch sensor(s) and/or force sensors(s). The processor receives force measurements from the sensors and identifies touch locations based on the force measurements. In some embodiments, the touch locations include a device edge (e.g. a housing) and/or a device back opposite to a display. The processor is further configured to provide at least one user interface element based on the touch locations. To provide the user interface element(s), the processor may be configured to determine a location and an orientation on a display for each of the user interface element(s). In some such embodiments, a context is also determined. For example, the context might be that the electronic device is being held in portrait or landscape mode, that the electronic device is in the user's left hand or right hand, that the user is gaming, or another context. The processor may also generate haptic feedback based on the touch locations. For example, the appropriate haptics actuators may be driven to provide a haptic response at one or more of the touch locations.
In some embodiments, a touch location corresponds to a force measurement from at least one of the sensors. The force measurement has a first magnitude. In such embodiments, the processor is further configured to update the user interface element(s) and/or generate haptic feedback based upon an additional force measurement corresponding to the touch location. The additional force measurement has a second magnitude greater than the first magnitude. In some embodiments, the second magnitude exceeds an absolute threshold and/or a relative threshold. For example, the relative threshold may be equal to the first magnitude added to a first threshold.
In some embodiments, the system also includes an orientation sensor, such as one or more accelerometers, for sensing a rotation of a display. In such embodiments, the processor may be configured to update the user interface element(s) for the rotation of a display only if the orientation sensor senses the rotation and the touch locations change.
In some embodiments, rather than individually attaching separate already manufactured piezoresistive elements together on to a backing material to produce the piezoresistive bridge structure, the piezoresistive bridge structure is manufactured together as a single integrated circuit component and included in an application-specific integrated circuit (ASIC) chip. For example, the four piezoresistive elements and appropriate connections between are fabricated on the same silicon wafer/substrate using a photolithography microfabrication process. In an alternative embodiment, the piezoresistive bridge structure is built using a microelectromechanical systems (MEMS) process. The piezoresistive elements may be any mobility sensitive/dependent element (e.g., as a resistor, a transistor, etc.).
Each strain sensor 202, 204, 212, 214, 222, 224, 232, 234, 242 and 244 is labeled with a + sign indicating the directions of strain sensed. Thus, strain sensors 202, 204, 212, 214, 222, 224, 232, 234 and 244 sense strains (expansion or contraction) in the x and y directions. However, strain sensors at the edges of integrated sensor 200 may be considered to sense strains in a single direction. This is because there is no expansion or contraction beyond the edge of integrated sensor 200. Thus, strain sensors 202 and 204 and strain sensors 222 and 224 measure strains parallel to the y-axis, while strain sensors 212 and 214 and strain sensors 232 and 234 sense strains parallel to the x-axis. Strain sensor 242 has been configured in a different direction. Strain sensor 242 measures strains in the xy direction (parallel to the lines x=y or x=−y). For example, strain sensor 242 may be used to sense twists of integrated sensor 200. In some embodiments, the output of strain sensor 242 is small or negligible in the absence of a twist to integrated sensor 200 or the surface to which integrated sensor 200 is mounted.
Thus, integrated sensor 200 obtains ten measurements of strain: four measurements of strain in the y direction from strain sensors 202, 204, 222 and 224; four measurements of strain in the x direction from sensors 212, 214, 232 and 234; one measurement of strains in the xy direction from sensors 242 and one measurement of strain from sensor 244. Although ten strain measurements are received from strain sensors 202, 204, 212, 214, 222, 224, 232, 234, 242 and 244, six measurements may be considered independent. Strain sensors 202, 204, 212, 214, 222, 224, 232, and 234 on the edges may be considered to provide four independent measurements of strain. In other embodiments, a different number of strain sensors and/or different locations for strain sensors may be used in integrated sensor 200.
Integrated sensor 200 also includes temperature sensor 250 in some embodiments. Temperature sensor 250 provide an onboard measurement of the temperatures to which strain sensors 202, 204, 212, 214, 222, 224, 232, 234, 242 and 244 are exposed. Thus, temperature sensor 200 may be used to account for drift and other temperature artifacts that may be present in strain data. Integrated sensor 200 may be used in a device for detecting touch inputs.
System 300 is connected with application system 302 and touch surface 320, which may be considered part of the device with which system 300 is used. System 300 includes touch detector/processor(s) 310, force sensors 312 and 314, transmitter 330 and touch sensors 332 and 334. Also shown are optional haptics generator 350, haptics actuators 352 and 354, and additional sensor(s) 360. Although indicated as part of touch surface 320, haptics actuators 352 and 354 may be located elsewhere on the device incorporating system 300. Additional sensor(s) 360 may include orientation sensors such as accelerometer(s), gyroscope(s) and/or other sensors generally included in a device, such as a smartphone. Although shown as not located on touch surface 320, additional sensor(s) 360 may be at or near touch surface 320. Although shown as coupled with touch detector 310, in some embodiments, sensor(s) 360 and/or haptics generator 350 are simply coupled with application system 302. Haptics generator receives signals from touch detector/processor(s) 310 and/or application system 302 and drives haptics actuator(s) 352 and/or 354 to provide haptic feedback for a user. For simplicity, only some portions of system 300 are shown. For example, only two haptics actuators 352 and 354 are shown, but more may be present.
Touch surface 320 is a surface on which touch inputs are desired to be detected. For example touch surface may include the display of a mobile phone, the touch screen of a laptop, a side or an edge of a smartphone, a back of a smartphone (i.e. opposite from the display), a portion of the frame of the device or other surface. Thus, touch surface 320 is not limited to a display. Force sensors 312 and 314 may be integrated sensors including multiple strain sensors, such as integrated sensor 200. In other embodiments, force sensors 312 and 314 may be individual strain sensors. Other force sensors may also be utilized. Although two force sensors 312 and 314 are shown, another number is typically present. Touch sensors 330 and 332 may be piezoelectric sensors. Transmitter 330 may also be a piezoelectric device. In some embodiments, touch sensors 330 and 332 and transmitter 330 are interchangeable. Touch sensors 330 and 332 may be considered receivers of an ultrasonic wave transmitted by transmitter 330. In other cases, touch sensor 332 may function as a transmitter, while transmitter 330 and touch sensor 334 function as receivers. Thus, a transmitter-receiver pair may be viewed as a touch sensor in some embodiments. Multiple receivers share a transmitter in some embodiments. Although only one transmitter 330 is shown for simplicity, multiple transmitters may be used. Similarly, although two touch sensors 332 and 334 are shown, another number may be used. Application system 302 may include the operating system for the device in which system 300 is used.
In some embodiments, touch detector/processor(s) 310 is integrated in an integrated circuit chip. Touch detector 310 includes one or more microprocessors that process instructions and/or calculations that can be used to program software/firmware and/or process data for touch detector 310. In some embodiments, touch detector 310 include a memory coupled to the microprocessor and configured to provide the microprocessor with instructions. Other components such as digital signal processors may also be used.
Touch detector 310 receives input from force sensors 312 and 314, touch sensors 332 and 334 and, in some embodiments, transmitter 330. For example, touch detector 310 receives force (e.g. strain) measurements from force sensors 312 and 314, touch (e.g. piezoelectric voltage) measurements from touch sensors 332 and 334. Although termed “touch” measurements, such measurements may also be considered a measure of force. Touch detector 310 may also receive temperature measurements from onboard temperature sensors for force sensors 312 and/or 314, such as temperature sensor 250. Touch detector may also obtain temperature data from one or more separate, dedicated temperature sensor(s). Touch detector 310 may provide signals and/or power to force sensors 312 and 314, touch sensors 332 and 334 and transmitter 330. For example, touch detector 310 may provide the input voltage(s) to force sensors 312 and 314, voltage or current to touch sensor(s) 332 and 334 and a signal to transmitter 330. Touch detector 310 utilizes the force (strain) measurements and/or touch (piezoelectric) measurements to determine whether a user has provided touch input touch surface 320. If a touch input is detected, touch detector 310 provides this information to application system 302 and/or haptics generator 350 for use.
Signals provided from force sensors 312 and 314 are received by touch detector 310 and may be conditioned for further processing. For example, touch detector 310 receives the strain measurements output by force sensors 312 and 314 and may utilize the signals to track the baseline signals (e.g. voltage, strain, or force) for force sensors 312 and 314. Strains due to temperature may also be accounted for by touch detector 310 using signals from a temperature sensor, such as temperature sensor 250. Thus, touch detector 310 may obtain absolute forces (the actual force on touch surface 320) from force sensors 312 and 314 by accounting for temperature. In some embodiments, a model of strain versus temperature for force sensors 312 and 314 is used. In some embodiments, a model of voltage or absolute force versus temperature may be utilized to correct force measurements from force sensors 312 and 314 for temperature.
In some embodiments, touch sensors 332 and 334 sense touch via a wave propagated through touch surface 320, such as an ultrasonic wave. For example, transmitter 330 outputs such an ultrasonic wave. Touch sensors 332 and 334 function as receivers of the ultrasonic wave. In the case of a touch by a user, the ultrasonic wave is attenuated by the presence of the user's finger (or other portion of the user contacting touch surface 320). This attenuation is sensed by one or more of touch sensors 332 and 334, which provide the signal to touch detector 310. The attenuated signal can be compared to a reference signal. A sufficient difference between the attenuated signal and the reference signal results in a touch being detected. The attenuated signal corresponds to a force measurement. Because the attenuation may also depend upon other factors, such as whether the user's is wearing a glove, such force measurements from touch sensors may be termed imputed force measurements. In some embodiments, absolute forces may be obtained from the imputed force measurements. As used herein in the context of touch sensors, imputed force and force may be used interchangeably.
Encoded signals may be used in system 300. In some embodiments, transmitter 330 provides an encoded signal. For example, transmitter 330 may use a first pseudo-random binary sequence (PRBS) to transmit a signal. If multiple transmitters are used, the encoded signals may differ to be able to discriminate between signals. For example, the first transmitter may use a first PRBS and the second transmitter may use a second, different PRBS which creates orthogonality between the transmitters and/or transmitted signals. Such orthogonality permits a processor or sensor coupled to the receiver to filter for or otherwise isolate a desired signal from a desired transmitter. In some embodiments, the different transmitters use time-shifted versions of the same PRBS. In some embodiments, the transmitters use orthogonal codes to create orthogonality between the transmitted signals (e.g., in addition to or as an alternative to creating orthogonality using a PRBS). In various embodiments, any appropriate technique to create orthogonality may be used. In some embodiments, encoded signals may also be used for force sensors 312 and 314. For example, an input voltage for the force sensors 312 and 314 may be provided. Such an input signal may be encoded using PRBS or another mechanism.
In some embodiments, only force sensors 312 and 314 may be used to detect touch inputs. In some such embodiments, drifts and other temperature effects may be accounted for using temperature sensor 250. Bending or other flexing may be accounted for using strain sensor 242. In other embodiments, only touch sensors 332 and 334 may be used to detect touch inputs. In such embodiments, touch inputs are detected based upon an attenuation in a signal from transmitter 330. However, in other embodiments, a combination of force sensors 312 and 314 and touch sensors 332 and 334 are used to detect touch inputs.
Based upon which sensor(s) 312, 314, 332 and/or 334 detects the touch and/or characteristics of the measurement (e.g. the magnitude of the force detected), the location of the touch input in addition to the presence of a touch input may be identified. For example, given an array of force and/or touch sensors, a location of a touch input may be triangulated based on the detected force and/or imputed force measurement magnitudes and the relative locations of the sensors that detected the various magnitudes (e.g., using a matched filter). Further, data from force sensors 312 and 314 can be utilized in combination with data from touch sensors 332 and 334 to detect touches. Utilization of a combination of force and touch sensors allows for the detection of touch inputs while accounting for variations in temperature, bending, user conditions (e.g. the presence of a glove) and/or other factors. Thus, detection of touches using system 300 may be improved.
For example, touch detector 310 receives force measurements from force sensors 312 and 314. Touch detector 310 receives imputed force measurements from touch sensors 332 and 334. Touch detector 310 identifies touch inputs based upon at least the imputed force measurements. In such embodiments, force measurements are utilized to calibrate one or more touch input criterion for touch sensors 332 and 223. For example, if a user is wearing a glove, the attenuation in the ultrasonic signal(s) sensed by touch sensors 332 and 334 may be reduced. Consequently, the corresponding imputed force measurements may not result in a detection of a touch input. However, force measurements from force sensors 312 and/or 314 correlated with and corresponding to the touch input of a user wearing a glove indicate a larger force than the imputed force measurements. In some embodiments, the measured forces corresponding to the output of touch sensors 332 and 334 are recalibrated (e.g. raised in this example) so that a reduced attenuation in the ultrasonic signal(s) is identified as a touch input. In some embodiments, a touch input is detected if the force meets or exceeds a threshold. Thus, the threshold for detecting a touch input using the signals from touch sensors 332 and 334 is recalibrated (e.g. decreased in this example) so that a reduced attenuation in the ultrasonic signal(s) is identified as a touch input. Thus, the user's condition can be accounted for. Further, touch sensors 312 and 334 may be piezoelectric sensors and thus insensitive to bends and temperature. Consequently, such effects may not adversely affect identification of touch inputs. In embodiments in which both force and imputed force measurements are used in identifying a touch input, only if force measurements from force sensors (e.g. strains indicating an input force at a particular time and location) and imputed force measurements (e.g. piezoelectric signals indicating an input force at a corresponding time and location) are sufficiently correlated. In such embodiments, there may be a reduced likelihood of bends or temperature effects resulting in a touch input being detected. The touch input criterion/criteria may then be calibrated as described above.
Thus, using system 300, touch inputs may be detected. If both force and imputed force measurements (e.g. strain and piezoelectric measurements), issues such as changes in temperature and bending of the touch surface may not adversely affect identification of touch inputs. Similarly, changes in the user, such as the user wearing a glove, may also be accounted for in detecting touch inputs. Further, the dynamic ranges of force sensors and touch sensors may differ. In some embodiments, piezoelectric touch sensors may be capable of sensing lighter touches than strain gauges used in force sensors. A wider variety of touch inputs may, therefore, be detected. Moreover, force and/or touch sensors may be utilized to detect touch inputs in regions that are not part of a display. For example, the sides, frame, back cover or other portions of a device may be used to detect touch inputs. Consequently, detection of touch inputs may be improved.
For example, in the embodiment shown in
Utilizing force measurements from force and/or touch sensors, a user interface may be better controlled.
Force measurements are received from force sensors, at 702. Force sensors, such as strain sensors and/or touch sensors, provide an output signal that is received at 702. In some embodiments, the signal corresponding to the force measurements provided is a voltage signal that may be conditioned, converted to a digital signal for processing, or otherwise processed. In some embodiments, 702 includes transmitting ultrasonic signal(s), receiving the ultrasonic signal(s) at touch sensors, and the touch sensors providing the received ultrasonic signal for further processing. The ultrasonic signal(s) provided may be encoded. The received signals output by the touch sensors and corresponding to the imputed force measurements may also be encoded.
Touch locations are identified, at 704. Identification of touch locations includes detecting touch inputs, for example as described above. Thus, in some embodiments, a touch may be detected if a measured force exceeds a particular threshold. In addition, the locations of the touch inputs based are determined upon the characteristics of the force measurements and the location(s) of the corresponding sensors. Thus, at 702 and 704, the magnitude of the force and the location of the force due to the user's touch input can be determined.
User interface elements are provided based on the touch locations, at 706. User interface elements may include elements such as virtual buttons (e.g. volume increase, volume decrease, power, menu buttons), slide bars (e.g. for controlling magnification of images presented on a display), menus (e.g. menu bars, the direction a menu drops down), information (e.g. a battery power bar), icons and/or other elements that allow a user to provide input to and receive output from the device. Although the user interface elements are presented to the user on the display, at least some user interface elements may correspond to portions of the device distinct from the display. For example, virtual buttons may be depicted on the display but may be activated by user touches at the sides of the device as depicted in
At 708, user interface elements may be updated based upon changes in the force and/or touch locations. Also at 708, haptic feedback may be generated also based upon changes in the force applied by the user and/or changes in the touch locations. For example, a sufficient increase or decrease in the force may be used to update the user interface and/or generate haptic feedback. In some embodiments, if the force applied to a touch location exceeds a threshold larger than the touch threshold used in determining the touch location, a user is considered to have pressed a (virtual) button at the touch location or squeezed the frame to activate a function. For example, a force having a magnitude that meets or, in some embodiments exceeds, a first level results in a touch being detected and a touch location identified at 704. A force having a magnitude that meets or, in some embodiments exceeds, a second level greater than the first level results in the detection of a button push or other selection at 708. In some embodiments, the thresholds are absolute. For example, the second level may be a particular force determined by the manufacturer or based upon prior training by the user. In some embodiments, the thresholds may be relative. For example, the second level may exceed the first level by a fraction multiplied by the actual force the user applied for touch detection. In another example, the second level may exceed the first level by a fraction multiplied by the first level. In some cases, the user may move their finger, for example to activate a slide bar. In such instances, the change in the touch location (e.g. the removal of one location and the addition of a new touch location) may result in an update. In some embodiments, changes in force and/or touch location may also trigger the generation of haptic feedback. In the examples above, a button press or touch location change may result in actuators being activated to provide vibrations (e.g. at the new touch location or by the device generally) or simulate the click of a button. Thus, feedback and/or updates to the user interface may be provided.
For example, force measurements may be received by touch input detect/processor(s) at 702. These force measurements are received from force sensors 312 and 314 and/or touch sensors 332 and/or 334. Thus, the force measurements may include imputed force measurements. However, as discussed above, in some embodiments, absolute forces may be determined from imputed force measurements. Further, temperature, bending of the device and other artifacts may be accounted for. Touch detector/processor(s) 310 determines the touch locations for the touch inputs detected, at 704. Based on the touch locations and, in some embodiments, the force measurements, touch detector/processor(s) 310 provide user interface elements. For example, the location and orientation the virtual power and volume buttons depicted as dotted lines in
AT 708, force sensor 312 and/or touch sensor 332 may sense an increase in force. This may occur without the user removing their finger from the corresponding device. In the example above, the user may have a finger already located at the volume increase button and may depress the side of device 600. If a force that meets or exceeds the second level is sensed, then at 708 the volume is increased. Further, a volume indicator (not shown in
Thus, using method 700 user interface elements may be provided and updated based upon the forces sensed and touch locations. Consequently, physical buttons may be replaced by or complemented with virtual buttons. As a result, the corresponding device may be less subject to contaminants entering through physical buttons. Further, on such touch surfaces such as the sides of the device, higher resolution interactivity may be provided. User interface elements may also be generated and updated based upon touches to part of the device other than the display. This may increase the fraction of the display usable for viewing and improve ease of use of the device.
The context for a device is determined based on the touch locations, at 802. The context is so termed because the context is determined while the user utilizes the device, typically without requiring active input by the user specifically to identify the context. For example, the context might be that the electronic device is being held in portrait or landscape orientation, that the electronic device is in the user's left hand or right hand, that the user is gaming, that the user is taking a photograph and/or another context.
Features of the touch locations identified at 706 may be used in determining the context. For example, if the one touch location is on one side of the device and four touch locations are on an opposing side, the device may be considered to be in portrait mode. In some embodiments, the size of the touch location and the forces applied at the touch locations may also be used in determining context. For example, a larger, single touch location at which a larger force is applied to a side of the frame of a device may correspond to a thumb, while a very large touch location on the back cover of the device may correspond to a palm. Smaller touch locations on the opposing side of the device may correspond to fingers. Thus, whether the user holds the device in the left or right hand may also be determined. External inputs may also be used in determining context at 802. For contexts such as gaming, the touch locations may be used in conjunction with the application running on the device in order to determine context. In some embodiments, additional inputs such as from sensors including orientation sensors such as accelerometers and/or gyroscopes may be used to determine context. For example, whether a smartphone is in portrait or landscape mode with respect to the earth's surface/gravity may be determined via accelerometer and/or gyroscope input.
The locations, orientations, and/or other characteristics of user interface elements are determined based on the touch locations and context, at 804. For example, buttons may be located in proximity to touch locations for ease of access, while menu bars may be located distal from touch locations to improve visibility. The orientation of buttons, menus, slide bars and/or other user interface elements may be determined at 804. For example, menus are oriented so that a user may be better able to read the information on the menu. Slide bars may be oriented such that the direction a user slides the bar is easier to reach. Using the locations, orientations and/or other characteristics determined at 804, the user interface elements are rendered on the display at 806.
Thus, using method 800 user interface elements may be provided and updated based upon the context as determined by the forces sensed and touch locations. Consequently, physical buttons may be removed. As a result, the corresponding device may be less subject to contaminants entering through physical buttons. The configuration of the user interface may also be more readily customizable. User interface elements may also be generated and updated based upon touches to part of the device other than the display. This may enhance usability of the device.
In some embodiments, touch detector/processor(s) 310 may determine the context at 802 and configure the user interface elements at 804. These user interface elements may be provided to the display at 806. For example,
User interface elements 931, 932, 933 and 934 (collectively user interface elements 930) have been provided based the context. User interface elements 931 and 933 may be virtual buttons, user interface element 932 may be a slide bar or jog wheel, user interface element 934 may be a portion of a menu or a text item. For example, the location and orientation of slide bar/jog wheel 932 has been selected for ease of access by the user's thumb corresponding to touch location 951. Similarly, button 933 has been placed for ease of access by the user's finger(s) corresponding to touch location(s) 952 and/or 953. In some embodiments, a visual guide may be provided to indicate to the user the location(s) of the software-defined jog wheel or other user interface elements 931, 392, 933 and 934. For example, a symbol identifying the buttons and/or a light may be used to indicate the software-defined button. A user may use a single hand to select operate the slide bar/jog wheel 932 or other buttons, for example scrolling through web pages or apps depicted on the display of mobile device 400. Thus, the user interface elements 9310 may not only be located but also controlled via the user's touch.
User interface elements 931B, 932B, 933B and 934B have been provided based the context. Thus, the locations of user interface elements 931B, 932B, 933B and 934B have been switched from that shown in
User interface elements 931C, 932C, 933C and 934C have been provided based the context. Thus, both the locations and the orientations (with respect to display 920) of user interface elements 931C, 932C, 933C and 934C have been switched from that shown in
As discussed with respect to
Thus, user interface elements may be provided and updated based upon the context as indicated by touch locations and/or forces. Consequently, the ease of operation, flexibility, reliability (e.g. being water tight) of device 900 using the method 700 and/or 800 may be improved.
Additional force measurements are received from force sensors, at 1002. Thus, 1002 is analogous to 702 of method 700. Touch locations are identified, at 1004. In some embodiment, 1004 is analogous to 704 of method 700. At 1006 it is determined whether the touch locations have changed. For example, touch locations 951B, 952B, 953B, 954B and 955B may be compared to touch location 951, 952, 953, 954 and 955. If the touch locations have not changed, then method 1000 continues. If however, the touch locations have changed, then the user interface elements may be updated and/or haptic feedback generated, at 1008. To generate haptic feedback, a signal may be provided from touch detector/processor(s) 310 to haptics generator 350. Alternatively, a signal may be provided from touch detector/processor(s) 310 to application system 302. In response, application system 302 provides a signal to haptics generator 30. Haptics generator 350 provides the appropriate signal(s) to one or more haptics actuator(s) 352 and/or 354 to provide the haptic feedback at the desired location(s).
It is determined whether the force(s) have changed, at 1010. If not, then method 1000 is completed. If the force(s) measured by force and/or touch sensors have changed, then it is determined whether the change(s) exceed the appropriate thresholds. An increase in the force, particularly followed by a decrease in force, may indicate a virtual button push. A decrease in force may indication a movement by the user's finger(s) or other action. In response to the change in force(s) the user interface elements may be updated and/or haptic feedback generated.
Thus, using method 1000 a user is able to control the user interface and receive feedback without requiring that that the user lifts their finger from the display. Instead, a change in force is sufficient for at least some updates. This may facilitate use of the corresponding device.
For example,
Similarly,
In the embodiment shown in
In addition, haptics may be incorporated to mimic the response of a physical button and/or provide other feedback, at 1014. For example, at 1014, the touch sensing system of mobile device 1200 may provide a signal for an actuator or other motion generator that can vibrate or otherwise cause motions in some or all of mobile device 1200. Thus, using such a haptic system, software-defined buttons function and feel similar to a physical button. In some embodiments, mobile device 1200 may be configured such that the point of view may be changed via a software-defined slide (not shown). Such a slide may be analogous to slide bar 932. Thus, such a slide may be provided on the display, edge, or back of mobile device 1200. In some embodiments, the locations and operation of the controls may be customized by the user. Thus, mobile device 1200 may provide rapid response as well as an intuitive, customizable, and rapid.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
This application claims priority to U.S. Provisional Patent Application No. 62/905,997 entitled HAPTICS USING TOUCH INPUT SENSORS filed Sep. 25, 2019 which is incorporated herein by reference for all purposes.
Number | Date | Country | |
---|---|---|---|
62905997 | Sep 2019 | US |