The present disclosure relates to control methods for a force sensor system, and in particular to systems and methods for compensating for the effects of temperature on force sensing systems.
Force sensors are known as possible input devices for electronic systems, and can be used as an alternative to traditional mechanical switches.
Many traditional mobile devices (e.g., mobile phones, personal digital assistants, video game controllers, etc.) include mechanical buttons to allow for interaction between a user of a mobile device and the mobile device itself. However, such mechanical buttons are susceptible to aging, wear, and tear that may reduce the useful life of a mobile device and/or may require significant repair if malfunction occurs. Also, the presence of mechanical buttons may render it difficult to manufacture mobile devices to be waterproof.
Accordingly, mobile device manufacturers are increasingly looking to equip mobile devices with virtual buttons that act as a human-machine interface allowing for interaction between a user of a mobile device and the mobile device itself. Similarly, mobile device manufacturers are increasingly looking to equip mobile devices with other virtual interface areas (e.g., a virtual slider, interface areas of a body of the mobile device other than a touch screen, etc.). Ideally, for best user experience, such virtual interface areas should look and feel to a user as if a mechanical button or other mechanical interface were present instead of a virtual button or virtual interface area.
Presently, linear resonant actuators (LRAs) and other vibrational actuators (e.g., rotational actuators, vibrating motors, etc.) are increasingly being used in mobile devices to generate vibrational feedback in response to user interaction with human-machine interfaces of such devices. Typically, a sensor (traditionally a force or pressure sensor) detects user interaction with the device (e.g., a finger press on a virtual button of the device) and in response thereto, the linear resonant actuator may vibrate to provide feedback to the user. For example, a linear resonant actuator may vibrate in response to user interaction with the human-machine interface to mimic to the user the feel of a mechanical button click.
Force sensors thus detect forces on the device to determine user interaction, e.g. touches, presses, or squeezes of the device. There is a need to provide systems to process the output of such sensors which balances low power consumption with responsive performance.
There is a need in the industry for sensors to detect user interaction with a human-machine interface, wherein such sensors and related sensor systems provide acceptable levels of sensor sensitivity, power consumption, and size. There is also a need in the industry to provide force sensor systems with improved operation over a range of operating environments.
According to a first aspect of the present disclosure, there is provided a control method for a force sensor system used to define at least one virtual button, the control method comprising the steps of: receiving a force sensor input; determining a gradient of the force sensor input; and comparing the determined gradient to a gradient threshold to determine a user press event of a virtual button.
According to a second aspect of the present disclosure, there is provided a control method for a force sensor system used to define at least one virtual button, the control method comprising the steps of: receiving a force sensor input; determining a gradient of the force sensor input; and comparing the determined gradient to a first re-calibration threshold to determine a re-calibration requirement of the force sensor system.
According to a third aspect of the present disclosure, there is provided a control method for a force sensor system used to define at least one virtual button, the control method comprising the steps of: receiving a force sensor input; determining first and second derivatives of the force sensor input to provide velocity and acceleration values; mapping the velocity and acceleration values to a 2D representation; and performing quadrant tracking or state space tracking of the 2D representation of the velocity and acceleration values to determine if a user press event has occurred.
According to a fourth aspect of the present disclosure, there is provided a method of controlling a force sensor system to define at least one button implemented by at least one force sensor, the method comprising: receiving a force sensor input; determining a gradient of the force sensor input; and controlling the force sensor system based on the determined gradient.
Any of the aforementioned aspects may be employed in combination.
According to a fifth aspect of the present disclosure, there is provided a force sensor system, comprising: at least one force sensor; and a controller connected to the at least one force sensor and configured to carry out the method of any of the aforementioned aspects.
According to a sixth aspect of the present disclosure, there is provided a host device comprising the force sensor system according to the aforementioned fifth aspect.
Computer program aspects corresponding to the method aspects are envisaged, as are (non-transitory) storage medium aspects storing computer programs of the computer program aspects.
Reference will now be made, by way of example only, to the accompanying drawings, of which:
The description below sets forth example embodiments according to this disclosure. Further example embodiments and implementations will be apparent to those having ordinary skill in the art. Further, those having ordinary skill in the art will recognize that various equivalent techniques may be applied in lieu of, or in conjunction with, the embodiments discussed below, and all such equivalents should be deemed as being encompassed by the present disclosure.
As shown in
Enclosure 101 may comprise any suitable housing, casing, or other enclosure for housing the various components of mobile device 102. Enclosure 101 may be constructed from plastic, metal, and/or any other suitable materials. In addition, enclosure 101 may be adapted (e.g., sized and shaped) such that mobile device 102 is readily transported on a person of a user of mobile device 102. Accordingly, mobile device 102 may include but is not limited to a smart phone, a tablet computing device, a handheld computing device, a personal digital assistant, a notebook computer, a video game controller, a headphone or earphone or any other device that may be readily transported on a person of a user of mobile device 102. While
Controller 103 may be housed within enclosure 101 and may include any system, device, or apparatus configured to interpret and/or execute program instructions and/or process data, and may include, without limitation a microprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit (ASIC), or any other digital or analog circuitry configured to interpret and/or execute program instructions and/or process data. In some embodiments, controller 103 interprets and/or executes program instructions and/or processes data stored in memory 104 and/or other computer-readable media accessible to controller 103.
Memory 104 may be housed within enclosure 101, may be communicatively coupled to controller 103, and may include any system, device, or apparatus configured to retain program instructions and/or data for a period of time (e.g., computer-readable media). Memory 104 may include random access memory (RAM), electrically erasable programmable read-only memory (EEPROM), a Personal Computer Memory Card International Association (PCMCIA) card, flash memory, magnetic storage, opto-magnetic storage, or any suitable selection and/or array of volatile or non-volatile memory that retains data after power to mobile device 102 is turned off.
Microphone 106 may be housed at least partially within enclosure 101, may be communicatively coupled to controller 103, and may comprise any system, device, or apparatus configured to convert sound incident at microphone 106 to an electrical signal that may be processed by controller 103, wherein such sound is converted to an electrical signal using a diaphragm or membrane having an electrical capacitance that varies as based on sonic vibrations received at the diaphragm or membrane. Microphone 106 may include an electrostatic microphone, a condenser microphone, an electret microphone, a microelectromechanical systems (MEMs) microphone, or any other suitable capacitive microphone.
Radio transmitter/receiver 108 may be housed within enclosure 101, may be communicatively coupled to controller 103, and may include any system, device, or apparatus configured to, with the aid of an antenna, generate and transmit radio-frequency signals as well as receive radio-frequency signals and convert the information carried by such received signals into a form usable by controller 103. Radio transmitter/receiver 108 may be configured to transmit and/or receive various types of radio-frequency signals, including without limitation, cellular communications (e.g., 2G, 3G, 4G, 5G, LTE, etc.), short-range wireless communications (e.g., BLUETOOTH), commercial radio signals, television signals, satellite radio signals (e.g., GPS), Wireless Fidelity, etc.
A speaker 110 may be housed at least partially within enclosure 101 or may be external to enclosure 101, may be communicatively coupled to controller 103, and may comprise any system, device, or apparatus configured to produce sound in response to electrical audio signal input. In some embodiments, a speaker may comprise a dynamic loudspeaker, which employs a lightweight diaphragm mechanically coupled to a rigid frame via a flexible suspension that constrains a voice coil to move axially through a cylindrical magnetic gap. When an electrical signal is applied to the voice coil, a magnetic field is created by the electric current in the voice coil, making it a variable electromagnet. The coil and the driver's magnetic system interact, generating a mechanical force that causes the coil (and thus, the attached cone) to move back and forth, thereby reproducing sound under the control of the applied electrical signal coming from the amplifier.
The force sensor 105 may be housed within, be located on or form part of the enclosure 101, and may be communicatively coupled to the controller 103. Each force sensor of a device 102 may include any suitable system, device, or apparatus for sensing a force, a pressure, or a touch (e.g., an interaction with a human finger) and for generating an electrical or electronic signal in response to such force, pressure, or touch. In some embodiments, such electrical or electronic signal may be a function of a magnitude of the force, pressure, or touch applied to the force sensor. In these and other embodiments, such electronic or electrical signal may comprise a general-purpose input/output (GPIO) signal associated with an input signal to which haptic feedback is given.
Example force sensors 105 may include or comprise: capacitive displacement sensors, inductive force sensors, strain gauges, piezoelectric force sensors, force sensing resistors, piezoresistive force sensors, thin film force sensors, and quantum tunneling composite-based force sensors.
In some arrangements, other types of sensor may be employed. For purposes of clarity and exposition in this disclosure, the term “force” as used herein may refer not only to force, but to physical quantities indicative of force or analogous to force, such as, but not limited to, pressure and touch.
Linear resonant actuator 107 may be housed within enclosure 101, and may include any suitable system, device, or apparatus for producing an oscillating mechanical force across a single axis. For example, in some embodiments, linear resonant actuator 107 may rely on an alternating current voltage to drive a voice coil pressed against a moving mass connected to a spring. When the voice coil is driven at the resonant frequency of the spring, linear resonant actuator 107 may vibrate with a perceptible force. Thus, linear resonant actuator 107 may be useful in haptic applications within a specific frequency range.
While, for the purposes of clarity and exposition, this disclosure is described in relation to the use of linear resonant actuator 107, it is understood that any other type or types of vibrational actuators (e.g., eccentric rotating mass actuators) may be used in lieu of or in addition to linear resonant actuator 107. In addition, it is also understood that actuators arranged to produce an oscillating mechanical force across multiple axes may be used in lieu of or in addition to linear resonant actuator 107. A linear resonant actuator 107, based on a signal received from integrated haptic system 112, may render haptic feedback to a user of mobile device 102 for at least one of mechanical button replacement and capacitive sensor feedback.
Integrated haptic system 112 may be housed within enclosure 101, may be communicatively coupled to force sensor 105 and linear resonant actuator 107, and may include any system, device, or apparatus configured to receive a signal from force sensor 105 indicative of a force applied to mobile device 102 (e.g., a force applied by a human finger to a virtual button of mobile device 102) and generate an electronic signal for driving linear resonant actuator 107 in response to the force applied to mobile device 102.
Although specific example components are depicted above as being integral to mobile device 102 (e.g., controller 103, memory 104, force sensor 105, microphone 106, radio transmitter/receiver 108, speakers(s) 110), a mobile device 102 in accordance with this disclosure may comprise one or more components not specifically enumerated above. For example, although
In addition, it will be understood that the device may be provided with additional input sensor devices or transducers, for example accelerometers, gyroscopes, cameras, or other sensor devices.
Some force sensor systems are sensitive to variations in temperature (or other properties of the operating environment such as pressure). For example, for resistive force sensor systems where a bias voltage is applied to the sensors, changes in temperature can create changes in the bias of the force signal, resulting in changes to the baseline of operation of the sensor system.
The upper line illustrates the output signal (Force signal), with the signal peaks indicating the occurrence of touch events detected by the sensor system. The lower line (Baseline tracking) illustrates the output of a baseline tracking system, to monitor for changes in the bias applied to the resistive force sensor system. As can be seen, the increasing temperature of the system results in a steadily increasing bias (baseline) of the system.
It is noted that depending on the force sensor system setup (i.e. an actual implementation), and for example on the type/structure of the mobile device 102 comprising the system, it may be that an increasing temperature of the system results in a steadily increasing bias (baseline) of the system (as above) or conversely a decreasing bias (baseline) of the system. The present disclosure will be understood accordingly. In either case, the bias (baseline) of the system may change with temperature (or another environmental factor).
Additionally or alternatively, the sensitivity of the sensors (such as force sensor 105) themselves can change with changes in temperature, with an increase (or decrease) in temperature resulting in a reduction in sensitivity.
An issue with force sensor systems can arise when a portion of a device such as device 102, e.g. a frame or casing of a mobile device such as enclosure 101, is at a relatively hot temperature (e.g. 60 degrees Celsius), and where a user touches the device with a finger at normal body temperature (e.g. 37 degrees Celsius). This can result in a temperature difference between locations on the device, which may translate into temperature variations in a force sensor system. For sensor systems using an arrangement of multiple force sensors (such as force sensor 105), e.g. resistive force sensors arranged in a Whetstone Bridge configuration or similar, the temperature variation can occur between different portions of the sensor, creating a temporary baseline drift.
The uppermost graph shows the force input signal over time. The force input signal is shown as an analogue signal, expressed in N (Newtons). It will be appreciated that analogue voltage signals received from a force sensor such as force sensor 105 (such as the output signal in
The input is compared against fixed thresholds (shown in the upper graph as TH rise and TH fall) to determine a rising edge (TH rise) and a falling edge (TH fall) of a user touch or press event.
The middle graph shows the result of a comparison over time of the force input signal against the thresholds TH rise and TH fall, the result being a binary output of logic 1 for a rise flag when the force input signal rises above (or crosses in an upward direction) the threshold TH rise and a binary output of logic 1 for a fall flag when the force input signal falls below (or crosses in a downward direction) the threshold TH fall. The rise flag and fall flag signals are otherwise at logic 0.
The lowermost graph shows the result of state detection logic performed over time based on the flag signals. The available states indicated by the state signal shown in this example are “single tap”, “long push”, “double push”, “single push” and “idle detection”, e.g. corresponding to common user inputs (or lack of inputs) in relation to a mobile device 102 such a smartphone. The state signal indicates the “idle detection” state in this example unless one of the other states is detected, in which case that state is indicated.
In this case, even though five individual presses are performed by a user (as indicated by the five peaks of the force input signal in the uppermost graph), due to the increase in the baseline due to the temperature variation only a long press is detected (as indicated in the lowermost graph) instead of five individual presses.
A gradient release threshold method intended to address the above issue will now be considered. In overview, it is proposed to analyze the gradient of a force input signal, i.e. its rate of change with respect to time or its first derivative with respect to time (first time derivative), and to use the gradient to detect the release of a button implemented by one or more force sensors, which may be considered a virtual button, and/or to trigger a recalibration of the force sensor system.
Here, the release of the button may be considered the process of the user retracting a finger (or stylus or other body part or similar) from the surface in respect of which the force sensor or sensors are measuring an applied force, so that the measured applied force reduces from a relatively high level (corresponding to a press or push of the button) to a relatively low level (corresponding to the button no longer being pressed or pushed, i.e. the end of the press or push). Recalibration may be understood as the process of setting a baseline value to, or based on, the current force input value.
The gradient of the user press event when the button is pushed (indicated by the rising line superimposed on the first rising edge of the input signal) is approximately the same (in magnitude) as the gradient when the button is released (indicated by the falling line superimposed on the first falling edge of the input signal).
It can be seen that the gradient for the push event (i.e. when the applied force is increased from a non-pushed state) is positive and the gradient for the release event (i.e. when the applied force is decreased back to the non-pushed state) is negative. Accordingly, once the press (or push) is detected (e.g. based on comparison of the input with a rise threshold such as TH rise in
The upper graph shows the force sensor inputs (force input signals) over time in respect of four (virtual) buttons for a series of user press events which occur in respect of two of those buttons. The force input signals are shown as analogue signals, expressed in N in line with
The lower graph shows the instantaneous gradient of the input signals shown in the upper graph. Thus, the gradient signals are shown as analogue signals, expressed in N/s. The gradient threshold (for detecting a release event) is selected such that when the instantaneous gradient falls below the gradient threshold, a release event of the virtual button is detected. In the illustrated example, a suitable release threshold of e.g. −5 N/s would detect the release of the virtual button for each of the user press events. It will be understood that the threshold may be dependent on the stiffness of the device chassis or frame. Accordingly, for different device chassis or frame configurations, the release threshold may be set to a different level.
Additionally or alternatively, the analysis of the falling gradient may be used as part of a system to trigger a re-calibration of a force sensor system.
In a preferred aspect, the re-calibration may be triggered if the gradient is lower than the negative threshold. However, the point where the system is re-calibrated is not when the gradient is lower than the negative threshold, rather the re-calibration is performed when the gradient is larger than a second threshold, to ensure that the re-calibration will occur at the correct point.
For example, it may be desirable to re-calibrate once the gradient has become more negative than the negative threshold (part of a release event) and then risen up to become 0 (indicating the end of the release event). Baseline re-calibration may have the effect of re-calibrating the force input signal back to indicating a zero value (0 N) at that point, i.e. for that input from the force sensor(s), in this case when the gradient is also zero (0 N/s).
Initially a flag gradHit is set to 0, and a series of samples are received from the force sensor system. The samples (“Force”) are compared to a noise threshold TH_noise (Force>TH_noise?). The noise threshold TH_noise could be the same as, or replaced with, the threshold TH rise in
If gradHit is at 0, then the gradient of the force signal (“gradient”) is compared to the falling gradient threshold gradLowTH (gradient<gradLowTH?). If the gradient of the falling edge of the force signal is below the falling gradient threshold, then this indicates that a re-calibration is required, and the flag gradHit is set to 1. In one example, the falling gradient threshold is set at a level of −5 N/s. The falling gradient here could be set at a level lower than (i.e. more negative than, and thus having a larger absolute value than) that used to detect a release event, i.e. to detect a “very” negative gradient which may indicate the need for re-calibration.
The gradient of the falling edge of the force signal is compared to a second gradient threshold gradHighTh (gradient>gradHighTh?), to determine if the re-calibration can be performed. If the gradient is above (or at) the second threshold, e.g. 0 N/s, then the re-calibration of the force sensor system can be triggered. If the gradient is below the second threshold, then a further set of samples are received, and if the samples are above the noise threshold (TH_noise) and the gradHit flag is set, then if the gradient is above the second threshold the re-calibration can be triggered.
The comparison of the gradient with two separate thresholds allows for the detection of a required re-calibration, and ensures that the re-calibration is performed at the correct point of the force sensor system operation (e.g. when the gradient is zero after a touch event).
A further step could be interjected between the successful comparison of the gradient to the second gradient threshold (gradient>gradHighTh?=yes) and the triggering of the re-calibration. For example, a comparison could be made between the samples (“Force”) and the fall threshold TH fall of
The upper graph corresponds to the upper graph of
Thus, the force sensor inputs (force input signals) over time are shown in the upper graph in respect of four (virtual) buttons for a series of user press events which occur in respect of two of those buttons. The lower graph shows the instantaneous gradient of the input signals shown in the upper graph, smoothed with a smoothing filter.
Unlike the graphs of
As indicated in the upper graph of
This re-calibration could be applied only to the force input signal concerned (the one which has triggered the re-calibration) or could be applied to a plurality or all of the force input signals together.
Note that each “force input signal” represented in e.g.
The re-calibration can be seen to occur three times in
A press dynamics method intended to address the issue mentioned earlier will now be considered. This method may be considered part of, an addition to, or an alternative of the gradient release threshold method.
In overview, it is proposed to analyze the dynamics of the force sensor signal, in particular by quadrant tracking of the first and second derivatives (i.e. velocity and acceleration), to detect a user press event. In this approach, presses and taps can be traceable regardless the bias of the signal (i.e. baseline tracking may not be needed, or at least not to the same degree of accuracy).
Here, the terms “velocity” and “acceleration” will be used for convenience (i.e. by analogy to position-based measures) to indicate the first and second derivatives of the force sensor signal (with respect to time), respectively.
The uppermost plot shows the force input signal over time. The force input signal is shown as an analogue signal expressed in N in line with
Thus, a series of samples of a force signal from a force sensor are received, which provide the force profile of a user press as indicated in the top plot of
The press signal is plotted to indicate the 4 different stages happening in a press:
1. The first section (first stage) of the press is indicated as Stage 1 (Quadrant I). It represents when the signal goes from minimum absolute velocity to the maximum velocity with the acceleration moving from the maximum to zero.
2. The second section (second stage) of the press is indicated as Stage 2 (Quadrant IV). This section displays the signal going from maximum velocity to zero and consequently the acceleration moving to the minimum value.
3. The third section (third stage) of the press is indicated as Stage 3 (Quadrant III). In here, the velocity is going negative as a consequence of reducing the force level and the acceleration is approaching zero.
4. The fourth section (fourth stage) of the press is indicated as Stage 4 (Quadrant II). This is the last section where the minimum velocity approaches zero and the acceleration is positive.
It will be understood that analyzing the velocity and acceleration of the force sensor signal is independent of the initial bias level, and accordingly such analysis will not be negatively impacted by a changed bias level, e.g. due to temperature changes.
In particular, the proposed method tracks the velocity and acceleration to confirm that the sensor signal actually moves through the four different states (corresponding to the four stages mentioned above) consecutively and in the correct order. For a valid user press event to have occurred, the system is configured to confirm that the combined velocity and acceleration points go from Quadrant I (Stage 1) to Quadrant IV (Stage 2); and then from Quadrant IV (Stage 2) to Quadrant III (Stage 3); and finally from Quadrant III (Stage 3) to Quadrant II (Stage 4). If such a sequence is followed, accordingly a user press event is triggered.
In some instances, it may be acceptable to confirm that the combined velocity and acceleration points go through fewer (i.e. a subset of the) stages. For example, from Quadrant IV (Stage 2) to Quadrant III (Stage 3), or from Quadrant I (Stage 1) to Quadrant II (Stage 4), or from Quadrant I (Stage 1) to Quadrant IV (Stage 2) to Quadrant II (Stage 4). It will be understood that the present disclosure extends to confirming progression through any subset of the sequence depicted in
Of course, it is not necessary to actually generate such a 2D representation in order to perform quadrant tacking; the graph of
While the noise box is illustrated as a rectangle centered at (0,0) in
Further, it will be understood that although in the example of
In addition, it will be understood that the boundaries of the “noise box” can be dynamically calculated or estimated using a recursive averaging technique as an alternative to using fixed thresholds. In such a situation, the definition of the noise box may be unsupervised in the sense that it may be applied on the voltage domain (without using any sensitivity value knowledge) and without setting any threshold.
With reference to
This method allows for the unsupervised detection of force events, and in addition is configured to be independent of incorrect sensitivity estimates.
Each of
Thus, starting with
Moving on to
Moving on to
Finally, moving on to
With the progression through Quadrants I, IV, III and II (stages 1, 2, 3 and 4) in that order detected, it has been determined that a user press has been detected, and this is indicated above the upper plot in
Merely as an example, one possible mechanism or algorithm for adaptively updating the size of the noise box is to define the dimensions of the rectangle (e.g. as depicted in
vel[n]=x[n]−x[n−1]
acc[n]=vel[n]−vel[n−1]
vel_th[n]=vel_th[n−1]λ+(1−λ)(abs(vel[n]))
acc_th[n]=acc_th[n−1]λ+(1−λ)(abs(acc[n]))
where n is the sample number, x[n] is the value of the force input signal x at sample n, vel[n] and acc[n] are the first and second derivatives of the signal x at sample n respectively used to determine the size of the rectangle, abs indicates the absolute value, and λ is a forgetting factor. Of course, other adaptive, recursive methods will be known to the skilled person.
It will be understood that the above-described gradient release threshold method and the press dynamics method may be independently used in a force sensor system or may be used in combination to determine the occurrence of user press events in the system. It will be further understood that either or both of the methods may be used in combination with other force sensor methods, e.g. based on fixed or adaptive thresholds.
In a preferred aspect, the above-described methods are used when the temperature of the force sensor system exceeds a temperature threshold, e.g. above 50 degrees Celsius. As a result, the force sensor system may be arranged to receive a temperature input, e.g. from a temperature sensor provided in the device 102, to control the operation of the various methods. For temperatures below such a threshold, standard force sensor methods may be used, but for temperatures above such a threshold the use of one or both of the above methods can prevent the processing of incorrect inputs. It will be understood that standard force sensor methods, e.g. using fixed or adaptive thresholds, may continue to be used for temperatures above the temperature threshold.
It will be understood that the above-described methods may be implemented in a dedicated control module, for example a processing module or DSP. The control module may be provided as an integral part of the force sensor system or may be provided as part of a centralized controller such as a central processing unit (CPU) or applications processor (AP). It will be understood that the control module may be provided with a suitable memory storage module for storing measured and calculated data for use in the described processes.
The skilled person will recognise that some aspects of the above described apparatus (circuitry) and methods may be embodied as processor control code (e.g. a computer program), for example on a non-volatile carrier medium such as a disk, CD- or DVD-ROM, programmed memory such as read only memory (Firmware), or on a data carrier such as an optical or electrical signal carrier.
For some applications, such aspects will be implemented on a DSP (Digital Signal Processor), ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array). Thus the code may comprise conventional program code or microcode or, for example, code for setting up or controlling an ASIC or FPGA. The code may also comprise code for dynamically configuring re-configurable apparatus such as re-programmable logic gate arrays. Similarly, the code may comprise code for a hardware description language such as Verilog™ or VHDL. As the skilled person will appreciate, the code may be distributed between a plurality of coupled components in communication with one another. Where appropriate, such aspects may also be implemented using code running on a field-(re)programmable analogue array or similar device in order to configure analogue hardware.
Embodiments may be implemented in a host device, especially a portable and/or battery powered host device such as a mobile computing device for example a laptop or tablet computer, a games console, a remote control device, a home automation controller or a domestic appliance including a domestic temperature or lighting control system, a toy, a machine such as a robot, an audio player, a video player, a headphone or earphone, or a mobile telephone for example a smartphone. It will be understood that embodiments may be implemented as part of any suitable human-machine interface system, for example on a home appliance or in a vehicle or interactive display. There is further provided a host device incorporating the above-described system.
There is further provided a control method for a sensor system as described above.
It should be understood—especially by those having ordinary skill in the art with the benefit of this disclosure—that the various operations described herein, particularly in connection with the figures, may be implemented by other circuitry or other hardware components. The order in which each operation of a given method is performed may be changed, and various elements of the systems illustrated herein may be added, reordered, combined, omitted, modified, etc. It is intended that this disclosure embrace all such modifications and changes and, accordingly, the above description should be regarded in an illustrative rather than a restrictive sense.
Similarly, although this disclosure makes reference to specific embodiments, certain modifications and changes can be made to those embodiments without departing from the scope and coverage of this disclosure. Moreover, any benefits, advantages, or solutions to problems that are described herein with regard to specific embodiments are not intended to be construed as a critical, required, or essential feature or element. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. The word “comprising” does not exclude the presence of elements or steps other than those listed in the claim, “a” or “an” does not exclude a plurality, and a single feature or other unit may fulfil the functions of several units recited in the claims. Any reference numerals or labels in the claims shall not be construed so as to limit their scope.
Further embodiments likewise, with the benefit of this disclosure, will be apparent to those having ordinary skill in the art, and such embodiments should be deemed as being encompassed herein.
As used herein, when two or more elements are referred to as “coupled” to one another, such term indicates that such two or more elements are in electronic communication or mechanical communication, as applicable, whether connected indirectly or directly, with or without intervening elements.
This disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments herein that a person having ordinary skill in the art would comprehend. Similarly, where appropriate, the appended claims encompass all changes, substitutions, variations, alterations, and modifications to the example embodiments herein that a person having ordinary skill in the art would comprehend. Moreover, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, or component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative. Accordingly, modifications, additions, or omissions may be made to the systems, apparatuses, and methods described herein without departing from the scope of the disclosure. For example, the components of the systems and apparatuses may be integrated or separated. Moreover, the operations of the systems and apparatuses disclosed herein may be performed by more, fewer, or other components and the methods described may include more, fewer, or other steps. Additionally, steps may be performed in any suitable order. As used in this document, “each” refers to each member of a set or each member of a subset of a set.
Although exemplary embodiments are illustrated in the figures and described below, the principles of the present disclosure may be implemented using any number of techniques, whether currently known or not. The present disclosure should in no way be limited to the exemplary implementations and techniques illustrated in the drawings and described above.
Unless otherwise specifically noted, articles depicted in the drawings are not necessarily drawn to scale.
All examples and conditional language recited herein are intended for pedagogical objects to aid the reader in understanding the disclosure and the concepts contributed by the inventor to furthering the art, and are construed as being without limitation to such specifically recited examples and conditions. Although embodiments of the present disclosure have been described in detail, it should be understood that various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the disclosure.
Although specific advantages have been enumerated above, various embodiments may include some, none, or all of the enumerated advantages. Additionally, other technical advantages may become readily apparent to one of ordinary skill in the art after review of the foregoing figures and description.
To aid the Patent Office and any readers of any patent issued on this application in interpreting the claims appended hereto, applicants wish to note that they do not intend any of the appended claims or claim elements to invoke 35 U.S.C. § 112(f) unless the words “means for” or “step for” are explicitly used in the particular claim.
Aspects of the system may be defined by the following numbered statements:
1. There is provided a control method for a force sensor system used to define at least one virtual button, the control method comprising the steps of:
receiving a force sensor input;
determining a gradient of the force sensor input; and
comparing the determined gradient to a gradient threshold to determine a user press event of a virtual button.
2. Preferably, the method comprises the step of comparing the determined gradient to a falling gradient threshold to determine a release event of a virtual button.
3. In one aspect, the falling gradient threshold is selected based on characteristic properties of a device having the force sensor system, e.g. the stiffness of a chassis or frame of such a device. In a preferred aspect, the falling gradient threshold is set at approximately −5 N/s.
4. Additionally or alternatively, there is provided a first control method for a force sensor system used to define at least one virtual button, the control method comprising the steps of:
receiving a force sensor input;
determining a gradient of the force sensor input; and
comparing the determined gradient to a first re-calibration threshold to determine a re-calibration requirement of the force sensor system.
5. Preferably, the first re-calibration threshold is a negative value. In a preferred aspect, the first re-calibration threshold is selected as the falling gradient threshold as described above. In one aspect, the first re-calibration threshold is set at approximately −5 N/s.
6. Preferably, responsive to a determination that a re-calibration is required, the method comprises the step of comparing the determined gradient to a second re-calibration threshold to trigger a re-calibration of the force sensor system.
7. Preferably, the second re-calibration threshold is set at a level higher than the first re-calibration threshold. Preferably, the second re-calibration threshold is approximately a zero value, e.g. 0 N/s. Alternatively, the second re-calibration threshold may be a positive value, or a negative value close to zero, e.g. −1 N/s.
As the second threshold is defined at a positive or zero value, accordingly the re-calibration of the system will be performed at the correct point for accurate system operation.
8. Preferably, the method comprises determining a smoothed gradient of the force sensor input, preferably by performing a low pass filtering of the gradient signal, wherein the steps of comparing are performed using the smoothed gradient.
Providing a smoothed gradient removes high-frequency noise from the gradient signal, to allow for more accurate comparison of the gradient to the relevant thresholds.
9. Additionally or alternatively, there is provided a second control method for a force sensor system used to define at least one virtual button, the control method comprising the steps of:
receiving a force sensor input;
determining first and second derivatives of the force sensor input to provide velocity and acceleration values;
mapping the velocity and acceleration values to a 2D representation; and
performing quadrant tracking or state space tracking of the 2D representation of the velocity and acceleration values to determine if a user press event has occurred.
10. Preferably, the method comprises the step of determining a user press event has occurred if the quadrant tracking follows a defined sequence.
11. Preferably, the method comprises the step of determining a user press event has occurred if the velocity and acceleration values sequentially progress through first, second, third and fourth stages, wherein the stages are defined as follows:
First stage: positive velocity, positive acceleration;
Second stage: positive velocity, negative acceleration;
Third stage: negative velocity, negative acceleration; and
Fourth stage: negative velocity, positive acceleration.
12. Preferably, the method comprises the step of defining a noise box as part of a 2D representation, wherein a force sensor input having velocity and acceleration values falling outside of the noise box is assumed to be part of a user press event.
13. Preferably, the noise box is defined as a space at the center of the 2D representation.
14. Preferably, the noise box is centered at (0,0) of the 2D representation.
15. In one aspect, the noise box is defined as a rectangle having a velocity value of between +/−0.4 N/s and an acceleration value of between +/−0.25 N/s2. Alternatively, the noise box is dynamically calculated.
16. In an alternative aspect, the noise box is defined as an alternative shape, e.g. an ellipse.
17. In one aspect, the above-described control methods are performed together in a force sensor system.
18. Preferably, at least one of the above-described control methods comprise:
receiving a temperature level of a force sensor system; and
performing the control method when the temperature level is above a temperature threshold.
19. Preferably, the temperature threshold is approximately 50 degrees Celsius.
20. The temperature threshold may be received from a temperature sensor provided at the force sensor system, or may be received from a temperature sensor of a device comprising such a force sensor system.
21. Preferably, the above-described control methods are performed in combination with an alternative force sensor control method to determine user touch events, e.g. a force sensor control method using absolute or adaptive thresholds.
22. There is further provided a force sensor system comprising at least one force sensor and a controller arranged to implement at least one of the above-described control methods.
23. Preferably, the at least one force sensor comprises one or more of the following:
a capacitive displacement sensor,
an inductive force sensor,
a strain gauge,
a piezoelectric force sensor,
a force sensing resistor,
a piezoresistive force sensor,
a thin film force sensor, and
a quantum tunneling composite-based force sensor.
24. There is provided a host device comprising a force sensor system as described above.
Aspects of the present disclosure may also be defined by the following numbered statements:
S1. A method of controlling a force sensor system to define at least one button implemented by at least one force sensor, the method comprising:
S2. The method according to statement S1, wherein:
S3. The method according to statement S1 or S2, wherein said gradient is:
S4. The method according to any of the preceding statements, wherein the control comprises comparing the determined gradient to:
S5. The method according to statement S4, wherein at least one of the press-event gradient threshold, the falling gradient threshold and the first re-calibration threshold is a negative gradient threshold or has a negative value, and is optionally set at a value corresponding to a detected rate of change of force with respect to time of approximately −5 N/s.
S6. The method according to statement S4 or S5, wherein at least one of the press-event gradient threshold, the falling gradient threshold and the first re-calibration threshold is based on characteristic properties of a device having the force sensor system, such as the stiffness of a chassis or frame of the device.
S7. The method according to any of statements S4 to S6, wherein the press-event gradient threshold, the falling gradient threshold and the first re-calibration threshold, or at least two of those thresholds, are the same as one another.
S8. The method according to any of statements S4 to S7, wherein the method comprises comparing the determined gradient to a second re-calibration threshold to trigger a re-calibration of the force sensor system.
S9. The method according to statement S8, wherein a value of the second re-calibration threshold is set at a level higher than, or more positive than, that of the first re-calibration threshold.
S10. The method according to statement S8 or S9, wherein:
S11. The method according to any of statements S8 to S10, wherein the method comprises triggering the re-calibration if the determined gradient crosses the first re-calibration threshold and the second re-calibration threshold, optionally in that order.
S12. The method according to any of the preceding statements, wherein:
S13. The method according to any of the preceding statements, comprising:
S14. The method according to statement S13, wherein:
S15. The method according to statement S13 or S14, comprising:
S16. The method according to any of statements S13 to S15, comprising:
S17. The method according to any of statements S13 to 16, wherein the first-derivative values and the second-derivative values are determined as successive pairs of (instantaneous) derivative values, each pair comprising a first-derivative value and a second-derivative value corresponding to a given value of the force sensor input or to a given point in time.
S18. The method according to statement S17, comprising:
S19. The method according to statement S18, wherein the 2D representation comprises a plot of first-derivative values against second-derivative values, or vice versa.
S20. The method according to statement S18 or S19, wherein the method comprises determining a user press event has occurred if the quadrant tracking or state space tracking follows a defined sequence.
S21. The method according to any of statements S18 to S20, wherein the method comprises defining a noise box as part of the 2D representation,
S22. The method according to statement S21, wherein the noise box is defined as a space at the centre of the 2D representation.
S23. The method according to statement S21 or S22, wherein, the noise box is centred at (0,0) or an origin of the 2D representation.
S24. The method according to any of statements S21 to S23, wherein the noise box is defined as a rectangle, optionally having a first-derivative value of between +/−0.4 N/s and a second-derivative value of between +/−0.25 N/s2.
S25. The method according to any of statements S21 to S24, wherein the noise box is dynamically calculated.
S26. The method according to any of statements S21 to S25, wherein the noise box is defined as a noise shape, such as an ellipse or cross.
S27. The method according to any of statements S17 to S26, wherein the method comprises determining that a user press event has occurred if the successive pairs of first-derivative and second-derivative values sequentially progress through first, second, third and fourth stages (in that order), wherein the stages are defined as follows:
S28. The method according to any of statements S17 to S27, wherein the method comprises defining a first-derivative noise range as a range of first-derivative values attributable to noise, and excluding a pair of first-derivative and second-derivative values from determining a user press event has occurred if the first-derivative value of that pair is within the first-derivative noise range.
S29. The method according to statement S28, wherein the method comprises updating the first-derivative noise range dynamically, optionally based on preceding first-derivative values.
S30. The method according to any of statements S17 to S29, wherein the method comprises defining a second-derivative noise range as a range of second-derivative values attributable to noise, and excluding a pair of first-derivative and second-derivative values from determining a user press event has occurred if the second-derivative value of that pair is within the second-derivative noise range.
S31. The method according to statement S30, wherein the method comprises updating the second-derivative noise range dynamically, optionally based on preceding second-derivative values.
S32. The method according to any of statements S17 to S31, wherein the method comprises defining a noise space as a space defined by pairs of first-derivative and second-derivative values attributable to noise, and excluding a pair of first-derivative and second-derivative values from determining a user press event has occurred if that pair of first-derivative and second-derivative values falls within the noise space.
S33. The method according to statement S32, wherein the method comprises updating the noise space dynamically, optionally based on preceding first-derivative values, second-derivative values and/or pairs of first-derivative and second-derivative values.
S34. The method according to any of the preceding statements, comprising:
S35. The method according to statement S34, wherein the temperature threshold is approximately 50 degrees Celsius.
S36. The method according to statement S34 or S35, wherein the temperature level is received from a temperature sensor of the force sensor system, or from a temperature sensor of a device comprising the force sensor system.
S37. The method according to any of statements S34 to S36, comprising:
S38. A force sensor system, comprising:
S39. The force sensor system according to statement S38, wherein the at least one force sensor comprises one or more of the following:
S40. A host device comprising the force sensor system according to statement S38 or S39.
The present disclosure is a continuation of U.S. patent application Ser. No. 16/850,117, filed Apr. 16, 2020, which claims benefit of U.S. Provisional Patent Application Ser. No. 62/915,245, filed Oct. 15, 2019, each of which is incorporated by reference herein in its entirety; and relates to U.S. Provisional Patent Application Ser. No. 62/842,821, filed May 3, 2019 and U.S. patent application Ser. No. 16/422,543, filed May 24, 2019, all of which are incorporated by reference herein in their entireties.
| Number | Name | Date | Kind |
|---|---|---|---|
| 3686927 | Scharton | Aug 1972 | A |
| 4902136 | Mueller et al. | Feb 1990 | A |
| 5374896 | Sato et al. | Dec 1994 | A |
| 5684722 | Thorner et al. | Nov 1997 | A |
| 5748578 | Schell | May 1998 | A |
| 5857986 | Moriyasu | Jan 1999 | A |
| 6050393 | Murai et al. | Apr 2000 | A |
| 6278790 | Davis et al. | Aug 2001 | B1 |
| 6294891 | McConnell et al. | Sep 2001 | B1 |
| 6332029 | Azima et al. | Dec 2001 | B1 |
| 6388520 | Wada et al. | May 2002 | B2 |
| 6567478 | Oishi et al. | May 2003 | B2 |
| 6580796 | Kuroki | Jun 2003 | B1 |
| 6683437 | Tierling | Jan 2004 | B2 |
| 6703550 | Chu | Mar 2004 | B2 |
| 6762745 | Braun et al. | Jul 2004 | B1 |
| 6768779 | Nielsen | Jul 2004 | B1 |
| 6784740 | Tabatabaei | Aug 2004 | B1 |
| 6816833 | Iwamoto et al. | Nov 2004 | B1 |
| 6906697 | Rosenberg | Jun 2005 | B2 |
| 6995747 | Casebolt et al. | Feb 2006 | B2 |
| 7042286 | Meade et al. | May 2006 | B2 |
| 7154470 | Tierling | Dec 2006 | B2 |
| 7277678 | Rozenblit et al. | Oct 2007 | B2 |
| 7333604 | Zernovizky et al. | Feb 2008 | B2 |
| 7392066 | Haparnas | Jun 2008 | B2 |
| 7456688 | Okazaki et al. | Nov 2008 | B2 |
| 7623114 | Rank | Nov 2009 | B2 |
| 7639232 | Grant et al. | Dec 2009 | B2 |
| 7777566 | Drogi et al. | Aug 2010 | B1 |
| 7791588 | Tierling et al. | Sep 2010 | B2 |
| 7825838 | Srinivas et al. | Nov 2010 | B1 |
| 7979146 | Ullrich et al. | Jul 2011 | B2 |
| 8068025 | Devenyi et al. | Nov 2011 | B2 |
| 8098234 | Lacroix et al. | Jan 2012 | B2 |
| 8102364 | Tierling | Jan 2012 | B2 |
| 8325144 | Tierling et al. | Dec 2012 | B1 |
| 8427286 | Grant et al. | Apr 2013 | B2 |
| 8441444 | Moore et al. | May 2013 | B2 |
| 8466778 | Hwang et al. | Jun 2013 | B2 |
| 8480240 | Kashiyama | Jul 2013 | B2 |
| 8572293 | Cruz-Hernandez et al. | Oct 2013 | B2 |
| 8572296 | Shasha et al. | Oct 2013 | B2 |
| 8593269 | Grant et al. | Nov 2013 | B2 |
| 8648659 | Oh et al. | Feb 2014 | B2 |
| 8648829 | Shahoian et al. | Feb 2014 | B2 |
| 8659208 | Rose et al. | Feb 2014 | B1 |
| 8754757 | Ullrich et al. | Jun 2014 | B1 |
| 8754758 | Ullrich et al. | Jun 2014 | B1 |
| 8947216 | Da Costa et al. | Feb 2015 | B2 |
| 8981915 | Birnbaum et al. | Mar 2015 | B2 |
| 8994518 | Gregorio et al. | Mar 2015 | B2 |
| 9019087 | Bakircioglu et al. | Apr 2015 | B2 |
| 9030428 | Fleming | May 2015 | B2 |
| 9063570 | Weddle et al. | Jun 2015 | B2 |
| 9070856 | Rose et al. | Jun 2015 | B1 |
| 9083821 | Hughes | Jul 2015 | B2 |
| 9092059 | Bhatia | Jul 2015 | B2 |
| 9117347 | Matthews | Aug 2015 | B2 |
| 9128523 | Buuck et al. | Sep 2015 | B2 |
| 9164587 | Da Costa et al. | Oct 2015 | B2 |
| 9196135 | Shah et al. | Nov 2015 | B2 |
| 9248840 | Truong | Feb 2016 | B2 |
| 9326066 | Kilppel | Apr 2016 | B2 |
| 9329721 | Buuck et al. | May 2016 | B1 |
| 9354704 | Lacroix et al. | May 2016 | B2 |
| 9368005 | Cruz-Hernandez et al. | Jun 2016 | B2 |
| 9489047 | Jiang et al. | Nov 2016 | B2 |
| 9495013 | Underkoffler et al. | Nov 2016 | B2 |
| 9507423 | Gandhi et al. | Nov 2016 | B2 |
| 9513709 | Gregorio et al. | Dec 2016 | B2 |
| 9520036 | Buuck | Dec 2016 | B1 |
| 9588586 | Rihn | Mar 2017 | B2 |
| 9640047 | Choi et al. | May 2017 | B2 |
| 9652041 | Jiang et al. | May 2017 | B2 |
| 9696859 | Heller et al. | Jul 2017 | B1 |
| 9697450 | Lee | Jul 2017 | B1 |
| 9715300 | Sinclair et al. | Jul 2017 | B2 |
| 9740381 | Chaudhri et al. | Aug 2017 | B1 |
| 9842476 | Rihn et al. | Dec 2017 | B2 |
| 9864567 | Seo | Jan 2018 | B2 |
| 9881467 | Levesque | Jan 2018 | B2 |
| 9886829 | Levesque | Feb 2018 | B2 |
| 9946348 | Ullrich et al. | Apr 2018 | B2 |
| 9947186 | Macours | Apr 2018 | B2 |
| 9959744 | Koskan et al. | May 2018 | B2 |
| 9965092 | Smith | May 2018 | B2 |
| 10032550 | Zhang et al. | Jul 2018 | B1 |
| 10039080 | Miller et al. | Jul 2018 | B2 |
| 10055950 | Saboune et al. | Aug 2018 | B2 |
| 10074246 | Da Costa et al. | Sep 2018 | B2 |
| 10102722 | Levesque et al. | Oct 2018 | B2 |
| 10110152 | Hajati | Oct 2018 | B1 |
| 10171008 | Nishitani et al. | Jan 2019 | B2 |
| 10175763 | Shah | Jan 2019 | B2 |
| 10191579 | Forlines et al. | Jan 2019 | B2 |
| 10264348 | Harris et al. | Apr 2019 | B1 |
| 10275087 | Smith | Apr 2019 | B1 |
| 10402031 | Vandermeijden et al. | Sep 2019 | B2 |
| 10564727 | Billington et al. | Feb 2020 | B2 |
| 10620704 | Rand et al. | Apr 2020 | B2 |
| 10667051 | Stahl | May 2020 | B2 |
| 10726683 | Mondello et al. | Jul 2020 | B1 |
| 10735956 | Bae et al. | Aug 2020 | B2 |
| 10782785 | Hu et al. | Sep 2020 | B2 |
| 10795443 | Hu et al. | Oct 2020 | B2 |
| 10820100 | Stahl et al. | Oct 2020 | B2 |
| 10828672 | Stahl et al. | Nov 2020 | B2 |
| 10832537 | Doy et al. | Nov 2020 | B2 |
| 10848886 | Rand | Nov 2020 | B2 |
| 10860202 | Sepehr et al. | Dec 2020 | B2 |
| 10955955 | Peso Parada et al. | Mar 2021 | B2 |
| 10969871 | Rand et al. | Apr 2021 | B2 |
| 10976825 | Das et al. | Apr 2021 | B2 |
| 11069206 | Rao et al. | Jul 2021 | B2 |
| 11079874 | Lapointe et al. | Aug 2021 | B2 |
| 11139767 | Janko et al. | Oct 2021 | B2 |
| 11150733 | Das et al. | Oct 2021 | B2 |
| 11259121 | Lindemann et al. | Feb 2022 | B2 |
| 11460526 | Foo | Oct 2022 | B1 |
| 20010043714 | Asada et al. | Nov 2001 | A1 |
| 20020018578 | Burton | Feb 2002 | A1 |
| 20020085647 | Oishi et al. | Jul 2002 | A1 |
| 20030068053 | Chu | Apr 2003 | A1 |
| 20030214485 | Roberts | Nov 2003 | A1 |
| 20050031140 | Browning | Feb 2005 | A1 |
| 20050134562 | Grant et al. | Jun 2005 | A1 |
| 20050195919 | Cova | Sep 2005 | A1 |
| 20060028095 | Maruyama et al. | Feb 2006 | A1 |
| 20060197753 | Hotelling | Sep 2006 | A1 |
| 20070013337 | Liu et al. | Jan 2007 | A1 |
| 20070024254 | Radecker et al. | Feb 2007 | A1 |
| 20070241816 | Okazaki et al. | Oct 2007 | A1 |
| 20080077367 | Odajima | Mar 2008 | A1 |
| 20080226109 | Yamakata et al. | Sep 2008 | A1 |
| 20080240458 | Goldstein et al. | Oct 2008 | A1 |
| 20080293453 | Atlas et al. | Nov 2008 | A1 |
| 20080316181 | Nurmi | Dec 2008 | A1 |
| 20090020343 | Rothkopf et al. | Jan 2009 | A1 |
| 20090079690 | Watson et al. | Mar 2009 | A1 |
| 20090088220 | Persson | Apr 2009 | A1 |
| 20090096632 | Ullrich et al. | Apr 2009 | A1 |
| 20090102805 | Meijer et al. | Apr 2009 | A1 |
| 20090128306 | Luden et al. | May 2009 | A1 |
| 20090153499 | Kim et al. | Jun 2009 | A1 |
| 20090189867 | Krah et al. | Jul 2009 | A1 |
| 20090278819 | Goldenberg et al. | Nov 2009 | A1 |
| 20090313542 | Cruz-Hernandez et al. | Dec 2009 | A1 |
| 20100013761 | Birnbaum et al. | Jan 2010 | A1 |
| 20100080331 | Garudadri et al. | Apr 2010 | A1 |
| 20100085317 | Park et al. | Apr 2010 | A1 |
| 20100141408 | Doy et al. | Jun 2010 | A1 |
| 20100141606 | Bae et al. | Jun 2010 | A1 |
| 20100260371 | Afshar | Oct 2010 | A1 |
| 20100261526 | Anderson et al. | Oct 2010 | A1 |
| 20110056763 | Tanase et al. | Mar 2011 | A1 |
| 20110075835 | Hill | Mar 2011 | A1 |
| 20110077055 | Pakula et al. | Mar 2011 | A1 |
| 20110141052 | Bernstein et al. | Jun 2011 | A1 |
| 20110161537 | Chang | Jun 2011 | A1 |
| 20110163985 | Bae et al. | Jul 2011 | A1 |
| 20110167391 | Momeyer et al. | Jul 2011 | A1 |
| 20120011436 | Jinkinson et al. | Jan 2012 | A1 |
| 20120105358 | Momeyer et al. | May 2012 | A1 |
| 20120105367 | Son et al. | May 2012 | A1 |
| 20120112894 | Yang et al. | May 2012 | A1 |
| 20120206246 | Cruz-Hernandez et al. | Aug 2012 | A1 |
| 20120206247 | Bhatia et al. | Aug 2012 | A1 |
| 20120229264 | Company Bosch et al. | Sep 2012 | A1 |
| 20120249462 | Flanagan et al. | Oct 2012 | A1 |
| 20120253698 | Cokonaj | Oct 2012 | A1 |
| 20120306631 | Hughes | Dec 2012 | A1 |
| 20130016855 | Lee et al. | Jan 2013 | A1 |
| 20130027359 | Schevin et al. | Jan 2013 | A1 |
| 20130038792 | Quigley et al. | Feb 2013 | A1 |
| 20130096849 | Campbell et al. | Apr 2013 | A1 |
| 20130141382 | Simmons et al. | Jun 2013 | A1 |
| 20130275058 | Awad | Oct 2013 | A1 |
| 20130289994 | Newman et al. | Oct 2013 | A1 |
| 20130307786 | Heubel | Nov 2013 | A1 |
| 20140035736 | Weddle et al. | Feb 2014 | A1 |
| 20140056461 | Afshar | Feb 2014 | A1 |
| 20140064516 | Cruz-Hernandez et al. | Mar 2014 | A1 |
| 20140079248 | Short et al. | Mar 2014 | A1 |
| 20140085064 | Crawley et al. | Mar 2014 | A1 |
| 20140118125 | Bhatia | May 2014 | A1 |
| 20140118126 | Garg et al. | May 2014 | A1 |
| 20140119244 | Steer et al. | May 2014 | A1 |
| 20140125467 | Da Costa et al. | May 2014 | A1 |
| 20140139327 | Bau et al. | May 2014 | A1 |
| 20140176415 | Buuck et al. | Jun 2014 | A1 |
| 20140205260 | Lacroix et al. | Jul 2014 | A1 |
| 20140222377 | Bitan et al. | Aug 2014 | A1 |
| 20140226068 | Lacroix et al. | Aug 2014 | A1 |
| 20140253303 | Levesque | Sep 2014 | A1 |
| 20140292501 | Lim et al. | Oct 2014 | A1 |
| 20140300454 | Lacroix et al. | Oct 2014 | A1 |
| 20140340209 | Lacroix et al. | Nov 2014 | A1 |
| 20140347176 | Modarres et al. | Nov 2014 | A1 |
| 20150049882 | Chiu et al. | Feb 2015 | A1 |
| 20150061846 | Yliaho | Mar 2015 | A1 |
| 20150070149 | Cruz-Hernandez et al. | Mar 2015 | A1 |
| 20150070151 | Cruz-Hernandez et al. | Mar 2015 | A1 |
| 20150070154 | Levesque et al. | Mar 2015 | A1 |
| 20150070260 | Saboune et al. | Mar 2015 | A1 |
| 20150077324 | Birnbaum et al. | Mar 2015 | A1 |
| 20150084752 | Heubel et al. | Mar 2015 | A1 |
| 20150116205 | Westerman et al. | Apr 2015 | A1 |
| 20150130767 | Myers et al. | May 2015 | A1 |
| 20150208189 | Tsai | Jul 2015 | A1 |
| 20150216762 | Oohashi et al. | Aug 2015 | A1 |
| 20150234464 | Yliaho | Aug 2015 | A1 |
| 20150264455 | Granoto et al. | Sep 2015 | A1 |
| 20150268768 | Woodhull et al. | Sep 2015 | A1 |
| 20150324116 | Marsden et al. | Nov 2015 | A1 |
| 20150325116 | Umminger, III | Nov 2015 | A1 |
| 20150339898 | Saboune et al. | Nov 2015 | A1 |
| 20150341714 | Ahn et al. | Nov 2015 | A1 |
| 20150356981 | Johnson et al. | Dec 2015 | A1 |
| 20160004311 | Yliaho | Jan 2016 | A1 |
| 20160007095 | Lacroix | Jan 2016 | A1 |
| 20160063826 | Morrell et al. | Mar 2016 | A1 |
| 20160070392 | Wang et al. | Mar 2016 | A1 |
| 20160074278 | Muench et al. | Mar 2016 | A1 |
| 20160097662 | Chang et al. | Apr 2016 | A1 |
| 20160132118 | Park et al. | May 2016 | A1 |
| 20160162031 | Westerman et al. | Jun 2016 | A1 |
| 20160179203 | Modarres et al. | Jun 2016 | A1 |
| 20160187987 | Ullrich et al. | Jun 2016 | A1 |
| 20160239089 | Taninaka et al. | Aug 2016 | A1 |
| 20160246378 | Sampanes et al. | Aug 2016 | A1 |
| 20160277821 | Kunimoto | Sep 2016 | A1 |
| 20160291731 | Liu et al. | Oct 2016 | A1 |
| 20160328065 | Johnson et al. | Nov 2016 | A1 |
| 20160358605 | Ganong, III et al. | Dec 2016 | A1 |
| 20170031495 | Smith | Feb 2017 | A1 |
| 20170052593 | Jiang et al. | Feb 2017 | A1 |
| 20170078804 | Guo et al. | Mar 2017 | A1 |
| 20170083096 | Rihn et al. | Mar 2017 | A1 |
| 20170090572 | Holenarsipur et al. | Mar 2017 | A1 |
| 20170090573 | Hajati et al. | Mar 2017 | A1 |
| 20170153760 | Chawda et al. | Jun 2017 | A1 |
| 20170168574 | Zhang | Jun 2017 | A1 |
| 20170168773 | Keller et al. | Jun 2017 | A1 |
| 20170169674 | Macours | Jun 2017 | A1 |
| 20170180863 | Biggs et al. | Jun 2017 | A1 |
| 20170220197 | Matsumoto et al. | Aug 2017 | A1 |
| 20170256145 | Macours et al. | Sep 2017 | A1 |
| 20170277350 | Wang et al. | Sep 2017 | A1 |
| 20170357440 | Tse | Dec 2017 | A1 |
| 20180021811 | Kutez et al. | Jan 2018 | A1 |
| 20180033946 | Kemppinen et al. | Feb 2018 | A1 |
| 20180059733 | Gault et al. | Mar 2018 | A1 |
| 20180059793 | Hajati | Mar 2018 | A1 |
| 20180067557 | Robert et al. | Mar 2018 | A1 |
| 20180074637 | Rosenberg et al. | Mar 2018 | A1 |
| 20180082673 | Tzanetos | Mar 2018 | A1 |
| 20180084362 | Zhang et al. | Mar 2018 | A1 |
| 20180095596 | Turgeman | Apr 2018 | A1 |
| 20180151036 | Cha et al. | May 2018 | A1 |
| 20180158289 | Vasilev et al. | Jun 2018 | A1 |
| 20180159452 | Eke et al. | Jun 2018 | A1 |
| 20180159457 | Eke | Jun 2018 | A1 |
| 20180159545 | Eke et al. | Jun 2018 | A1 |
| 20180160227 | Lawrence et al. | Jun 2018 | A1 |
| 20180165925 | Israr et al. | Jun 2018 | A1 |
| 20180178114 | Mizuta et al. | Jun 2018 | A1 |
| 20180182212 | Li et al. | Jun 2018 | A1 |
| 20180183372 | Li et al. | Jun 2018 | A1 |
| 20180194369 | Lisseman et al. | Jul 2018 | A1 |
| 20180196567 | Klein et al. | Jul 2018 | A1 |
| 20180224963 | Lee et al. | Aug 2018 | A1 |
| 20180227063 | Heubel et al. | Aug 2018 | A1 |
| 20180237033 | Hakeem et al. | Aug 2018 | A1 |
| 20180206282 | Singh | Sep 2018 | A1 |
| 20180253123 | Levesque et al. | Sep 2018 | A1 |
| 20180255411 | Lin et al. | Sep 2018 | A1 |
| 20180267897 | Jeong | Sep 2018 | A1 |
| 20180294757 | Feng et al. | Oct 2018 | A1 |
| 20180301060 | Israr et al. | Oct 2018 | A1 |
| 20180304310 | Long et al. | Oct 2018 | A1 |
| 20180321056 | Yoo et al. | Nov 2018 | A1 |
| 20180321748 | Rao et al. | Nov 2018 | A1 |
| 20180323725 | Cox et al. | Nov 2018 | A1 |
| 20180329172 | Tabuchi | Nov 2018 | A1 |
| 20180335848 | Moussette et al. | Nov 2018 | A1 |
| 20180367897 | Bjork et al. | Dec 2018 | A1 |
| 20190020760 | DeBates et al. | Jan 2019 | A1 |
| 20190035235 | Da Costa et al. | Jan 2019 | A1 |
| 20190227628 | Rand et al. | Jan 2019 | A1 |
| 20190044651 | Nakada | Feb 2019 | A1 |
| 20190051229 | Ozguner et al. | Feb 2019 | A1 |
| 20190064925 | Kim et al. | Feb 2019 | A1 |
| 20190069088 | Seiler | Feb 2019 | A1 |
| 20190073078 | Sheng et al. | Mar 2019 | A1 |
| 20190102031 | Shutzberg et al. | Apr 2019 | A1 |
| 20190103829 | Vasudevan et al. | Apr 2019 | A1 |
| 20190138098 | Shah | May 2019 | A1 |
| 20190163234 | Kim et al. | May 2019 | A1 |
| 20190196596 | Yokoyama et al. | Jun 2019 | A1 |
| 20190206396 | Chen | Jul 2019 | A1 |
| 20190215349 | Adams et al. | Jul 2019 | A1 |
| 20190220095 | Ogita et al. | Jul 2019 | A1 |
| 20190228619 | Yokoyama et al. | Jul 2019 | A1 |
| 20190114496 | Lesso | Aug 2019 | A1 |
| 20190235629 | Hu et al. | Aug 2019 | A1 |
| 20190253031 | Vellanki et al. | Aug 2019 | A1 |
| 20190294247 | Hu et al. | Sep 2019 | A1 |
| 20190295755 | Konradi et al. | Sep 2019 | A1 |
| 20190296674 | Janko et al. | Sep 2019 | A1 |
| 20190297418 | Stahl | Sep 2019 | A1 |
| 20190305851 | Vegas-Olmos et al. | Oct 2019 | A1 |
| 20190311590 | Doy et al. | Oct 2019 | A1 |
| 20190341903 | Kim | Nov 2019 | A1 |
| 20190384393 | Cruz-Hernandez et al. | Dec 2019 | A1 |
| 20190384898 | Chen et al. | Dec 2019 | A1 |
| 20200117506 | Chan | Apr 2020 | A1 |
| 20200139403 | Palit | May 2020 | A1 |
| 20200150767 | Karimi Eskandary et al. | May 2020 | A1 |
| 20200218352 | Macours et al. | Jul 2020 | A1 |
| 20200231085 | Kunii et al. | Jul 2020 | A1 |
| 20200306796 | Lindemann et al. | Oct 2020 | A1 |
| 20200313529 | Lindemann | Oct 2020 | A1 |
| 20200313654 | Marchais et al. | Oct 2020 | A1 |
| 20200314969 | Marchais et al. | Oct 2020 | A1 |
| 20200348249 | Marchais et al. | Nov 2020 | A1 |
| 20200403546 | Janko et al. | Dec 2020 | A1 |
| 20210108975 | Peso Parada et al. | Apr 2021 | A1 |
| 20210125469 | Alderson | Apr 2021 | A1 |
| 20210153562 | Fishwick et al. | May 2021 | A1 |
| 20210157436 | Peso Parada et al. | May 2021 | A1 |
| 20210174777 | Marchais et al. | Jun 2021 | A1 |
| 20210175869 | Taipale | Jun 2021 | A1 |
| 20210200316 | Das et al. | Jul 2021 | A1 |
| 20210325967 | Khenkin et al. | Oct 2021 | A1 |
| 20210328535 | Khenkin et al. | Oct 2021 | A1 |
| 20210365118 | Rajapurkar et al. | Nov 2021 | A1 |
| 20220026989 | Rao et al. | Jan 2022 | A1 |
| 20220328752 | Lesso | Oct 2022 | A1 |
| 20220404398 | Reynaga et al. | Dec 2022 | A1 |
| Number | Date | Country |
|---|---|---|
| 2002347829 | Apr 2003 | AU |
| 103165328 | Jun 2013 | CN |
| 104811838 | Jul 2015 | CN |
| 204903757 | Dec 2015 | CN |
| 105264551 | Jan 2016 | CN |
| 106438890 | Feb 2017 | CN |
| 103403796 | Jul 2017 | CN |
| 106950832 | Jul 2017 | CN |
| 107665051 | Feb 2018 | CN |
| 107835968 | Mar 2018 | CN |
| 107835968 | Mar 2018 | CN |
| 210628147 | May 2020 | CN |
| 114237414 | Mar 2022 | CN |
| 0784844 | Jun 2005 | EP |
| 2306269 | Apr 2011 | EP |
| 2363785 | Sep 2011 | EP |
| 2487780 | Aug 2012 | EP |
| 2600225 | Jun 2013 | EP |
| 2846218 | Mar 2015 | EP |
| 2846229 | Mar 2015 | EP |
| 2846329 | Mar 2015 | EP |
| 2988528 | Feb 2016 | EP |
| 3125508 | Feb 2017 | EP |
| 3379382 | Sep 2018 | EP |
| 201620746 | Jan 2017 | GB |
| 201747044027 | Aug 2018 | IN |
| H02130433 | May 1990 | JP |
| 08149006 | Jun 1996 | JP |
| H10184782 | Jul 1998 | JP |
| 6026751 | Nov 2016 | JP |
| 6250985 | Dec 2017 | JP |
| 6321351 | May 2018 | JP |
| 20120126446 | Nov 2012 | KR |
| 2013104919 | Jul 2013 | WO |
| 2013186845 | Dec 2013 | WO |
| 2014018086 | Jan 2014 | WO |
| 2014094283 | Jun 2014 | WO |
| 2016105496 | Jun 2016 | WO |
| 2016164193 | Oct 2016 | WO |
| 2017034973 | Mar 2017 | WO |
| 2017113651 | Jul 2017 | WO |
| 2018053159 | Mar 2018 | WO |
| 2018067613 | Apr 2018 | WO |
| 2018125347 | Jul 2018 | WO |
| 2020004840 | Jan 2020 | WO |
| 2020055405 | Mar 2020 | WO |
| Entry |
|---|
| Invitation to Pay Additional Fees, Partial International Search Report and Provisional Opinion of the International Searching Authority, International Application No. PCT/US2020/052537, dated Jan. 14, 2021. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/052537, dated Mar. 9, 2021. |
| Office Action of the Intellectual Property Office, ROC (Taiwan) Patent Application No. 107115475, dated Apr. 30, 2021. |
| First Office Action, China National Intellectual Property Administration, Patent Application No. 2019800208570, dated Jun. 3, 2021. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2021/021908, dated Jun. 9, 2021. |
| Notice of Preliminary Rejection, Korean Intellectual Property Office, Application No. 10-2019-7036236, dated Jun. 29, 2021. |
| Combined Search and Examination Report, United Kingdom Intellectual Property Office, Application No. GB2018051.9, dated Jun. 30, 2021. |
| Communication pursuant to Rule 164(2)(b) and Article 94(3) EPC, European Patent Office, Application No. 18727512.8, dated Jul. 8, 2021. |
| Gottfried Behler: “Measuring the Loudspeaker's Impedance during Operation for the Derivation of the Voice Coil Temperature”, AES Convention Preprint, Feb. 25, 1995 (Feb. 25, 1995), Paris. |
| First Office Action, China National Intellectual Property Administration, Patent Application No. 2019800211287, dated Jul. 5, 2021. |
| Steinbach et al., Haptic Data Compression and Communication, IEEE Signal Processing Magazine, Jan. 2011. |
| Pezent et al., Syntacts Open-Source Software and Hardware for Audio-Controlled Haptics, IEEE Transactions on Haptics, vol. 14, No. 1, Jan.-Mar. 2021. |
| Examination Report under Section 18(3), United Kingdom Intellectual Property Office, Application No. GB2018051.9, dated Nov. 5, 2021. |
| Jaijongrak et al., A Haptic and Auditory Assistive User Interface: Helping the Blinds on their Computer Operations, 2011 IEEE International Conference on Rehabilitation Robotics, Rehab Week Zurich, ETH Zurich Science City, Switzerland, Jun. 29-Jul. 1, 2011. |
| Lim et al., An Audio-Haptic Feedbacks for Enhancing User Experience in Mobile Devices, 2013 IEEE International Conference on Consumer Electronics (ICCE). |
| Weddle et al., How Does Audio-Haptic Enhancement Influence Emotional Response to Mobile Media, 2013 Fifth International Workshop on Quality of Multimedia Experience (QoMEX), QMEX 2013. |
| Danieau et al., Enhancing Audiovisual Experience with Haptic Feedback: A Survey on HAV, IEEE Transactions on Haptics, vol. 6, No. 2, Apr.-Jun. 2013. |
| Danieau et al., Toward Haptic Cinematography: Enhancing Movie Experiences with Camera-Based Haptic Effects, IEEE Computer Society, IEEE MultiMedia, Apr.-Jun. 2014. |
| Final Notice of Preliminary Rejection, Korean Patent Office, Application No. 10-2019-7036236, dated Nov. 29, 2021. |
| Examination Report under Section 18(3), United Kingdom Intellectual Property Office, Application No. GB2018050.1, dated Dec. 22, 2021. |
| Second Office Action, National Intellectual Property Administration, PRC, Application No. 2019800208570, dated Jan. 19, 2022. |
| Examination Report under Section 18(3), UKIPO, Application No. GB2106247.6, dated Mar. 31, 2022. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/050964, dated Sep. 3, 2019. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/050770, dated Jul. 5, 2019. |
| Communication Relating to the Results of the Partial International Search, and Provisional Opinion Accompanying the Partial Search Result, of the International Searching Authority, International Application No. PCT/US2018/031329, dated Jul. 20, 2018. |
| Combined Search and Examination Report, UKIPO, Application No. GB1720424.9, dated Jun. 5, 2018. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/052991, dated Mar. 17, 2020. |
| Communication Relating to the Results of the Partial International Search, and Provisional Opinion Accompanying the Partial Search Result, of the International Searching Authority, International Application No. PCT/GB2020/050822, dated Jul. 9, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2020/024864, dated Jul. 6, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051035, dated Jul. 10, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/050823, dated Jun. 30, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051037, dated Jul. 9, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/050822, dated Aug. 31, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051438, dated Sep. 28, 2020. |
| First Examination Opinion Notice, State Intellectual Property Office of the People's Republic of China, Application No. 201880037435.X, dated Dec. 31, 2020. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2020/056610, dated Jan. 21, 2021. |
| Combined Search and Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2210174.5, dated Aug. 1, 2022. |
| Examination Report under Section 18(3), UKIPO, Application No. GB2112207.2, dated Aug. 18, 2022. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/030541, dated Sep. 1, 2022. |
| Vanderborght, B. et al., Variable impedance actuators: A review; Robotics and Autonomous Systems 61, Aug. 6, 2013, pp. 1601-1614. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/033190, dated Sep. 8, 2022. |
| International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/033230, dated Sep. 15, 2022. |
| Examination Report under Section 18(3), UKIPO, Application No. GB2115048.7 dated Aug. 24, 2022. |
| Communication Pursuant to Article 94(3) EPC, European Patent Office, Application No. 18727512.8, dated Sep. 26, 2022. |
| Examination Report under Section 18(3), UKIPO, Application No. GB2112207.2, dated Nov. 7, 2022. |
| Examination Report, Intellectual Property India, Application No. 202117019138, dated Jan. 4, 2023. |
| Examination Report under Section 18(3), UKIPO, Application No. GB2113228.7, dated Feb. 10, 2023. |
| Examination Report under Section 18(3), UKIPO, Application No. GB2113154.5, dated Feb. 17, 2023. |
| First Office Action, China National Intellectual Property Administration, Application No. 2019107179621, dated Jan. 19, 2023. |
| Number | Date | Country | |
|---|---|---|---|
| 20220260439 A1 | Aug 2022 | US |
| Number | Date | Country | |
|---|---|---|---|
| 62915245 | Oct 2019 | US |
| Number | Date | Country | |
|---|---|---|---|
| Parent | 16850117 | Apr 2020 | US |
| Child | 17735582 | US |