Device comprising force sensors

Information

  • Patent Grant
  • 11515875
  • Patent Number
    11,515,875
  • Date Filed
    Thursday, March 4, 2021
    3 years ago
  • Date Issued
    Tuesday, November 29, 2022
    2 years ago
Abstract
A device, comprising: a pair of force sensors located for detecting a user squeeze input; and a controller operable in a squeeze detection operation to detect the user squeeze input based on a cross-correlation between respective sensor signals originating from the pair of force sensors.
Description
FIELD OF DISCLOSURE

The present disclosure relates in general to a device comprising force sensors. Such a device may be a portable electrical or electronic device.


The present disclosure extends to a controller of the device and to corresponding methods and computer programs.


BACKGROUND

Force sensors are known as possible input transducers for devices such as portable electrical or electronic devices, and can be used as alternatives to traditional mechanical switches. Such sensors detect forces on the device to determine user interaction, e.g. touches or presses of the device (user force inputs).


It is desirable to process the sensor signals originating from such force sensors in a convenient and useful manner.


SUMMARY

According to a first aspect of the present disclosure, there is provided a device, comprising: a pair of force sensors located for detecting a user squeeze input; and a controller operable in a squeeze detection operation to detect the user squeeze input based on a cross-correlation between respective sensor signals originating from the pair of force sensors.


The device may be a portable electrical or electronic device such as a portable telephone or computer. Other example devices are mentioned later herein. Using cross-correlation as disclosed herein provides a robust way of detecting a user squeeze input.


The user squeeze input may comprise a user applying forces: with one or both of their hands; and/or which together compress the device; and/or at at least two different locations on the device at the same time; and/or on at least two different sides or edges of the device at the same time; and/or on at least two opposite or opposing sides or edges of the device at the same time.


The pair of force sensors may be provided: at different locations on the device; and/or on the same side or edge of the device, or on different sides or edges of the device, or on opposite or opposing sides or edges of the device; and/or on the device at locations according to anthropometric measurements of a human hand.


The squeeze detection operation may comprise determining a cross-correlation value based on the sensor signals and detecting the user squeeze input based on the cross-correlation value. Determining the cross-correlation value may comprise determining a sliding dot product, a cross-product, a product, a sum or a combination of the sensor signals originating from the pair of force sensors.


The respective sensor signals originating from the pair of force sensors may be digital signals. The squeeze detection operation may comprise determining the cross-correlation value on a sample-by-sample basis.


For a given sample, the cross-correlation value may be generated as an updated cross-correlation value by updating an existing cross-correlation value (which was the updated cross-correlation value for the previous sample) based on a new cross-correlation value determined based on the sensor signals for that sample. The updated cross-correlation value may be based on the existing cross-correlation value to an extent defined by a smoothing parameter. The updated cross-correlation value may be based on a combination or sum of a proportion (e.g. 90%) of the existing cross-correlation value and a proportion (e.g. 10%) of the new cross-correlation value, those proportions defined by the smoothing parameter. The cross-correlation value may be generated as a smoothed cross-product of the respective sensor signals originating from the pair of force sensors.


The squeeze detection operation may comprise: at least one of normalising, filtering and bounding the cross-correlation value; and/or normalising the cross-correlation value to a maximum expected force value (i.e. to a value representative of a maximum expected force applied to a force sensor); and/or converting the cross-correlation value into a percentage or a fraction of a defined maximum value (e.g. 1 or 100); and/or comparing the cross-correlation value with a threshold value (e.g. a squeeze threshold, above which it is determined that a user squeeze input has occurred).


The squeeze detection operation may comprise determining whether the cross-correlation value exceeds the threshold value. The threshold value may be controlled based on one or more of a device configuration, a device setting and a user input.


The squeeze detection operation may comprise determining whether the cross-correlation value exceeds the threshold value for a threshold period of time, or by a threshold percentage of the threshold period of time. Thus, it may be that it is not sufficient for the cross-correlation value to exceed the threshold value only briefly. The threshold period and/or the threshold percentage may be controlled based on one or more of a device configuration, a device setting and a user input.


The device may comprise plurality of pairs of force sensors, each pair located for detecting a corresponding user squeeze input. The controller may be operable, for each pair of force sensors, to carry out a said squeeze detection operation to detect the corresponding user squeeze input.


The device may comprise at least two said pairs of force sensors located on the device for detecting the same user squeeze input. The controller may be operable to detect the user squeeze input corresponding to those pairs of force sensors based on a combination of the squeeze detection operations carried out for those pairs, optionally by combining cross-correlation values determined in respect of each of those pairs.


At least one said pair of force sensors may be part of a group of force sensors located on the device for detecting a user squeeze input corresponding to that group. The squeeze detection operation, for that group, may comprise comparing respective sensor signals originating from at least three of the force sensors of the group.


The group may comprise force sensors s1, s2, s3 and s4. The sensor signals originating from the group may be digital signals s1(n), s2(n), s3(n) and s4(n) corresponding respectively to the force sensors s1, s2, s3 and s4 and each comprising a series of numbered samples, where n is the sample number. The squeeze detection operation for the group may comprise calculating correlation coefficients ρ1(n) and ρ2(n) based on the equations:

ρ1(n)=λ·ρ1(n−1)+(1·λ)·s1(ns2(n)
ρ2(n)=λ·ρ2(n−1)+(1·λ)·s3(ns4(n)

where λ is a smoothing parameter.


The squeeze detection operation for that group may comprise normalising the correlation coefficients ρ1(n) and ρ2(n) to produce respective normalised correlation coefficients based on the equations:









ρ
˜

1



(
n
)


=


min


(


max


(



ρ
1



(
n
)


,
0

)


,
γ

)


γ










ρ
~

2



(
n
)


=


min


(


max


(



ρ
2



(
n
)


,
0

)


,
γ

)


γ






where γ is a parameter representing a maximum expected squared force.


The squeeze detection operation for the group may comprise determining a squeeze force level signal y(n) based on the equation:

y(n)=min({tilde over (ρ)}1(n)+{tilde over (ρ)}2(n),1).


The squeeze detection operation for the group may comprise detecting the user squeeze input based on the squeeze force level signal y(n).


The controller may be configured to control operation of the device based on detection of the user squeeze input, optionally by outputting a control signal based on detection of the user squeeze input. The device may comprise one or more input/output components, wherein the controller is configured to control operation of at least one of the input/output components based on detection of the user squeeze input.


Each of the force sensors may comprise one or more of: a capacitive displacement sensor; an inductive force sensor; a strain gauge; a piezoelectric force sensor; a force sensing resistor; a piezoresistive force sensor; a thin film force sensor; and a quantum tunneling composite-based force sensor.


According to a second aspect of the present disclosure, there is provided a controller for use in a device comprising a pair of force sensors located for detecting a user squeeze input, the controller operable in a squeeze detection operation to detect the user squeeze input based on a cross-correlation between respective sensor signals originating from the pair of force sensors.


According to a third aspect of the present disclosure, there is provided a method of detecting a user squeeze input in a device comprising a pair of force sensors located for detecting the user squeeze input, the method comprising detecting the user squeeze input based on a cross-correlation between respective sensor signals originating from the pair of force sensors.


According to a fourth aspect of the present disclosure, there is provided a computer program which, when executed by a controller of a device comprising a pair of force sensors located for detecting a user squeeze input, causes the controller to carry out a squeeze detection operation to detect the user squeeze input based on a cross-correlation between respective sensor signals originating from the pair of force sensors.


According to a fifth aspect of the present disclosure, there is provided a device, comprising: a pair of force sensors located for detecting a user force input; and a controller operable in a detection operation to detect the user force input based on a cross-correlation between respective sensor signals originating from the pair of force sensors.





BRIEF DESCRIPTION OF THE DRAWINGS

Reference will now be made, by way of example only, to the accompanying drawings, of which:



FIG. 1 is a schematic diagram of a device according to an embodiment; and



FIG. 2 presents example graphs of sensor signals which may be received from the force sensors of the FIG. 1 device.





DETAILED DESCRIPTION


FIG. 1 is a schematic diagram of a device 100 according to an embodiment, for example a mobile or portable electrical or electronic device. Example device 100 includes a portable and/or battery powered host device such as a mobile telephone, a smartphone, an audio player, a video player, a PDA, a mobile computing platform such as a laptop computer or tablet and/or a games device.


As shown in FIG. 1, the device 100 may comprise an enclosure 101, a controller 110, a memory 120, a plurality of force sensors 130, and an input and/or output unit (I/O unit) 140.


The enclosure 101 may comprise any suitable housing, casing, or other enclosure for housing the various components of device 100. Enclosure 101 may be constructed from plastic, metal, and/or any other suitable materials. In addition, enclosure 101 may be adapted (e.g., sized and shaped) such that device 100 is readily transported by a user (i.e. a person).


Controller 110 may be housed within enclosure 101 and may include any system, device, or apparatus configured to control functionality of the device 100, including any or all of the memory 120, the force sensors 130, and the I/O unit 140. Controller 110 may be implemented as digital or analogue circuitry, in hardware or in software running on a processor, or in any combination of these.


Thus controller 110 may include any system, device, or apparatus configured to interpret and/or execute program instructions or code and/or process data, and may include, without limitation a processor, microprocessor, microcontroller, digital signal processor (DSP), application specific integrated circuit (ASIC), FPGA (Field Programmable Gate Array) or any other digital or analogue circuitry configured to interpret and/or execute program instructions and/or process data. Thus the code may comprise program code or microcode or, for example, code for setting up or controlling an ASIC or FPGA. The code may also comprise code for dynamically configuring re-configurable apparatus such as re-programmable logic gate arrays. Similarly, the code may comprise code for a hardware description language such as Verilog™ or VHDL. As the skilled person will appreciate, the code may be distributed between a plurality of coupled components in communication with one another. Where appropriate, such aspects may also be implemented using code running on a field-(re)programmable analogue array or similar device in order to configure analogue hardware. Processor control code for execution by the controller 110, may be provided on a non-volatile carrier medium such as a disk, CD- or DVD-ROM, programmed memory such as read only memory (Firmware), or on a data carrier such as an optical or electrical signal carrier. The controller 110 may be referred to as control circuitry and may be provided as, or as part of, an integrated circuit such as an IC chip.


Memory 120 may be housed within enclosure 101, may be communicatively coupled to controller 110, and may include any system, device, or apparatus configured to retain program instructions and/or data for a period of time (e.g., computer-readable media). In some embodiments, controller 110 interprets and/or executes program instructions and/or processes data stored in memory 120 and/or other computer-readable media accessible to controller 110.


The force sensors 130 may be housed within, be located on or form part of the enclosure 101, and may be communicatively coupled to the controller 110. Each force sensor 130 may include any suitable system, device, or apparatus for sensing a force, a pressure, or a touch (e.g., an interaction with a human finger) and for generating an electrical or electronic signal in response to such force, pressure, or touch. Example force sensors 130 include or comprise capacitive displacement sensors, inductive force sensors, strain gauges, piezoelectric force sensors, force sensing resistors, piezoresistive force sensors, thin film force sensors and quantum tunneling composite-based force sensors.


In some arrangements, the electrical or electronic signal generated by a force sensor 130 may be a function of a magnitude of the force, pressure, or touch applied to the force sensor (a user force input). Such electronic or electrical signal may comprise a general purpose input/output (GPIO) signal associated with an input signal in response to which the controller 110 controls some functionality of the device 100. The term “force” as used herein may refer not only to force, but to physical quantities indicative of force or analogous to force such as, but not limited to, pressure and touch.


The I/O unit 140 may be housed within enclosure 101, may be distributed across the device 100 (i.e. it may represent a plurality of units) and may be communicatively coupled to the controller 110. Although not specifically shown in FIG. 1, the I/O unit 140 may comprise any or all of a microphone, an LRA (or other device capable of outputting a force, such as a vibration), a radio (or other electromagnetic) transmitter/receiver, a speaker, a display screen (optionally a touchscreen), an indicator (such as an LED), a sensor (e.g. accelerometer, temperature sensor, tilt sensor, electronic compass, etc.) and one or more buttons or keys.


As a convenient example to keep in mind, the device 100 may be a haptic-enabled device. As is well known, haptic technology recreates the sense of touch by applying forces, vibrations, or motions to a user. The device 100 for example may be considered a haptic-enabled device (a device enabled with haptic technology) where its force sensors 130 (input transducers) measure forces exerted by the user on a user interface (such as a button or touchscreen on a mobile telephone or tablet computer), and an LRA or other output transducer of the I/O unit 140 applies forces directly or indirectly (e.g. via a touchscreen) to the user, e.g. to give haptic feedback. Some aspects of the present disclosure, for example the controller 110 and/or the force sensors 130, may be arranged as part of a haptic circuit, for instance a haptic circuit which may be provided in the device 100. A circuit or circuitry embodying aspects of the present disclosure (such as the controller 110) may be implemented (at least in part) as an integrated circuit (IC), for example on an IC chip. One or more input or output transducers (such as the force sensors 130 or an LRA) may be connected to the integrated circuit in use.


Of course, this application to haptic technology is just one example application of the device 100 comprising the plurality of force sensors 130. The force sensors 130 may simply serve as generic input transducers to provide input signals to control other aspects of the device 100, such as a GUI (graphical user interface) displayed on a touchscreen of the I/O unit 140 or an operational state of the device 100 (such as waking components from a low-power “sleep” state).


The device 100 is shown comprising four force sensors 130, labelled s1, s2, s3 and s4, with their signals labelled S1, S2, S3 and S4, respectively. However, it will be understood that the device 100 generally need only comprise a pair of (i.e. at least two) force sensors 130 in connection with the techniques described herein, for example any pair of the sensors s1 to s4. Example pairs comprise s1 and s2, s1 and s3, s1 and s4, s2 and s4, s2 and s3, and s3 and s4. The four force sensors 130 s1 to s4 are shown for ready understanding of a particular arrangement described later. Of course, the device 100 may comprise more than four force sensors 130, such as additional sensors s5 to s8 arranged in a similar way to sensors s1 to s4 but in another area of the device 100.


Although FIG. 1 is schematic, it will be understood that the sensors s1 to s4 are located so that they can receive force inputs from a user, in particular a user hand, during use of the device 100. A user force input in this context corresponds to a user touching, pushing, pressing, or swiping the device, optionally with one or both of their hands, in the vicinity of one or more of the force sensors 130 so that a force (e.g. a threshold amount of force) may be applied at multiple force sensors at or substantially at the same time (simultaneously or contemporaneously) in some cases. Of course, in some cases the user may apply a user force input at a single force sensor 130. A change in the amount of force applied may be detected, rather than an absolute amount of force detected, for example.


Thus, the force sensors s1 to s4 may be located on the device according to anthropometric measurements of a human hand (e.g. so that a single human hand will likely apply a force to multiple force sensors when squeezing the device 100). For example, where there is only a pair of force sensors 130, they may be provided on the same side (e.g. s1 and s3), or on opposite sides (e.g. s1 and s2), of the device 100. It will be understood that the force sensors 130 are provided at different locations on the device, but may be in close proximity to one another.


In overview, taking a pair of force sensors 130 as a minimum case, the controller 110 is operable to perform a squeeze detection operation to detect a user squeeze input, the squeeze detection operation being a function of sensor signals originating from the respective force sensors 130 of the pair.


In this context, a user squeeze input comprises a user applying forces (e.g. with one or both of their hands) which together compress the device. Such forces may be applied at at least two different locations on the device at the same time, such as on at least two different sides or edges of the device. For example, such forces may be applied on at least two opposite or opposing sides or edges of the device at the same time. With the force sensors at different locations on the device (on the same side or edge of the device, or on different sides or edges of the device as mentioned earlier) such a user squeeze input may be picked up.


The squeeze detection operation involves operating on both of the sensor signals originating from the pair of force sensors 130, where each of them has its own sensor signal. The squeeze detection operation may be considered to comprise a comparison of the sensor signals originating from the pair of force sensors 130, where each of them has its own sensor signal. The controller 110 is thus connected to receive sensor signals, in digital or analogue form, originating from the force sensors 130.


The squeeze detection operation (e.g. the comparison of the sensor signals) may comprise determining a detection value based on the sensor signals, in particular a cross-correlation value as described in more detail later.



FIG. 2 presents example graphs of analogue (time domain) signals s1(t), s2(t), s3(t) and s4(t), which may be received from the force sensors s1, s2, s3 and s4, respectively, based on an example user squeeze applied to the device 100 by a user hand. In each graph, the x-axis represents time (e.g. measured in seconds, or milliseconds), and the y-axis represents force (e.g. measured in Newtons). It will be appreciated that the analogue signals may be voltage signals, in which case the y-axis unit may be volts (e.g. millivolts) but still be representative of detected force.


Also shown in FIG. 2 in schematic form alongside each of the graphs is an analogue-to-digital conversion of the each of the analogue (time domain) signals s1(t), s2(t), s3(t) and s4(t) to corresponding digital (digital domain) signals s1(n), s2(n), s3(n) and s4(n), respectively. The analogue-to-digital conversion could be carried out by corresponding analogue-to-digital converters (ADCs, not shown), which could be provided within the force sensors 130, within the controller 110, or between the force sensors 130 and the controller 110. The force sensors 130 could be digital force sensors which output digital signals s1(n), s2(n), s3(n) and s4(n) directly.


It will be apparent from FIG. 2 that by considering the sensor signals from at least a pair of the force sensors 130 it may be possible to detect a user squeeze input, i.e. a user squeezing the device so that a force is applied at multiple force sensors at the same time (simultaneously or contemporaneously).


There are several ways to consider the sensor signals from at least a pair of the force sensors 130. Taking the minimum case of considering the sensor signals from (only) a pair of the force sensors 130, the detection value may comprise or be a correlation value (cross-correlation value), determined by calculating a correlation between the sensor signals.


In some arrangements where the sensor signals are digital signals, the cross-correlation value is calculated as a cross-product of the sensor signals concerned (i.e. of their magnitudes) on a sample-by-sample basis. Smoothing of the cross-correlation values may be carried out. For example, for a given sample, the cross-correlation value may be generated as an updated cross-correlation value by updating an existing cross-correlation value based on a new cross-correlation value determined based on the sensor signals for that sample. In some arrangements, the updated cross-correlation value is based on the existing cross-correlation value to an extent defined by a smoothing parameter, or is based on a combination (e.g. sum) of a proportion of the existing cross-correlation value and a proportion of the new cross-correlation value, those proportions defined by the smoothing parameter. Those proportions may for example sum to 1 (100%), e.g. being 0.9 (90%) and 0.1 (10%), or 0.7 (70%) and 0.3 (30%). In this respect, the cross-correlation value may be considered a smoothed cross-product of the respective sensor signals originating from the pair of force sensors. Other examples of smoothing may include taking a running average (e.g. of a given number of cross-correlation values) such as a sliding window average (with a given or adaptable window size), or low-pass filtering.


The detection value may comprise a summation value, determined by summing the sensor signals. As another example, the detection value may comprise a difference value, determined by calculating a difference between the sensor signals. As another example, the detection value may comprise a multiplication value, determined by multiplying the sensor signals one by the other. As another example, the detection value may comprise a division value, determined by dividing the sensor signals one by the other. As another example, the detection value may comprise a convolution value, determined by convolving the sensor signals one with the other. Of course, combinations of these values may be used in the squeeze detection operation.


It will be appreciated that the sensor signals or the detection values (in particular, cross-correlation values) may be subject to conversion (e.g. analogue-to-digital), normalisation, filtering (e.g. high-pass, low-pass or band-pass frequency filtering), averaging (e.g. finding a running average) or other signal conditioning operations. The detection values may for example be normalised to a maximum expected force value, and then converted to a percentage (or a fraction of a defined maximum value). The detection values may for example be bounded, between given maximum and minimum boundary values such as 0 and 1.


In some arrangements, the squeeze detection operation compares the detection value with a threshold value. For example, the squeeze detection operation may determine whether the detection value exceeds the threshold value. The controller 110 may be configured to control the threshold value based on one or more of a device configuration, a device setting and a user input.


The squeeze detection operation may involve determining whether the detection value exceeds the threshold value for a threshold period of time, or exceeds the threshold value over a threshold percentage of a threshold period of time. The controller 110 may be configured to control the threshold period and/or the threshold percentage based on one or more of a device configuration, a device setting and a user input.


As in FIG. 1, the device 100 may comprise a plurality of pairs of force sensors 130. For example, the device 100 is shown as having two pairs of force sensors 130, e.g. s1 and s2 as one pair, and s3 and s4 as another pair. Each of these pairs may be considered located for detecting a corresponding (different) user squeeze input. The controller 110 may be operable, for each pair of force sensors 130 (i.e. on a pair-by-pair basis), to carry out a squeeze detection operation to detect the corresponding user squeeze input. Those operations may be carried out at least in part in parallel or in series (sequentially).


Where there are at least two pairs of force sensors 130, as in FIG. 1, those pairs may be located on the device 100 for detecting the same user squeeze input. The controller 110 may be operable to detect the user squeeze input corresponding to those pairs of force sensors 130 based on a combination of the squeeze detection operations carried out for those pairs. For example, the controller 110 may combine detection values for the two detection operations in some way (e.g. take an average), for example after normalisation or bounding.


Where there are more than two force sensors 130, as in FIG. 1, those force sensors may be considered to form a group of force sensors 130 (the group comprising a pair of force sensors 130) located on the device 100 for detecting a user squeeze input corresponding to that group. The squeeze detection operation, for that group, may be a function (comprising a comparison) of the respective sensor signals originating from at least three of the force sensors 130 of the group.


As a detailed example based on FIG. 1, the group may be considered to comprise the force sensors 130 s1, s2, s3 and s4. In line with FIG. 2, it may be considered that the sensor signals originating from the group are digital signals s1(n), s2(n), s3(n) and s4(n), corresponding respectively to the force sensors s1, s2, s3 and s4 and each comprising a series of numbered samples, where n is the sample number.


In that case, the squeeze detection operation for the group may comprise calculating correlation (cross-correlation) coefficients ρ1(n) and ρ2(n) as example cross-correlation values based on the equations:

ρ1(n)=λ·p1(n−1)+(1−λ)·s1(ns2(n)
ρ2(n)=λ·ρ2(n−1)+(1·λ)·s3(ns4(n)

where λ is a smoothing (weighting or learning rate) parameter. Here, the correlation coefficients ρ1(n) and ρ2(n) could be considered updated correlation coefficients and are based at least in part on previous or existing correlation coefficients ρ1(n−1) and ρ2(n−1) and newly-calculated coefficients (i.e. based on the current samples) s1(n)·s2(n) and s3(n)·s4(n), to an extent defined by the smoothing parameter. It can readily be seen above that the smoothing parameter A determines the relative proportions of the existing and new coefficients that make up the updated coefficients. For example if λ is 0.9 then in the above equations an updated coefficient will be the sum of 90% of the existing coefficient and 10% of the new coefficient concerned.


Thus, the above equations may be considered to calculate smoothed cross-products. Other methods of smoothing include averaging (e.g. calculating a running average or sliding window average or time-based or multiple-sample-based average) and low-pass filtering. Of course, it may be that only one of the correlation coefficients ρ1(n) and ρ2(n) is calculated, e.g. where only two force sensors are employed, however the present example where both are calculated will be continued.


The above equations for the correlation coefficients ρ1(n) and ρ2(n) may be considered a simplification of more general cross-correlation equations which take account of a potentially variable window size w and hardware delay Δ (e.g. a relative delay between the signals provided by the force sensors 130), the above simplification using w=1 and Δ=0.


Such a more general cross-correlation equation is indicated below for the correlation coefficient ρ1(n), where i is the sensor index:

ρ1(n)=λρ1(n−1)+(1−λ)s1(n−Δ)s2T(n)
si(n)=[si(n−w+1),si(n−w+2), . . . ,si(n)]


It will be appreciated that there may be a hardware delay between the force sensors 130, and it may be desirable in some applications to use a larger window size than 1, or for example to vary the window size dynamically.


The correlation coefficients ρ1(n) and ρ2(n) may be normalised to produce respective normalised correlation coefficients as follows:









ρ
~

1



(
n
)


=


min






(


max


(



ρ
1



(
n
)


,
0

)


,
γ

)


γ










ρ
˜

2



(
n
)


=


min


(


max


(



ρ
2



(
n
)


,
0

)


,
γ

)


γ






where γ is a parameter representing the maximum expected squared force.


These normalised correlation coefficients are bounded between 0 and 1, and may be combined as follows to provide the squeeze force level y(n):

y(n)=min({tilde over (ρ)}1(n)+{tilde over (ρ)}2(n),1)


The squeeze detection operation for said group may comprise detecting the user squeeze input based on the squeeze force level signal y(n), for example by comparing the signal with a threshold. It will be appreciated that only one of the correlation coefficients ρ1(n) and ρ2(n) (e.g. ρ1(n)) may have been normalised to produce a corresponding normalised correlation coefficient as above, and this value used as the squeeze force level y(n).


Parameter values for the smoothing parameter A and the maximum expected squared force parameter γ (and window size w and hardware delay Δ if used) may differ from application to application, and may be varied dynamically. Of course, one or more of these parameters may be tunable, for example dynamically based on any of the signals s1(n), s2(n), s3(n), s4(n) and y(n) or based on a tuning input, or set for a given application or based on a user input.


The operations described herein are dependent at least to an extent on the arrangement of the force sensors 130 in the device 100, and relate in particular to how the input sensor signals are handled in the controller 110. The skilled person will accordingly recognise that aspects of the operations disclosed herein (and associated methods) may be embodied within the controller 110 itself based on the input sensor signals it receives. As such, the controller 110 itself and the methods it carries out (and corresponding computer programs) may embody the present invention.


Turning back to FIG. 1, the controller 110 may be configured to control operation of the device 100 based on detection of the user squeeze input. For example, the controller 110 may be configured to control operation of itself or of at least one of the input/output components of the I/O unit 140 based on detection of the user squeeze input. In the context of haptic functionality, the controller 110 may be configured to control an LRA within the I/O unit 140 based on detection of the user squeeze input.


As another example, the user squeeze input may be taken to be a user input in connection with a GUI (graphical user interface) displayed on a touchscreen of the device 100. Of course, numerous other examples will occur to the skilled person, the user squeeze input simply serving as a generic user input which may be taken advantage of in any way.


It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims. The word “comprising” does not exclude the presence of elements or steps other than those listed in the claim, “a” or “an” does not exclude a plurality, and a single feature or other unit may fulfil the functions of several units recited in the claims. Any reference numerals or labels in the claims shall not be construed so as to limit their scope.

Claims
  • 1. A device, comprising: first and second force sensors located on respective different sides or edges of the device from one another, the first and second force sensors being different force sensors from one another, said different sides or edges opposing one another such that when a user applies forces on said different sides or edges of the device at the same time said forces oppose one another and together compress the device, and a user squeeze input comprising the user applying said forces; anda controller configured in a squeeze detection operation to detect the user squeeze input, the squeeze detection operation comprising determining a cross-correlation value of a cross-correlation between a first sensor signal originating from the first force sensor and a second sensor signal originating from the second force sensor and comparing the cross-correlation value with a threshold value.
  • 2. The device as claimed in claim 1, wherein said user squeeze input comprises the user applying said forces with one or both of their hands.
  • 3. The device as claimed in claim 1, wherein said first and second force sensors are provided on the device at locations according to anthropometric measurements of a human hand.
  • 4. The device as claimed in claim 1, wherein determining the cross-correlation value comprises determining a sliding dot product, a cross-product, a product, a sum or a combination of the sensor signals originating from the first and second force sensors.
  • 5. The device as claimed in claim 1, wherein the respective sensor signals originating from the first and second force sensors are digital signals, and the squeeze detection operation comprises determining the cross-correlation value on a sample-by-sample basis.
  • 6. The device as claimed in claim 5, wherein: for a given sample, the cross-correlation value is generated as an updated cross-correlation value by updating an existing cross-correlation value based on a new cross-correlation value determined based on the sensor signals for that given sample, optionally wherein the updated cross-correlation value is based on the existing cross-correlation value to an extent defined by a smoothing parameter, or is based on a combination or sum of a proportion of the existing cross-correlation value and a proportion of the new cross-correlation value, those proportions defined by the smoothing parameter; and/orthe cross-correlation value is generated as a smoothed or averaged cross-product of the respective sensor signals originating from the first and second force sensors.
  • 7. The device as claimed in claim 1, wherein the squeeze detection operation comprises: at least one of normalising, filtering and bounding the cross-correlation value; and/ornormalising the cross-correlation value to a maximum expected force value; and/orconverting the cross-correlation value into a percentage or a fraction of a defined maximum value.
  • 8. The device as claimed in claim 7, wherein the squeeze detection operation comprises determining whether the cross-correlation value exceeds the threshold value, optionally wherein the device is configured to control the threshold value based on one or more of a device configuration, a device setting and a user input.
  • 9. The device as claimed in claim 7, wherein the squeeze detection operation comprises determining whether the cross-correlation value exceeds the threshold value for a threshold period of time, or by a threshold percentage of the threshold period of time.
  • 10. The device as claimed in claim 1, comprising a plurality of pairs of force sensors, one of said pairs comprising said first and second force sensors, each pair located for detecting a corresponding user squeeze input, wherein: the controller is operable, for each said pair of force sensors, to carry out a said squeeze detection operation to detect the corresponding user squeeze input.
  • 11. The device as claimed in claim 1, wherein the first and second force sensors are a pair of force sensors, and wherein at least two said pairs of force sensors are located on the device for detecting the same user squeeze input, wherein: the controller is operable to detect the user squeeze input corresponding to those pairs of force sensors based on a combination of the squeeze detection operations carried out for those pairs, optionally by combining cross-correlation values determined in respect of each of those pairs.
  • 12. The device as claimed in claim 1, the first and second force sensors being part of a group of force sensors located on the device for detecting a user squeeze input corresponding to that group, wherein: the squeeze detection operation, for said group, comprises comparing respective sensor signals originating from at least three of the force sensors of the group.
  • 13. The device as claimed in claim 1, said first and second force sensors being part of a group of force sensors located on the device for detecting a user squeeze input corresponding to that group, wherein: said group comprises force sensors s1, s2, s3 and s4;the sensor signals originating from the group are digital signals s1(n), s2(n), s3(n) and s4(n) corresponding respectively to the force sensors s1, s2, s3 and s4 and each comprising a series of numbered samples, where n is the sample number; andthe squeeze detection operation for said group comprises calculating correlation coefficients ρ1(n) and ρ2(n) based on the equations: ρ1(n)=λ·ρ1(n−1)+(1−λ)·s1(n)·s2(n)ρ2(n)=λ·ρ2(n−1)+(1−λ)·s3(n)·s4(n)
  • 14. The device as claimed in claim 13, wherein: the squeeze detection operation for said group comprises normalising the correlation coefficients ρ1(n) and ρ2(n) to produce respective normalised correlation coefficients based on the equations:
  • 15. The device as claimed in claim 14, wherein: the squeeze detection operation for said group comprises determining a squeeze force level signal y(n) based on the equation: y(n)=min({tilde over (ρ)}1(n)+{tilde over (ρ)}2(n),1).
  • 16. The device as claimed in claim 15, wherein the squeeze detection operation for said group comprises detecting the user squeeze input based on the squeeze force level signal y(n).
  • 17. The device as claimed claim 1, wherein the controller is configured to control operation of the device based on detection of the user squeeze input, optionally by outputting a control signal based on detection of the user squeeze input.
  • 18. A controller for use in a device comprising first and second force sensors located on respective different sides or edges of the device from one another, the first and second force sensors being different force sensors from one another, said different sides or edges opposing one another such that when a user applies forces on said different sides or edges of the device at the same time said forces oppose one another and together compress the device, and a user squeeze input comprising the user applying said forces, the controller configured in a squeeze detection operation to detect the user squeeze input, the squeeze detection operation comprising determining a cross-correlation value of a cross-correlation between a first sensor signal originating from the first force sensor and a second sensor signal originating from the second force sensor and comparing the cross-correlation value with a threshold value.
  • 19. A method of controlling a device, the device comprising first and second force sensors located on respective different sides or edges of the device from one another, the first and second force sensors being different force sensors from one another, said different sides or edges opposing one another such that when a user applies forces on said different sides or edges of the device at the same time said forces oppose one another and together compress the device, and a user squeeze input comprising the user applying said forces, the method comprising: detecting the user squeeze input in a squeeze detection operation, the squeeze detection operation comprising determining a cross-correlation value of a cross-correlation between a first sensor signal originating from the first force sensor and a second sensor signal originating from the second force sensor and comparing the cross-correlation value with a threshold value.
Parent Case Info

The present disclosure is a continuation of U.S. Non-Provisional patent application Ser. No. 16/369,645, filed Mar. 29, 2019, which is incorporated by reference herein in its entirety.

US Referenced Citations (298)
Number Name Date Kind
3686927 Scharton Aug 1972 A
4902136 Mueller et al. Feb 1990 A
5684722 Thorner et al. Nov 1997 A
5748578 Schell May 1998 A
5857986 Moriyasu Jan 1999 A
6050393 Murai et al. Apr 2000 A
6278790 Davis et al. Aug 2001 B1
6294891 McConnell et al. Sep 2001 B1
6332029 Azima et al. Dec 2001 B1
6388520 Wada et al. May 2002 B2
6567478 Oishi et al. May 2003 B2
6580796 Kuroki Jun 2003 B1
6683437 Tierling Jan 2004 B2
6703550 Chu Mar 2004 B2
6762745 Braun et al. Jul 2004 B1
6768779 Nielsen Jul 2004 B1
6784740 Tabatabaei Aug 2004 B1
6906697 Rosenberg Jun 2005 B2
6995747 Casebolt Feb 2006 B2
7154470 Tierling Dec 2006 B2
7277678 Rozenblit et al. Oct 2007 B2
7333604 Zernovizky et al. Feb 2008 B2
7392066 Hapamas Jun 2008 B2
7456688 Okazaki et al. Nov 2008 B2
7623114 Rank Nov 2009 B2
7639232 Grant et al. Dec 2009 B2
7791588 Tierling et al. Sep 2010 B2
7979146 Ullrich et al. Jul 2011 B2
8068025 Devenyi et al. Nov 2011 B2
8098234 Lacroix et al. Jan 2012 B2
8102364 Tierling Jan 2012 B2
8325144 Tierling et al. Dec 2012 B1
8427286 Grant et al. Apr 2013 B2
8441444 Moore et al. May 2013 B2
8466778 Hwang et al. Jun 2013 B2
8480240 Kashiyama Jul 2013 B2
8572293 Cruz-Hernandez et al. Oct 2013 B2
8572296 Shasha et al. Oct 2013 B2
8593269 Grant et al. Nov 2013 B2
8648829 Shahoian et al. Feb 2014 B2
8659208 Rose et al. Feb 2014 B1
8754757 Ullrich et al. Jun 2014 B1
8947216 Da Costa et al. Feb 2015 B2
8981915 Birnbaum et al. Mar 2015 B2
8994518 Gregorio et al. Mar 2015 B2
9030428 Fleming May 2015 B2
9063570 Weddle et al. Jun 2015 B2
9070856 Rose et al. Jun 2015 B1
9083821 Hughes Jul 2015 B2
9092059 Bhatia Jul 2015 B2
9117347 Matthews Aug 2015 B2
9128523 Buuck et al. Sep 2015 B2
9164587 Da Costa et al. Oct 2015 B2
9196135 Shah et al. Nov 2015 B2
9248840 Truong Feb 2016 B2
9326066 Kilppel Apr 2016 B2
9329721 Buuck et al. May 2016 B1
9354704 Lacroix et al. May 2016 B2
9368005 Cruz-Hernandez et al. Jun 2016 B2
9489047 Jiang et al. Nov 2016 B2
9495013 Underkoffler et al. Nov 2016 B2
9507423 Gandhi et al. Nov 2016 B2
9513709 Gregorio et al. Dec 2016 B2
9520036 Buuck Dec 2016 B1
9588586 Rihn Mar 2017 B2
9640047 Choi et al. May 2017 B2
9652041 Jiang et al. May 2017 B2
9696859 Heller et al. Jul 2017 B1
9697450 Lee Jul 2017 B1
9715300 Sinclair et al. Jul 2017 B2
9740381 Chaudhri et al. Aug 2017 B1
9842476 Rihn et al. Dec 2017 B2
9864567 Seo Jan 2018 B2
9881467 Levesque Jan 2018 B2
9886829 Levesque Feb 2018 B2
9946348 Ullrich et al. Apr 2018 B2
9947186 Macours Apr 2018 B2
9959744 Koskan et al. May 2018 B2
9965092 Smith May 2018 B2
10032550 Zhang et al. Jul 2018 B1
10055950 Saboune et al. Aug 2018 B2
10074246 Da Costa et al. Sep 2018 B2
10110152 Hajati Oct 2018 B1
10171008 Nishitani et al. Jan 2019 B2
10175763 Shah Jan 2019 B2
10191579 Forlines et al. Jan 2019 B2
10264348 Harris et al. Apr 2019 B1
10275087 Smith Apr 2019 B1
10402031 Vandermeijden et al. Sep 2019 B2
10564727 Billington Feb 2020 B2
10620704 Rand et al. Apr 2020 B2
10667051 Stahl May 2020 B2
10726683 Mondello et al. Jul 2020 B1
10735956 Bae et al. Aug 2020 B2
10782785 Hu et al. Sep 2020 B2
10795443 Hu et al. Oct 2020 B2
10820100 Stahl et al. Oct 2020 B2
10828672 Stahl et al. Nov 2020 B2
10832537 Doy et al. Nov 2020 B2
10848886 Rand Nov 2020 B2
10860202 Sepehr et al. Dec 2020 B2
10969871 Rand et al. Apr 2021 B2
11069206 Rao et al. Jul 2021 B2
11139767 Janko et al. Oct 2021 B2
11150733 Das et al. Oct 2021 B2
11259121 Lindemann Feb 2022 B2
20010043714 Asada et al. Nov 2001 A1
20020018578 Burton Feb 2002 A1
20020085647 Oishi et al. Jul 2002 A1
20020169608 Tamir et al. Nov 2002 A1
20030068053 Chu Apr 2003 A1
20030214485 Roberts Nov 2003 A1
20050031140 Browning Feb 2005 A1
20050134562 Grant et al. Jun 2005 A1
20060028095 Maruyama et al. Feb 2006 A1
20060197753 Hotelling Sep 2006 A1
20070024254 Radecker et al. Feb 2007 A1
20070241816 Okazaki et al. Oct 2007 A1
20080077367 Odajima Mar 2008 A1
20080226109 Yamakata et al. Sep 2008 A1
20080240458 Goldstein et al. Oct 2008 A1
20080293453 Atlas et al. Nov 2008 A1
20080316181 Nurmi Dec 2008 A1
20090020343 Rothkopf et al. Jan 2009 A1
20090079690 Watson et al. Mar 2009 A1
20090088220 Persson Apr 2009 A1
20090096632 Ullrich et al. Apr 2009 A1
20090102805 Meijer et al. Apr 2009 A1
20090128306 Luden et al. May 2009 A1
20090153499 Kim et al. Jun 2009 A1
20090189867 Krah et al. Jul 2009 A1
20090278819 Goldenberg et al. Nov 2009 A1
20090313542 Cruz-Hernandez et al. Dec 2009 A1
20100013761 Birnbaum et al. Jan 2010 A1
20100080331 Garudadr et al. Apr 2010 A1
20100085317 Park et al. Apr 2010 A1
20100141408 Doy et al. Jun 2010 A1
20100141606 Bae et al. Jun 2010 A1
20100260371 Afshar Oct 2010 A1
20100261526 Anderson et al. Oct 2010 A1
20110056763 Tanase et al. Mar 2011 A1
20110075835 Hill Mar 2011 A1
20110077055 Pakula et al. Mar 2011 A1
20110141052 Bernstein et al. Jun 2011 A1
20110161537 Chang Jun 2011 A1
20110163985 Bae et al. Jul 2011 A1
20110167391 Momeyer et al. Jul 2011 A1
20120011436 Jinkinson et al. Jan 2012 A1
20120105358 Momeyer et al. May 2012 A1
20120112894 Yang et al. May 2012 A1
20120206246 Cruz-Hernandez et al. Aug 2012 A1
20120206247 Bhatia et al. Aug 2012 A1
20120229264 Company Bosch et al. Sep 2012 A1
20120253698 Cokonaj Oct 2012 A1
20120306631 Hughes Dec 2012 A1
20130016855 Lee et al. Jan 2013 A1
20130027359 Schevin et al. Jan 2013 A1
20130038792 Quigley et al. Feb 2013 A1
20130096849 Campbell et al. Apr 2013 A1
20130132091 Skerpac May 2013 A1
20130141382 Simmons et al. Jun 2013 A1
20130275058 Awad Oct 2013 A1
20130289994 Newman et al. Oct 2013 A1
20140056461 Afshar Feb 2014 A1
20140064516 Cruz-Hernandez et al. Mar 2014 A1
20140079248 Short et al. Mar 2014 A1
20140085064 Crawley et al. Mar 2014 A1
20140118125 Bhatia May 2014 A1
20140118126 Garg et al. May 2014 A1
20140119244 Steer et al. May 2014 A1
20140139327 Bau May 2014 A1
20140226068 Lacroix et al. Aug 2014 A1
20140292501 Lim et al. Oct 2014 A1
20140340209 Lacroix et al. Nov 2014 A1
20140347176 Modarres et al. Nov 2014 A1
20150061846 Yliaho Mar 2015 A1
20150070149 Cruz-Hernandez et al. Mar 2015 A1
20150070151 Cruz-Hernandez et al. Mar 2015 A1
20150070154 Levesque et al. Mar 2015 A1
20150070260 Saboune et al. Mar 2015 A1
20150084752 Heubel et al. Mar 2015 A1
20150130767 Myers et al. May 2015 A1
20150208189 Tsai Jul 2015 A1
20150216762 Oohashi et al. Aug 2015 A1
20150234464 Yliaho Aug 2015 A1
20150324116 Marsden et al. Nov 2015 A1
20150325116 Umminger, III Nov 2015 A1
20150341714 Ahn et al. Nov 2015 A1
20150356981 Johnson et al. Dec 2015 A1
20160004311 Yliaho Jan 2016 A1
20160007095 Lacroix Jan 2016 A1
20160063826 Morrell et al. Mar 2016 A1
20160070392 Wang Mar 2016 A1
20160074278 Muench et al. Mar 2016 A1
20160097662 Chang et al. Apr 2016 A1
20160132118 Park et al. May 2016 A1
20160162031 Westerman et al. Jun 2016 A1
20160179203 Modarres Jun 2016 A1
20160187987 Ulrich et al. Jun 2016 A1
20160239089 Taninaka et al. Aug 2016 A1
20160246378 Sampanes et al. Aug 2016 A1
20160248768 McLaren et al. Aug 2016 A1
20160277821 Kunimoto Sep 2016 A1
20160291731 Liu et al. Oct 2016 A1
20160328065 Johnson et al. Nov 2016 A1
20160358605 Ganong, III et al. Dec 2016 A1
20170052593 Jiang et al. Feb 2017 A1
20170078804 Guo et al. Mar 2017 A1
20170083096 Rihn Mar 2017 A1
20170090572 Holenarsipur et al. Mar 2017 A1
20170090573 Hajati et al. Mar 2017 A1
20170153760 Chawda et al. Jun 2017 A1
20170168574 Zhang Jun 2017 A1
20170169674 Macours Jun 2017 A1
20170180863 Biggs et al. Jun 2017 A1
20170220197 Matsumoto et al. Aug 2017 A1
20170256145 Macours et al. Sep 2017 A1
20170277350 Wang et al. Sep 2017 A1
20170031495 Tse Dec 2017 A1
20170357440 Tse Dec 2017 A1
20180021811 Kutej et al. Jan 2018 A1
20180059733 Gault et al. Mar 2018 A1
20180059793 Hajati Mar 2018 A1
20180067557 Robert et al. Mar 2018 A1
20180074637 Rosenberg et al. Mar 2018 A1
20180082673 Tzanetos Mar 2018 A1
20180084362 Zhang et al. Mar 2018 A1
20180151036 Cha et al. May 2018 A1
20180158289 Vasilev et al. Jun 2018 A1
20180159452 Eke et al. Jun 2018 A1
20180159457 Eke Jun 2018 A1
20180159545 Eke et al. Jun 2018 A1
20180160227 Lawrence et al. Jun 2018 A1
20180165925 Israr et al. Jun 2018 A1
20180178114 Mizuta et al. Jun 2018 A1
20180182212 Li et al. Jun 2018 A1
20180183372 Li et al. Jun 2018 A1
20180196567 Klein et al. Jul 2018 A1
20180224963 Lee et al. Aug 2018 A1
20180237033 Hakeem et al. Aug 2018 A1
20180206282 Singh Sep 2018 A1
20180253123 Levesque et al. Sep 2018 A1
20180255411 Lin et al. Sep 2018 A1
20180267897 Jeong Sep 2018 A1
20180294757 Feng et al. Oct 2018 A1
20180301060 Israr et al. Oct 2018 A1
20180321748 Rao et al. Nov 2018 A1
20180323725 Cox et al. Nov 2018 A1
20180329172 Tabuchi Nov 2018 A1
20180335848 Moussette et al. Nov 2018 A1
20180367897 Bjork et al. Dec 2018 A1
20190020760 DeBates et al. Jan 2019 A1
20190035235 Da Costa et al. Jan 2019 A1
20190227628 Rand et al. Jan 2019 A1
20190043512 Huang et al. Feb 2019 A1
20190044651 Nakada Feb 2019 A1
20190051229 Ozguner et al. Feb 2019 A1
20190064925 Kim et al. Feb 2019 A1
20190069088 Seiler Feb 2019 A1
20190073078 Sheng Mar 2019 A1
20190102031 Shutzberg et al. Apr 2019 A1
20190103829 Vasudevan et al. Apr 2019 A1
20190138098 Shah May 2019 A1
20190163234 Kim et al. May 2019 A1
20190196596 Yokoyama et al. Jun 2019 A1
20190206396 Chen Jul 2019 A1
20190215349 Adams et al. Jul 2019 A1
20190220095 Ogita et al. Jul 2019 A1
20190228619 Yokoyama et al. Jul 2019 A1
20190114496 Lesso Aug 2019 A1
20190235629 Hu et al. Aug 2019 A1
20190287536 Sharifi et al. Sep 2019 A1
20190294247 Hu et al. Sep 2019 A1
20190296674 Janko et al. Sep 2019 A1
20190297418 Stahl Sep 2019 A1
20190305851 Vegas-Olmos et al. Oct 2019 A1
20190311590 Doy et al. Oct 2019 A1
20190341903 Kim Nov 2019 A1
20190384393 Cruz-Hernandez et al. Dec 2019 A1
20200117506 Chan Apr 2020 A1
20200139403 Palit May 2020 A1
20200150767 Karimi Eskandary et al. May 2020 A1
20200218352 Macours et al. Jul 2020 A1
20200313529 Lindemann Oct 2020 A1
20200313654 Marchais et al. Oct 2020 A1
20200314969 Marchais et al. Oct 2020 A1
20200403546 Janko et al. Dec 2020 A1
20210108975 Peso Parada et al. Apr 2021 A1
20210125469 Alderson Apr 2021 A1
20210153562 Fishwick et al. May 2021 A1
20210157436 Peso Parada et al. May 2021 A1
20210174777 Marchais et al. Jun 2021 A1
20210175869 Taipale Jun 2021 A1
20210200316 Das et al. Jul 2021 A1
20210325967 Khenkin et al. Oct 2021 A1
20210328535 Khenkin et al. Oct 2021 A1
20210365118 Rajapurkar et al. Nov 2021 A1
20220026989 Rao et al. Jan 2022 A1
Foreign Referenced Citations (43)
Number Date Country
2002347829 Apr 2003 AU
103165328 Jun 2013 CN
204903757 Dec 2015 CN
105264551 Jan 2016 CN
106438890 Feb 2017 CN
103403796 Jul 2017 CN
106950832 Jul 2017 CN
107665051 Feb 2018 CN
210628147 May 2020 CN
114237414 Mar 2022 CN
0784844 Jun 2005 EP
2306269 Apr 2011 EP
2363785 Sep 2011 EP
2487780 Aug 2012 EP
2600225 Jun 2013 EP
2846218 Mar 2015 EP
2846229 Mar 2015 EP
2846329 Mar 2015 EP
2988528 Feb 2016 EP
3125508 Feb 2017 EP
3379382 Sep 2018 EP
201620746 Jan 2017 GB
201747044027 Aug 2018 IN
H02130433 May 1990 JP
08149006 Jun 1996 JP
H10184782 Jul 1998 JP
6026751 Nov 2016 JP
6250985 Dec 2017 JP
6321351 May 2018 JP
20120126446 Nov 2012 KR
0208147 Oct 2002 WO
2013104919 Jul 2013 WO
2013186845 Dec 2013 WO
2014018086 Jan 2014 WO
2014094283 Jun 2014 WO
2016105496 Jun 2016 WO
2016164193 Oct 2016 WO
2017113651 Jul 2017 WO
2018053159 Mar 2018 WO
2018067613 Apr 2018 WO
2018125347 Jul 2018 WO
2020004840 Jan 2020 WO
2020055405 Mar 2020 WO
Non-Patent Literature Citations (44)
Entry
Invitation to Pay Additional Fees, Partial International Search Report and Provisional Opinion of the International Searching Authority, International Application No. PCT/US2020/052537, dated Jan. 14, 2021.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/052537, dated Mar. 9, 2021.
Office Action of the Intellectual Property Office, ROC (Taiwan) Patent Application No. 107115475, dated Apr. 30, 2021.
First Office Action, China National Intellectual Property Administration, Patent Application No. 2019800208570, dated Jun. 3, 2021.
Steinbach et al., Haptic Data Compression and Communication, IEEE Signal Processing Magazine, Jan. 2011.
Pezent et al., Syntacts Open-Source Software and Hardware for Audio-Controlled Haptics, IEEE Transactions on Haptics, vol. 14, No. 1, Jan.-Mar. 2021.
Danieau et al., Enhancing Audiovisual Experience with Haptic Feedback: A Survey on HAV, IEEE Transactions on Haptics, vol. 6, No. 2, Apr.-Jun. 2013.
Danieau et al., Toward Haptic Cinematography: Enhancing Movie Experiences with Camera-Based Haptic Effects, IEEE Computer Society, IEEE MultiMedia, Apr.-Jun. 2014.
Jaijongrak et al., A Haptic and Auditory Assistive User Interface: Helping the Blinds on their Computer Operations, 2011 IEEE International Conference on Rehabilitation Robotics, Rehab Week Zurich, ETH Zurich Science City, Switzerland, Jun. 29-Jul. 1, 2011.
Lim et al., An Audio-Haptic Feedbacks for Enhancing User Experience in Mobile Devices, 2013 IEEE International Conference on Consumer Electronics (ICCE).
Weddle et al., How Does Audio-Haptic Enhancement Influence Emotional Response to Mobile Media, 2013 Fifth International Workshop on Quality of Multimedia Experience (QoMEX), QMEX 2013.
Final Notice of Preliminary Rejection, Korean Patent Office, Application No. 10-2019-7036236, dated Nov. 29, 2021.
Examination Report, United Kingdom Intellectual Property Office, Application No. GB2018051.9, dated Nov. 5, 2021.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2021/021908, dated Jun. 9, 2021.
Notice of Preliminary Rejection, Korean Intellectual Property Office, Application No. 10-2019-7036236, dated Jun. 29, 2021.
Combined Search and Examination Report, United Kingdom Intellectual Property Office, Application No. GB2018051.9, dated Jun. 30, 2021.
Communication pursuant to Rule 164(2)(b) and Article 94(3) EPC, European Patent Office, Application No. 18727512.8, dated Jul. 8, 2021.
Gottfried Behler: “Measuring the Loudspeaker's Impedance during Operation for the Derivation of the Voice Coil Temperature”, AES Convention Preprint, Feb. 25, 1995 (Feb. 25, 1995), Paris.
First Office Action, China National Intellectual Property Administration, Patent Application No. 2019800211287, dated Jul. 5, 2021.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/050964, dated Sep. 3, 2019.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/050770, dated Jul. 5, 2019.
Communication Relating to the Results of the Partial International Search, and Provisional Opinion Accompanying the Partial Search Result, of the International Searching Authority, International Application No. PCT/US2018/031329, dated Jul. 20, 2018.
Combined Search and Examination Report, UKIPO, Application No. GB1720424.9, dated Jun. 5, 2018.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2019/052991, dated Mar. 17, 2020.
Communication Relating to the Results of the Partial International Search, and Provisional Opinion Accompanying the Partial Search Result, of the International Searching Authority, International Application No. PCT/GB2020/050822, dated Jul. 9, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2020/024864, dated Jul. 6, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051035, dated Jul. 10, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/050823, dated Jun. 30, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051037, dated Jul. 9, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/050822, dated Aug. 31, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/GB2020/051438, dated Sep. 28, 2020.
First Examination Opinion Notice, State Intellectual Property Office of the People's Republic of China, Application No. 201880037435.X, dated Dec. 31, 2020.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2020/056610, dated Jan. 21, 2021.
Examination Report under Section 18(3), United Kingdom Intellectual Property Office, Application No. GB2018050.1, dated Dec. 22, 2021.
Second Office Action, National Intellectual Property Administration, PRC, Application No. 2019800208570, dated Jan. 19, 2022.
Examination Report under Section 18(3), United Kingdom Intellectual Property Office, Application No. GB2106247.6, dated Mar. 31, 2022.
Combined Search and Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2112228.8, dated May 17, 2022.
Search Report under Section 17, UKIPO, Application No. GB2202521.7, dated Jun. 21, 2022.
Combined Search and Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2210174.5, dated Aug. 1, 2022.
Examination Report under Sections 17 and 18(3), UKIPO, Application No. GB2112207.2, dated Aug. 18, 2022.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/030541, dated Sep. 1, 2022.
Vanderborght, B. et al., Variable impedance actuators: A review; Robotics and Autonomous Systems 61, Aug. 6, 2013, pp. 1601-1614.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/033190, dated Sep. 8, 2022.
International Search Report and Written Opinion of the International Searching Authority, International Application No. PCT/US2022/033230, dated Sep. 15, 2022.
Related Publications (1)
Number Date Country
20210194484 A1 Jun 2021 US
Continuations (1)
Number Date Country
Parent 16369645 Mar 2019 US
Child 17192632 US