Calibration of haptic devices

Abstract
Disclosed herein are methods and systems for providing haptic output and audio output on computing devices using the same haptic device and methods for calibrating the same. To produce the haptic and audio output, the computing device receives a profile of a desired output waveform that is to be provided by the haptic device. Using the desired output waveform, an input waveform is generated. Once the input waveform that will produce the desired output waveform is generated, the input waveform may be calibrated to account for various structural components of the haptic device and may also be combined with an audio waveform. The input waveform is then provided to the haptic device.
Description
FIELD

The present disclosure relates generally to haptic output in electronic devices. More specifically, the present disclosure is directed to providing audio and haptic output from a haptic device and a method and system for calibrating the haptic device.


BACKGROUND

Electronic devices may employ haptic output to provide a user with a tactile sensation in various circumstances. For example, haptic output may be provided in response to a particular input by the user, a system state, or an application instruction. As a specific example, some electronic devices, such as a laptop computer, include a trackpad or button that may move or vibrate to provide haptic output to a user. However, the feel of the haptic output may vary from device to device and may also vary over time as the device is continually used.


It is with respect to these and other general considerations that embodiments have been made. Although relatively specific problems have been discussed, it should be understood that the embodiments should not be limited to solving the specific problems identified in this background.


SUMMARY

This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description section. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.


Disclosed herein are methods and systems for calibrating a haptic device. According to this exemplary method, one or more characteristics of an input waveform is obtained or otherwise received. A current profile associated with the input waveform is then determined. As part of determining the current profile, a scaling factor associated with the input waveform is also determined. The scaling factor may be combined with the input waveform to cause an output waveform to be substantially similar to the input waveform.


Also disclosed is a haptic output device capable of providing both audio output and vibratory output. The haptic device includes a feedback surface, an actuator, one or more biasing supports and a controller operatively coupled to the actuator. The controller is configured to receive parameters of a desired output waveform that is to be provided by the haptic output device. Using these parameters, an input waveform is generated that is based on the desired output waveform. The input waveform is then provided to the actuator to generate an actual output waveform. The actual output waveform should have parameters that match or otherwise correspond to the parameters of the desired output waveform.


A method for providing tactile output and audio output on haptic device for an electronic device is also disclosed. According to this method, the electronic device receives a profile of a desired output waveform that is to be provided by the haptic device. An input waveform based on the desired output waveform is then generated. Once the input waveform has been generated, an audio waveform is superimposed on or otherwise added to the input waveform. The input waveform having the audio waveform is then provided to the haptic device which generates the tactile output and the audio output.





BRIEF DESCRIPTION OF THE DRAWINGS

This summary is provided to introduce Embodiments of the present disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings. The elements of the drawings are not necessarily to scale relative to each other. Identical reference numerals have been used, where possible, to designate identical features that are common to the figures.



FIG. 1A-FIG. 1C illustrate exemplary electronic devices incorporating a haptic device according to one or more embodiments of the present disclosure;



FIG. 2 is an enlarged top plan view of a sample haptic device that may be incorporated into or provided with one or more of the devices shown in FIG. 1A-FIG. 1C according to one or more embodiments of the present disclosure;



FIG. 3 is an exploded isometric view of the haptic device of according to one or more embodiments of the present disclosure;



FIG. 4A-FIG. 4D illustrate exemplary input and output waveforms that may be used by and/or output from, a haptic device according to one or more embodiments of the present disclosure;



FIG. 5 illustrates a sample method for providing audio and haptic output using a haptic device according to one or more embodiments of the present disclosure;



FIG. 6 illustrates a sample method for determining an input current that may be multiplied by a scaling factor to produce a desired output waveform according to one or more embodiments of the present disclosure;



FIG. 7 illustrates a sample method for using a model to calibrate a haptic device according to one or more embodiments of the present disclosure;



FIG. 8 is a graph that illustrates upper and lower bounds of a stiffness variable of biasing supports of a haptic device according to one or more embodiments of the present disclosure;



FIG. 9A-FIG. 9B illustrate quadratic curve fit graphs for a spring constant and amplitude factor according to one or more embodiments of the present disclosure;



FIG. 10 illustrates a method for verifying calibration parameters according to one or more embodiments of the present disclosure; and



FIG. 11 is a block diagram illustrating exemplary components of a computing device according to one or more embodiments of the present disclosure.





DETAILED DESCRIPTION

Various embodiments are described more fully below with reference to the accompanying drawings, which form a part hereof, and which show specific exemplary embodiments. However, embodiments may be implemented in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the embodiments to those skilled in the art. The following detailed description is, therefore, not to be taken in a limiting sense.


Trackpads, touch screens and other such haptic devices of the various electronic devices disclosed herein may be programmable to provide various types of outputs to a user of the electronic device. The outputs may be provided by a single haptic device or multiple haptic devices that provide audio output, tactile or haptic output and a combination of haptic and audio output. The audio and haptic output, and more specifically the types, combinations, durations and so on of the output, may be based on user preferences, user interface elements, input dynamics from a user (e.g., how hard the user presses on the haptic device, the length of the press etc.) and habits of the user. For example, if a user takes a first action in a first application, the haptic device may provide both audio output and tactile output. In response to a second action taken by a user, the haptic device may provide a second audio output and a second tactile output. In some embodiments, the electronic device may adaptively learn the habits of the user and alter the haptic output accordingly.


As will be explained below, in order for the haptic device to provide the haptic output and the audio output, a voltage or current, represented a current waveform, is combined with an audio waveform. The input voltage waveform may be in the form of a half-sine wave (or a Gaussian wave), a sine wave, a half elliptical wave, a saw-tooth wave, a pulse, a ramp down or ramp up wave, a square wave, and various combinations of such waveforms. Further, each current waveform may be associated with a particular amplitude, displacement, momentum and/or velocity.


More specifically, once the desired haptic output is determined, including the feel and the duration of the haptic output, the type of audio output that is to be provided may also be determined. As will be appreciated, each type of audio output that is provided may be associated with a particular input audio waveform. As such, each of the various types of audio waveforms may be combined with the various types of input waveforms described above. It should be appreciated that although a combination of audio output and haptic output is disclosed, the methods, systems and devices described herein may be used to provide haptic output or audio output.


Once the audio output (represented as an audio input waveform) and the haptic output (represented as a haptic input waveform) have been determined, the waveforms are combined into an input waveform and provided to a haptic actuator. As the haptic actuator receives the input waveform, mechanical movement output by the actuator may vary, such that one type of waveform may have a different acoustic and haptic output compared to another waveform. In other words, the displacement direction or directions and/or speed of a feedback surface of the haptic device may be varied by changing the shape, frequency, amplitude, phase, and/or duration of the input waveform or signal. In addition, the tone, sound, and duration of the acoustic or audio output may be altered by changing the shape, frequency, amplitude and so on of the audio waveform. Thus, by changing the input waveform the haptic and acoustic output experienced by a user may be changed.


In addition to the above, it may be useful that all audio output and/or haptic output be the same, similar or substantially similar across various devices as the consistency of user experiences enhances a user's ability to discern and understand the haptic output. For example, the haptic output and audio output provided by a first haptic device on a first electronic device should be similar to the haptic output and audio output provided on a second electronic device in order to enhance the user experience. Further, these different types of output should not vary as the device continues to be used. That is, over the lifespan of the electronic device, the haptic device should generally provide the same or similar haptic output and audio output. Accordingly, embodiments of the present disclosure are directed to performing a calibration technique on a haptic device such, as for example, a haptic trackpad.


Additionally, in many implementations, a haptic device itself can affect the quality of waveforms output therefrom. For example, an input waveform can be distorted, attenuated, or otherwise affected as a result of the materials selected for a particular haptic device. In other examples, the structure of a haptic device can affect the output waveform.


Accordingly, many embodiments described herein model the haptic device as a linear time-invariant (“LTI”) system having a single input and a single output. These embodiments can include a filter (e.g., inverse of the transfer function) designed to account for the effects of the LTI system. As a result of the filter, the waveform output from the haptic device may more accurately reproduce the waveform input to the filter, effectively mitigating any distortions, attenuations, or other effects introduced by the haptic device itself.


In these embodiments, the filter can correspond to the inverse of a transfer function that models the LTI effects of the haptic device. In some embodiments, the transfer function (and/or its inverse) can be analytically derived. In other embodiments, the transfer function can be experimentally derived. However, as may be appreciated, analytical derivation of a transfer function of an LTI system may be computationally impractical to perform on demand for certain haptic devices. Similarly, it may be prohibitively time consuming to experimentally derive the same. Accordingly, many embodiments described herein relate to methods for efficiently determining a transfer function (and/or inverse thereof) of a particular haptic device given a particular input waveform. Thereafter, the transfer function (or parameters that define the transfer function) can be saved as calibration parameters and can be used as an effective approximation of the transfer function for other waveforms. For example, a transfer function derived to filter a Gaussian waveform of variable peak amplitude through a particular haptic device can be saved, and can thereafter be used as a filter for arbitrarily-shaped signals passed through that same haptic device.


For example, many embodiments described herein can include haptic devices generating output waveforms that are affected by the stiffness of a gel used to soften the feel of the haptic trackpad to a user. Such a system can be modeled by a second-order differential equation dependent upon the mass of moving portions of the haptic device and the stiffness of the gel. To obtain an approximation of the stiffness of the gel, a series of Gaussian pulses with known peak amplitude can be applied to the haptic device, and the input waveforms can be compared to the output waveforms. Based upon the difference in magnitude between the input Gaussian pulse and the measured output waveform, an approximation of the stiffness of the gel can be obtained by solving the second-order differential equation. Thereafter, the peak amplitude of the input Gaussian pulse can be changed, and a new stiffness can be determined. Repeating in this manner, a functional relationship between peak amplitude of a Gaussian pulse and the gel stiffness can be determined. Thereafter, this function can be used (with or without amplitude scaling) to define a filter that effectively mitigates any distortions, attenuations, or other effects introduced by the haptic device when the input waveform is a Gaussian pulse. This function (and/or coefficients that define this functional relationship) can be saved as calibration parameters to be used to filter arbitrary input waveforms.


The methods and devices described herein may be used with substantially any type of apparatus or device where haptic output and/or audio output may be desired. For example, FIG. 1A-FIG. 1C illustrate exemplary electronic devices 100 that may be used with the various embodiments described herein. As shown in FIG. 1A, the electronic device 100 may be a laptop computer. Alternatively, as depicted in FIG. 1B and FIG. 1C, the electronic device 100 can be a tablet computer or a mobile telephone. It should be noted that the electronic devices 100 illustrated and described are illustrative only and substantially any other type of electronic device, such as but not limited to, a computer, a digital music player, a wearable electronic device, a digital camera, a personal digital assistant, and so on may include one or more haptic devices.


With reference to FIG. 1A-FIG. 1C, the electronic device 100 may include a haptic device 102 such as, for example, a trackpad or other input device, and a display 104. In some embodiments, the haptic device 102 and the display 104 may be part of the same unit. For example, a tablet computer such as shown in FIG. 1B, may have a display 104 that also acts as a haptic device. In some embodiments, the display 104 may be touch sensitive and enable a user to provide one or more commands or other types of input to the electronic device 100.


It should also be noted that FIG. 1A-FIG. 1C are exemplary only. In other examples, the electronic device 100 may include fewer or more components than those shown above or described below. Additionally, the illustrated electronic devices 100 are exemplary devices that can include a haptic device 102. In other embodiments, a haptic device 102 such as described herein may be incorporated into substantially any type of device that provides haptic output and/or audio output to a user. Additionally or alternatively, a haptic device 102 can be included in any type of component within, or connected to an electronic device 100. For example, one or more haptic devices 102 can be included in an enclosure 106 or button 108 of an electronic device 100, or in a component operatively connected to an electronic device 100 including input devices such as a mouse or keyboard, output devices and other accessories.


Referring now to FIG. 2, the figure illustrates an enlarged plan view of an exemplary haptic device 102 according to one or more embodiments of the present disclosure. In some embodiments, the haptic device 102 provides both audio output and tactile output to a user by moving, vibrating, or otherwise alternating a feedback surface 200. In some embodiments, the feedback surface may be made of glass, plastic, sapphire or other such material. As shown in FIG. 2, the feedback surface 200 is substantially co-planar with an exterior surface of the enclosure 106 of the electronic device 100. However, it is contemplated that the feedback surface may be raised or recessed with respect to the exterior surface. Although shown in a rectangular shape, the feedback surface 200 may have any shape and/or dimensions.


The haptic device 102 may include one or more force sensors 202. Although not shown, the haptic device 102 can include other types of sensors, such as position sensors that may be disposed below the feedback surface 200, acceleration sensors that are configured to detect an acceleration of a user input or other movement of the electronic device 100 and so on. The force sensors can be any suitable type of sensor capable of detecting an exerted force. For example, in some embodiments the force sensor may be a strain gauge, capacitive, resistive, optic, piezoelectric or other suitable force sensor.



FIG. 3 illustrates an exploded isometric view of an exemplary haptic device 300. In some embodiments, the haptic device 300 may be similar to the haptic device 102 described above. The haptic device 300 may include multiple layers including a glass layer 302 a touch sensor layer 304 and a ground layer 306 although fewer or additional layers are contemplated. As shown in FIG. 3, the ground layer 306 may include one or more gel pads 308. The gel pads 308 may be used to secure and support the haptic device 102 to the electronic device 100 and/or to support a feedback surface such as, for example the glass layer 302. In some embodiments, the haptic device 300 may include four gel pads 308 that each may be operably connected to the feedback surface below or at a location substantially adjacent to the location of the sensors 202 (FIG. 2). Although four gel pads 308 are specifically mentioned, any number of gel pads 308 may be used.


The gel pads 308 may also provide a biasing force to the various layers of the haptic device (e.g., the feedback surface 200 (FIG. 2), or the glass layer 302) to return them to a nominal or first position. The gel pads 308 may be substantially any member capable of providing a biasing or return force to the feedback surface. In some embodiments, the gel pads 308 may be a silicone based gel that may be positioned around the sides of the various layers or the feedback surface. In other embodiments, the gel pads 308 can be one or more springs poisoned on or between the various layers. In yet other embodiments, the haptic device 300 may use a magnetic force from one or more magnets to return the feedback surface to its nominal position.


The haptic device 300 may also include a force sensor assembly 310 configured to be coupled to the ground layer 306. The force sensor assembly 310 may include a stiffener 312, an electrostatic discharge component 314, an actuator 316, a circuit board 318 and an attraction plate 320. Although a single actuator 316 is shown, the haptic device 300 may include two or more actuators 316. In some embodiments, the actuator 316 may be configured to receive one or more haptic input signals from a processing device or other controlling element. As will be discussed below, the input signals may include both audio waveforms and current waveforms that may be converted into mechanical movement by the actuator 316.


Any suitable type of actuator 316 can be included in the haptic device 300. For example, an actuator 316 may be a solenoid actuator including a wire wound around a moveable iron core, and as a current passes through the wire coil, the iron core may move correspondingly. Specifically, the electric current through the wire may create a magnetic field. The magnetic field may then apply a force to the core or plunger, to either attract or repel the core. In these embodiments, the actuator 316 may also include a spring or biasing member which may return the core to its original position after the magnetic field is removed. In other embodiments, an actuator 316 may be an electromagnet, or a series of magnets that are selectively energized to attract or repel the feedback surface. As a specific example, the actuator 316 may be a series of bar electromagnets with alternating poles that may be used to mechanically move the feedback surface.


Each actuator 316 in the haptic device 300 may selectively move the feedback surface or one or more layers of the haptic device 300 in a horizontal and/or linear direction. In other words, the feedback surface may translate horizontally or laterally but may not move substantially vertically with respect to the enclosure 106. In other embodiments, the actuators 316 may move the feedback surface in a vertical direction (e.g., along a Z axis) or in a combination of vertical and linear directions. In some implementations the vertical movement may produce the audio output while the horizontal or lateral movement provides the haptic output.


For example, the motion of the feedback surface in one or more directions, such as, for example, the Z-direction, may move the air that surrounds the feedback surface and produce sound. Additionally or alternatively, movement in a horizontal direction may produce a haptic output.



FIG. 4A - FIG. 4D illustrate exemplary waveforms that may be used by and/or produced by a haptic device, such as, for example, haptic device 102 (FIG. 1A-FIG. 10). More specifically, FIG. 4A illustrates a desired output waveform 400 that is configured to produce a desired haptic output. As shown in FIG. 4A, the desired output waveform 400 is a Gaussian wave although other output waveforms are contemplated.


As discussed above, embodiments of the present disclosure utilize a model that receives parameters or characteristics associated with the desired output waveform. In some embodiments, the characteristics may include a desired amplitude and a desired time. Although amplitude and time are specifically mentioned, other characteristics may be specified and used. These other characteristics include momentum, speed, frequency and so on.


Once the characteristics are received by the model, the model is able to determine the current that is needed to produce the desired output waveform 400. More specifically, and as will be described below, the model receives the desired characteristics and calculates the input waveform (represented as current vs. time) required to produce the desired haptic output. For example, the model may determine that based on the displacement versus time characteristics of the desired output waveform 400, the required input waveform is a square input waveform 402 shown in FIG. 4B. Although a square input waveform 402 is specifically mentioned and shown, the input waveform may have any shape. In some embodiments, the edges of the square input waveform 402 may be rounded.


Once the input waveform 402 has been determined, the input waveform may be provided to a haptic device to produce the desired haptic output (e.g., an output that follows the shape of the desired output waveform 400). However, as discussed above, embodiments of the present disclosure utilize a haptic device to produce both haptic output and audio output. Accordingly, an audio waveform 404 such as shown in FIG. 4C may be superimposed on the input waveform 402.


Although the audio waveform 404 is represented as a sine wave, other waveforms are contemplated. Further, although a specific audio waveform is shown, the audio waveform may have various amplitudes, frequencies and durations. In addition, different types of audio waveforms may be superimposed on the input waveform 402. That is, different audio output may be provided with the same haptic output. Likewise, the same audio output may be provided with different haptic output.


As discussed above, the audio waveform 404 is superimposed on the input waveform 420. An exemplary audio and haptic input waveform 406 is shown in FIG. 4D. Once the audio and haptic input waveform 406 has been created or generated, the audio and haptic input waveform 406 is provided to a haptic device.


As will be appreciated, the levels of the acoustic and haptic output provided by the haptic device may be adjusted and changed. For example, as one or both of the audio input signal and/or the haptic input signal of the audio and haptic input waveform 406 varies, the output provided by the haptic device will also vary.


In some embodiments, a first sound may be produced when a single actuator moves the feedback surface. In embodiments where multiple actuators are used, different haptic output and/or audio output may be provided by each actuator. Thus, the audio output and haptic output of a haptic device can be adjusted based on the positioning and selective activation of one or more actuators.


In other embodiments, various types of audio and haptic input waveforms may be concatenated or otherwise combined to produce a series of different audio and haptic output. For example, a first type of audio and haptic input waveform may be combined with a second type of audio and haptic waveform. In other embodiments, an audio waveform may be provided to the haptic device followed by a haptic waveform and/or an audio and haptic input waveform and vice versa. As such, a user of an electronic device, such as electronic device 100, may first hear the audio, feel the haptic output and subsequently hear audio output.



FIG. 5 illustrates a method 500 for providing audio output and/or haptic output using a haptic output device. In some embodiments, the method 500 may be performed by the haptic device shown and described above. That is, a single haptic device may be configured to provide both haptic output and audio output simultaneously or substantially simultaneously. In other embodiments, the method 500 may be performed by multiple haptic devices. In such embodiments, each haptic device or haptic actuator of a single haptic device may be configured to produce a first type of haptic output (and optionally audio output) while a second haptic device or haptic actuator of the single haptic device may be configured to produce a second, different type of haptic and/or audio output.


Method 500 begins at operation 502 when characteristics of a desired output waveform are received. In some embodiments, these characteristics include a desired displacement of an element of the haptic device (e.g., an actuator mass, a plate of a trackpad and so on) as well as a time duration for the displacement. For example, the desired displacement may include a peak displacement based on the type of haptic output that is desired and a time frame in which the displacement is to occur. In other embodiments, the characteristics may include a momentum of an element of the haptic device, a velocity of the element of the haptic device and so on. In some implementations, each characteristic or combination of characteristics of the output waveform may be associated with a different type of haptic output. For example, a first displacement characteristic and a first time characteristic may produce a first type of haptic output while a second displacement characteristic and a second time characteristic may produce a second type of haptic output.


Once the characteristics of the desired output waveform have been obtained, flow proceeds to operation 504 and the characteristics are provided to an actuator model or transfer function such as described above. The actuator model is configured to analyze the characteristics of the desired output waveform and determine 506 an input waveform that will cause the haptic device (or various elements of the haptic device) to move in accordance with the desired output waveform. In some embodiments, and as will be described below, the actuator model is also configured to calibrate the haptic device and/or otherwise alter the input waveform based on various factors (e.g., gap, stiffness of the biasing supports, efficiency of the haptic device or of the actuators of the haptic device, and so on). As a result of the calibration, the haptic output (and the audio output when provided) may remain constant or substantially constant across various devices and over the life of the device. As discussed above, the haptic input waveform can be, for example, a sinusoidal wave, a half sinusoidal wave, a half elliptical wave, a saw-tooth wave, a pulse, a ramp down or ramp up wave, a square wave, and various combinations of such waveforms.


If audio is to be provided, flow proceeds to operation 508 and the determined and/or generated input waveform is combined with one or more audio waveforms. Like the desired output waveform, the audio waveforms may be selected from a library of waveforms and may be specific to a particular type of action being taken by a user. In some embodiments, the audio waveform is provided by a synthesizer engine. The synthesizer engine may be part of a processor or may be a separate module or component configured to generate and/or provide an input to a haptic actuator. For example, depending on the type of output that is to be provided by a haptic output device, the synthesizer engine may provide or generate various input waveforms to the haptic output device. This information may be generated by the synthesizer engine and provided to the haptic output device in real time. The synthesizer engine may also provide instructions to other modules which cause additional output to be provided. For example, the synthesizer engine may instruct, or otherwise cause a speaker or other audio component to provide audio output with a given haptic output.


Once the audio waveform and input waveform have been combined, flow proceeds to operation 510 and the combined audio and haptic waveform is provided to a haptic device. As the haptic device receives the input signal, movement of the various components of the haptic device causes the haptic output and the audio output. In some embodiments, the audio output is provided before the haptic output although this may vary. As discussed above, output by the haptic device may vary, such that one type of audio and haptic input waveform may have a different audio and haptic output compared to another waveform. In other words, the displacement and/or speed of movement of the actuator may be varied by changing the shape, frequency, amplitude, phase, and/or duration of the input signal. Thus, by changing the input signal the haptic and audio output experienced by a user may be varied.


As briefly discussed above, haptic devices generating output waveforms may be affected by the stiffness of a gel, gap between components of the haptic device actuator efficiency and the like. To account for these variables, a series of Gaussian pulses with known peak amplitude can be applied to the haptic device. The input waveforms can be compared to the output waveforms.


Based upon the difference in magnitude between the input Gaussian pulse and the measured output waveform, an approximation of the variables can be obtained by solving a second-order differential equation. Thereafter, the peak amplitude of the input Gaussian pulse can be changed, and a new solution to the variables can be determined. As this process is repeated, a functional relationship between peak amplitude of a Gaussian pulse and the variables, such as, for example, a gel stiffness can be determined. Thereafter, this function can be used (with or without amplitude scaling) to define a filter that effectively mitigates any distortions, attenuations, or other effects introduced by the haptic device when the input waveform is a Gaussian pulse or other such waveform. This function (and/or coefficients that define this functional relationship) can be saved as calibration parameters to be used to filter arbitrary input waveforms.



FIG. 6 illustrates a sample method 600 for determining an input current that may be multiplied by a scaling factor to produce a desired output waveform according to one or more embodiments of the present disclosure. The desired output waveform may be associated with haptic output, audio output, and/or a combination of haptic output and audio output. In some embodiments, the method 600 may be performed at various times including, but not limited to, the time the haptic device is manufactured, the time the electronic device is manufactured, or at various times as specified by the user or in response to various external events. Additionally, the method 600 may be used to calibrate the haptic device based on different manufacturing tolerances of the components of the electronic device and/or other variations of components that are used in the haptic device. More specifically, the calibration technique described may be used to account for the stiffness of one or more gel pads or other biasing members of the haptic device, gap that is present between an actuator and an actuator plate, efficiency of the actuator and other such variables such as explained above.


Method 600 begins at operation 602 when characteristics of an input waveform are received or otherwise defined. In some embodiments, the characteristics of the input waveform that are used as input to the transfer function may include a desired peak amplitude of the input waveform. Although the term peak amplitude is used, a haptic actuator of the present disclosure may have many desired peak amplitudes associated with an input waveform. For example, a first type of output that is to be provided by the haptic actuator may be associated with an input waveform having a first peak amplitude while a second type of output that is to be provided by the haptic actuator may be associated with an input waveform having a second peak amplitude. In addition to peak amplitude, the input waveform may be specified by other desired characteristics. These characteristics include but are not limited to a desired duration, frequency, velocity, displacement, momentum and so on.


In some embodiments, the input waveform may be represented as a Gaussian waveform although other waveforms may be used. For example, the input waveform and/or the output waveform may be represented as a sine wave, a sawtooth wave, a square wave, arbitrary waves and the like.


The input waveform may also correspond to a desired output waveform. More specifically, given a desired output waveform, an input waveform may be generated that results in an actual output motion of the haptic actuator, or a component of the haptic actuator (e.g., a plate of the haptic trackpad) that matches the desired output waveform. However, and as discussed above, it may be desirable to provide a user with the same or substantially similar haptic output and/or audio output that is the same or substantially similar across various devices and/or throughout the life of the device. Accordingly, the input waveform may need to be modified and/or the haptic device may need to be calibrated to account for various manufacturing tolerances such as gel stiffness such as described above.


Once the characteristics of the input waveform have been obtained, flow proceeds to operation 604 and a current profile associated with the input waveform is determined. As briefly discussed above, the current profile is used to determine a current or voltage that is provided to the haptic actuator. The applied current or voltage may then cause a component of the haptic trackpad (e.g., a plate of the trackpad) and/or a component of the haptic actuator (e.g., a mass of the haptic actuator) to have an actual output motion (or provide an output waveform) that matches the desired output waveform.


In some embodiments, the current profile may be determined using a relationship between a gap ‘G’ present in the haptic device at a given time ‘t’, represented as G(t), and an amount of force ‘F’ provided by the haptic actuator at the given time t represented herein as F(t). More specifically, the current profile may be determined using a lookup table between the gap and the output force provided by the haptic actuator and the current profile. For example, the lookup table may be used to determine a current or an amount of voltage that should be provided to the haptic actuator to produce a given force when a gap of a certain distance is present in the haptic trackpad.


In some embodiments, the current profile may be determined using linear interpolation between points of the gap G(t) and the force F(t). Thus, when a given force and gap are known or desired, the table may be used to determine an amount of current that is to be applied to the haptic actuator.


In some embodiments, the gap is defined as the distance between the plate of the haptic trackpad and the haptic actuator. More specifically, the gap G(t) may be defined as a nominal gap (which may be unknown or may be known or fixed based on manufacturing tolerances and/or based on the actual known distance between the plate and haptic actuator of an assembled haptic trackpad) minus the displacement profile of the input waveform (which is also known as the input waveform was specified in operation 602 above).


In addition to finding the gap, the force that is output by the haptic actuator may also be needed to determine the current profile. In some embodiments, force, represented as F(t), is modeled using the following differential equation:

F(t)=M{umlaut over (x)}+C{dot over (x)}+Kx

where M is the mass of the moving mass and its acceleration at time t, C is the damping coefficient of the haptic device (defined as C=2ζ√{square root over (MK)} with ζ being the damping ratio) and represents the change in velocity caused by the damping coefficient, and K is the stiffness of the biasing structures (e.g., the stiffness of the gel pads within the haptic device). More specifically the above differential equation may be used as a model to predict the output force of the haptic actuator based on the input waveform received in operation 602.


In certain implementations, some of the variables of the force model described above may be known. For example, the weight of the moving mass may be known during the manufacturing process. Likewise, the damping coefficient may also be known. Accordingly, the only unknown variable in the above model may be the stiffness K of the biasing structures (e.g., the gel pads) of the haptic trackpad.


In some embodiments, the K value may depend on or otherwise be associated with the peak displacement of a given input waveform. More specifically, the peak displacement p of an input waveform may have a quadratic relationship with the stiffness K and may be represented by the quadratic model:

K(p)=Kap2+Kbp+Kc

where Ka, Kb and Kc are coefficients that define a vector or other such value that is dependent on a peak amplitude of the input waveform and that minimizes ring out (e.g., movement of the actuator mass after current is no longer applied or provided to the haptic actuator) of the haptic trackpad. However, as the stiffness K is unknown, the values for the above quadratic model may be found experimentally.


As such, FIG. 7 illustrates a method 700 for determining a spring constant or stiffness value K as well as the scaling factor that is used to determine the current profile such as described above. As briefly discussed, the stiffness value K is quadratically related to the peak amplitude p of the input waveform. In addition, the scaling factor, represented herein as w, also varies quadratically with respect to peak amplitude and may be represented by the quadratic model:

w(p)=wap2+wbp+wc


where wa, wb and wc are coefficients that define a vector or other such value that is dependent on a peak amplitude p of the input waveform and that minimize peak displacement errors of the haptic trackpad.


As discussed above, each input waveform is quadratically related to the stiffness K and a scaling factor w. As such, each input waveform will have coefficients in the above referenced quadratic models that may be used to calibrate the system. More specifically, the values of K and w may be determined experimentally. As such, operation 702 provides that a sample input waveform is selected. Values for K and w are then determined such that an output waveform matches the sample input waveform. In some embodiments, the sample input waveform may be an arbitrarily selected input waveform having a desired peak amplitude. The peak amplitude of this input waveform may then be used to determine a relationship between the stiffness K and the scaling factor w.


For example, and with reference to FIG. 8, multiple sample input waveforms may be selected to determine a relationship between the stiffness K and the scaling factor w. As shown in FIG. 8, input waveforms having desired amplitudes of 30 um, 40 um, 50 um, 60 um and 70 um are selected. Although these values are specifically mentioned, it is contemplated that any amplitude values may be used.


For each desired peak amplitude, operation 704 of method 700 provides that upper and lower bounds are defined. In some embodiments, the upper and lower bounds are defined in order to limit the search area and/or also minimizes the error between the peak amplitude and the desired peak amplitude. In some embodiments, the search bounds are defined by the curves:

Kcenter(p)=Kacenterp2+Kbcenterp+Kccenter
KUpperBound(p)=Kcenter+Ksearch_range/2 (shown by line 802)
KLowerBound(p)=Kcenter−Ksearch_range/2 (shown by line 804)


More specifically, when a desired peak amplitude is provided to the model, a grid based search (bound by the lines 802 and 804) may be performed to find the optimal stiffness value K (shown as points along the line 806) for each received input waveform and their associated peak amplitudes. Further, the search is used to determine a scaling factor w that minimizes the error between the output peak amplitude and a desired peak amplitude.


More specifically,operation 706 provides that the a scaling factor w is determined such that, when the current profile is multiplied by the scaling factor w,the scaling factor increases (or decreases) an amount of current applied to the system which causes the output waveform to match or substancially match the input waveform while accounting for the stiffness K that is present in the haptic trackpad due to the biasing supports.


More specifically, to find the scaling factor w for each desired peak amplitude (e.g., 30 um, 40 um, 50 um, 60 um and 70 um) a search for w is performed using the following algorithm:

Get output peak displacement p(w)
If (|p(w)−pgoal|<pmax error)
Break
w=w*(p(w)/pgoal)1/3


Using the above algorithm, a scaling factor for each desired peak amplitude may be determined and plotted on a graph (such as shown in FIG. 9B). Once the values for K and w have been determined, flow proceeds to operation 708 and quadratic curve fits are performed on the set of points (p, K(p)) and (p, w(p)) to compute calibration curves for each K and w value at the different amplitudes.


More specifically, values for K and w for each received peak displacement value may be plotted on graphs 900 and 910 of FIGS. 9A and 9B respectively. A quadratic fit is then performed between the values on each graph to determine values for K and w that are used in the above referenced quadratic models.


Once these values are determined, flow proceeds to operation 710 and values of K and w are selected depending on, for example, the desired amplitude. For example, if a desired output waveform has an amplitude of 50 um, graph 900 illustrates that a spring constant K of approximately two nanometers should be selected. Likewise, for the amplitude of 50 um, a scaling factor of approximately 6.3 should be selected such as shown on graph 910 of FIG. 9A.


Returning to method 600 of FIG. 6, Once the scaling factor w and the values for K have been determined, operation 606 of method 600 provides that the determined scaling factor is then applied to the input waveform. More specifically and as discussed above, the scaling factor is a value that, when multiplied with the current profile, causes the output waveform (e.g., the movement of the plate of the haptic trackpad) to match or substantially match the shape of the input waveform (while accounting for the stiffness of one or more biasing structures). Operation 612 then provides that the modified current profile may then subsequently be provided to the haptic device.


The illustrative methods shown in FIG. 6-FIG. 7 may be performed by a manufacturer at the time the haptic device is fabricated. Additionally or alternatively, the method can be performed by a user when a user wishes to change the haptic and acoustic output of a haptic device or may be performed at any time a user wants to recalibrate the haptic device. For example, the electronic device may include an accelerometer that is configured to detect a fall event. In response to the fall event, the electronic device may determine that a recalibration is warranted.


Other embodiments can perform the methods shown in FIG. 6-FIG. 7 differently. Additionally, each method can be used for a single actuator or for multiple actuators in a haptic device. In embodiments that have multiple actuators, a different input signal can be input into each actuator, or all of the actuators can receive the same input signal.


When the calibration parameters K and w have been determined, these parameters may also be checked or otherwise verified. Accordingly, FIG. 10 illustrates a method 1000 for checking the calibration parameters according to one or more embodiments of the present disclosure.


Method 1000 begins at operation 1002 in which an input waveform is provided to the haptic device and/or the model to cause the haptic device to reach predetermined amplitudes over a predetermined time or number of inputs. For example, in one embodiment, the input waveform is applied to the haptic device 10 times to provide target amplitudes or displacements of 30 microns, 50 microns and 70 microns. In some embodiments, the input waveform causes the haptic output to occur for a predetermined or set amount of time. Although specific amplitudes and time durations are specified, other goal amplitudes and time durations may be used.


Flow then proceeds to operation 1004 and the average peak displacement for each goal amplitude is determined. Flow then proceeds to operation 1006 and the average ring out errors for each goal amplitude is determined. The average twist values for each goal amplitude is also determined 1008. Using these values, it may be determined whether the average ring out error and/or the average peak displacement error fall within predetermined thresholds. If so, the calibration is successful. If not, the calibration process described in FIG. 6-FIG. 7 may be repeated.



FIG. 11 is a block diagram illustrating exemplary components, such as, for example, hardware components, of an electronic device 1100 according to one or more embodiments of the present disclosure. In certain embodiments, the electronic device 1100 may be similar to the various electronic devices 100 described above. Although various components of the electronic device 1100 are shown, connections and communication channels between each of the components are omitted for simplicity.


In a basic configuration, the electronic device 1100 may include at least one processor 1105 and an associated memory 1110. The processor 1105 may be used to determine the various calibration parameters described above. The memory 1110 may comprise, but is not limited to, volatile storage such as random access memory, non-volatile storage such as read-only memory, flash memory, or any combination thereof. The memory 1110 may store an operating system 1115 and one or more program modules 1120 suitable for running software applications 1155. The operating system 1115 may be configured to control the electronic device 1100 and/or one or more software applications 1155 being executed by the operating system 1115. The software applications 1155 may include browser applications, e-mail applications, calendaring applications, contact manager applications, messaging applications, games, media player applications, time keeping applications and the like some or all of which may provide both haptic output and audio output. More specifically, the software applications 1155 may include instructions that cause a haptic device to output various combinations of haptic output and tactile output.


The electronic device 1100 may have additional features or functionality than those expressly described herein. For example, the electronic device 1100 may also include additional data storage devices, removable and non-removable, such as, for example, magnetic disks, optical disks, or tape. Exemplary storage devices are illustrated in FIG. 11 by removable storage device 1125 and a non-removable storage device 1130. In certain embodiments, various program modules and data files may be stored in the system memory 1110.


As also shown in FIG. 11, the electronic device 1100 may include one or more input devices 1135. The input devices 1135 may include a trackpad, a keyboard, a mouse, a pen or stylus, a sound input device, a touch input device, and the like. The electronic device 1100 may also include one or more output devices 1140. The output devices 1140 may include a display, one or more speakers, a printer, and the like. The electronic device 1100 may also include one or more haptic actuators 1150 that are configured to provide both tactile and audio output. As discussed above, the haptic actuators 1150 may be part of the input devices 1135 and/or the output devices 1140.


The electronic device 1100 may also include one or more sensors 1165. The sensors may include, but are not limited to, accelerometers, ambient light sensors, photodiodes, gyroscopes, magnetometers and so on. These sensors 1165 may work in conjunction with the processor 1105 to determine when and/or what type of haptic and/or audio output should be provided. The sensors 1165 may also be able to determine when the electronic device 1100 should be recalibrated.


The electronic device 1100 also includes communication connections 1145 that facilitate communications with additional electronic devices 1160. Such communication connections 1145 may include a RF transmitter, a receiver, and/or transceiver circuitry, universal serial bus (USB) communications, parallel ports and/or serial ports.


As used herein, the term computer-readable media may include computer storage media. Computer storage media may include volatile and nonvolatile media and/or removable and non-removable media implemented in any method or technology for the storage of information. Examples include computer-readable instructions, data structures, or program modules. The memory 1110, the removable storage device 1125, and the non-removable storage device 1130 are all examples of computer storage media. Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the electronic device 1100. Any such computer storage media may be part of the electronic device 1100. Computer storage media may store instructions which, when executed by the processor 1105, dynamically adjust a current applied to a light source.


In certain embodiments, the electronic device 1100 includes a power supply such as a battery, a solar cell, and the like that provides power to each of the components shown. The power supply may also include an external power source, such as an AC adapter or other such connector that supplements or recharges the batteries. The electronic device 1100 may also include a radio that performs the function of transmitting and receiving radio frequency communications. Additionally, communications received by the radio may be disseminated to the application programs. Likewise, communications from the application programs may be disseminated to the radio as needed.


The electronic device 1100 may also include a visual indicator, a keypad and a display. In embodiments, the keypad may be a physical keypad or a virtual keypad generated on a touch screen display. The visual indicator may be used to provide visual notifications to a user of the electronic device. The electronic device 1100 may also include an audio interface for producing audible notifications and alerts.


In certain embodiments, the visual indicator is a light emitting diode (LED) or other such light source and the audio interface is a speaker. In other embodiments, the audio interface may be configured to receive audio input.


The audio interface may also be used to provide and receive audible signals from a user of the electronic device 1100. For example, a microphone may be used to receive audible input. In accordance with embodiments of the present disclosure, the microphone may also serve as an audio sensor to facilitate control of notifications such as described above. The system may further include a video interface that enables an operation of an on-board camera to record still images, video, and the like.


Embodiments of the present disclosure are described above with reference to block diagrams and operational illustrations of methods and the like. The operations described may occur out of the order as shown in any of the figures. Additionally, one or more operations may be removed or executed substantially concurrently. For example, two blocks shown in succession may be executed substantially concurrently. Additionally, the blocks may be executed in the reverse order.


In addition, it will be understood that variations and modifications can be effected within the spirit and scope of the disclosure. And even though specific embodiments have been described herein, it should be noted that the application is not limited to these embodiments. In particular, any features described with respect to one embodiment may also be used in other embodiments, where compatible. Likewise, the features of the different embodiments may be exchanged, where compatible.

Claims
  • 1. A method of calibrating a haptic device, the method comprising: receiving at a controller of the haptic device one or more respective characteristics of each of a plurality of input waveforms, wherein distortions in the haptic device cause the haptic device to generate a haptic output different from a desired haptic output;determining respective current profiles associated with each of the plurality of input waveforms having the respective characteristics;applying, by the controller, the current profiles to a haptic actuator of the haptic device to produce respective output waveforms;measuring respective characteristics of the respective output waveforms;determining a scaling factor and a stiffness associated with one or more biasing supports of the haptic actuator based on the measured respective characteristics of the respective output waveforms; andcalibrating the haptic device based on the determined scaling factor and stiffness to account for the distortions.
  • 2. The method of claim 1, wherein the scaling factor accounts for one or more of: a structure of the haptic device; anda gap between a first component of the haptic device and the haptic actuator of the haptic device.
  • 3. The method of claim 1, wherein determining the current profiles associated with the input waveforms comprises determining an amount of force output by the haptic device, wherein the amount of force is modeled as a differential equation.
  • 4. The method of claim 1, wherein the scaling factor is quadratically related to peak amplitudes of the input waveforms.
  • 5. The method of claim 4, wherein determining the scaling factor further comprises determining a quadratic relationship of the stiffness of the one or more biasing supports and the peak amplitudes.
  • 6. The method of claim 5, wherein determining the quadratic relationship of the stiffness of the one or more biasing supports and the peak amplitudes comprises obtaining a quadratic fit of peak amplitudes to stiffness values for the plurality of input waveforms.
  • 7. The method of claim 1, wherein the one or more respective characteristics of the input waveforms include one or more of a peak amplitude, a duration, a frequency, a velocity, a displacement, and a momentum.
  • 8. A haptic output device comprising: a feedback surface;an actuator linked with the feedback surface;one or more biasing supports linked with the feedback surface; anda controller operatively coupled to the actuator, wherein the controller is configured to: receive parameters of a desired haptic output to be produced by the actuator on the feedback surface;determine, using at least the received parameters, an input waveform associated with the desired haptic output;determine a current profile associated with the input waveform, wherein the current profile includes a scaling factor associated with a stiffness of the one or more biasing supports;provide the current profile to the haptic output device to generate an actual output waveform on the feedback surface;measure differences between the actual output waveform and the desired haptic output; andcalibrate the actuator using at least the measured differences.
  • 9. The haptic output device of claim 8, wherein the one or more biasing supports include a gel pad.
  • 10. The haptic output device of claim 8, wherein the one or more characteristics of the input waveform includes one or more of a desired amplitude and a desired duration.
  • 11. The haptic output device of claim 8, wherein the controller is further configured to superimpose an audio waveform on top of the input waveform.
  • 12. The haptic output device of claim 8, wherein the feedback surface is a touch surface of the haptic output device.
  • 13. The haptic output device of claim 8, wherein the scaling factor is quadratically related to a peak amplitude of the input waveform.
  • 14. The haptic output device of claim 8, wherein the calibration of the actuator comprises determining a quadratic relationship between the stiffness of the one or more biasing supports and peak amplitudes of a plurality of sample input waveforms.
  • 15. A method of providing a tactile output and an audio output on a haptic device for an electronic device, the method comprising: receiving, by a controller of the electronic device, characteristics of a desired output waveform to be provided by the haptic device;generating an input waveform based on the desired output waveform by providing the characteristics to an actuator model having calibration parameters based on a scaling factor and a stiffness of one or more biasing supports of the haptic device;adding an audio waveform on the input waveform;providing the input waveform having the audio waveform to the haptic device to generate the tactile output and the audio output that corresponds to the audio waveform.
  • 16. The method of claim 15, wherein the audio output is provided before the tactile output.
  • 17. The method of claim 15, wherein the haptic device is a touch sensitive device.
  • 18. The method of claim 15, further comprising calibrating the actuator by adjusting the calibration parameters.
  • 19. The method of claim 18, wherein calibrating of the actuator comprises determining a quadratic relationship between the stiffness of the one or more biasing supports and peak amplitudes of a plurality of sample input waveforms.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a nonprovisional patent application of and claims the benefit of U.S. Provisional Patent Application No. 62/129,677, filed Mar. 6, 2015 and titled “Calibration of Haptic Devices,” the disclosure of which is hereby incorporated herein by reference in its entirety.

US Referenced Citations (479)
Number Name Date Kind
3001049 Didier Sep 1961 A
3390287 Sonderegger Jun 1968 A
3419739 Clements Dec 1968 A
4236132 Zissimopoulos Nov 1980 A
4412148 Klicker et al. Oct 1983 A
4414984 Zarudiansky Nov 1983 A
4490815 Umehara et al. Dec 1984 A
4695813 Nobutoki et al. Sep 1987 A
4975616 Park Dec 1990 A
5010772 Bourland Apr 1991 A
5245734 Issartel Sep 1993 A
5283408 Chen Feb 1994 A
5293161 MacDonald et al. Mar 1994 A
5317221 Kubo et al. May 1994 A
5365140 Ohya et al. Nov 1994 A
5434549 Hirabayashi et al. Jul 1995 A
5436622 Gutman et al. Jul 1995 A
5510584 Norris Apr 1996 A
5510783 Findlater et al. Apr 1996 A
5513100 Parker Apr 1996 A
5587875 Sellers Dec 1996 A
5590020 Sellers Dec 1996 A
5602715 Lempicki et al. Feb 1997 A
5619005 Shibukawa et al. Apr 1997 A
5621610 Moore et al. Apr 1997 A
5625532 Sellers Apr 1997 A
5629578 Winzer et al. May 1997 A
5635928 Takagi et al. Jun 1997 A
5718418 Gugsch Feb 1998 A
5739759 Nakazawa et al. Apr 1998 A
5742242 Sellers Apr 1998 A
5783765 Muramatsu Jul 1998 A
5793605 Sellers Aug 1998 A
5812116 Malhi Sep 1998 A
5813142 Demon Sep 1998 A
5818149 Safari et al. Oct 1998 A
5896076 Van Namen Apr 1999 A
5907199 Miller May 1999 A
5951908 Cui et al. Sep 1999 A
5959613 Rosenberg Sep 1999 A
5973441 Lo Oct 1999 A
5982304 Selker et al. Nov 1999 A
5982612 Roylance Nov 1999 A
5995026 Sellers Nov 1999 A
5999084 Armstrong Dec 1999 A
6069433 Lazarus et al. May 2000 A
6078308 Rosenberg Jun 2000 A
6127756 Iwaki Oct 2000 A
6135886 Armstrong Oct 2000 A
6218966 Goodwin Apr 2001 B1
6219033 Rosenberg Apr 2001 B1
6220550 McKillip, Jr. Apr 2001 B1
6222525 Armstrong Apr 2001 B1
6252336 Hall Jun 2001 B1
6342880 Rosenberg et al. Jan 2002 B2
6351205 Armstrong Feb 2002 B1
6373465 Jolly et al. Apr 2002 B2
6408187 Merriam Jun 2002 B1
6411276 Braun Jun 2002 B1
6429849 An Aug 2002 B1
6438393 Surronen Aug 2002 B1
6444928 Okamoto et al. Sep 2002 B2
6455973 Ineson Sep 2002 B1
6465921 Horng Oct 2002 B1
6552404 Hynes Apr 2003 B1
6552471 Chandran et al. Apr 2003 B1
6557072 Osborn Apr 2003 B2
6642857 Schediwy Nov 2003 B1
6693626 Rosenberg Feb 2004 B1
6717573 Shahoian et al. Apr 2004 B1
6809462 Pelrine et al. Oct 2004 B2
6809727 Piot et al. Oct 2004 B2
6864877 Braun Mar 2005 B2
6906697 Rosenberg Jun 2005 B2
6906700 Armstrong Jun 2005 B1
6906703 Vablais et al. Jun 2005 B2
6952203 Banerjee et al. Oct 2005 B2
6954657 Bork et al. Oct 2005 B2
6963762 Kaaresoja et al. Nov 2005 B2
6995752 Lu Feb 2006 B2
7005811 Wakuda et al. Feb 2006 B2
7016707 Fujisawa et al. Mar 2006 B2
7022927 Hsu Apr 2006 B2
7023112 Miyamoto et al. Apr 2006 B2
7081701 Yoon et al. Jul 2006 B2
7121147 Okada Oct 2006 B2
7123948 Nielsen Oct 2006 B2
7130664 Williams Oct 2006 B1
7136045 Rosenberg et al. Nov 2006 B2
7158122 Roberts Jan 2007 B2
7161580 Bailey et al. Jan 2007 B2
7162928 Shank et al. Jan 2007 B2
7170498 Huang Jan 2007 B2
7176906 Williams et al. Feb 2007 B2
7180500 Marvit et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7194645 Bieswanger et al. Mar 2007 B2
7217891 Fischer et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7219561 Okada May 2007 B2
7253350 Noro et al. Aug 2007 B2
7333604 Zernovizky et al. Feb 2008 B2
7334350 Ellis Feb 2008 B2
7348968 Dawson Mar 2008 B2
7388741 Konuma et al. Jun 2008 B2
7392066 Hapamas Jun 2008 B2
7423631 Shahoian et al. Sep 2008 B2
7446752 Goldenberg et al. Nov 2008 B2
7469155 Chu Dec 2008 B2
7469595 Kessler et al. Dec 2008 B2
7471033 Thiesen et al. Dec 2008 B2
7495358 Kobayashi et al. Feb 2009 B2
7508382 Denoue et al. Mar 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7562468 Ellis Jul 2009 B2
7569086 Chandran Aug 2009 B2
7575368 Guillaume Aug 2009 B2
7586220 Roberts Sep 2009 B2
7619498 Miura Nov 2009 B2
7639232 Grant et al. Dec 2009 B2
7641618 Noda et al. Jan 2010 B2
7649305 Priya et al. Jan 2010 B2
7675253 Dorel Mar 2010 B2
7675414 Ray Mar 2010 B2
7679611 Schena Mar 2010 B2
7707742 Ellis May 2010 B2
7710399 Bruneau et al. May 2010 B2
7732951 Mukaide Jun 2010 B2
7737828 Yang et al. Jun 2010 B2
7742036 Grant et al. Jun 2010 B2
7788032 Moloney Aug 2010 B2
7793429 Ellis Sep 2010 B2
7793430 Ellis Sep 2010 B2
7798982 Zets et al. Sep 2010 B2
7868489 Amemiya et al. Jan 2011 B2
7886621 Smith et al. Feb 2011 B2
7886631 Smith et al. Feb 2011 B2
7888892 McReynolds et al. Feb 2011 B2
7893922 Klinghult et al. Feb 2011 B2
7919945 Houston et al. Apr 2011 B2
7929382 Yamazaki Apr 2011 B2
7946483 Miller et al. May 2011 B2
7952261 Lipton et al. May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7961909 Mandella et al. Jun 2011 B2
8018105 Erixon et al. Sep 2011 B2
8031172 Kruse et al. Oct 2011 B2
8044940 Narusawa Oct 2011 B2
8069881 Cunha Dec 2011 B1
8072418 Crawford et al. Dec 2011 B2
8077145 Rosenberg et al. Dec 2011 B2
8081156 Ruettiger Dec 2011 B2
8082640 Takeda Dec 2011 B2
8084968 Murray et al. Dec 2011 B2
8098234 Lacroix et al. Jan 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125453 Shahoian et al. Feb 2012 B2
8141276 Ellis Mar 2012 B2
8156809 Tierling et al. Apr 2012 B2
8174344 Yakima et al. May 2012 B2
8174372 da Costa May 2012 B2
8179202 Cruz-Hernandez et al. May 2012 B2
8188623 Park et al. May 2012 B2
8205356 Ellis Jun 2012 B2
8210942 Shimabukuro et al. Jul 2012 B2
8232494 Purcocks Jul 2012 B2
8242641 Bae Aug 2012 B2
8248277 Peterson et al. Aug 2012 B2
8248278 Schlosser et al. Aug 2012 B2
8253686 Kyung et al. Aug 2012 B2
8255004 Huang et al. Aug 2012 B2
8261468 Ellis Sep 2012 B2
8264465 Grant et al. Sep 2012 B2
8270114 Argumedo et al. Sep 2012 B2
8270148 Griffith et al. Sep 2012 B2
8288899 Park et al. Oct 2012 B2
8291614 Ellis Oct 2012 B2
8294600 Peterson et al. Oct 2012 B2
8315746 Cox et al. Nov 2012 B2
8344834 Niiyama Jan 2013 B2
8378797 Pance et al. Feb 2013 B2
8378798 Bells et al. Feb 2013 B2
8378965 Gregorio et al. Feb 2013 B2
8384679 Paleczny et al. Feb 2013 B2
8390594 Modarres et al. Mar 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8398570 Mortimer et al. Mar 2013 B2
8411058 Wong et al. Apr 2013 B2
8446264 Tanase May 2013 B2
8451255 Weber et al. May 2013 B2
8461951 Gassmann et al. Jun 2013 B2
8466889 Tong et al. Jun 2013 B2
8471690 Hennig et al. Jun 2013 B2
8487759 Hill Jul 2013 B2
8515398 Song et al. Aug 2013 B2
8542134 Peterson et al. Sep 2013 B2
8547341 Takashima et al. Oct 2013 B2
8547350 Anglin et al. Oct 2013 B2
8552859 Pakula et al. Oct 2013 B2
8570291 Motomura Oct 2013 B2
8575794 Lee et al. Nov 2013 B2
8587955 DiFonzo et al. Nov 2013 B2
8596755 Hibi Dec 2013 B2
8598893 Camus Dec 2013 B2
8599047 Schlosser et al. Dec 2013 B2
8599152 Wurtenberger et al. Dec 2013 B1
8600354 Esaki Dec 2013 B2
8614431 Huppi et al. Dec 2013 B2
8621348 Ramsay et al. Dec 2013 B2
8629843 Steeves et al. Jan 2014 B2
8633916 Bernstein et al. Jan 2014 B2
8674941 Casparian et al. Mar 2014 B2
8680723 Subramanian Mar 2014 B2
8681092 Harada et al. Mar 2014 B2
8682396 Yang et al. Mar 2014 B2
8686952 Pope et al. Apr 2014 B2
8710966 Hill Apr 2014 B2
8717309 Almalki May 2014 B2
8723813 Park et al. May 2014 B2
8735755 Peterson et al. May 2014 B2
8760273 Casparian et al. Jun 2014 B2
8780060 Maschmeyer et al. Jul 2014 B2
8787006 Golko et al. Jul 2014 B2
8797152 Henderson et al. Aug 2014 B2
8798534 Rodriguez et al. Aug 2014 B2
8803842 Wakasugi et al. Aug 2014 B2
8836502 Culbert et al. Sep 2014 B2
8845071 Yamamoto et al. Sep 2014 B2
8857248 Shih et al. Oct 2014 B2
8860562 Hill Oct 2014 B2
8861776 Lastrucci Oct 2014 B2
8866600 Yang et al. Oct 2014 B2
8890668 Pance et al. Nov 2014 B2
8928621 Ciesla et al. Jan 2015 B2
8947383 Ciesla et al. Feb 2015 B2
8948821 Newham et al. Feb 2015 B2
8952937 Shih et al. Feb 2015 B2
8970534 Adachi et al. Mar 2015 B2
8976141 Myers et al. Mar 2015 B2
9008730 Kim et al. Apr 2015 B2
9012795 Niu Apr 2015 B2
9013426 Cole et al. Apr 2015 B2
9019088 Zawacki et al. Apr 2015 B2
9024738 Van Schyndel et al. May 2015 B2
9035887 Prud'Hommeaux et al. May 2015 B1
9072576 Nishiura Jul 2015 B2
9092129 Abdo et al. Jul 2015 B2
9098991 Park et al. Aug 2015 B2
9117347 Matthews Aug 2015 B2
9122325 Peshkin et al. Sep 2015 B2
9131039 Behles Sep 2015 B2
9134834 Reshef Sep 2015 B2
9141225 Cok et al. Sep 2015 B2
9158379 Cruz-Hernandez et al. Oct 2015 B2
9178509 Bernstein Nov 2015 B2
9189932 Kerdemelidis et al. Nov 2015 B2
9201458 Hunt et al. Dec 2015 B2
9202355 Hill Dec 2015 B2
9235267 Pope et al. Jan 2016 B2
9274601 Faubert et al. Mar 2016 B2
9275815 Hoffmann Mar 2016 B2
9285923 Liao et al. Mar 2016 B2
9293054 Bruni et al. Mar 2016 B2
9300181 Maeda et al. Mar 2016 B2
9310906 Yumiki et al. Apr 2016 B2
9310950 Takano et al. Apr 2016 B2
9317116 Ullrich et al. Apr 2016 B2
9317118 Puskarich Apr 2016 B2
9317154 Perlin et al. Apr 2016 B2
9318942 Sugita et al. Apr 2016 B2
9325230 Yamada et al. Apr 2016 B2
9357052 Ullrich May 2016 B2
9360944 Pinault Jun 2016 B2
9367238 Tanada Jun 2016 B2
9380145 Tartz et al. Jun 2016 B2
9390599 Weinberg Jul 2016 B2
9396434 Rothkopf Jul 2016 B2
9405369 Modarres et al. Aug 2016 B2
9411423 Heubel Aug 2016 B2
9417695 Griffin et al. Aug 2016 B2
9449476 Lynn Sep 2016 B2
9454239 Elias et al. Sep 2016 B2
9477342 Daverman et al. Oct 2016 B2
9501912 Hayskjold et al. Nov 2016 B1
9542028 Filiz et al. Jan 2017 B2
9544694 Abe et al. Jan 2017 B2
9576445 Cruz-Hernandez Feb 2017 B2
9622214 Ryu Apr 2017 B2
9659482 Yang et al. May 2017 B2
9727157 Ham et al. Aug 2017 B2
9779592 Hoen Oct 2017 B1
9823833 Grant et al. Nov 2017 B2
9904393 Frey et al. Feb 2018 B2
9934661 Hill Apr 2018 B2
9990099 Ham et al. Jun 2018 B2
10067585 Kim Sep 2018 B2
10139907 Billington Nov 2018 B2
10139959 Butler et al. Nov 2018 B2
20020163498 Chang Nov 2002 A1
20020194284 Haynes Dec 2002 A1
20030210259 Liu Nov 2003 A1
20040021663 Suzuki et al. Feb 2004 A1
20040127198 Roskind et al. Jul 2004 A1
20040169483 Hardwick Sep 2004 A1
20050057528 Kleen Mar 2005 A1
20050107129 Kaewell et al. May 2005 A1
20050110778 Ben Ayed May 2005 A1
20050118922 Endo Jun 2005 A1
20050134562 Grant Jun 2005 A1
20050217142 Ellis Oct 2005 A1
20050237306 Klein et al. Oct 2005 A1
20050248549 Dietz et al. Nov 2005 A1
20050258715 Schlabach Nov 2005 A1
20060014569 DelGiorno Jan 2006 A1
20060052907 Hein Mar 2006 A1
20060154674 Landschaft et al. Jul 2006 A1
20060209037 Wang et al. Sep 2006 A1
20060239746 Grant Oct 2006 A1
20060252463 Liao Nov 2006 A1
20070043725 Hotelling Feb 2007 A1
20070099574 Wang May 2007 A1
20070152974 Kim et al. Jul 2007 A1
20070168430 Brun et al. Jul 2007 A1
20070178942 Sadler et al. Aug 2007 A1
20070188450 Hernandez et al. Aug 2007 A1
20080084384 Gregorio et al. Apr 2008 A1
20080158149 Levin Jul 2008 A1
20080165148 Williamson Jul 2008 A1
20080181501 Faraboschi Jul 2008 A1
20080181706 Jackson Jul 2008 A1
20080192014 Kent et al. Aug 2008 A1
20080204428 Pierce et al. Aug 2008 A1
20080255794 Levine Oct 2008 A1
20090002328 Ullrich et al. Jan 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090120105 Ramsay et al. May 2009 A1
20090128503 Grant et al. May 2009 A1
20090135142 Fu et al. May 2009 A1
20090167702 Nurmi Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090218148 Hugeback et al. Sep 2009 A1
20090225046 Kim et al. Sep 2009 A1
20090236210 Clark et al. Sep 2009 A1
20090243997 Tierling Oct 2009 A1
20090267892 Faubert Oct 2009 A1
20090291670 Sennett et al. Nov 2009 A1
20090313542 Cruz-Hernandez et al. Dec 2009 A1
20100020036 Hui et al. Jan 2010 A1
20100053087 Dai et al. Mar 2010 A1
20100079264 Hoellwarth Apr 2010 A1
20100089735 Takeda et al. Apr 2010 A1
20100141408 Doy Jun 2010 A1
20100141606 Bae et al. Jun 2010 A1
20100148944 Kim et al. Jun 2010 A1
20100152620 Ramsay et al. Jun 2010 A1
20100164894 Kim et al. Jul 2010 A1
20100188422 Shingai et al. Jul 2010 A1
20100194547 Terrell et al. Aug 2010 A1
20100231508 Cruz-Hernandez et al. Sep 2010 A1
20100231550 Cruz-Hernandez et al. Sep 2010 A1
20100265197 Purdy Oct 2010 A1
20100309141 Cruz-Hernandez et al. Dec 2010 A1
20100328229 Weber et al. Dec 2010 A1
20110007023 Abrahamsson et al. Jan 2011 A1
20110053577 Lee et al. Mar 2011 A1
20110075835 Hill Mar 2011 A1
20110107958 Pance et al. May 2011 A1
20110121765 Anderson et al. May 2011 A1
20110128239 Polyakov et al. Jun 2011 A1
20110148608 Grant et al. Jun 2011 A1
20110157052 Lee et al. Jun 2011 A1
20110163985 Bae et al. Jul 2011 A1
20110193824 Modarres et al. Aug 2011 A1
20110248817 Houston Oct 2011 A1
20110248948 Griffin et al. Oct 2011 A1
20110260988 Colgate et al. Oct 2011 A1
20110263200 Thornton et al. Oct 2011 A1
20110291950 Tong Dec 2011 A1
20110304559 Pasquero Dec 2011 A1
20120068957 Puskarich et al. Mar 2012 A1
20120075198 Sulem et al. Mar 2012 A1
20120083341 George Apr 2012 A1
20120092263 Peterson et al. Apr 2012 A1
20120126959 Zarrabi et al. May 2012 A1
20120127088 Pance et al. May 2012 A1
20120133494 Cruz-Hernandez et al. May 2012 A1
20120139844 Ramstein et al. Jun 2012 A1
20120185099 Bosscher Jul 2012 A1
20120188180 Yang Jul 2012 A1
20120206248 Biggs Aug 2012 A1
20120256848 Madabusi Srinivasan Oct 2012 A1
20120268412 Cruz-Hernandez et al. Oct 2012 A1
20120274578 Snow et al. Nov 2012 A1
20120280927 Ludwig Nov 2012 A1
20120306631 Hughes Dec 2012 A1
20120319987 Woo Dec 2012 A1
20120327006 Israr et al. Dec 2012 A1
20130027345 Binzel Jan 2013 A1
20130033967 Chuang et al. Feb 2013 A1
20130058816 Kim Mar 2013 A1
20130063356 Martisauskas Mar 2013 A1
20130106699 Babatunde May 2013 A1
20130141365 Lynn et al. Jun 2013 A1
20130142362 Lynn Jun 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130200732 Jun et al. Aug 2013 A1
20130207793 Weaber et al. Aug 2013 A1
20130217491 Hilbert et al. Aug 2013 A1
20130222280 Sheynblat et al. Aug 2013 A1
20130228023 Drasnin et al. Sep 2013 A1
20130261811 Yagi et al. Oct 2013 A1
20130300590 Dietz et al. Nov 2013 A1
20140035397 Endo et al. Feb 2014 A1
20140082490 Jung et al. Mar 2014 A1
20140085065 Biggs Mar 2014 A1
20140118126 Garg May 2014 A1
20140143785 Mistry et al. May 2014 A1
20140168153 Deichmann et al. Jun 2014 A1
20140197936 Biggs et al. Jul 2014 A1
20140232534 Birnbaum et al. Aug 2014 A1
20140247227 Jiang et al. Sep 2014 A1
20140267076 Birnbaum et al. Sep 2014 A1
20140267952 Sirois Sep 2014 A1
20140315642 Grant Oct 2014 A1
20140347176 Modarres Nov 2014 A1
20150005039 Liu et al. Jan 2015 A1
20150040005 Faaborg Feb 2015 A1
20150090572 Lee et al. Apr 2015 A1
20150098309 Adams et al. Apr 2015 A1
20150169059 Behles et al. Jun 2015 A1
20150192414 Das et al. Jul 2015 A1
20150194165 Faaborg et al. Jul 2015 A1
20150220199 Wang et al. Aug 2015 A1
20150227204 Gipson et al. Aug 2015 A1
20150234493 Parivar Aug 2015 A1
20150296480 Kinsey et al. Oct 2015 A1
20150324049 Kies et al. Nov 2015 A1
20150349619 Degner et al. Dec 2015 A1
20160049265 Bernstein Feb 2016 A1
20160063826 Morrell et al. Mar 2016 A1
20160071384 Hill Mar 2016 A1
20160103544 Filiz et al. Apr 2016 A1
20160162025 Shah Jun 2016 A1
20160163165 Morrell et al. Jun 2016 A1
20160172953 Hamel et al. Jun 2016 A1
20160195929 Martinez et al. Jul 2016 A1
20160196935 Bernstein Jul 2016 A1
20160206921 Szabados et al. Jul 2016 A1
20160211736 Moussette et al. Jul 2016 A1
20160216764 Morrell et al. Jul 2016 A1
20160216766 Puskarich Jul 2016 A1
20160231815 Moussette et al. Aug 2016 A1
20160233012 Lubinski et al. Aug 2016 A1
20160241119 Keeler Aug 2016 A1
20160306423 Uttermann et al. Oct 2016 A1
20160371942 Smith, IV et al. Dec 2016 A1
20170038905 Bijamov et al. Feb 2017 A1
20170070131 Degner et al. Mar 2017 A1
20170084138 Hajati et al. Mar 2017 A1
20170085163 Hajati et al. Mar 2017 A1
20170090667 Abdollahian et al. Mar 2017 A1
20170192507 Lee et al. Jul 2017 A1
20170192508 Lim et al. Jul 2017 A1
20170242541 Iuchi et al. Aug 2017 A1
20170255295 Tanemura et al. Sep 2017 A1
20170257844 Miller et al. Sep 2017 A1
20170285747 Chen Oct 2017 A1
20170311282 Miller et al. Oct 2017 A1
20170357325 Yang et al. Dec 2017 A1
20170364158 Wen et al. Dec 2017 A1
20180052550 Zhang et al. Feb 2018 A1
20180060941 Yang et al. Mar 2018 A1
20180075715 Morrell et al. Mar 2018 A1
20180081441 Pedder et al. Mar 2018 A1
20180174409 Hill Jun 2018 A1
20180203513 Rihn Jul 2018 A1
20180302881 Miller et al. Oct 2018 A1
20190159170 Miller et al. May 2019 A1
Foreign Referenced Citations (105)
Number Date Country
2015100710 Jul 2015 AU
2016100399 May 2016 AU
2355434 Feb 2002 CA
1324030 Nov 2001 CN
1692371 Nov 2005 CN
1817321 Aug 2006 CN
101120290 Feb 2008 CN
101409164 Apr 2009 CN
101763192 Jun 2010 CN
101903848 Dec 2010 CN
101938207 Jan 2011 CN
102025257 Apr 2011 CN
201829004 May 2011 CN
102163076 Aug 2011 CN
102246122 Nov 2011 CN
102315747 Jan 2012 CN
102591512 Jul 2012 CN
102667681 Sep 2012 CN
102713805 Oct 2012 CN
102768593 Nov 2012 CN
102844972 Dec 2012 CN
102915111 Feb 2013 CN
103019569 Apr 2013 CN
103154867 Jun 2013 CN
103181090 Jun 2013 CN
103218104 Jul 2013 CN
103278173 Sep 2013 CN
103416043 Nov 2013 CN
103440076 Dec 2013 CN
103567135 Feb 2014 CN
103970339 Aug 2014 CN
104220963 Dec 2014 CN
104956244 Sep 2015 CN
105556268 May 2016 CN
19517630 Nov 1996 DE
10330024 Jan 2005 DE
102009038103 Feb 2011 DE
102011115762 Apr 2013 DE
0483955 May 1992 EP
1047258 Oct 2000 EP
1686776 Aug 2006 EP
2060967 May 2009 EP
2073099 Jun 2009 EP
2194444 Jun 2010 EP
2264562 Dec 2010 EP
2315186 Apr 2011 EP
2374430 Oct 2011 EP
2395414 Dec 2011 EP
2461228 Jun 2012 EP
2631746 Aug 2013 EP
2434555 Oct 2013 EP
H05301342 Nov 1993 JP
2002199689 Jul 2002 JP
2002102799 Sep 2002 JP
200362525 Mar 2003 JP
2003527046 Sep 2003 JP
200494389 Mar 2004 JP
2004236202 Aug 2004 JP
2006150865 Jun 2006 JP
2007519099 Jul 2007 JP
2010272903 Dec 2010 JP
2012135755 Jul 2012 JP
2014002729 Jan 2014 JP
2014509028 Apr 2014 JP
2014235133 Dec 2014 JP
2016095552 May 2016 JP
20050033909 Apr 2005 KR
1020100046602 May 2010 KR
1020110101516 Sep 2011 KR
20130024420 Mar 2013 KR
200518000 Nov 2007 TW
200951944 Dec 2009 TW
201145336 Dec 2011 TW
201218039 May 2012 TW
201425180 Jul 2014 TW
WO 9716932 May 1997 WO
WO 00051190 Aug 2000 WO
WO 01059588 Aug 2001 WO
WO 01089003 Nov 2001 WO
WO 02073587 Sep 2002 WO
WO 03038800 May 2003 WO
WO 03100550 Dec 2003 WO
WO 06057770 Jun 2006 WO
WO 07114631 Oct 2007 WO
WO 08075082 Jun 2008 WO
WO 09038862 Mar 2009 WO
WO 09068986 Jun 2009 WO
WO 09097866 Aug 2009 WO
WO 09122331 Oct 2009 WO
WO 09150287 Dec 2009 WO
WO 10085575 Jul 2010 WO
WO 10087925 Aug 2010 WO
WO 11007263 Jan 2011 WO
WO 12052635 Apr 2012 WO
WO 12129247 Sep 2012 WO
WO 13069148 May 2013 WO
WO 13150667 Oct 2013 WO
WO 13173838 Nov 2013 WO
WO 2013169302 Nov 2013 WO
WO 13186846 Dec 2013 WO
WO 13186847 Dec 2013 WO
WO 14018086 Jan 2014 WO
WO 14098077 Jun 2014 WO
WO 13169299 Nov 2014 WO
WO 15023670 Feb 2015 WO
Non-Patent Literature Citations (37)
Entry
Astronomer's Toolbox, “The Electromagnetic Spectrum,” http://imagine.gsfc.nasa.gov/science/toolbox/emspectrum1.html, updated Mar. 2013, 4 pages.
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC-vol. 49, pp. 73-80, 1993.
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Standford University, 6 pages, at least as early as Sep. 30, 2009.
Kim et al., “Tactile Rendering of 3D Features on Touch Surfaces,” UIST '13, Oct. 8-11, 2013, St. Andrews, United Kingdom, 8 pages.
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004.
U.S. Appl. No. 12/750,054, filed Mar. 30, 2010, Hill.
U.S. Appl. No. 12/887,455, filed Sep. 21, 2010, Puskarich et al.
U.S. Appl. No. 12/950,940, filed Nov. 19, 2010, Pance et al.
U.S. Appl. No. 13/630,867, filed Sep. 28, 2012, Bernstein.
U.S. Appl. No. 13/943,639, filed Jul. 16, 2013, Hill.
U.S. Appl. No. 14/059,693, filed Oct. 22, 2013, Puskarich.
U.S. Appl. No. 14/165,475, filed Jan. 27, 2014, Hayskjold et al.
U.S. Appl. No. 14/493,190, filed Sep. 22, 2014, Hoen.
U.S. Appl. No. 14/512,927, filed Oct. 13, 2014, Hill.
U.S. Appl. No. 14/728,505, filed Jun. 2, 2015, Degner et al.
U.S. Appl. No. 14/841,582, filed Aug. 31, 2015, Morrell et al.
U.S. Appl. No. 14/910,108, filed Feb. 4, 2016, Martinez et al.
U.S. Appl. No. 14/928,465, filed Oct. 30, 2015, Bernstein.
U.S. Appl. No. 14/942,521, filed Nov. 16, 2015, Hill.
U.S. Appl. No. 15/025,243, filed Mar. 25, 2016, Keeler.
U.S. Appl. No. 15/025,250, filed Mar. 25, 2016, Moussette et al.
U.S. Appl. No. 15/025,254, filed Mar. 25, 2016, Lubinski et al.
U.S. Appl. No. 15/025,277, filed Mar. 27, 2016, Morrell et al.
U.S. Appl. No. 15/025,425, filed Mar. 28, 2016, Moussette et al.
U.S. Appl. No. 15/045,761, filed Feb. 17, 2016, Morrell et al.
U.S. Appl. No. 15/046,194, filed Feb. 17, 2016, Degner et al.
U.S. Appl. No. 15/068,038, filed Mar. 11, 2016, Bernstein.
U.S. Appl. No. 15/091,501, filed Apr. 5, 2016, Puskarich.
U.S. Appl. No. 15/098,669, filed Apr. 14, 2016, Uttermann et al.
U.S. Appl. No. 15/102,826, filed Jun. 8, 2016, Smith et al.
Nakamura, “A Torso Haptic Display Based on Shape Memory Alloy Actuators,” Massachusetts Institute of Technology, 2003, pp. 1-123.
U.S. Appl. No. 15/621,966, filed Jun. 13, 2017, Pedder et al.
U.S. Appl. No. 15/621,930, filed Jun. 13, 2017, Wen et al.
U.S. Appl. No. 15/622,017, filed Jun. 13, 2017, Yang et al.
Actuator definition downloaded from http://www.thefreedictionary.com/actuator on May 3, 2018, 2 pages.
U.S. Appl. No. 15/800,630, filed Nov. 1, 2017, Morrell et al.
U.S. Appl. No. 15/881,476, filed Jan. 26, 2018, Moussette et al.
Related Publications (1)
Number Date Country
20160259480 A1 Sep 2016 US
Provisional Applications (1)
Number Date Country
62129677 Mar 2015 US