One embodiment is directed generally to haptic effects, and in particular to generating haptic effects using an actuator.
Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects.” Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
Haptic feedback has also been increasingly incorporated in portable electronic devices, such as cellular telephones, smartphones, portable gaming devices, and a variety of other portable electronic devices. For example, some portable gaming applications are capable of vibrating in a manner similar to control devices (e.g., joysticks, etc.) used with larger-scale gaming systems that are configured to provide haptic feedback. Further, devices such as smartphones use haptic effects to cause “buttons” on a touchscreen to feel like their mechanical counterparts when selected by a user.
In order to generate vibration effects, many devices utilize some type of actuator/motor or haptic output device. Known actuators used for this purpose include an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) actuator in which an eccentric mass is moved by a motor and rotates around an axis of rotation. However, because of inertia, the mass in an ERM takes time to get up to the desired rotation speed, and time to wind back down. This “spin up” and “spin down” time can cause latency in the generation of vibratory type haptic effects, and can degrade the “feeling” of haptic effects. In particular, multiple haptic effects that are generated within a short span of each other, such as in response to multiple “keypad” presses, can “cascade” or pile up into a continuous “buzz” due to the latency with slow, inexpensive ERMs.
One embodiment is a system that generates haptic effects using an actuator. The system receives a haptic effect definition that defines a haptic effect. The system pre-processes the haptic effect definition by determining if the actuator is capable of playing the haptic effect. The system then post-processes the haptic effect definition by adjusting a force value based on an estimate or measurement of a state of the actuator during a playing of the haptic effect.
One embodiment is a system and drive circuit for an ERM actuator that tunes, pre-processes, and post-processes a haptic effect definition in order to account for actuator properties and minimize cascading of multiple haptic effects. The pre-processing can scale the magnitude or other parameter of the haptic effect. The post-processing can generate a new magnitude based on actuator properties, current actuator state and the desired magnitude.
The haptic feedback system includes a processor or controller 12.
Coupled to processor 12 is a memory 20 and an actuator drive circuit 16, which is coupled to an actuator 18. Actuator 18 can be any type of actuator, including an actuator with a rotating mass, and in one embodiment is an Eccentric Rotating Mass (“ERM”) actuator. In one embodiment, actuator 18 is any type of actuator that has relatively long rise and fall times. Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10, or may be a separate processor. Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect. A haptic effect may be considered “dynamic” if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction.
Processor 12 outputs the control signals to actuator drive circuit 16, which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage (i.e., “motor signals”) to cause the desired haptic effects. System 10 may include more than one actuator 18, and each actuator may include a separate drive circuit 16, all coupled to a common processor 12. Memory device 20 can be any type of storage device or computer-readable medium, such as random access memory (“RAM”) or read-only memory (“ROM”). Memory 20 stores instructions executed by processor 12. Among the instructions, memory 20 includes a haptic effects module 22 which are instructions that, when executed by processor 12, generate drive signals for actuator 18 that provide haptic effects while minimizing cascading, as disclosed in more detail below. Memory 20 may also be located internal to processor 12, or any combination of internal and external memory.
Touch surface 11 recognizes touches, and may also recognize the position and magnitude of touches on the surface. The data corresponding to the touches is sent to processor 12, or another processor within system 10, and processor 12 interprets the touches and in response generates haptic effect signals. Touch surface 11 may sense touches using any sensing technology, including capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, etc. Touch surface 11 may sense multi-touch contacts and may be capable of distinguishing multiple touches that occur at the same time. Touch surface 11 may be a touchscreen that generates and displays images for the user to interact with, such as keys, dials, etc., or may be a touchpad with minimal or no images.
System 10 may be a handheld device, such a cellular telephone, personal digital assistant (“PDA”), smartphone, computer tablet/pad, gaming console, etc., or may be any other type of device that includes a haptic effect system that includes one or more actuators. System 10 may also be a wearable device (e.g., a bracelet, armband, glove, jacket, vest, pair of glasses, shoes, belt, etc.) that includes one or more actuators that generate haptic effects. The user interface may be a touch sensitive surface, or can be any other type of user interface such as a mouse, touchpad, mini-joystick, scroll wheel, trackball, game pads or game controllers, etc. In embodiments with more than one actuator, each actuator may have a different rotational capability in order to create a wide range of haptic effects on the device.
As discussed above, when playing vibratory haptic effects on actuators that have a high rise and fall times, such as ERMs or other types of slow motors, there is typically some latency. Therefore, when playing a series of quick haptic effects, such as in response to consecutive pressing of virtual keys of a touchscreen keyboard, a continuous buzz may be generated, which is referred to as “cascading”. Cascading is caused by the motor not coming to a stop after playing a first effect when it receives a subsequent effect. Ultimately, the motor is spinning continuously, and the user cannot feel the individual haptic effects when they are played within a short span of each other. Known solutions to the cascading problem include using relatively expensive ERMs that have a quicker rise and fall time, using a relatively expensive drive circuit that supports bidirectional drive signals that include braking signals, or somehow preventing the effects from being played so quickly.
Embodiments include a pre-processing and post-processing of the haptic effect signal in order to accommodate slow rise and fall times of the actuator and eliminate or minimize cascading. Before the pre-processing and post-processing, embodiments “tune” the actuator in order to characterize the actuator rise time and fall time and to account for variations in those times. Some embodiments may implement the pre-processing or post-processing alone to minimize or eliminate cascading. Other embodiments will implement both the pre-processing and post-processing to minimize or eliminate cascading.
Specifically, slow actuators have non-linear rise and fall characteristics. To be able to model such actuators, embodiments measure the total rise time taken for the actuator to rise from rest to 90% of rated acceleration. This rise time region in one embodiment is then divided into 10 segments at 10%, 20% . . . 90% of the rise time and the corresponding acceleration at each point is noted. These 9 values+the total rise time are added to the tuning parameters. In general, any number of points/segments can be selected, and the more points taken along the rise curve, the more accurate the model. Each actuator on a device has its own set of 99 tuning parameters in one embodiment. Changing these parameters results in different haptic effects. Examples of tuning parameters include: (1) a gain value for scaling magnitudes; (2) linearization values which help to get a linear response from a non-linear actuator; (3) MagSweep effect kick and brake parameters; (4) Periodic effect parameters; (5) update rate (e.g., 5 ms); and (6) informational parameters such as actuator type, build number, etc.
Similarly to the rise curve, 10 values are obtained for the fall curve in one embodiment, although any number of values can be used. The fall time is considered to be the time required for an actuator to go from rated acceleration to below perceptible acceleration (˜0.04 g in one embodiment). If braking is supported by the actuator, a negative overdrive voltage, or some other braking mechanism such as shunting, is applied. Otherwise, no voltage is applied. By applying this tuning model, embodiments have a more accurate picture of which effects a motor can play and also of the state of the motor at a given time.
At 202, the haptic effect is received in the form of a parameterized vibration definition that defines the envelope/shape of the haptic effect. The definition includes parameters that define the haptic effect, including duration, frequency, magnitude, etc. In one embodiment, the definition is in accordance with the “TouchSense® 3000 Haptic Design Kit” from Immersion Corp. In this embodiment, the haptic effect is defined by a definition shape formed by an envelope that includes three basis parts/regions or effect magnitudes: “impulse”, “sustain” and “fade” that are formed from a MagSweep or Periodic basis effect. Other effect types such as Timelines and Interpolated effects are composed of basis effects. In other embodiments, the definition is in general of an effect that is relatively maintained throughout the functionality of
At 204, the haptic effect definition is pre-processed. In general, the pre-processing revises the haptic effect to match the characteristics of actuator 18 within system 10 by determining the desired haptic effect and determining the actual effect the actuator is capable of playing based on its characteristics. The “new” effect is either the same as, or a scaled down version of, the original effect. For example, if the haptic effect definition is very strong for a short duration, the actuator may not be able to spin up so quickly. Therefore, the pre-processing modifies the haptic effect definition to accommodate the actuator, such as reducing the magnitude, before the haptic effect is played. In one embodiment, pre-processing includes “clipping” the effect based on the desired effect to be played and based on the actuator rise curve and/or fall curve. The clipping occurs at the basis effect level and ensures that the actuator can achieve the desired magnitude at the end of the effect's duration. Therefore, if the same effect is played repeatedly, the effects will not merge into each other, which prevents cascading. In general, the pre-processing modifies the effect in such a way that cascading is reduced or is impossible, while maintaining as much as possible the effect properties relative to each other. Other methods in addition to or instead of magnitude scaling can be used, including effect duration scaling. Additional details of the pre-processing is disclosed below.
At 206, the pre-processed haptic effect is initiated/played by sending the pre-processed signal to the drive circuit, which causes the actuator to create a vibratory haptic effect.
While the haptic effect is playing, at 208 it is determined if a post-processing interval has been reached. A post-processing interval is pre-determined and is typically equal to every timer/clock tick/cycle (e.g., every 5 ms) and occurs every time prior to having a force value sent to the drive circuit.
If no at 208, at 212 the haptic effect continues to be played until its full duration has expired and functionality continues at 208.
If yes at 208, at 210 the haptic effect definition is post-processed. In general, the post-processing estimates or measures where the actuator lies on its rise or fall curve at a given point (i.e., the actuator current state), and the force/voltage value sent to the actuator drive circuit is adjusted accordingly. The force value represents how fast the motor is desired to be spinning based on the effect definition at a particular time instant. The post-processing determines the current state of the actuator as well as the desired effect force, and determines a new force/magnitude. The new force may be the same as the desired force or it may be a force which will cause the actuator to achieve the desired force as soon as possible. The force value can be received from an external entity, such as the kernel from the “TouchSense® 3000 Haptic Design Kit” from Immersion Corp. The post-processing adjusts the force value based on an estimate or measurement about the current state of the actuator (i.e., how fast it is already spinning). Additional details of the post-processing are disclosed below.
At 212, the haptic effect, after being post-processed at 210, is played. Functionality continues at 208.
Specifically, as shown in
The variable “oldMax” equals the maximum magnitude among the three sections. In
At 304, it is determined if the impulse section is a ramp down and intersects with the rise curve of the actuator. In
If yes at 304, at 306 the variable “newMax” equals the impulse intersection level and the variable “Susachieved” equals 1. In
If no at 304 (i.e., the impulse is ramp down), at 308 newMax equals the sustain intersection level of the rise curve. If the sustain intersection level is greater than the sustain level, then Susachieved equals 1. At 310, it is determined if the fade section ramps down. If yes at 310, at 312, it is determined if Susachieved equals 1. If Susachieved=1, functionality continues at 316.
If no at 310 or 312, at 314, the intersection level of the fade section and the rise curve is determined. If the rise curve does not intersect the fade section, fadeIntersectLevel is set to the max that the rise curve can achieve in the fade duration. If the intersection level is greater than newMax, the newMax is set to the intersection level. In
At 316, a scale factor of newMax/oldMax is determined. The original effect is then scaled to a scaled effect based on the scale factor by multiplying all magnitude levels by the scale factor. In
As described above, the pre-processing scales the entire effect to something that the actuator can achieve, while maintaining the effect shape (or maintains the original effect). The functionality determines the maximum magnitude (referred to as “oldMax”) out of the initial ramp, sustain and fade magnitudes. The functionality then finds the three magnitudes that are actually achievable based on where the actuator rise curves intersect the effect curves (i.e., “ImpIntersectLevel”, “SusIntersectLevel” and “FadeIntersectLevel”). The newMax is the maximum of these three effect magnitudes. Finally a scale factor of newMax/oldMax is determined. All three effect magnitudes in the original effect are multiplied by this scale factor to come up with the new magnitudes.
In other embodiments, instead of clipping effect magnitudes, effect durations could be clipped. Further, both the effect duration and magnitude can be clipped. For this embodiment, a perpendicular line is drawn from the effect to the rise curve and the magnitude and duration at the point of intersection of the perpendicular line and the rise curve is used.
In another embodiment, the area under the effect curve is determined and that is used to play effects. This embodiment assumes that for really slow motors a user would not be able to tell the difference between different effect shapes. In another embodiment, the duration of the effect can be determined and a determination is made as to how fast the actuator can end up after that duration starting from zero if the effect were a maximum magnitude square pulse, then the effect magnitude, impulse level and fade level can be limited to that value. Further, an embodiment can have a separate clipping process for “Timeline” haptic effects. Periodic, Magsweep and Waveform effects can be placed in a Timeline to create complex effect sequences.
At 502, it is determined if the desired force/output is equal to the estimated output. If yes at 502, the output is linearized at 504. Linearizing the output, in general, adds real world parameters such as friction, etc. to the output determination. For example, friction in the actuator may mean that 15% force is needed to generate a 1% force based haptic effect. In one embodiment, a table or other mapping is used for linearizing.
If no at 502, at 506 the variable “Diff” is determined as the desired output minus the estimated output.
At 508, it is determined if Diff is greater than zero.
If yes at 508, at 510 the rise time that the actuator can achieve in 1 timer tick is determined. In one embodiment, a lookup table based on the rise time slope of the actuator is used for the determination. The lookup table is generated based on the tuning functionality described above before the pre-processing. Once the lookup table is generated, it is used for pre-processing and post-processing.
At 512, it is determined if Diff is greater than the rising slope.
If yes at 512, at 514 Diff equals the rising slope and the output equals the maximum output of the actuator.
If no at 512, at 516 the new output is determined using linear interpolation between two points. Additional details regarding the linear interpolation is disclosed below. In other embodiments, any type of interpolation model can be used.
At 524, the Estimate equals Diff, and the New Force equals the output.
At 526, the output is linearized as in 504.
If Diff is not greater than 0 at 508, at 518 the fall slope that the actuator can achieve in 1 timer tick is determined. In one embodiment, a lookup table based on the fall time slope of the actuator is used for the determination.
At 520, it is determined if Diff is less than the falling slope.
If yes at 520, at 522 Diff equals the falling slope and the output equals 0 or the negative of the maximum power based on braking capabilities. Functionality then continues at 524.
If no at 520, functionality continues at 516.
As described above, for post-processing, the rise and fall curves of the actuator are divided into 10 linear segments based on the changes to the tuning process. The maximum increase/decrease in force values in a timer tick is determined, based on the slope of these line segments. The slope is used to determine if the actuator can get to the desired force value by the next timer tick. If not, a kick or brake pulse is applied. If the actuator can get to the desired force by the next time tick, the amount that the force should be increased/decreased is determined so that the actuator is at the desired force value by the end of the timer tick. The estimate force is updated every tick, as shown in
In one embodiment, the post-processing can be implemented by the following pseudo-code in which parameter definitions are as follows:
The derived parameters are as follows:
The input is as follows:
The state variable is as follows:
The desired slope is as follows:
The possible slope is as follows:
The output voltage is as follows:
Finally, the updated state is as follows:
As disclosed, embodiments generate haptic effects by pre-processing and/or post-processing a haptic effect definition. The pre-processing looks at the desired effect as well as the actuator characteristics to determine the actual effect the actuator is capable of playing. The pre-processed effect is either the same or a scaled down version of the original effect. The pre-processing changes the haptic effects to what is achievable even before they are played.
The post-processing looks at the current state of the actuator as well as the desired effect force, to come up with a new force. This new force might be the same as the desired force or it might be a force which will get the actuator to the desired force as soon as possible. As a result of the pre-processing and the post-processing, either implemented together in one embodiment, or implemented individually in other embodiments (i.e., one embodiment implements only pre-processing, and another embodiment implements only post-processing), cascading from a series of quick haptic effects, such as consecutive keypad presses, is minimized or eliminated.
Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.
This application is a continuation of U.S. patent application Ser. No. 14/943,179, filed on Nov. 17, 2015, which is a continuation of U.S. patent application Ser. No. 14/048,374, filed on Oct. 8, 2013, which issued as U.S. Pat. No. 9,213,408. The disclosure of each of these applications is hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
5889670 | Schuler et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5959613 | Rosenberg et al. | Sep 1999 | A |
6088017 | Tremblay et al. | Jul 2000 | A |
6275213 | Tremblay et al. | Aug 2001 | B1 |
6278439 | Rosenberg et al. | Aug 2001 | B1 |
6424333 | Tremblay et al. | Jul 2002 | B1 |
6801008 | Jacobus et al. | Oct 2004 | B1 |
6876891 | Schuler et al. | Apr 2005 | B1 |
7050955 | Carmel et al. | May 2006 | B1 |
7209117 | Rosenberg et al. | Apr 2007 | B2 |
7218310 | Tierling et al. | May 2007 | B2 |
7369115 | Cruz-Hernandez et al. | May 2008 | B2 |
7639232 | Grant et al. | Dec 2009 | B2 |
7656388 | Schena et al. | Feb 2010 | B2 |
7765333 | Cruz-Hernandez et al. | Jul 2010 | B2 |
7821493 | Tierling et al. | Oct 2010 | B2 |
7843277 | Gregorio et al. | Nov 2010 | B2 |
8098234 | Lacroix et al. | Jan 2012 | B2 |
8232969 | Grant et al. | Jul 2012 | B2 |
9213408 | Gandhi | Dec 2015 | B2 |
9367136 | Latta et al. | Jun 2016 | B2 |
9370459 | Mahoney | Jun 2016 | B2 |
9370704 | Marty | Jun 2016 | B2 |
9392094 | Hunt et al. | Jul 2016 | B2 |
9462262 | Worley, III et al. | Oct 2016 | B1 |
9507423 | Gandhi | Nov 2016 | B2 |
9626805 | Lampotang et al. | Apr 2017 | B2 |
9645646 | Cowley et al. | May 2017 | B2 |
9652037 | Rubin et al. | May 2017 | B2 |
9760166 | Ammi et al. | Sep 2017 | B2 |
9811854 | Lucido | Nov 2017 | B2 |
9851799 | Keller et al. | Dec 2017 | B2 |
9933851 | Goslin et al. | Apr 2018 | B2 |
9948885 | Kurzweil | Apr 2018 | B2 |
20030025595 | Langberg | Feb 2003 | A1 |
20060038781 | Levin | Feb 2006 | A1 |
20070202841 | Cruz-Hernandez et al. | Aug 2007 | A1 |
20070236449 | Lacroix et al. | Oct 2007 | A1 |
20080132313 | Rasmussen | Jun 2008 | A1 |
20080198139 | Lacroix | Aug 2008 | A1 |
20080223627 | Lacroix et al. | Sep 2008 | A1 |
20090313542 | Cruz-Hernandez | Dec 2009 | A1 |
20120249462 | Flanagan | Oct 2012 | A1 |
20130106589 | Posamentier | May 2013 | A1 |
20140232646 | Biggs | Aug 2014 | A1 |
20140272867 | Ratcliffe et al. | Sep 2014 | A1 |
20160070348 | Cowley et al. | Mar 2016 | A1 |
20160084605 | Monti | Mar 2016 | A1 |
20160086457 | Baron et al. | Mar 2016 | A1 |
20160163227 | Penake et al. | Jun 2016 | A1 |
20160166930 | Bray et al. | Jun 2016 | A1 |
20160169635 | Hannigan et al. | Jun 2016 | A1 |
20160170508 | Moore | Jun 2016 | A1 |
20160171860 | Hannigan et al. | Jun 2016 | A1 |
20160171908 | Moore et al. | Jun 2016 | A1 |
20160187969 | Larsen et al. | Jun 2016 | A1 |
20160187974 | Mallinson | Jun 2016 | A1 |
20160201888 | Ackley et al. | Jul 2016 | A1 |
20160209658 | Zalewski | Jul 2016 | A1 |
20160214015 | Osman et al. | Jul 2016 | A1 |
20160214016 | Stafford | Jul 2016 | A1 |
20160375170 | Kursula et al. | Dec 2016 | A1 |
20170102771 | Lei | Apr 2017 | A1 |
20170103574 | Faaborg et al. | Apr 2017 | A1 |
20170131775 | Clements | May 2017 | A1 |
20170148281 | Do et al. | May 2017 | A1 |
20170154505 | Kim | Jun 2017 | A1 |
20170168576 | Keller et al. | Jun 2017 | A1 |
20170168773 | Keller et al. | Jun 2017 | A1 |
20170178407 | Gaidar et al. | Jun 2017 | A1 |
20170203221 | Goslin et al. | Jul 2017 | A1 |
20170203225 | Goslin | Jul 2017 | A1 |
20170206709 | Goslin et al. | Jul 2017 | A1 |
20170214782 | Brinda | Jul 2017 | A1 |
20170257270 | Goslin et al. | Sep 2017 | A1 |
20170352185 | Bonilla Acevedo et al. | Dec 2017 | A1 |
20180050267 | Jones | Feb 2018 | A1 |
20180053351 | Anderson | Feb 2018 | A1 |
20180077976 | Keller et al. | Mar 2018 | A1 |
20180081436 | Keller et al. | Mar 2018 | A1 |
20180093181 | Goslin et al. | Apr 2018 | A1 |
20180107277 | Keller et al. | Apr 2018 | A1 |
20180120936 | Keller et al. | May 2018 | A1 |
Number | Date | Country |
---|---|---|
2624100 | May 2013 | EP |
Entry |
---|
Michael Levin et al.; “Tactile-Feedback Solutions for an Enhanced User Experience”; Information Display, Palisades Institute for Research Services; New York, U.S.; No. 10., Jan. 1, 2009; pp. 18-21; XP007915375. |
Number | Date | Country | |
---|---|---|---|
20170045945 A1 | Feb 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 14943179 | Nov 2015 | US |
Child | 15338978 | US | |
Parent | 14048374 | Oct 2013 | US |
Child | 14943179 | US |