The invention relates to a method and apparatus for simulating surface features on a user interface with haptic effects.
Some electronic user interface devices are able to generate a haptic effect to indicate presence of features represented on the user interface devices. If an electronic user interface device has a touch interface, presence of the haptic effect may indicate the feature has been touched by a user, while absence of the haptic effect may indicate the feature has not been touched. Other details of the feature, such as its texture, may be conveyed to the user visually. A fixed periodic haptic effect has been generally described as a way to convey additional details of a feature to the user. Overall, however, the ability to convey feature details to users through haptic effects is still limited.
According to an aspect of the present invention, there is provided a method for producing a haptic effect. The method may include generating a periodic drive signal based on a touch input at a surface and based on a tactile sensation to be simulated at the surface. The periodic drive signal may be applied to a haptic output device.
In an embodiment, the surface may be a surface of an interface device and the haptic output device may be coupled to the surface. In an embodiment, the haptic output device may be configured to generate electrostatic friction. The generation of the periodic drive signal may include altering an amplitude, frequency, or wave shape of the periodic drive signal to alter a level of friction at the surface of the interface device. The alteration of the signal may be based on a location, velocity, acceleration, pressure, or contact area of the touch input.
In an embodiment, an amplitude, frequency, or wave shape of a periodic drive signal may be altered based on a simulated transition between a first simulated region represented on the surface of the interface device and a second simulated region represented on the surface of the interface device. In an embodiment, the simulated transition may comprise movement over a simulated edge of the first simulated region or of the second simulated region. In some instances, the amplitude, frequency, or wave shape may be altered when the location of the touch input is substantially at the edge.
In an embodiment, the periodic drive signal may be based on a texture to be simulated at the surface of the interface device. In some instances, the texture may comprise a grating or mesh texture at the surface of the interface device. The grating may include, for example, a plurality of edges. In some instances, the generation of the periodic drive signal may comprise altering the frequency of the drive signal based on spacing among the plurality of edges of the grating or mesh and based on a velocity of the touch input at the surface. In some instances, the texture may comprise a stick-slip texture at the surface of the interface device, where generating the periodic drive signal may comprise temporarily suspending the periodic drive signal to simulate a slippery texture at the surface of the interface device.
In an embodiment, a frequency or amplitude of the periodic drive signal may be altered by a pseudo-random amount.
In an embodiment, the method may include generating two periodic drive signals that have different frequencies.
In an embodiment, recorded contact dynamics of an object that moved across another surface may be received. The periodic drive signal may be generated based on the recorded contact dynamics.
According to an aspect of the present invention, there is provided a haptic effect enabled device that comprises a haptic output device, a drive module, and a drive circuit. The drive module may be configured to generate a periodic drive signal based on a touch input at a surface and based on a tactile sensation to be simulated at the surface. The drive circuit may be operatively coupled to the drive module and the haptic output device and configured to apply the periodic drive signal to the haptic output device.
In an embodiment, the haptic effect enabled device may be an interface device, and the surface may be a surface of the interface device. In an embodiment, the haptic output device may be configured to generate electrostatic friction. In the embodiment, the drive module may be configured to generate the periodic drive signal by altering an amplitude, frequency, or wave shape of the periodic drive signal to alter a level of friction at the surface of the interface device. The alteration may be based on a location, velocity, acceleration, pressure, or contact area of the touch input.
These and other aspects, features, and characteristics of the present invention, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
Device 100 may include a mobile phone, tablet computer, electronic display, touch pad, or any other electronic user interface device.
In an embodiment, device 100 may comprise a haptic drive module (e.g., controller 130), a haptic output device 120 to generate haptic effects, and a drive circuit operatively coupled to the controller 130 and the haptic output device 120 so as to apply a drive signal to the haptic output device. Controller 130 may include one or more processors or any other processing unit. Haptic output device 120 may include an actuator (e.g., a voice coil, ultrasonic vibration device, solenoid, piezoelectric device, or any other actuator), an electrostatic device, or any other haptic output device. The ultrasonic vibration device may, in some instances, reduce a level of friction at surface 110. Controller 130 may be operatively coupled to haptic device 120, which may be operatively coupled to surface 110. Haptic output devices are discussed in more detail in U.S. patent application Ser. No. 13/092,269, titled “Electro-vibrotactile Display,” filed Apr. 22, 2011, the entire content of which is incorporated by reference herein.
In an embodiment, controller 130 and haptic device 120 may simulate surface features at surface 110 by controlling a level of friction. For example, a haptic device 120 that includes an actuator may control friction through generating vibrations at surface 110. A haptic device 120 that includes an electrostatic device may control a level of friction through applying a voltage to or underneath surface 110. An alternating voltage signal, for example, may create a capacitive effect that attracts finger 10, a stylus, or any other object at surface 110. The attractive force at the surface may be perceived as friction as the object moves across the surface. Increasing the attractive force may increase a level of friction at the surface. Controlling friction through a haptic effect is discussed in more detail in U.S. patent application Ser. No. 13/092,269, which was incorporated by reference above.
As described in that application, an electrostatic device may, in an embodiment, be used with a surface 110 that includes a conductive layer having one or more electrodes and that includes an insulating layer. The conducting layer may be any semiconductor or other conductive material. The insulating layer may be glass, plastic (e.g., thermoplastic), polymer, or any other insulating layer. The electrostatic device may operate by applying an AC signal that, in an embodiment, capacitively couples the conducting layer with an object near or touching surface 110. The AC signal may be generated by a high-voltage amplifier.
The capacitive coupling may control a level of friction on the surface 110. In an embodiment, a texture may be simulated by controlling the level of friction on the surface 110. Varying the levels of attraction between the object and the conducting layer can vary the friction on an object moving across the surface 110. Varying the friction force may simulate one or more textures.
Further, the capacitive coupling may also generate a haptic effect by stimulating parts of the object near or touching the surface 110, such as mechanoreceptors in the skin of a user's finger. In an example, the conducting layer can be applied with an AC voltage signal that couples with conductive parts of a user's finger. As the user moves his or her finger on the screen, the user may sense a texture of prickliness, graininess, bumpiness, roughness, stickiness, or some other texture.
In an embodiment, surface 110 does not have an insulating layer, so that an object can directly touch the conducting layer. A haptic effect can be generated by applying a voltage from the conducting layer to the object through an electrically conductive path. This embodiment may alternatively use an insulating layer, but include one or more electrodes in the insulating layer that can create an electrically conductive path from the conducting layer to objects that touch the electrode as they move across the insulating layer.
In an embodiment, a haptic effect is not confined to a surface (e.g., surface 110) of an electronic user interface device. In the embodiment, a user's hand, for example, may touch objects beyond a touch screen or touchpad and still perceive a haptic effect. The haptic effect may be generated by, for example, applying a voltage directly to the user's body from a signal generator or any other voltage-generating device. In some instances, the voltage-generating device may be a standalone device adapted to be mounted at a location that frequently comes into contact with the user's body. The voltage may be applied whenever a sensor detects that the user's body is touching an object on which a texture is to be simulated. The voltage may place charge on the user's body. Capacitive interaction between the charge on the user's body and the object being touched may create an attractive force between the user's body and the object. The force of attraction may control a level of friction at a surface of the object, which may simulate a texture or any other tactile sensation of the object being touched. Varying the voltage being applied to the user's body may vary the haptic effect, and thus vary the tactile sensation being simulated. If the voltage is based on a periodic signal, varying the voltage may include varying the amplitude or frequency of the signal. In some instances, the object may have a conductive layer surrounded by an insulating layer. The capacitive interaction may be between the conductive layer and the charge on the user's body. In some instances, both the object being touched and the voltage generating device may have a common ground. In some instances, the user's body may be grounded. In some instances, the user's body is not grounded.
In an embodiment, a user may perceive a simulated texture on an object both through an electrostatic effect that is generated at a surface of the object and through an augmented reality experience created by an electronic user interface device. For example, the electronic user interface device may create an augmented reality experience by displaying a captured image of an object and overlaying a graphical representation of a texture on the image. In the embodiment, the user may perceive a texture on an object both by touching the object and by seeing the graphical representation of the texture overlaid on the object on the electronic user interface.
In an embodiment, controller 130 may be configured to cause haptic device 120 to generate a periodic haptic effect.
In an embodiment, controller 130 may cause haptic device 120 to alter the haptic effect.
In an embodiment, a haptic effect may be altered in a continuous manner. For example
In an embodiment, the continuous gradient in the periodic haptic effect as an object moves across surface 110 may simulate a gradient in texture or any other surface feature. For example, as the object moves across surface 110, the alteration in the periodic haptic effect may simulate a gradient in smoothness, roughness, stickiness, or any other texture. In an embodiment, the alteration in the periodic haptic effect may simulate a gradually increasing resistance, such as that from a spring or any other elastic force. In some instances, an image of a spring or other elastic object may be displayed on surface 110. The periodic haptic effect may simulate a resistance that corresponds with visually displayed stretching of the elastic object.
In an embodiment, a haptic effect may be altered in a discrete manner. For example, as illustrated in
In an embodiment, altering a haptic effect in a discrete manner may simulate discrete regions of different textures on surface 110. For example,
In an embodiment, a periodic haptic effect may be suspended at one or more regions on surface 110.
In an embodiment, a simulated region may have any shape, and may extend in one or more dimensions.
In an embodiment, a periodic haptic effect may be generated to create a pleasant or unpleasant sensation, or more generally a sensation having a psychological association, for a user touching surface 110. For example, the user may perceive a periodic haptic effect with a low frequency as pleasant and a periodic haptic effect with a high frequency as unpleasant. In an embodiment, the periodic haptic effect may be associated with an event displayed on surface 110. For example, the event may be losing a game displayed on surface 110 or attempting to perform an action on device 100 that is prohibited. When the event occurs, the haptic effect may be generated to create an unpleasant sensation for the user.
In an embodiment, a haptic effect that is localized in time or space (e.g., a brief, abrupt pulse) may be generated to simulate an edge or detent. For example,
In an embodiment, the haptic effect may be based on a direction of movement. For example, a localized haptic effect may be more intense if a touch input is moving in a particular direction. The haptic effect may simulate directional textures such as fish scales, or other directional features, such as detents (e.g., in a ratchet).
In an embodiment, the described haptic effects may be part of an interface metaphor. For example, the different regions being simulated by altering a haptic effect may represent different file folders, workspaces, windows, or any other metaphor used in a computing environment. In the interface metaphor, dragging an element on surface 110 may be guided by friction created from a haptic effect. A level of friction being generated by the haptic effect may, for example, indicate how close the dragged element is to a target location.
In an embodiment, a haptic effect generated at a surface may be based on measurements obtained from another surface. For example, to characterize the other surface, a probe may move across the other surface and measure acceleration or velocity of the probe, sound generated from the movement, any other contact dynamics measurement, light reflection off of the surface, or any other physical quantity. For example,
In an embodiment, a rate of play back of signal 227a may be based on speed of movement of an object at surface 110 relative to speed of movement of the probe that measured surface 140. For example, if finger 10 moves more quickly than the probe moved across surface 140, signal 227b may be a compressed version of signal 227a in the time domain. If finger 10 moves more slowly than the probe moved across surface 140, signal 227b may be an expanded version of signal 227a in the time domain. Compressing or expanding signal 227a in the time domain may preserve a spatial distance over a surface feature is reproduced. For example, if the probe measured 1 cm of surface 140 in 1 second, the measured signal should be played back in 0.5 seconds for a finger moving at 2 cm/sec on surface 110. Compressing the measured signal in the time domain ensures that the reproduced surface features still occupies 1 cm of space on surface 110.
In an embodiment, a haptic effect may be based on a combination of one or more haptic drive signals. Two signals may be combined through superposition, modulation (e.g., amplitude or frequency modulation), convolution, or any other combination. The combination of one or more haptic drive signals may include a discrete signal, a continuous signal, or any combination thereof.
A haptic effect may be created based on a combination of any signals. Signals being combined may have different phases, amplitudes, frequencies, or wave shapes.
In an embodiment, a haptic effect may be based on a random or pseudo-random haptic drive signal, such as signal 235, illustrated in
In an embodiment, a haptic effect may be based on a combination of a random or pseudo-random signal and another signal. For example, as illustrated in
As discussed above, signals may be combined through superposition, modulation, convolution, or any other combination.
In an embodiment, a random or pseudo-random signal may be generated from a desired frequency distribution of the random or pseudo-random signal. For example,
As discussed above, a haptic effect may be based on a property of how an object moves across a touch interface, such as surface 110. The property may include location, velocity, acceleration, or any other property of the movement.
At operation 301, a location or position of an object creating a touch input at a touch interface, such as finger 10 at surface 110, may be measured. In the illustration of
At operation 303, a velocity of the object may be estimated or otherwise determined. The velocity may be estimated, for example, by dividing a change in location by a change in time. In the illustration in
At operation 305, a haptic effect may be adjusted based on the estimated velocity. For example, to simulate a grating in which edges have equal spacing, impulse signals on which haptic effects are based may have to be compressed in the time domain if velocity of the object increases. If velocity of the object decreases, the impulse signals may be expanded in the time domain. If the series of impulse signals are treated as a periodic square wave, operation 305 may be treated as updating a frequency of the periodic square wave. In the illustration in
When velocity increases from 2 mm/sec to 3 mm/sec, the frequency may be increased from 4 Hz to 6 Hz to still stimulate two edges of the grating per millimeter. In other embodiments where spacing in a grating or any other surface feature is not constant, timing of the haptic effect may still be adjusted in proportion with a change in velocity of a touch input so as to preserve a spatial dimension of the surface feature being simulated.
In the embodiment of
At an operation 401, a location of the object on the touch interface may be determined. For example,
At time t2, operation 403 may compute output values for a haptic effect based on current location xi-1, and past location xi-1, i.e. on location x2 and location x1. Computing output values may include computing a haptic drive signal for a haptic effect to be generated at or around t2. The drive signal's waveform may have N discrete values corresponding to N positions between xi and xi-1, as illustrated in
At operation 405, the haptic effect based on the waveform computed at operation 403 may be generated. The generated haptic effect may match a desired haptic effect for the interval between xi and xi-1. If the haptic drive signal computed at operation 403 has N discrete values, the haptic effect may be outputted based on one of the values every Δt/N seconds, where Δt is the time between measurements of an object's location and generating a new haptic effect.
In an embodiment, intensity of a haptic effect may be normalized. Normalization may address, for example, periodic haptic effects that are perceived to have different intensities at different frequencies or at different contact properties (e.g., different touch input velocities, applied pressures, finger moisture levels).
In an embodiment, a haptic effect generated for an electronic user interface device may depend on a history of interactions with the electronic user interface device. For example, the haptic effect may depend on a combination of a previous touch input and a current touch input. The previous touch input may have caused one haptic effect to be generated, while the current touch input may, for instance, cause a different haptic effect to be generated. In an embodiment, a haptic effect may be suspended for a predetermined period of time after a touch input has been detected.
One or more operations of the one or more methods disclosed herein may be implemented as one or more instructions stored on a computer-readable medium and executed by one or more processors. For example, the one or more operations may be implemented through firmware or software code stored on RAM, ROM, EPROM, flash memory, a hard drive, or any other computer-readable medium.
Although the invention has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred embodiments, it is to be understood that such detail is solely for that purpose and that the invention is not limited to the disclosed embodiments, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present invention contemplates that, to the extent possible, one or more features of any embodiment can be combined with one or more features of any other embodiment.
This application is a continuation application of U.S. patent application Ser. No. 15/641,458, filed Jul. 5, 2017, and issued as U.S. Pat. No. 10,139,912 on Nov. 27, 2018 which is a continuation application of U.S. patent application Ser. No. 14/949,033, filed Nov. 23, 2015, and issued as U.S. Pat. No. 9,727,142 on Aug. 8, 2017, which is a continuation application of U.S. patent application Ser. No. 13/665,526, filed Oct. 31, 2012, and issued as U.S. Pat. No. 9,196,134 on Nov. 24, 2015, the entirety of all of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
6046726 | Keyson | Apr 2000 | A |
6337687 | Lee | Jan 2002 | B1 |
6819312 | Fish | Nov 2004 | B2 |
7088342 | Rekimoto et al. | Aug 2006 | B2 |
7205978 | Poupyrev et al. | Apr 2007 | B2 |
7336260 | Martin et al. | Feb 2008 | B2 |
7369115 | Cruz-Hernandez et al. | May 2008 | B2 |
7446456 | Maruyama et al. | Nov 2008 | B2 |
7456823 | Poupyrev et al. | Nov 2008 | B2 |
7554246 | Maruyama et al. | Jun 2009 | B2 |
7663604 | Maruyama et al. | Feb 2010 | B2 |
7755607 | Poupyrev et al. | Jul 2010 | B2 |
RE42064 | Fish | Jan 2011 | E |
7924144 | Makinen et al. | Apr 2011 | B2 |
7982588 | Makinen et al. | Jul 2011 | B2 |
8026798 | Makinen et al. | Sep 2011 | B2 |
8279193 | Birnbaum et al. | Oct 2012 | B1 |
8917234 | Cruz-Hernandez et al. | Dec 2014 | B2 |
9448713 | Cruz-Hernandez et al. | Sep 2016 | B2 |
20020149561 | Fukumoto et al. | Oct 2002 | A1 |
20020177471 | Kaaresoja | Nov 2002 | A1 |
20080068348 | Rosenberg et al. | Mar 2008 | A1 |
20080088580 | Poupyrev et al. | Apr 2008 | A1 |
20080218488 | Yang et al. | Sep 2008 | A1 |
20090079550 | Makinen et al. | Mar 2009 | A1 |
20100085169 | Poupyrev et al. | Apr 2010 | A1 |
20100141407 | Heubel et al. | Jun 2010 | A1 |
20100152794 | Radivojevic et al. | Jun 2010 | A1 |
20100171715 | Peterson | Jul 2010 | A1 |
20100231367 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100231550 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100283731 | Grant et al. | Nov 2010 | A1 |
20100289740 | Kim | Nov 2010 | A1 |
20100325931 | Rosenberg | Dec 2010 | A1 |
20100327006 | Camp | Dec 2010 | A1 |
20110051360 | Dabov et al. | Mar 2011 | A1 |
20110109588 | Makinen | May 2011 | A1 |
20110115717 | Hable | May 2011 | A1 |
20110157088 | Motomura et al. | Jun 2011 | A1 |
20110193824 | Modarres et al. | Aug 2011 | A1 |
20110285666 | Poupyrev | Nov 2011 | A1 |
20110285667 | Poupyrev et al. | Nov 2011 | A1 |
20120026180 | Kuchenbecker et al. | Feb 2012 | A1 |
20120142379 | Park | Jun 2012 | A1 |
20120217978 | Shen | Aug 2012 | A1 |
20120229400 | Birnbaum et al. | Sep 2012 | A1 |
20120229401 | Birnbaum et al. | Sep 2012 | A1 |
20120268412 | Cruz-Hernandez | Oct 2012 | A1 |
20120287068 | Colgate | Nov 2012 | A1 |
20120327006 | Israr | Dec 2012 | A1 |
20130002570 | Ogg | Jan 2013 | A1 |
20130164543 | Shibuya | Jun 2013 | A1 |
20130222280 | Sheynblat et al. | Aug 2013 | A1 |
20130227410 | Sridhara et al. | Aug 2013 | A1 |
20130307789 | Karamath et al. | Nov 2013 | A1 |
20130314303 | Osterhout et al. | Nov 2013 | A1 |
20150123775 | Kerdemelidis | May 2015 | A1 |
20150355710 | Modarres et al. | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
102227696 | Oct 2011 | CN |
102349041 | Feb 2012 | CN |
06278056 | Oct 1994 | JP |
3085481 | May 2002 | JP |
2003-248540 | Sep 2003 | JP |
2006-012184 | Jan 2006 | JP |
2006163579 | Jun 2006 | JP |
2008027223 | Feb 2008 | JP |
2011-008532 | Jan 2011 | JP |
2011129091 | Jun 2011 | JP |
2011141868 | Jul 2011 | JP |
2011-248884 | Dec 2011 | JP |
2012064095 | Mar 2012 | JP |
2012-520523 | Sep 2012 | JP |
2012520137 | Sep 2012 | JP |
10-1338-332 | Dec 2013 | KR |
2010037894 | Apr 2010 | WO |
2012145264 | Oct 2012 | WO |
2010134349 | Nov 2012 | WO |
2013007882 | Jan 2013 | WO |
Entry |
---|
European Patent Application No. 13191210.7, Extended Search Report dated May 16, 2014. |
Bau, et al., TelsaTouch: Electrovibration for Touch Surfaces, UIST '10 (Oct. 3-6, 2010), pp. 283-292, New York, NY. |
Bau, et al., “REVEL: Tactile Feedback Technology for Augmented Reality,” SIGGRAPH 2012 (Aug. 5-9, 2012), Los Angeles, CA. |
“Revel: Programming the Sense of Touch,” Disney Research Hub (Jul. 31, 2012), retrieved Oct. 31, 2012 from http://www.youtube.com/watch?v=L7DGq8SddEQ. |
Smith, N., “Touch Screens That Touch Back: Feeling in the Future,” LiveScience (Dec. 29, 2010), retrieved Oct. 31, 2012 http://www.livescience.com/11228-touch-screens-touch-feeling-future.html. |
Greene, K., “A Touch Screen with Texture,” MIT Technology Review (Oct. 13, 2010), retrieved Oct. 31, 2012 from http://www.technologyreview.com/news/421191/a-touch-screen-with-texture. |
Marks, P., “Nokia touchscreen creates texture illusion,” NewScientist (Sep. 28, 2010), retrieved Oct. 31, 2012 from http://newscientist.com/article/dn19510-nokia-touchscreen-creates-texture-illusion. |
“Nokia touchscreen creates texture illusion,” Gotchacode (Oct. 10, 2010), retrieved Oct. 31, 2012 from https://gotchacode.wordpress.com/2010/10/10/nokia-touchscreen-creates-texture-illusion. |
Bonderud, D., “Nokia Files Patent to Make Phones Digitally Stimulating,” InventorSpot, retrieved Oct. 31, 2012 from http://inventorspot.com/articles/nokia_files_patent_make_phones_digitally_stimulating. |
Japanese Patent Application 2013-213693, Office Action dated Jun. 27, 2017. |
U.S. Appl. No. 14/949,033 , “Non-Final Office Action”, dated Nov. 16, 2016, 11 pages. |
CN 201310529209.2, “Office Action”, dated Mar. 16, 2018, 7 pages. |
JP 2013-213693, “Office Action”, dated Apr. 5, 2018, 4 pages. |
State Intellectual Property Office of the Peoples Republic of China Application No. 201310529209.2, Office Action dated Jul. 26, 2017. |
CN 201310529209.2, “Office Action,” dated Sep. 10, 2018, 6 pages. |
JP 2013-213693, “Office Action,” dated Nov. 13, 2018, 24 pages. |
Chinese Application No. CN201310529209.2, “Notice of Decision to Grant”, dated Apr. 8, 2019, 2 pages. |
Japanese Application No. JP2013-213693 , “Notice of Allowance”, dated Jul. 2, 2019, 2 pages. |
Japanese Application No. JP2017-185478 , “Office Action”, dated Jan. 29, 2019, 8 pages. |
Number | Date | Country | |
---|---|---|---|
20190121437 A1 | Apr 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15641458 | Jul 2017 | US |
Child | 16166465 | US | |
Parent | 14949033 | Nov 2015 | US |
Child | 15641458 | US | |
Parent | 13665526 | Oct 2012 | US |
Child | 14949033 | US |