Interactivity model for shared feedback on mobile devices

Information

  • Patent Grant
  • 8711118
  • Patent Number
    8,711,118
  • Date Filed
    Wednesday, February 15, 2012
    12 years ago
  • Date Issued
    Tuesday, April 29, 2014
    10 years ago
Abstract
A system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound. The haptic effect may optionally be modified dynamically by using the gesture signal and the real or virtual device sensor signal and a physical model, or may optionally be applied concurrently to multiple devices which are connected via a communication link. The haptic effect may optionally be encoded into a data file on a first device. The data file is then communicated to a second device and the haptic effect is read from the data file and applied to the second device.
Description
FIELD OF THE INVENTION

One embodiment is directed generally to a user interface for a device, and in particular to producing a dynamic haptic effect using multiple gesture signals and real or virtual device sensor signals.


BACKGROUND INFORMATION

Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects”. Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.


In order to generate vibration effects, many devices utilize some type of actuator. Known actuators used for this purpose include an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys.


Traditional architectures provide haptic feedback only when triggered effects are available, and must be carefully designed to make sure the timing of the haptic feedback is correlated to user initiated gestures or system animations. However, because these user gestures and system animations have variable timing, the correlation to haptic feedback may be static and inconsistent and therefore less compelling to the user. Further, device sensor information is typically not used in combination with gestures to produce haptic feedback.


Therefore, there is a need for an improved system of providing a dynamic haptic effect that includes multiple gesture signals and device sensor signals. There is a further need for providing concurrent haptic feedback to multiple devices which are connected via a communication link.


SUMMARY OF THE INVENTION

One embodiment is a system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound. The haptic effect may optionally be modified dynamically by using the gesture signal and the real or virtual device sensor signal and a physical model. The haptic effect may optionally be applied concurrently to multiple devices which are connected via a communication link. The haptic effect may optionally be encoded into a data file on a first device. The data file is then communicated to a second device and the haptic effect is read from the data file and applied to the second device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a haptically-enabled system according to one embodiment of the present invention.



FIG. 2 is a cut-away perspective view of an LRA implementation of a haptic actuator according to one embodiment of the present invention.



FIG. 3 is a cut-away perspective view of an ERM implementation of a haptic actuator according to one embodiment of the present invention.



FIGS. 4A-4C are views of a piezoelectric implementation of a haptic actuator according to one embodiment of the present invention.



FIGS. 5A-5C are screen views of a user initiated dynamic haptic effect according to one embodiment of the present invention.



FIGS. 6A-6F are screen views of encoding a haptic effect into a data file according to one embodiment of the present invention.



FIG. 7 is a screen view of a user initiated dynamic haptic effect according to one embodiment of the present invention.



FIGS. 8A-8E are screen views of applying a haptic effect concurrently to multiple devices according to one embodiment of the present invention.



FIG. 9 is a flow diagram for producing a dynamic haptic effect with a gesture signal and a device sensor signal according to one embodiment of the present invention.



FIG. 10 is a flow diagram for concurrently applying a haptic effect to multiple devices according to one embodiment of the present invention.



FIG. 11 is a flow diagram for encoding and applying a haptic effect using a data file according to one embodiment of the present invention.





DETAILED DESCRIPTION

As described below, a dynamic effect refers to a haptic effect that evolves over time as it responds to one or more input parameters. A dynamic effect signal can be any type of signal, but does not necessarily have to be complex. For example, a dynamic effect signal may be a simple sine wave that has some property such as phase, frequency, or amplitude that is changing over time or reacting in real time according to a mapping schema which maps an input parameter onto a changing property of the effect signal. An input parameter may be any type of input capable of being provided by a device, and typically may be any type of signal such as a device sensor signal. A device sensor signal may be generated by any means, and typically may be generated by capturing a user gesture with a device. Dynamic effects may be very useful for gesture interfaces, but the use of gestures or sensors are not necessarily required to create a dynamic signal.


One common scenario that does not involve gestures directly is defining the dynamic haptic behavior of an animated widget. For example, when a user scrolls a list, it is not typically the gesture that is subjected to haptification but instead the motion of the widget in response to the gesture that will feel most intuitive when haptified. In the scroll list example, gently sliding a virtual scroll bar may generate a dynamic haptic feedback that changes according to the speed of the scrolling, but flinging the scroll bar may produce dynamic haptics even after the gesture has ended. This creates the illusion that the widget has some physical properties and it provides the user with information about the state of the widget such as speed or whether it is in motion.


A gesture is any movement of the body that conveys meaning or user intent. It will be recognized that simple gestures may be combined to form more complex gestures. For example, bringing a finger into contact with a touch sensitive surface may be referred to as a “finger on” gesture, while removing a finger from a touch sensitive surface may be referred to as a separate “finger off” gesture. If the time between the “finger on” and “finger off” gestures is relatively short, the combined gesture may be referred to as “tapping”; if the time between the “finger on” and “finger off” gestures is relatively long, the combined gesture may be referred to as “long tapping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively large, the combined gesture may be referred to as “swiping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively small, the combined gesture may be referred to as “smearing”, “smudging” or “flicking”. Any number of two dimensional or three dimensional simple or complex gestures may be combined in any manner to form any number of other gestures, including, but not limited to, multiple finger contacts, palm or first contact, or proximity to the device.



FIG. 1 is a block diagram of a haptically-enabled system 10 according to one embodiment of the present invention. System 10 includes a touch sensitive surface 11 or other type of user interface mounted within a housing 15, and may include mechanical keys/buttons 13. Internal to system 10 is a haptic feedback system that generates vibrations on system 10. In one embodiment, the vibrations are generated on touch surface 11.


The haptic feedback system includes a processor 12. Coupled to processor 12 is a memory 20 and an actuator drive circuit 16, which is coupled to a haptic actuator 18. Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10, or may be a separate processor. Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect. A haptic effect may be considered dynamic if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction.


Processor 12 outputs the control signals to drive circuit 16 which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage to cause the desired haptic effects. System 10 may include more than one actuator 18, and each actuator may include a separate drive circuit 16, all coupled to a common processor 12. Memory device 20 can be any type of storage device or computer-readable medium, such as random access memory (RAM) or read-only memory (ROM). Memory 20 stores instructions executed by processor 12. Among the instructions, memory 20 includes an actuator drive module 22 which are instructions that, when executed by processor 12, generate drive signals for actuator 18 while also determining feedback from actuator 18 and adjusting the drive signals accordingly. The functionality of module 22 is discussed in more detail below. Memory 20 may also be located internal to processor 12, or any combination of internal and external memory.


Touch surface 11 recognizes touches, and may also recognize the position and magnitude or pressure of touches on the surface. The data corresponding to the touches is sent to processor 12, or another processor within system 10, and processor 12 interprets the touches and in response generates haptic effect signals. Touch surface 11 may sense touches using any sensing technology, including capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, etc. Touch surface 11 may sense multi-touch contacts and may be capable of distinguishing multiple touches that occur at the same time. Touch surface 11 may be a touchscreen that generates and displays images for the user to interact with, such as keys, dials, etc., or may be a touchpad with minimal or no images.


System 10 may be a handheld device, such as a cellular telephone, PDA, computer tablet, gaming console, etc. or may be any other type of device that provides a user interface and includes a haptic effect system that includes one or more ERMs, LRAs, electrostatic or other types of actuators. The user interface may be a touch sensitive surface, or can be any other type of user interface such as a mouse, touchpad, mini-joystick, scroll wheel, trackball, game pads or game controllers, etc. In embodiments with more than one actuator, each actuator may have a different output capability in order to create a wide range of haptic effects on the device. Each actuator may be any type of haptic actuator or a single or multidimensional array of actuators.



FIG. 2 is a cut-away side view of an LRA implementation of actuator 18 in accordance to one embodiment. LRA 18 includes a casing 25, a magnet/mass 27, a linear spring 26, and an electric coil 28. Magnet 27 is mounted to casing 25 by spring 26. Coil 28 is mounted directly on the bottom of casing 25 underneath magnet 27. LRA 18 is typical of any known LRA. In operation, when current flows thru coil 28 a magnetic field forms around coil 28 which in interaction with the magnetic field of magnet 27 pushes or pulls on magnet 27. One current flow direction/polarity causes a push action and the other a pull action. Spring 26 controls the up and down movement of magnet 27 and has a deflected up position where it is compressed, a deflected down position where it is expanded, and a neutral or zero-crossing position where it is neither compressed or deflected and which is equal to its resting state when no current is being applied to coil 28 and there is no movement/oscillation of magnet 27.


For LRA 18, a mechanical quality factor or “Q factor” can be measured. In general, the mechanical Q factor is a dimensionless parameter that compares a time constant for decay of an oscillating physical system's amplitude to its oscillation period. The mechanical Q factor is significantly affected by mounting variations. The mechanical Q factor represents the ratio of the energy circulated between the mass and spring over the energy lost at every oscillation cycle. A low Q factor means that a large portion of the energy stored in the mass and spring is lost at every cycle. In general, a minimum Q factor occurs with system 10 is held firmly in a hand due to energy being absorbed by the tissues of the hand. The maximum Q factor generally occurs when system 10 is pressed against a hard and heavy surface that reflects all of the vibration energy back into LRA 18.


In direct proportionality to the mechanical Q factor, the forces that occur between magnet/mass 27 and spring 26 at resonance are typically 10-100 times larger than the force that coil 28 must produce to maintain the oscillation. Consequently, the resonant frequency of LRA 18 is mostly defined by the mass of magnet 27 and the compliance of spring 26. However, when an LRA is mounted to a floating device (i.e., system 10 held softly in a hand), the LRA resonant frequency shifts up significantly. Further, significant frequency shifts can occur due to external factors affecting the apparent mounting weight of LRA 18 in system 10, such as a cell phone flipped open/closed or the phone held tightly.



FIG. 3 is a cut-away perspective view of an ERM implementation of actuator 18 according to one embodiment of the present invention. ERM 18 includes a rotating mass 301 having an off-center weight 303 that rotates about an axis of rotation 305. In operation, any type of motor may be coupled to ERM 18 to cause rotation in one or both directions around the axis of rotation 305 in response to the amount and polarity of voltage applied to the motor. It will be recognized that an application of voltage in the same direction of rotation will have an acceleration effect and cause the ERM 18 to increase its rotational speed, and that an application of voltage in the opposite direction of rotation will have a braking effect and cause the ERM 18 to decrease or even reverse its rotational speed.


One embodiment of the present invention provides haptic feedback by determining and modifying the angular speed of ERM 18. Angular speed is a scalar measure of rotation rate, and represents the magnitude of the vector quantity angular velocity. Angular speed or frequency ω, in radians per second, correlates to frequency v in cycles per second, also called Hz, by a factor of 2π. The drive signal includes a drive period where at least one drive pulse is applied to ERM 18, and a monitoring period where the back electromagnetic field (“EMF”) of the rotating mass 301 is received and used to determine the angular speed of ERM 18. In another embodiment, the drive period and the monitoring period are concurrent and the present invention dynamically determines the angular speed of ERM 18 during both the drive and monitoring periods.



FIGS. 4A-4C are views of a piezoelectric implementation of a haptic actuator 18 according to one embodiment of the present invention. FIG. 4A shows a disk piezoelectric actuator that includes an electrode 401, a piezo ceramics disk 403 and a metal disk 405. As shown in FIG. 4B, when a voltage is applied to electrode 401, the piezoelectric actuator bends in response, going from a relaxed state 407 to a transformed state 409. When a voltage is applied, it is that bending of the actuator that creates the foundation of vibration. Alternatively, FIG. 4C shows a beam piezoelectric actuator that operates similarly to a disk piezoelectric actuator by going from a relaxed state 411 to a transformed state 413.



FIGS. 5A-5C are screen views of a user initiated dynamic haptic effect according to one embodiment of the present invention. Dynamic effects involve changing a haptic effect provided by a haptic enabled device in real time according to an interaction parameter. An interaction parameter can be derived from any two-dimensional or three-dimensional gesture using information such as the position, direction and velocity of a gesture from a two-dimensional on-screen display such as on a mobile phone or tablet computer, or a three-dimensional gesture detection system such as a video motion capture system or an electronic glove worn by the user, or by any other 2D or 3D gesture input means. FIG. 5A shows a screen view of a mobile device having a touch sensitive display which displays one photograph out of a group of photographs. FIG. 5B shows a screen view of a user gesture using a single index finger being swiped across the touch sensitive display from right to left in order to display the next photograph. Multiple inputs from the index finger are received from the single gesture. Each of the multiple inputs may occur at a different time and may indicate a different two dimensional position of the contact point of the index finger with the touch sensitive display.



FIG. 5C shows a screen view of the next photograph being displayed in conjunction with a dynamic haptic effect. Based upon the one or more inputs from the one or more user gestures in FIG. 5B, a dynamic haptic effect is provided during the user gesture and continuously modified as determined by the interaction parameter. The dynamic haptic effect may speed up or slow down, increase or decrease in intensity, or change its pattern or duration, or change in any other way, in real-time according to such elements as the speed, direction, pressure, magnitude, or duration of the user gesture itself, or based on a changing property of a virtual object such as the number of times an image has been viewed. The dynamic haptic effect may further continue and may further be modified by the interaction parameter even after the user gesture has stopped. For example, in one embodiment the dynamic haptic effect may be stop immediately at the end of the user gesture, or in another embodiment the dynamic haptic effect may optionally fade slowly after the end of the user gesture according to the interaction parameter. The effect of providing or modifying a dynamic haptic effect in real-time during and even after a user gesture is that no two gestures such as page turns or finger swipes will feel the same to the user. That is, the dynamic haptic effect will always be unique to the user gesture, thereby creating a greater sense connectedness to the device and a more compelling user interface experience for the user as compared to a simple static haptic effect provided by a trigger event.


The interaction parameter may also be derived from device sensor data such as whole device acceleration, gyroscopic information or ambient information. Device sensor signals may be any type of sensor input enabled by a device, such as from an accelerometer or gyroscope, or any type of ambient sensor signal such as from a microphone, photometer, thermometer or altimeter, or any type of bio monitor such as skin or body temperature, blood pressure (BP), heart rate monitor (HRM), electroencephalograph (EEG), or galvanic skin response (GSR), or information or signals received from a remotely coupled device, or any other type of signal or sensor including, but not limited to, the examples listed in TABLE 1 below.









TABLE 1





LIST OF SENSORS

















For the purposes of physical interaction design, a sensor is a



transducer that converts a form of energy into an electrical signal,



or any signal that represents virtual sensor information.



Acceleration



Accelerometer



Biosignals



Electrocardiogram (ECG)



Electroencephalogram (EEG)



Electromyography (EMG)



Electrooculography (EOG)



Electropalatography (EPG)



Galvanic Skin Response (GSR)



Distance



Capacitive



Hall Effect



Infrared



Ultrasound



Flow



Ultrasound



Force/pressure/strain/bend



Air Pressure



Fibre Optic Sensors



Flexion



Force-sensitive Resistor (FSR)



Load Cell



LuSense CPS2 155



Miniature Pressure Transducer



Piezoelectric Ceramic & Film



Strain Gage



Humidity



Hygrometer



Linear position



Hall Effect



Linear Position (Touch)



Linear Potentiometer (Slider)



Linear Variable Differential Transformer (LVDT)



LuSense CPS2 155



Orientation/inclination



Accelerometer



Compass (Magnetoresistive)



Inclinometer



Radio Frequency



Radio Frequency Identification (RFID)



Rotary position



Rotary Encoder



Rotary Potentiometer



Rotary velocity



Gyroscope



Switches



On-Off Switch



Temperature



Temperature



Vibration



Piezoelectric Ceramic & Film



Visible light intensity



Fibre Optic Sensors



Light-Dependent Resistor (LDR)










Active or ambient device sensor data may be used to modify the haptic feedback based any number of factors relating to a user's environment or activity. For example, an accelerometer device sensor signal may indicate that a user is engaging in physical activity such as walking or running, so the pattern and duration of the haptic feedback should be modified to be more noticeable to the user. In another example, a microphone sensor signal may indicate that a user is in a noisy environment, so the amplitude or intensity of the haptic feedback should be increased. Sensor data may also include virtual sensor data which is represented by information or signals that are created from processing data such as still images, video or sound. For example, a video game that has a virtual racing car may dynamically change a haptic effect based the car velocity, how close the car is to the camera viewing angle, the size of the car, and so on.


The interaction parameter may optionally incorporate a mathematical model related to a real-world physical effect such as gravity, acceleration, friction or inertia. For example, the motion and interaction that a user has with an object such as a virtual rolling ball may appear to follow the same laws of physics in the virtual environment as an equivalent rolling ball would follow in a non-virtual environment.


The interaction parameter may optionally incorporate an animation index to correlate the haptic output of a device to an animation or a visual or audio script. For example, an animation or script may play in response to a user or system initiated action such as opening or changing the size of a virtual window, turning a page or scrolling through a list of data entries.


Two or more gesture signals, device sensor signals or physical model inputs may be used alone or in any combination with each other to create an interaction parameter having a difference vector. A difference vector may be created from two or more scalar or vector inputs by comparing the scalar or vector inputs with each other, determining what change or difference exists between the inputs, and then generating a difference vector which incorporates a position location, direction and magnitude. Gesture signals may be used alone to create a gesture difference vector, or device sensor signals may be used alone to create a device signal difference vector.



FIGS. 6A-6F are screen views of encoding a haptic effect into a data file according to one embodiment of the present invention. In order to facilitate dynamic haptic feedback between two or more users, it is not necessary to have low latency or pseudo synchronous communication of the haptic effect. Instead, one embodiment of the present invention enables remote haptic interaction that takes place out of real time by encoding haptic effect data into a shared data file. An example of such a non real time interaction is encoding the haptic effect taken from a digital drawing surface. FIG. 6A shows a default screen view of a virtual “frost” application running on a handheld or mobile device having a digital drawing surface and a haptic actuator. FIG. 6B shows the screen view of a “frosted” screen, created from the default screen view in response to user gestures or device sensor signals such as blowing into a microphone on the handheld device. Once the screen is frosted, FIG. 6C shows the creation of a stylized face pattern drawn in the frost according to gestures provided by the first user. The frosted screen and stylized face are stored in a data file in a format that supports either raster or vector depiction of images, and optionally any other data or metadata necessary for subsequent reproduction of the image such as information about stored gestures or device sensor information.


A haptic effect corresponding to the motions used to create the stylized face is stored or encoded into the data file concurrently with the other image information in the data file. The haptic effect information may be stored in any way that permits the reproduction of the haptic effect along with the image. The data file is then communicated to a second device having a haptic actuator via any file transfer mechanism or communication link. FIG. 6D shows the second device reading the stored gesture or device sensor signal from the data file on the second device and displaying the default frosted screen view. FIG. 6E shows how the stylized face is then subsequently displayed on the second device. A drive signal is also applied to the haptic actuator on the second device according to the gesture or device sensor signal stored in the file.


The second user may optionally collaborate with the first user to create a combined data file by providing additional gestures or device sensor signals to add the virtual message “Hi” on the drawing, along with any corresponding haptic effect generated from the virtual message and stored in the data file. FIG. 6F shows the final collaborative screen view which combines gestures and device sensor signals from the first and second users along with the corresponding haptic effect data. Gestures, device sensor signals and haptic effect data generated by both users are stored or encoded into the data file as a combined collaborative document which can subsequently be communicated between the users or to other users for further input, modification or collaboration. Although the above example describes a digital drawing surface, it will be recognized that many other types of user gestures and device sensor data may be stored or encoded with haptic effect signals in any type of data file in virtually any format, without limitation.



FIG. 7 is a screen view of a user initiated dynamic haptic effect according to one embodiment of the present invention. A filmstrip application for displaying or selecting photographs is shown running at the bottom of a handheld or mobile device having a touch sensitive surface and a haptic actuator. By using gestures or device sensor data, a user may scroll the filmstrip from left to right or right to left, and the filmstrip application may then dynamically provide a haptic effect for a first photograph 701 which is different from a haptic effect for a second photograph 703 based upon the gestures or device sensor data. Once the user has initiated the selection of a photograph through a gesture, the system may provide an animation to visually show the filmstrip in motion along with a corresponding haptic animation component. Subsequent user gestures or device sensor information received during the filmstrip animation may cause the haptic effect to change along with any associated change in the animation. For example, if the filmstrip animation is moving too slow or too fast, the user may speed it up or slow it down in real time with a gesture and the corresponding haptic effect component will also change dynamically in real time along with the animation.



FIGS. 8A-8E are screen views of applying a haptic effect concurrently to multiple devices according to one embodiment of the present invention. FIG. 8A shows a screen view of a haptic enabled handheld or mobile device of a first user 801, along with a visual thumbnail view of a second user 803 also having a haptic enabled handheld or mobile device. The first and second devices may be connected in real time via any type of communication link, including but not limited to electronic, cellular, wireless, wi-fi, optical, infrared, acoustic, Bluetooth, USB, Firewire, Thunderbolt or Ethernet.



FIG. 8B shows the first user selecting an application to share photographs between the two users. Upon selecting the application, FIG. 8C shows the first photograph in the album, and FIG. 8D shows the first user applying a scrolling gesture to select the second photograph in the album by scrolling the photos from right to left. A corresponding haptic effect is provided to the first user during the scrolling gesture. Because the first and second devices are connected in real time via the communication link, FIG. 8E shows the screen view of the second user which visually shows the same photograph as being displayed concurrently to the first user. Because of the real time link between the two devices, the second user is able to concurrently view the same photos as the first user. The second user also experiences in real time a similar haptic effect for each gesture and photo as provided for the first user. In one embodiment, user gestures and haptic effects generated by the second user may be optionally communicated concurrently to the first user via the communication link, creating a real time bi-directional haptic link between the first and second devices. For example, the first user may scroll to the second photo, the second user may then scroll to the third photo, and so on. It will be recognized that many other types of user gestures, device sensor data and haptic effects may be communicated between two or more devices in real time without limitation.



FIG. 9 is a flow diagram for producing a dynamic haptic effect with a gesture signal and a device sensor signal according to one embodiment of the present invention. In one embodiment, the functionality of the flow diagram of FIG. 9 is implemented by software stored in memory or other computer readable or tangible medium, and executed by a processor. In other embodiments, the functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.


At 901, the system receives input of a device sensor signal at time T1, and at 903 the system receives input of a gesture signal at time T2. Time T1 and time T2 may occur simultaneously or non-simultaneously with each other and in any order. Multiple additional gesture inputs or device sensor inputs may be used to give greater precision to the dynamic haptic effect or to provide the dynamic haptic effect over a greater period of time. The gesture signals and the device sensor signals may be received in any order or time sequence, either sequentially with non-overlapping time periods or in parallel with overlapping or concurrent time periods. At 905, the device sensor signal is compared to a haptic effect signal to generate a device sensor difference vector. At 907, the gesture signal is compared to a haptic effect signal to generate a gesture difference vector. At 909, an animation or physical model description may optionally be received. At 911, an interaction parameter is generated using the gesture difference vector, the signal difference vector, and optionally the animation or physical model description. It will be recognized that any type of input synthesis method may be used to generate the interaction parameter from one or more gesture signals or device sensor signals including, but not limited to, the method of synthesis examples listed in TABLE 2 below. At 913, a drive signal is applied to a haptic actuator according to the interaction parameter.









TABLE 2





METHODS OF SYNTHESIS















Additive synthesis - combining inputs, typically of varying amplitudes


Subtractive synthesis - filtering of complex signals or multiple signal


inputs


Frequency modulation synthesis - modulating a carrier wave signal


with one or more operators


Sampling - using recorded inputs as input sources subject to modification


Composite synthesis - using artificial and sampled inputs to establish


a resultant “new” input


Phase distortion - altering the speed of waveforms stored in wavetables


during playback


Waveshaping - intentional distortion of a signal to produce a modified


result


Resynthesis - modification of digitally sampled inputs before playback


Granular synthesis - combining of several small input segments into a


new input


Linear predictive coding - similar technique as used for speech synthesis


Direct digital synthesis - computer modification of generated waveforms


Wave sequencing - linear combinations of several small segments to


create a new input


Vector synthesis - technique for fading between any number of different


input sources


Physical modeling - mathematical equations of the physical characteristics


of virtual motion










FIG. 10 is a flow diagram for concurrently applying a haptic effect to multiple devices according to one embodiment of the present invention. At 1001, the system enables a unidirectional or bidirectional communication link between a first device having a first haptic actuator and a second device having a second haptic actuator. At 1003, the system receives input of a first gesture signal or device sensor signal from the first device and communicates it to the second device via the communication link. At 1005, the system optionally receives input of a second gesture signal or device sensor signal from the second device and communicates it to the first device via the communication link. At 1007, an interaction parameter is generated using the first gesture or device sensor signal and the optional second gesture or device sensor signal. At 1009, a drive signal is concurrently applied to the haptic actuator on the first device and the second haptic actuator on the second device according to the interaction parameter. In one embodiment, the interaction parameter is generated independently on each device. In another embodiment, the interaction parameter is generated once on one device and then communicated to the other device via the communication link.



FIG. 11 is a flow diagram for encoding and applying a haptic effect using a data file according to one embodiment of the present invention. At 1101, the system receives input of a gesture signal or device sensor signal from a first device. At 1103, the gesture or device sensor signal is stored or encoded into a data file on the first device. At 1105, the data file is communicated to a second device having a haptic actuator via any file transfer mechanism or communication link. At 1107, the second device reads the stored gesture or device sensor signal from the data file on the second device. At 1109, a drive signal is applied to the haptic actuator on the second device according to the gesture or device sensor signal.


Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.

Claims
  • 1. A method of producing a haptic effect comprising: receiving a first gesture signal;receiving a second gesture signal;generating a dynamic interaction parameter using the first gesture signal and the second gesture signal;using a method of synthesis to further generate the dynamic interaction parameter to give greater precision to the haptic effect; andapplying a drive signal to a haptic output device according to the dynamic interaction parameter.
  • 2. The method of claim 1 wherein the first or second gesture signal comprises a vector signal.
  • 3. The method of claim 1 wherein the first or second gesture signal comprises an on-screen signal.
  • 4. The method of claim 1 wherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter from a difference between the first gesture signal and the second gesture signal.
  • 5. The method of claim 1 wherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and a physical model.
  • 6. The method of claim 1 wherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and an animation.
  • 7. The method of claim 1 further comprising: receiving a first device sensor signal;receiving a second device sensor signal; andwherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and the first device sensor signal and the second device sensor signal.
  • 8. The method of claim 1 wherein the first device sensor signal or the second device sensor signal comprises an accelerometer signal.
  • 9. The method of claim 1 wherein the first device sensor signal or the second device sensor signal comprises a gyroscope signal.
  • 10. The method of claim 1 wherein the first device sensor signal or the second device sensor signal comprises an ambient signal.
  • 11. The method of claim 1 wherein the first device sensor signal or the second device sensor signal comprises a virtual sensor signal.
  • 12. A haptic effect enabled system comprising: a haptic output device;a drive module electronically coupled to the haptic output device for receiving a first gesture signal, receiving a second gesture signal, and generating a dynamic interaction parameter according to the first gesture signal and the second gesture signal using a method of synthesis to further generate the dynamic interaction parameter to give greater precision to the haptic effect; anda drive circuit electronically coupled to the drive module and the haptic output device for applying a drive signal to a haptic output device according to the dynamic interaction parameter.
  • 13. The system of claim 12 wherein the first or second gesture signal comprises a vector signal.
  • 14. The system of claim 12 wherein the first or second gesture signal comprises an on-screen signal.
  • 15. The system of claim 12 wherein the drive module comprises a drive module for generating a dynamic interaction parameter from a difference between the first gesture signal and the second gesture signal.
  • 16. The system of claim 12 wherein the drive module comprises a drive module for generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and a physical model.
  • 17. The system of claim 12 wherein the drive module comprises a drive module for generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and an animation.
  • 18. The system of claim 12 wherein the drive module comprises a drive module for receiving a first device sensor signal, receiving a second device sensor signal, and generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and the first device sensor signal and the second device sensor signal.
  • 19. The system of claim 12 wherein the first device sensor signal or the second device sensor signal comprises an accelerometer signal.
  • 20. The system of claim 12 wherein the first device sensor signal or the second device sensor signal comprises a gyroscope signal.
  • 21. The system of claim 12 wherein the first device sensor signal or the second device sensor signal comprises an ambient signal.
  • 22. The system of claim 12 wherein the first device sensor signal or the second device sensor signal comprises a virtual sensor signal.
  • 23. A non-transitory computer readable medium having instructions stored thereon that, when executed by a processor, causes the processor to produce a haptic effect, the instructions comprising: receiving a first gesture signal;receiving a second gesture signal;generating a dynamic interaction parameter using the first gesture signal and the second gesture signal;using a method of synthesis to further generate the dynamic interaction parameter to give greater precision to the haptic effect; andapplying a drive signal to a haptic output device according to the dynamic interaction parameter.
  • 24. The non-transitory computer readable medium of claim 23, wherein the first or second gesture signal comprises a vector signal.
  • 25. The non-transitory computer readable medium of claim 23, wherein the first or second gesture signal comprises an on-screen signal.
  • 26. The non-transitory computer readable medium of claim 23, wherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter from a difference between the first gesture signal and the second gesture signal.
  • 27. The non-transitory computer readable medium of claim 23, wherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and a physical model.
  • 28. The non-transitory computer readable medium of claim 23, wherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and an animation.
  • 29. The non-transitory computer readable medium of claim 23, further comprising: receiving a first device sensor signal;receiving a second device sensor signal; andwherein generating a dynamic interaction parameter comprises generating a dynamic interaction parameter using the first gesture signal and the second gesture signal and the first device sensor signal and the second device sensor signal.
  • 30. The non-transitory computer readable medium of claim 23, wherein the first device sensor signal or the second device sensor signal comprises a signal selected from the list consisting of accelerometer, gyroscope, ambient, or virtual.
US Referenced Citations (137)
Number Name Date Kind
5666499 Baudel et al. Sep 1997 A
5825308 Rosenberg Oct 1998 A
6061004 Rosenberg May 2000 A
6088019 Rosenberg Jul 2000 A
6100874 Schena et al. Aug 2000 A
6166723 Schena et al. Dec 2000 A
6211861 Rosenberg et al. Apr 2001 B1
6252579 Rosenberg et al. Jun 2001 B1
6278439 Rosenberg et al. Aug 2001 B1
6300936 Braun et al. Oct 2001 B1
6337678 Fish Jan 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6448977 Braun et al. Sep 2002 B1
6717573 Shahoian et al. Apr 2004 B1
6819312 Fish Nov 2004 B2
7024625 Shalit Apr 2006 B2
7084854 Moore et al. Aug 2006 B1
7088342 Rekimoto et al. Aug 2006 B2
7113177 Franzen Sep 2006 B2
7205978 Poupyrev et al. Apr 2007 B2
7336260 Martin et al. Feb 2008 B2
7446456 Maruyama et al. Nov 2008 B2
7456823 Poupyrev et al. Nov 2008 B2
7468573 Dai et al. Dec 2008 B2
7528508 Bruwer May 2009 B2
7554246 Maruyama et al. Jun 2009 B2
7592999 Rosenberg et al. Sep 2009 B2
7663604 Maruyama et al. Feb 2010 B2
7755607 Poupyrev et al. Jul 2010 B2
7808488 Martin et al. Oct 2010 B2
7821498 Kramer et al. Oct 2010 B2
7825903 Anastas et al. Nov 2010 B2
RE42064 Fish Jan 2011 E
7890863 Grant et al. Feb 2011 B2
7920131 Westerman Apr 2011 B2
7924144 Makinen et al. Apr 2011 B2
7969288 Braun et al. Jun 2011 B2
7973769 Olien Jul 2011 B2
7978181 Westerman Jul 2011 B2
7982588 Makinen et al. Jul 2011 B2
7982720 Rosenberg et al. Jul 2011 B2
8004492 Kramer et al. Aug 2011 B2
8031181 Rosenberg et al. Oct 2011 B2
8035623 Bruwer Oct 2011 B2
8059105 Rosenberg et al. Nov 2011 B2
8098235 Heubel et al. Jan 2012 B2
20010035854 Rosenberg et al. Nov 2001 A1
20020015024 Westerman et al. Feb 2002 A1
20020044132 Fish Apr 2002 A1
20020177471 Kaaresoja et al. Nov 2002 A1
20030063128 Salmimaa et al. Apr 2003 A1
20030100969 Jones May 2003 A1
20030162595 Serbanescu Aug 2003 A1
20030206202 Moriya Nov 2003 A1
20040002902 Muehlhaeuser Jan 2004 A1
20050057528 Kleen Mar 2005 A1
20050179617 Matsui et al. Aug 2005 A1
20050212760 Marvit et al. Sep 2005 A1
20050245302 Bathiche et al. Nov 2005 A1
20060022952 Ryynanen Feb 2006 A1
20060026536 Hotelling et al. Feb 2006 A1
20060061545 Hughes et al. Mar 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060119586 Grant et al. Jun 2006 A1
20060181510 Faith Aug 2006 A1
20060192760 Moore et al. Aug 2006 A1
20060197752 Hurst et al. Sep 2006 A1
20060255683 Suzuki et al. Nov 2006 A1
20060256074 Krum et al. Nov 2006 A1
20060279476 Obata Dec 2006 A1
20060279542 Flack et al. Dec 2006 A1
20060284849 Grant et al. Dec 2006 A1
20070066283 Haar et al. Mar 2007 A1
20070139366 Dunko et al. Jun 2007 A1
20070150826 Anzures et al. Jun 2007 A1
20070152984 Ording et al. Jul 2007 A1
20070236450 Colgate et al. Oct 2007 A1
20070247429 Westerman Oct 2007 A1
20070247442 Andre et al. Oct 2007 A1
20070265096 Kouno et al. Nov 2007 A1
20070279392 Rosenberg et al. Dec 2007 A1
20080024459 Pupyrev et al. Jan 2008 A1
20080055277 Takenaka et al. Mar 2008 A1
20080060856 Shahoian et al. Mar 2008 A1
20080068334 Olien et al. Mar 2008 A1
20080088580 Poupyrev et al. Apr 2008 A1
20080111788 Rosenberg et al. May 2008 A1
20080180406 Han et al. Jul 2008 A1
20080216001 Ording et al. Sep 2008 A1
20080287147 Grant et al. Nov 2008 A1
20080300055 Lutnick et al. Dec 2008 A1
20080303782 Grant et al. Dec 2008 A1
20090002328 Ullrich et al. Jan 2009 A1
20090079550 Makinen et al. Mar 2009 A1
20090085878 Heubel et al. Apr 2009 A1
20090109007 Makinen et al. Apr 2009 A1
20090128503 Grant et al. May 2009 A1
20090137269 Chung May 2009 A1
20090166098 Sunder Jul 2009 A1
20090167508 Fadell et al. Jul 2009 A1
20090167509 Fadell et al. Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090250267 Heubel et al. Oct 2009 A1
20090256817 Perlin et al. Oct 2009 A1
20090270046 Lai Oct 2009 A1
20090284485 Colgate et al. Nov 2009 A1
20090315830 Westerman Dec 2009 A1
20100013653 Birnbaum et al. Jan 2010 A1
20100013761 Birnbaum et al. Jan 2010 A1
20100017489 Birnbaum et al. Jan 2010 A1
20100017759 Birnbaum et al. Jan 2010 A1
20100045619 Birnbaum et al. Feb 2010 A1
20100085169 Poupyrev et al. Apr 2010 A1
20100108408 Colgate et al. May 2010 A1
20100127819 Radivojevic et al. May 2010 A1
20100149134 Westerman et al. Jun 2010 A1
20100156818 Burrough et al. Jun 2010 A1
20100214243 Birnbaum et al. Aug 2010 A1
20100231539 Cruz-Hernandez et al. Sep 2010 A1
20100245254 Olien et al. Sep 2010 A1
20100265208 Kim et al. Oct 2010 A1
20100313124 Privault et al. Dec 2010 A1
20100328053 Yeh et al. Dec 2010 A1
20110021272 Grant et al. Jan 2011 A1
20110025609 Modarres et al. Feb 2011 A1
20110043454 Modarres et al. Feb 2011 A1
20110043527 Ording et al. Feb 2011 A1
20110102340 Martin et al. May 2011 A1
20110105103 Ullrich May 2011 A1
20110109588 Makinen et al. May 2011 A1
20110138277 Grant et al. Jun 2011 A1
20110260988 Colgate et al. Oct 2011 A1
20110264491 Birnbaum et al. Oct 2011 A1
20110267181 Kildal Nov 2011 A1
20120068957 Puskarich et al. Mar 2012 A1
20120081276 Ullrich et al. Apr 2012 A1
20120105333 Maschmeyer et al. May 2012 A1
Foreign Referenced Citations (19)
Number Date Country
200 19 074 Feb 2001 DE
0 899 650 Mar 1999 EP
1 691 263 Nov 2003 EP
1 401 185 Mar 2004 EP
1 731 993 Dec 2006 EP
2 910 160 Jun 2008 FR
2 416 962 Feb 2006 GB
WO 9720305 Jun 1997 WO
WO 9938064 Jul 1999 WO
WO 2004044728 May 2004 WO
WO 2004075169 Sep 2004 WO
WO 2004081776 Sep 2004 WO
WO 2005103863 Nov 2005 WO
WO 2008132540 Nov 2008 WO
WO 2009037379 Mar 2009 WO
WO 2009071750 Jun 2009 WO
WO 2009074185 Jun 2009 WO
WO 2009141502 Nov 2009 WO
WO 2010088477 Aug 2010 WO
Non-Patent Literature Citations (18)
Entry
Biet, M. et al., Discrimination of Virtual Square Gratings by Dynamic Touch on Friction Based Tactile Displays, Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2008, Symposium, IEEE, Piscataway, NJ, XP 031339918, pp. 41-48, Mar. 2008. ISBN: 978-1-4244-2005-6.
Biet, M. et al., “New Tactile Devices Using Piezoelectric Actuators”, L2EP, University of Lille, Polytech-Lille, Actuator 2006, 10th International Conference on New Actuators, Jun. 14-16, 2006, Bremen, Germany, pp. 989-992.
Bonderud, Doug, “Nokia Files Patent to Make Phones Digitally Stimulating”, InventorSpot.com [online], [retrieved Apr. 21, 2011]. Retrieved from the internet <URL: http://inventorspot.com/articles/nokia—files—patent—make—phones—digitally—stimulating>.
Buxton, Bill, “Multi-Touch Systems that I have Known and Loved”, Microsoft Research; Original: Jan. 12, 2007; Version: Jul. 17, 2007; <URL: http://www.billbuxton.com/multitouchOverview.html> pp. -16.
Chang et al., ComTouch: Design of a Vibrotactile Communication Device, DIS2002, London Copyright 2002 ACM 1-58113-2-9-0/00/0008, 10 pages.
Dewitt, A., Designing Sonification of User data in Affective Interaction, Master of Science Thesis Stockholm, Sweden, XP 002551466, at URL: http://w3.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/rapporter07/de—witt—anna—07142.pdf, as available via the Internet and printed on Oct. 20, 2009.
Greene, Kate, “A Touch Screen with Texture”, Technology Review [online], [retrieved Apr. 21, 2011]. Retrieved from the Internet <URL: http://www.technologyreview.com/printer—friendly—article.aspx?id=26506>.
Hsin-Un Yao et al.; “An Experiment on Length Perception with a Virtual Rolling Stone”; Proc. Eurohaptics 2006; pp. 325-330.
http://en.wikipedia.org/w/index.php?title=Gesture—recognition&printable=yes; Jul. 23, 2010.
Iwamoto, T. et al., “Airborne Ultrasound Tactile Display”, The University of Tokyo, SIGGRAPH 2008, Los Angeles, California, Aug. 11-15, 2008, ISBN 978-1-60558-466-9/08/0008.
Iwamoto, T. et al., “Non-contact Method for Producing Tactile Sensation Using Airborned Ultrasound”, Department of Information Physics and Computing Graduate School of Information Science and Technology the University of Tokyo, EuroHaptics 2008, LNCS 5024, pp. 504-513, Springer-Verlag Berlin Heidelberg 2008.
Kaaresoja, T. et al., Snap-Crackle-Pop: Tactile Feedback for Mobile Touch Screens, Proceedings of the Eurohaptics 2006, XP 002551465, at http://lsc.univ-evry.fr/{eurohaptics/upload/cd/papers/f80, as available via the Internet and printed Oct. 20, 2009.
Marks, Paul, “Nokia touchscreen creates texture illusion”, New Scientist [online], [retrieved Apr. 21, 2011]. Retrieved from the Internet <URL: http://www.newscientist.com/article/dn19510-nokia-touchscreen-cretes-texture-illusion.html>.
Oakley, I. et al., Contact IM: Exploring Asynchronous Touch over Distance, Palpable Machines Research Group, Media Lab Europe, XP 007910188, at http://people.cs.vt.edu/[wangr06/touch%20review%20origization/Oak002 as available via the internet and printed Oct. 20, 2009.
Rovers, A. et al., HIM: A Framework for Haptic Instant Messaging, CHI 2004 (CHI Conference Proceedings. Human Factors in Computing Systems), XP 002464573, Vienna Austria, Apr. 2004, p. 1313-1316.
Sekiguchi, Y. et al., Haptic Interface using Estimation of Box Contents Metaphor, Proceedings of ICAT 2003, Tokyo, Japan, XP 002551467, at http://www.vrsj.org/ic-at/papers/2003/00947-00000, as available via the Internet and printed Oct. 20, 2009.
Smith, Ned, “Feel the future: Touch screens that touch back”, MSNBC [online], [retrieved Apr. 21, 2011]. Retrieved from the Internet <URL: http://www.msnbc.msn.com/id/40845743/ns/technology—and—science-tech—and—gadgets/t/feel-future-touch-screens-touch-back/#.T5clU7ONfEY>.
Williamson, J. et al., Shoogle: Excitatory Multimodal Interaction on Mobile Devices, CHI 2007 Proceedings—Shake, Rattle and Roll: New Forms of Input and Output, 2007, pp. 121-124, XP 002549378.
Related Publications (1)
Number Date Country
20130207904 A1 Aug 2013 US