This application claims the benefit of priority under 35 USC § 120 to copending application Ser. No. 15/046,701, filed Feb. 18, 2016, which claims the benefit of priority to application Ser. No. 14/518,115, U.S. Pat. No. 9,268,403, filed Oct. 20, 2014, which claims the benefit of priority to application Ser. No. 14/444,416, U.S. Pat. No. 8,866,788, filed Jul. 28, 2014, which claims the benefit of priority to application Ser. No. 14/176,421, U.S. Pat. No. 8,823,674, filed Feb. 10, 2014, which claims the benefit of priority to application Ser. No. 13/773,191, U.S. Pat. No. 8,659,571, filed Feb. 21, 2013, which claims the benefit of priority to application Ser. No. 13/592,685, U.S. Pat. No. 8,493,354, filed Aug. 23, 2012, which claims the benefit of priority to application Ser. No. 13/472,709, U.S. Pat. No. 8,279,193, filed May 16, 2012, which claims the benefit of priority to application Ser. No. 13/397,142, U.S. Pat. No. 8,711,118, filed Feb. 15, 2012.
One embodiment is directed generally to a user interface for a device, and in particular to producing a dynamic haptic effect using multiple gesture signals and real or virtual device sensor signals.
Electronic device manufacturers strive to produce a rich interface for users. Conventional devices use visual and auditory cues to provide feedback to a user. In some interface devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects”. Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.
In order to generate vibration effects, many devices utilize some type of actuator or haptic output device. Known haptic output devices used for this purpose include an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys. Haptic output devices also broadly include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
Traditional architectures that provide haptic feedback only with triggered effects are available, and must be carefully designed to make sure the timing of the haptic feedback is correlated to user initiated gestures or system animations. However, because these user gestures and system animations have variable timing, the correlation to haptic feedback may be static and inconsistent and therefore less compelling to the user. Further, device sensor information is typically not used in combination with gestures to produce haptic feedback.
Therefore, there is a need for an improved system of providing a dynamic haptic effect that includes multiple gesture signals and device sensor signals. There is a further need for providing concurrent haptic feedback to multiple devices which are connected via a communication link.
One embodiment is a system that produces a dynamic haptic effect and generates a drive signal that includes a gesture signal and a real or virtual device sensor signal. The haptic effect is modified dynamically based on both the gesture signal and the real or virtual device sensor signal such as from an accelerometer or gyroscope, or by a signal created from processing data such as still images, video or sound. The haptic effect may optionally be modified dynamically by using the gesture signal and the real or virtual device sensor signal and a physical model. The haptic effect may optionally be applied concurrently to multiple devices which are connected via a communication link. The haptic effect may optionally be encoded into a data file on a first device. The data file is then communicated to a second device and the haptic effect is read from the data file and applied to the second device.
As described below, a dynamic haptic effect refers to a haptic effect that evolves over time as it responds to one or more input parameters. Dynamic haptic effects are haptic or vibrotactile effects displayed on haptic devices to represent a change in state of a given input signal. The input signal can be a signal captured by sensors on the device with haptic feedback, such as position, acceleration, pressure, orientation, or proximity, or signals captured by other devices and sent to the haptic device to influence the generation of the haptic effect.
A dynamic effect signal can be any type of signal, but does not necessarily have to be complex. For example, a dynamic effect signal may be a simple sine wave that has some property such as phase, frequency, or amplitude that is changing over time or reacting in real time according to a mapping schema which maps an input parameter onto a changing property of the effect signal. An input parameter may be any type of input capable of being provided by a device, and typically may be any type of signal such as a device sensor signal. A device sensor signal may be generated by any means, and typically may be generated by capturing a user gesture with a device. Dynamic effects may be very useful for gesture interfaces, but the use of gestures or sensors are not necessarily required to create a dynamic signal.
One common scenario that does not involve gestures directly is defining the dynamic haptic behavior of an animated widget. For example, when a user scrolls a list, it is not typically the haptification of the gesture that will feel most intuitive, but instead the motion of the widget in response to the gesture. In the scroll list example, gently sliding the list may generate a dynamic haptic feedback that changes according to the speed of the scrolling, but flinging the scroll bar may produce dynamic haptics even after the gesture has ended. This creates the illusion that the widget has some physical properties and it provides the user with information about the state of the widget such as its velocity or whether it is in motion.
A gesture is any movement of the body that conveys meaning or user intent. It will be recognized that simple gestures may be combined to form more complex gestures. For example, bringing a finger into contact with a touch sensitive surface may be referred to as a “finger on” gesture, while removing a finger from a touch sensitive surface may be referred to as a separate “finger off” gesture. If the time between the “finger on” and “finger off” gestures is relatively short, the combined gesture may be referred to as “tapping”; if the time between the “finger on” and “finger off” gestures is relatively long, the combined gesture may be referred to as “long tapping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively large, the combined gesture may be referred to as “swiping”; if the distance between the two dimensional (x,y) positions of the “finger on” and “finger off” gestures is relatively small, the combined gesture may be referred to as “smearing”, “smudging” or “flicking”. Any number of two dimensional or three dimensional simple or complex gestures may be combined in any manner to form any number of other gestures, including, but not limited to, multiple finger contacts, palm or first contact, or proximity to the device. A gesture can also be any form of hand movement recognized by a device having an accelerometer, gyroscope, or other motion sensor, and converted to electronic signals. Such electronic signals can activate a dynamic effect, such as shaking virtual dice, where the sensor captures the user intent that generates a dynamic effect.
The haptic feedback system includes a processor 12. Coupled to processor 12 is a memory 20 and an actuator drive circuit 16, which is coupled to a haptic actuator 18. Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10, or may be a separate processor. Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect. A haptic effect may be considered dynamic if it includes some variation of these parameters when the haptic effect is generated or a variation of these parameters based on a user's interaction.
Processor 12 outputs the control signals to drive circuit 16 which includes electronic components and circuitry used to supply actuator 18 with the required electrical current and voltage to cause the desired haptic effects. System 10 may include more than one actuator 18, and each actuator may include a separate drive circuit 16, all coupled to a common processor 12. Memory device 20 can be any type of storage device or computer-readable medium, such as random access memory (RAM) or read-only memory (ROM). Memory 20 stores instructions executed by processor 12. Among the instructions, memory 20 includes an actuator drive module 22 which are instructions that, when executed by processor 12, generate drive signals for actuator 18 while also determining feedback from actuator 18 and adjusting the drive signals accordingly. The functionality of module 22 is discussed in more detail below. Memory 20 may also be located internal to processor 12, or any combination of internal and external memory.
Touch surface 11 recognizes touches, and may also recognize the position and magnitude or pressure of touches on the surface. The data corresponding to the touches is sent to processor 12, or another processor within system 10, and processor 12 interprets the touches and in response generates haptic effect signals. Touch surface 11 may sense touches using any sensing technology, including capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, etc. Touch surface 11 may sense multi-touch contacts and may be capable of distinguishing multiple touches that occur at the same time. Touch surface 11 may be a touchscreen that generates and displays images for the user to interact with, such as keys, dials, etc., or may be a touchpad with minimal or no images.
System 10 may be a handheld device, such as a cellular telephone, PDA, computer tablet, gaming console, etc. or may be any other type of device that provides a user interface and includes a haptic effect system that includes one or more ERMs, LRAs, electrostatic or other types of actuators. The user interface may be a touch sensitive surface, or can be any other type of user interface such as a mouse, touchpad, mini-joystick, scroll wheel, trackball, game pads or game controllers, etc. In embodiments with more than one actuator, each actuator may have a different output capability in order to create a wide range of haptic effects on the device. Each actuator may be any type of haptic actuator or a single or multidimensional array of actuators.
For LRA 18, a mechanical quality factor or “Q factor” can be measured. In general, the mechanical Q factor is a dimensionless parameter that compares a time constant for decay of an oscillating physical system's amplitude to its oscillation period. The mechanical Q factor is significantly affected by mounting variations. The mechanical Q factor represents the ratio of the energy circulated between the mass and spring over the energy lost at every oscillation cycle. A low Q factor means that a large portion of the energy stored in the mass and spring is lost at every cycle. In general, a minimum Q factor occurs with system 10 is held firmly in a hand due to energy being absorbed by the tissues of the hand. The maximum Q factor generally occurs when system 10 is pressed against a hard and heavy surface that reflects all of the vibration energy back into LRA 18.
In direct proportionality to the mechanical Q factor, the forces that occur between magnet/mass 27 and spring 26 at resonance are typically 10-100 times larger than the force that coil 28 must produce to maintain the oscillation. Consequently, the resonant frequency of LRA 18 is mostly defined by the mass of magnet 27 and the compliance of spring 26. However, when an LRA is mounted to a floating device (i.e., system 10 held softly in a hand), the LRA resonant frequency shifts up significantly. Further, significant frequency shifts can occur due to external factors affecting the apparent mounting weight of LRA 18 in system 10, such as a cell phone flipped open/closed or the phone held tightly.
One embodiment of the present invention provides haptic feedback by determining and modifying the angular speed of ERM 18. Angular speed is a scalar measure of rotation rate, and represents the magnitude of the vector quantity angular velocity. Angular speed or frequency ω, in radians per second, correlates to frequency v in cycles per second, also called Hz, by a factor of 27. The drive signal includes a drive period where at least one drive pulse is applied to ERM 18, and a monitoring period where the back electromagnetic field (“EMF”) of the rotating mass 301 is received and used to determine the angular speed of ERM 18. In another embodiment, the drive period and the monitoring period are concurrent and the present invention dynamically determines the angular speed of ERM 18 during both the drive and monitoring periods.
It likely that the invention is based on a controlled formation of an electric field between an active surface of the apparatus and the body member, such as a finger, approaching or touching it. The electric field tends to give rise to an opposite charge on the proximate finger. A local electric field and a capacitive coupling can be formed between the charges. The electric field directs a force on the charge of the finger tissue. By appropriately altering the electric field a force capable of moving the tissue may arise, whereby the sensory receptors sense such movement as vibration.
As shown in
The airborne ultrasound can be applied directly onto the skin without the risk of the penetration. When the airborne ultrasound is applied on the surface of the skin, due to the large difference between the characteristic acoustic impedance of the air and that of the skin, about 99.9% of the incident acoustic energy is reflected on the surface of the skin. Hence, this tactile feedback system does not require the users to wear any clumsy gloves or mechanical attachments.
Flexible surface layer 703, in one instance, is made of soft and/or elastic materials such as silicone rubber, which is also known as polysiloxane. A function of the flexible surface layer 703 is to change its surface shape or texture upon contact with the physical pattern of haptic substrate 705. The physical pattern of haptic substrate 705 is variable as one or more of the local features 110-124 can be raised or lowered to present features to affect the surface of the flexible surface layer 703 upon contact. Once the physical pattern of haptic substrate 705 is determined, the texture of flexible surface layer 703 can change to confirm its surface texture to the physical pattern of haptic substrate 705. It should be note that the deformation of flexible surface layer 703 from one texture to another can be controlled by deforming mechanism 711. For example, when deforming mechanism 711 is not activated, flexible surface layer 703 maintains its smooth configuration floating or sitting over haptic substrate 705. The surface configuration of flexible surface layer 703, however, deforms or changes from a smooth configuration to a coarse configuration when deforming mechanism 711 is activated and the haptic substrate 705 is in contact with the flexible surface layer 703 so as to generate a similar pattern on the top surface of the flexible surface layer 703.
Alternatively, flexible surface layer 703 is a flexible touch sensitive surface, which is capable of accepting user inputs. The flexible touch sensitive surface can be divided into multiple regions wherein each region of the flexible touch sensitive surface can accept an input when the region is being touched or depressed by a finger. In one embodiment, the flexible touch sensitive surface includes a sensor, which is capable of detecting a nearby finger and waking up or turning on the device. Flexible surface layer 703 may also include a flexible display, which is capable of deforming together with flexible surface layer 703. It should be noted that various flexible display technologies can be used to manufacture flexible displays, such as organic light-emitting diode (OLED), organic, or polymer TFT (Thin Film Transistor).
Haptic substrate 705 is a surface reconfigurable haptic device capable of changing its surface pattern in response to one or more pattern activating signals. Haptic substrate 705 can also be referred to as a haptic mechanism, a haptic layer, a tactile element, and the like. Haptic substrate 705, in one embodiment, includes multiple tactile or haptic regions 707, 709, wherein each region can be independently controlled and activated. Since each tactile region can be independently activated, a unique surface pattern of haptic substrate 705 can be composed in response to the pattern activating signals. In another embodiment, every tactile region is further divided into multiple haptic bits wherein each bit can be independently excited or activated or deactivated.
Haptic substrate 705, or a haptic mechanism, in one embodiment, is operable to provide haptic feedback in response to an activating command or signal. Haptic substrate 705 provides multiple tactile or haptic feedbacks wherein one tactile feedback is used for surface deformation, while another tactile feedback is used for input confirmation. Input confirmation is a haptic feedback to inform a user about a selected input. Haptic mechanism 705, for example, can be implemented by various techniques including vibration, vertical displacement, lateral displacement, push/pull technique, air/fluid pockets, local deformation of materials, resonant mechanical elements, piezoelectric materials, micro-electro-mechanical systems (“MEMS”) elements, thermal fluid pockets, MEMS pumps, variable porosity membranes, laminar flow modulation, or the like.
Haptic substrate 705, in one embodiment, is constructed by semi-flexible or semi-rigid materials. In one embodiment, haptic substrate should be more rigid than flexible surface 703 thereby the surface texture of flexible surface 703 can confirm to the surface pattern of haptic substrate 705. Haptic substrate 705, for example, includes one or more actuators, which can be constructed from fibers (or nanotubes) of electroactive polymers (“EAP”), piezoelectric elements, fiber of shape memory alloys (“SMAs”) or the like. EAP, also known as biological muscles or artificial muscles, is capable of changing its shape in response to an application of voltage. The physical shape of an EAP may be deformed when it sustains large force. EAP may be constructed from Electrostrictive Polymers, Dielectric elastomers, Conducting Polymers, Ionic Polymer Metal Composites, Responsive Gels, Bucky gel actuators, or a combination of the above-mentioned EAP materials.
SMA (Shape Memory Alloy), also known as memory metal, is another type of material which can be used to construct haptic substrate 705. SMA may be made of copper-zinc-aluminum, copper-aluminum-nickel, nickel-titanium alloys, or a combination of copper-zinc-aluminum, copper-aluminum-nickel, and/or nickel-titanium alloys. A characteristic of SMA is that when its original shape is deformed, it regains its original shape in accordance with the ambient temperature and/or surrounding environment. It should be noted that the present embodiment may combine the EAP, piezoelectric elements, and/or SMA to achieve a specific haptic sensation.
Deforming mechanism 711 provides a pulling and/or pushing force to translate elements in the haptic substrate 705 causing flexible surface 703 to deform. For example, when deforming mechanism 711 creates a vacuum between flexible surface 703 and haptic substrate 705, flexible surface 703 is pushed against haptic substrate 705 causing flexible surface 703 to show the texture of flexible surface 703 in accordance with the surface pattern of haptic substrate 705. In other words, once a surface pattern of haptic substrate 705 is generated, flexible surface is pulled or pushed against haptic substrate 705 to reveal the pattern of haptic substrate 705 through the deformed surface of flexible surface 703. In one embodiment, haptic substrate 705 and deforming mechanism 711 are constructed in the same or substantially the same layer.
Upon receipt of a first activating signal, haptic substrate 705 generates a first surface pattern. After formation of the surface pattern of haptic substrate 705, deforming mechanism 711 is subsequently activated to change surface texture of flexible surface 703 in response to the surface pattern of haptic substrate 705. Alternatively, if haptic substrate 705 receives a second activating signal, it generates a second pattern.
Haptic substrate 705 further includes multiple tactile regions wherein each region can be independently activated to form a surface pattern of the substrate. Haptic substrate 705 is also capable of generating a confirmation feedback to confirm an input selection entered by a user. Deforming mechanism 711 is configured to deform the surface texture of flexible surface 703 from a first surface characteristic to a second surface characteristic. It should be noted that haptic device further includes a sensor, which is capable of activating the device when the sensor detects a touch on flexible surface 703. Deforming mechanism 711 may be a vacuum generator, which is capable of causing flexible surface 703 to collapse against the first surface pattern to transform its surface configuration in accordance with the configuration of first pattern of haptic substrate 705.
Haptic substrate 705 illustrates the state when tactile regions 707 and 709 are activated. Tactile regions 707 and 709 are raised in a z-axis direction. Upon receipt of one or more activating signals, haptic substrate 705 identifies a surface pattern in accordance with the activating signals. Haptic substrate 705 provides identified pattern by activating various tactile regions such as regions 707 and 709 to generate the pattern. It should be noted that tactile regions 707 and 709 imitate two buttons or keys. In another embodiment, tactile region 707 or 709 includes multiple haptic bits wherein each bit can be controlled for activating or deactivating.
Because the vibrations 805 occur on surface 803 in the ultrasound range of typically 20 KHz or greater, the wavelength content is usually smaller than the finger size, thus allowing for a consistent experience. It will be noted that the normal displacement of surface 803 is in the order of less than 5 micrometers, and that a smaller displacement results in lower friction reduction.
The interaction parameter may also be derived from device sensor data such as whole device acceleration, gyroscopic information or ambient information. Device sensor signals may be any type of sensor input enabled by a device, such as from an accelerometer or gyroscope, or any type of ambient sensor signal such as from a microphone, photometer, thermometer or altimeter, or any type of bio monitor such as skin or body temperature, blood pressure (BP), heart rate monitor (HRM), electroencephalograph (EEG), or galvanic skin response (GSR), or information or signals received from a remotely coupled device, or any other type of signal or sensor including, but not limited to, the examples listed in TABLE 1 below.
Active or ambient device sensor data may be used to modify the haptic feedback based any number of factors relating to a user's environment or activity. For example, an accelerometer device sensor signal may indicate that a user is engaging in physical activity such as walking or running, so the pattern and duration of the haptic feedback should be modified to be more noticeable to the user. In another example, a microphone sensor signal may indicate that a user is in a noisy environment, so the amplitude or intensity of the haptic feedback should be increased. Sensor data may also include virtual sensor data which is represented by information or signals that are created from processing data such as still images, video or sound. For example, a video game that has a virtual racing car may dynamically change a haptic effect based the car velocity, how close the car is to the camera viewing angle, the size of the car, and so on.
The interaction parameter may optionally incorporate a mathematical model related to a real-world physical effect such as gravity, acceleration, friction or inertia. For example, the motion and interaction that a user has with an object such as a virtual rolling ball may appear to follow the same laws of physics in the virtual environment as an equivalent rolling ball would follow in a non-virtual environment.
The interaction parameter may optionally incorporate an animation index to correlate the haptic output of a device to an animation or a visual or audio script. For example, an animation or script may play in response to a user or system initiated action such as opening or changing the size of a virtual window, turning a page or scrolling through a list of data entries.
Two or more gesture signals, device sensor signals or physical model inputs may be used alone or in any combination with each other to create an interaction parameter having a difference vector. A difference vector may be created from two or more scalar or vector inputs by comparing the scalar or vector inputs with each other, determining what change or difference exists between the inputs, and then generating a difference vector which incorporates a position location, direction and magnitude. Gesture signals may be used alone to create a gesture difference vector, or device sensor signals may be used alone to create a device signal difference vector.
A haptic effect corresponding to the motions used to create the stylized face is stored or encoded into the data file concurrently with the other image information in the data file. The haptic effect information may be stored in any way that permits the reproduction of the haptic effect along with the image. The data file is then communicated to a second device having a haptic actuator via any file transfer mechanism or communication link.
The second user may optionally collaborate with the first user to create a combined data file by providing additional gestures or device sensor signals to add the virtual message “Hi” on the drawing, along with any corresponding haptic effect generated from the virtual message and stored in the data file.
At 1301, the system receives input of a device sensor signal at time T1, and at 1303 the system receives input of a gesture signal at time T2. Time T1 and time T2 may occur simultaneously or non-simultaneously with each other and in any order. Multiple additional gesture inputs or device sensor inputs may be used to give greater precision to the dynamic haptic effect or to provide the dynamic haptic effect over a greater period of time. The gesture signals and the device sensor signals may be received in any order or time sequence, either sequentially with non-overlapping time periods or in parallel with overlapping or concurrent time periods. At 1305, the device sensor signal is compared to a haptic effect signal to generate a device sensor difference vector. At 1307, the gesture signal is compared to a haptic effect signal to generate a gesture difference vector. At 1309, an animation or physical model description may optionally be received. At 1311, an interaction parameter is generated using the gesture difference vector, the signal difference vector, and optionally the animation or physical model description. It will be recognized that any type of input synthesis method may be used to generate the interaction parameter from one or more gesture signals or device sensor signals including, but not limited to, the method of synthesis examples listed in TABLE 2 below. At 1313, a drive signal is applied to a haptic actuator according to the interaction parameter.
Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.
Number | Name | Date | Kind |
---|---|---|---|
5666499 | Baudel et al. | Sep 1997 | A |
5825308 | Rosenberg | Oct 1998 | A |
6061004 | Rosenberg | May 2000 | A |
6088019 | Rosenberg | Jul 2000 | A |
6100874 | Schena et al. | Aug 2000 | A |
6166723 | Schena et al. | Dec 2000 | A |
6211861 | Rosenberg et al. | Apr 2001 | B1 |
6252579 | Rosenberg et al. | Jun 2001 | B1 |
6278439 | Rosenberg et al. | Aug 2001 | B1 |
6300936 | Braun et al. | Oct 2001 | B1 |
6337678 | Fish | Jan 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6448977 | Braun et al. | Sep 2002 | B1 |
6647359 | Verplank et al. | Nov 2003 | B1 |
6717573 | Shahoian et al. | Apr 2004 | B1 |
6819312 | Fish | Nov 2004 | B2 |
7024625 | Shalit | Apr 2006 | B2 |
7084854 | Moore et al. | Aug 2006 | B1 |
7088342 | Reikmoto et al. | Aug 2006 | B2 |
7113177 | Franzen | Sep 2006 | B2 |
7133177 | Tamaru | Nov 2006 | B2 |
7199790 | Rosenberg et al. | Apr 2007 | B2 |
7205978 | Poupyrev et al. | Apr 2007 | B2 |
7336260 | Martin et al. | Feb 2008 | B2 |
7446456 | Maruyama et al. | Nov 2008 | B2 |
7456823 | Poupyrev et al. | Nov 2008 | B2 |
7468573 | Dai et al. | Dec 2008 | B2 |
7528508 | Bruwer | May 2009 | B2 |
7554246 | Maruyama et al. | Jun 2009 | B2 |
7592999 | Rosenberg et al. | Sep 2009 | B2 |
7663604 | Maruyama et al. | Feb 2010 | B2 |
7755607 | Poupyrev et al. | Jul 2010 | B2 |
7765333 | Cruz-Hernandez et al. | Jul 2010 | B2 |
7808488 | Martin et al. | Oct 2010 | B2 |
7821498 | Kramer et al. | Oct 2010 | B2 |
7825903 | Anastas et al. | Nov 2010 | B2 |
RE42064 | Fish | Jan 2011 | E |
7890863 | Grant et al. | Feb 2011 | B2 |
7911328 | Luden et al. | Mar 2011 | B2 |
7920131 | Westerman | Apr 2011 | B2 |
7924144 | Makinen et al. | Apr 2011 | B2 |
7969288 | Braun et al. | Jun 2011 | B2 |
7973769 | Olien | Jul 2011 | B2 |
7978181 | Westerman | Jul 2011 | B2 |
7979146 | Ullrich et al. | Jul 2011 | B2 |
7982588 | Makinen et al. | Jul 2011 | B2 |
7982720 | Rosenberg et al. | Jul 2011 | B2 |
8004492 | Kramer et al. | Aug 2011 | B2 |
8031181 | Rosenberg et al. | Oct 2011 | B2 |
8035623 | Bruwer | Oct 2011 | B2 |
8059105 | Rosenberg et al. | Nov 2011 | B2 |
8098235 | Heubel et al. | Jan 2012 | B2 |
8141947 | Nathan et al. | Mar 2012 | B2 |
8207832 | Yun et al. | Jun 2012 | B2 |
8260972 | Cruz-Hernandez et al. | Sep 2012 | B2 |
8279193 | Birnbaum et al. | Oct 2012 | B1 |
8280448 | Bang et al. | Oct 2012 | B2 |
8570296 | Birnbaum et al. | Oct 2013 | B2 |
8659571 | Birnbaum et al. | Feb 2014 | B2 |
8711118 | Short et al. | Apr 2014 | B2 |
8847741 | Birnbaum et al. | Sep 2014 | B2 |
20010035854 | Rosenberg et al. | Nov 2001 | A1 |
20020015024 | Westerman et al. | Feb 2002 | A1 |
20020044132 | Fish | Apr 2002 | A1 |
20020177471 | Kaaresoja et al. | Nov 2002 | A1 |
20030063128 | Salmimaa et al. | Apr 2003 | A1 |
20030100969 | Jones | May 2003 | A1 |
20030162595 | Serbanescu | Aug 2003 | A1 |
20030206202 | Moriya | Nov 2003 | A1 |
20040002902 | Muehlhaeuser | Jan 2004 | A1 |
20050057528 | Kleen | Mar 2005 | A1 |
20050060070 | Kapolka et al. | Mar 2005 | A1 |
20050179617 | Matsui et al. | Aug 2005 | A1 |
20050212749 | Marvit et al. | Sep 2005 | A1 |
20050212760 | Marvit et al. | Sep 2005 | A1 |
20050245302 | Batchiche et al. | Nov 2005 | A1 |
20060022952 | Ryynanen | Feb 2006 | A1 |
20060026536 | Hotelling et al. | Feb 2006 | A1 |
20060061545 | Hughes et al. | Mar 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060119586 | Grant et al. | Jun 2006 | A1 |
20060129719 | Cruz-Herhandez et al. | Jun 2006 | A1 |
20060181510 | Faith | Aug 2006 | A1 |
20060192760 | Moore et al. | Aug 2006 | A1 |
20060197752 | Hurst et al. | Sep 2006 | A1 |
20060255683 | Suzuki et al. | Nov 2006 | A1 |
20060256074 | Krum et al. | Nov 2006 | A1 |
20060279476 | Obata | Dec 2006 | A1 |
20060279542 | Flack et al. | Dec 2006 | A1 |
20060284849 | Grant et al. | Dec 2006 | A1 |
20070066283 | Haar et al. | Mar 2007 | A1 |
20070139366 | Dunko et al. | Jun 2007 | A1 |
20070146162 | Tengler et al. | Jun 2007 | A1 |
20070150826 | Anzures et al. | Jun 2007 | A1 |
20070152984 | Ording et al. | Jul 2007 | A1 |
20070236450 | Colgate et al. | Oct 2007 | A1 |
20070242040 | Ullrich et al. | Oct 2007 | A1 |
20070247429 | Westerman | Oct 2007 | A1 |
20070247442 | Andre et al. | Oct 2007 | A1 |
20070265096 | Kouno et al. | Nov 2007 | A1 |
20070279392 | Rosenberg et al. | Dec 2007 | A1 |
20080024459 | Poupyrev et al. | Jan 2008 | A1 |
20080055277 | Takenaka et al. | Mar 2008 | A1 |
20080060856 | Shahoian et al. | Mar 2008 | A1 |
20080068334 | Olien et al. | Mar 2008 | A1 |
20080088580 | Poupyrev et al. | Apr 2008 | A1 |
20080111788 | Rosenberg et al. | May 2008 | A1 |
20080180406 | Han et al. | Jul 2008 | A1 |
20080216001 | Ording et al. | Sep 2008 | A1 |
20080287147 | Grant et al. | Nov 2008 | A1 |
20080300055 | Lutnick et al. | Dec 2008 | A1 |
20080303782 | Grant et al. | Dec 2008 | A1 |
20090002328 | Ullrich et al. | Jan 2009 | A1 |
20090015045 | Nathan et al. | Jan 2009 | A1 |
20090079550 | Makinen et al. | Mar 2009 | A1 |
20090085878 | Heubel et al. | Apr 2009 | A1 |
20090106655 | Grant et al. | Apr 2009 | A1 |
20090109007 | Makinen et al. | Apr 2009 | A1 |
20090128503 | Grant et al. | May 2009 | A1 |
20090137269 | Chung | May 2009 | A1 |
20090146845 | Hedley | Jun 2009 | A1 |
20090166098 | Sunder | Jul 2009 | A1 |
20090167508 | Fadell et al. | Jul 2009 | A1 |
20090167509 | Fadell et al. | Jul 2009 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20090231276 | Ullrich et al. | Sep 2009 | A1 |
20090250267 | Heubel et al. | Oct 2009 | A1 |
20090256817 | Perlin et al. | Oct 2009 | A1 |
20090270046 | Lai | Oct 2009 | A1 |
20090284485 | Colgate et al. | Nov 2009 | A1 |
20090315830 | Westerman | Dec 2009 | A1 |
20090322498 | Yun et al. | Dec 2009 | A1 |
20090325645 | Bang et al. | Dec 2009 | A1 |
20100013653 | Birnbaum et al. | Jan 2010 | A1 |
20100013761 | Birnbaum et al. | Jan 2010 | A1 |
20100017489 | Birnbaum et al. | Jan 2010 | A1 |
20100017759 | Birnbaum et al. | Jan 2010 | A1 |
20100045619 | Birnbaum et al. | Feb 2010 | A1 |
20100073304 | Grant et al. | Mar 2010 | A1 |
20100085169 | Poupyrev et al. | Apr 2010 | A1 |
20100108408 | Colgate et al. | May 2010 | A1 |
20100127819 | Radivojevic et al. | May 2010 | A1 |
20100149134 | Westerman et al. | Jun 2010 | A1 |
20100152620 | Ramsay et al. | Jun 2010 | A1 |
20100156818 | Burrough | Jun 2010 | A1 |
20100214243 | Birnbaum et al. | Aug 2010 | A1 |
20100231539 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100231550 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100245254 | Olien et al. | Sep 2010 | A1 |
20100253491 | Grossman | Oct 2010 | A1 |
20100265208 | Kim et al. | Oct 2010 | A1 |
20100313124 | Privault et al. | Dec 2010 | A1 |
20100328053 | Yeh et al. | Dec 2010 | A1 |
20110021272 | Grant et al. | Jan 2011 | A1 |
20110025609 | Modarres et al. | Feb 2011 | A1 |
20110043454 | Modarres et al. | Feb 2011 | A1 |
20110043527 | Ording et al. | Feb 2011 | A1 |
20110102340 | Martin et al. | May 2011 | A1 |
20110105103 | Ullrich | May 2011 | A1 |
20110109588 | Makinen et al. | May 2011 | A1 |
20110138277 | Grant et al. | Jun 2011 | A1 |
20110260988 | Colgate et al. | Oct 2011 | A1 |
20110264491 | Birnbaum et al. | Oct 2011 | A1 |
20110267181 | Kildal | Nov 2011 | A1 |
20120068957 | Puskarich et al. | Mar 2012 | A1 |
20120081276 | Ullrich et al. | Apr 2012 | A1 |
20120105333 | Maschmeye et al. | May 2012 | A1 |
20120223880 | Birnbaum et al. | Sep 2012 | A1 |
20120223882 | Galor | Sep 2012 | A1 |
20130227410 | Sridhara et al. | Aug 2013 | A1 |
Number | Date | Country |
---|---|---|
101118469 | Feb 2008 | CN |
101681200 | Mar 2010 | CN |
200 19 074 | Feb 2001 | DE |
0 899 650 | Mar 1999 | EP |
1 401 185 | Mar 2004 | EP |
1 691 263 | Aug 2006 | EP |
1 731 993 | Dec 2006 | EP |
2 910 160 | Jun 2008 | FR |
2 416 962 | Feb 2006 | GB |
2005332063 | Dec 2005 | JP |
2007-257088 | Oct 2007 | JP |
2010-134955 | Jun 2010 | JP |
2010-522380 | Jul 2010 | JP |
100844487 | Jul 2008 | KR |
20100124324 | Nov 2010 | KR |
20100126277 | Dec 2010 | KR |
WO 9720305 | Jun 1997 | WO |
WO 9806024 | Feb 1998 | WO |
WO 9938064 | Jul 1999 | WO |
WO 2004044728 | May 2004 | WO |
WO 2004075169 | Sep 2004 | WO |
WO 2004081776 | Sep 2004 | WO |
WO 2005103863 | Nov 2005 | WO |
WO 2008132540 | Nov 2008 | WO |
WO 2009037379 | Mar 2009 | WO |
WO 2009071750 | Jun 2009 | WO |
WO 2009074185 | Jun 2009 | WO |
WO 2009141502 | Nov 2009 | WO |
WO 2010068574 | Jun 2010 | WO |
WO 2010088477 | Aug 2010 | WO |
WO 2011011552 | Jan 2011 | WO |
Entry |
---|
Abdulmotaleb, “Haptics Technologies: Bringing Touch to Multimedia,” Springer Series on Touch and Haptic Systems, 2011. |
Berkelman, P., “Tool-Based Haptic Interaction with Dynamic Physical Simulations Using Lorentz Magnetic Levitation” (1999) (990-RESP-ITC0010654-990-RESP-ITC0010844). |
Biet, M. et al., “Discrimination of Virtual Square Gratings by Dynamic Touch on Friction Based Tactile Displays,” Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2008 Symposium, IEEE, Piscataway, NJ, XP 031339918, pp. 41-48, Mar. 2008. |
Biet, M. et al., “New Tactile Devices Using Piezoelectric Acutators,” L2EP, University of Lille, Polytech-Lille, Actuator 2006, 10th International Conference on New Actuators, Bremen, Germany, pp. 989-992, Jun. 2006. |
Bonderud, D., “Nokia Files Patent to Make Phones Digitally Stimulating,” InventorSpot.com [online]. Retrieved on Apr. 21, 2011 from <URL: http://inventorspot.com/articles/nokia_files_patent_make_phones_digitally_stimulating>. |
Buxton, B., “Multi-Touch Systems that I Have Known and Loved,” Microsoft Research; Original: Jan. 12, 2007; Version: Jul. 17, 2007. <URL: http://www.billbuxton.com/multitouchOverview.html>. |
Chang et al., “ComTouch: Design of a Vibrotactile Communciation Device,” DIS2002, London Copyright 2002 ACM 1-58113-2-9-0/00/0008, 10 pages. |
Companies Redouble Efforts to Deliver Consistent Support, Compatibility Across Wide Range of Products (1998) (990-RESP-ITC0009693-990-RESP-ITC0009694). |
Dewitt, A., “Designing Sonification of User Data in Affective Interaction,” Master of Science Thesis, Stockholm, Sweden, XP 002551466. Retrieved on Oct. 20, 2009 from <URL: http://w3.nada.kth.se/utbildn.ing/grukth/exjobb/rapportlistor/2007/rapporter07/de_witt_anna_07142.pdf>. |
Flaherty, N., Stanford University EE 402A “Can You Feel Me Calling?” Oct. 27, 2005 (990-RESP-ITC0008980-990-RESP-ITC0009022). |
“Gesture Recognition,” Retrieved on Jul. 23, 2010 from <URL: http://en.wikipedia.com/w/index.php?title=Gesture_recognition&printable=yes>. |
Getting Started with Your EO Personal Communicator (1992, 1993) (990-RESP-ITC0008555-990-RESP-ITC0008631). |
Greene, K., “A Touch Screen with Texture,” Technology Review [online]. Retrieved on Apr. 21, 2011 from <URL: http://www.technologyreview.com/printer_friend_article.aspx?id=26506>. |
Hall, “T-Bars: Towards Tactile User Interfaces for Mobile Touchscreens” (2008) (990-RESP-ITC0012020-990-RESP-ITC0012023). |
Hinckley, K., “Sensing Techniques for Mobile Interaction” (2000) (990-RESP-ITC0009775-990-RESP-ITC0009784). |
Hsin-Un Yao et al., “An Experiment on Length Perception with a Virtual Rolling Stone,” Proc. EuroHaptics 2006, pp. 325-330. |
IBM Simon User Manual, Part No. 82G2557 (1994) (990-RESP-ITC0011979-990-RESP-ITC0012019). |
Immersion Announces Integration Kit and Program for Implementing Touch Feedback in Touchscreens, Feb. 27, 2006 (990-RESP-ITC0011813-990-RESP-ITC0011814). |
Immersion Corporation, “Runtime System Architecture, 2004, v2” (990-RESP-ITC0009553-990-RESP-ITC0009554). |
Immersion Tactile Feedback Technology for Touchscreens System Integration (2005) (990-RESP-ITC00011082-990-RESP-ITC0011083). |
Immersion Tactile Touchscreen Demonstrator TouchSense Enabled 8.4-inch LCD Touch Monitor, 0905.v3 (990-RESP-ITC0010208-990-RESP-ITC0010209). |
Immersion TouchSense Integration Kit for Touchscreens, 0206.v1 (2006) (990-RESP-ITC0008848-990-RESP-ITC0008849). |
Immersion TouchSense Programmable Rotary Modules 1004.1000.v3 (2004) (990-RESP-ITC0010234-990-RESP-ITC0010241). |
Immersion TouchSense Technology for Touchscreens, 0405, v1 (2005) (990-RESP-ITC0009723-990-RESP-ITC0009726). |
Iwamoto, T. et al., “Airborne Ultrasound Tactile Display,” The University of Tokyo, SIGGRAPH 2008, Los Angeles, CA, Aug. 11-15, 2008. |
Iwamoto, T. et al., “Non-Contact Method for Producing Tactile Sensation Using Airborned Ultrasound,” Department of Information Physics and Computing Graduate School of Information Science and Technology, the University of Tokyo, EuroHaptics 2008, LNCS 5024, Springer-Verlag Berlin Heidelberg 2008, pp. 504-551. |
Kaaresoja, T. et al., “Snap-Crackle-Pop: Tactile Feedback for Mobile Touch Screens,” Proc. EuroHaptics 2006, XP 002551465. Retrieved on Oct. 20, 2009 from <URL: http://lsc.univ-evry.fr/eurohaptics/upload/cd/papers/f80>. |
Koskinen, “Optimizing Tactile Feedback for Virtual Buttons in Mobile Devices” (2008) (990-RESP-ITC0014504-990-RESP-ITC0014597). |
Kyung, “Precise Manipulation of GUI on a Touch Screen with Haptic Cues” (2009) (990-RESP-ITC0012167-990-RESP-ITC0012172). |
Laitiennen, “Enabling Mobile Haptic Design: Piezoelectric Actuator Technology Properties in Hand Held Devices” (2006) (990-RESP-ITC0013663-990-RESP-ITC0013666). |
MacKenzie, “A Comparison of Three Selection Techniques for Touchpads,” Apr. 18-23, 1998 (990-RESP-ITC0013667-990-RESP-ITC0013674). |
Marks, P., “Nokia touchscreen creates texture illusion,” New Scientist [online]. Retrieved on Apr. 21, 2011 from <URL: http://www.newscientist.com/article/dn19510-nokia-touchscreen-creates-texture-illusion.html>. |
Minsky, “Feeling and Seeing: Issues in Force Display,” 1990 (990-RESP-ITC0009288-990-RESP-ITC0009297). |
Mora, “Real-Time 3D Fluid Interaction with a Haptic User Interface,” IEEE Symposium, Mar. 8-9, 2008 (990-RESP-ITC0011416-990-RESP-ITC0011422). |
Nashel, “Tactile Virtual Buttons for Mobile Devices” (2003) (990-RESP-ITC0012505-990-RESP-ITC0012506). |
Newton Apple MessagePad Handbook (1995) (990-RESP-ITC0015130-990-RESP-ITC0015325). |
Oakley, I. et al., “Contact IM: Exploring Asynchronous Touch over Distance,” Palpable Machines Research Group, Media Lab Europe, XP 007910188. Retrieved on Oct. 20, 2009 from <URL: http://people.cs.vt.edu/wangr06/touch%20review%20organization/Oak002>. |
Poupyrev, “Ambient Touch: Designing Tactile Interfaces for Handheld Devices,” UIST '02, Oct. 27-30, 2002 (990-RESP-ITC0009278-990-RESP-ITC0009297). |
PR-1000 Developer Kit Overview, May 3, 2005. |
Rovers, A. et al., “HIM: A Framework for Haptic Instant Messaging,” CHI 2004 (CHI Conference Proceedings, Human Factors in Computing Systems), XP 002464573, Vienna, Austria, Apr. 2004, pp. 1313-1316. |
Ruffaldi, E., “A Haptic Toolkit for the Development of Immersive and Web-Enabled Games,” Nov. 1-3, 2006 (990-RESP-ITC0010477-990-RESP-ITC0010480). |
Sekiguchi, Y. et al., “Haptic Interface Using Estimation of Box Contents Metaphor,” Proceedings of ICAT 203, Tokyo, Japan, XP 002551467. Retrieved Oct. 20, 2009 from <URL: http://www.vrsj.org/ic-at/papers/2003/00947-00000>. |
Smith, N. “Feel the future: Touch screens that touch back,” MSNBC [online]. Retrieved on Apr. 21, 2011 from <URL: http://www.msnbc.msn.com/id/40845743/ns/technology_and_science-tech_and_gadgets/t/feel-future-touch-screens-touch-back/#.T5clU7ONfEY>. |
Sony Magic Link User's Guide, PIC-1000 (1994) (990-RESP-ITC0010845-990-RESP-ITC0011048). |
Spence, R., “Data Base Navigation: An Office Environment for the Professional” (1982) (990-RESP-ITC0011780-990-RESP-ITC0011793). |
White, T., “Introducing Liquid Haptics in High Bandwidth Human-Computer Interfaces,” May 8, 1998 (990-RESP-ITC0008683-990-RESP-ITC0008773). |
Wiker, “Teletouch Teletouch Display Development: Phase 1 Report,” Technical Report 1230, Naval Ocean Systems Center, San Diego, Apr. 17, 1989 (990-RESP-ITC0002137-990-RESP-ITC0002202). |
Williamson, J. et al., “Shoogle: Excitatory Multimodal Interaction on Mobile Devices,” CHI 2007 Proceedings: Shake, Rattle and Roll: New Forms of Input and Output, 2007, XP 002549378, pp. 121-124. |
Yatani, “SemFeel: A User Interface with Semantic Tactile Feedback for Mobile Touch-screen Devices” (2009) (990-RESP-ITC0011458-990-RESP-ITC0011467). |
Number | Date | Country | |
---|---|---|---|
20170220115 A1 | Aug 2017 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15046701 | Feb 2016 | US |
Child | 15480725 | US | |
Parent | 14518115 | Oct 2014 | US |
Child | 15046701 | US | |
Parent | 14444416 | Jul 2014 | US |
Child | 14518115 | US | |
Parent | 14176421 | Feb 2014 | US |
Child | 14444416 | US | |
Parent | 13773191 | Feb 2013 | US |
Child | 14176421 | US | |
Parent | 13592685 | Aug 2012 | US |
Child | 13773191 | US | |
Parent | 13472709 | May 2012 | US |
Child | 13592685 | US | |
Parent | 13397142 | Feb 2015 | US |
Child | 13472709 | US |