Haptic input device

Information

  • Patent Grant
  • 10120446
  • Patent Number
    10,120,446
  • Date Filed
    Friday, November 19, 2010
    14 years ago
  • Date Issued
    Tuesday, November 6, 2018
    6 years ago
Abstract
One embodiment of a haptic input device may include a receiver configured to receive a signal from a touch-based user interface device. The signal may include a control signal or a look-up value. The haptic input device may also include a decoder coupled to the receiver and configured to decode the signal from the touch-based user interface device, at least one sensor configured to determine at least one characteristic of the haptic input device, a controller coupled to the one or more sensors and configured to transmit a control signal, a haptic actuator coupled to the controller, and a transmitter coupled to the at least one sensor.
Description
BACKGROUND

I. Technical Field


Embodiments described herein relate generally to input devices, and more particularly, to an input device capable of providing haptic and visual feedback to a user.


II. Background Discussion


Existing touch-based user interface devices typically have a touch panel and a visual display component. The touch panel may include a touch sensitive surface that, in response to detecting a touch event, generates a signal that can be processed and utilized by other components of an electronic device. The touch sensitive surface may be separate from the display component, such as in the case of a trackpad, or may be integrated into or positioned in front a display screen, such as in the case of a display touch screen.


Display touch screens may show textual and/or graphical display elements representing selectable virtual buttons or icons, and the touch sensitive surface may allow a user to navigate the content displayed on the display screen. Typically, a user may move one or more objects, such as a finger, a stylus, across the touch sensitive surface in a pattern that the device translates into an input command. As an example, some electronic devices allow the user to select a virtual button by tapping a portion of the touch sensitive surface corresponding to the virtual button. Some electronic devices may even detect more than one simultaneous touch events in different locations on the touch screen.


Generally, input devices do not provide haptic feedback to a user in response to interactions with the input device. The user can typically only feel the rigid surface of the touch screen, making it difficult to find icons, hyperlinks, text boxes, or other user-selectable input elements on the display. An input device capable of generating haptic feedback may help a user navigate content displayed on the display screen, and may further serve to enhance the content of various applications by creating a more appealing and realistic user interface. “Haptic feedback” may be any tactile feedback. Examples include forces, vibrations, and/or motions that may be sensed by the user.


SUMMARY

Embodiments described herein generally relate to haptic input devices that can receive an input from a user and provide haptic feedback based on the input from the user. In some embodiments, the haptic input device may be configured to interface with a touch-based user interface device, such as a touch screen. The touch-based user interface device may further include one or more input sensors, such as force sensors or position sensors, that are configured to sense one or more characteristics of a haptic input device as it engages the touch screen. For example, the one or more characteristics may include a position of the device relative to the touch screen, a pressure being applied on the touch screen surface by the haptic input device, an angle of the input device relative to the touch screen, and the like. The touch-based user interface device may determine a haptic response based on the one or more characteristics and transmit the haptic response to the haptic input device. The haptic input device may include a haptic actuator that generates haptic feedback based on the received haptic response. The haptic response may take the form of a control signal that drives a haptic actuator or a look-up value that corresponds to a control signal stored in a look-up table. In some embodiments, the haptic input device may also include additional sensors configured to sense one or more characteristics of the haptic input device, such as the orientation of the haptic input device, the acceleration of the device relative to the touch screen surface, and so on.


One embodiment may take the form of a haptic input device that includes: a receiver configured to receive a first signal from a touch-based user interface device; a decoder coupled to the receiver and configured to extract an input signal from the first signal; a controller coupled to the decoder and configured to receive the input signal from the decoder, further configured to generate a control signal based on the input signal; a haptic actuator coupled to the controller and configured to actuate in response to the input signal; at least one sensor configured to determine at least one characteristic of the haptic input device; a transmitter coupled to the at least one sensor.


Another embodiment may take the form of a touch-based user interface device. The touch-based user interface device may include: at least one transmitter configured to transmit at least one first signal to a haptic input device; at least one receiver configured to receive at least one second signal from the haptic input device; at least one input sensor configured to sense an input resulting from an object engaging a touch screen surface; at least one storage device storing one or more executable instructions; and at least one processor coupled to the at least one receiver, the at least one transmitter, the at least one input sensor, and the at least one storage device. The at least one processor may be configured to access the at least one storage device in order to execute the one or more executable instructions.


Another embodiment may take the form of a method for providing haptic feedback. The method may include receiving an input gesture from one or more input sensors, deriving a characteristic of an object engaging a touch screen surface, determining a haptic response based on the characteristic of the object engaging the touch screen, and transmitting a signal to a haptic input device. The signal may comprise a control signal or a look-up value.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 illustrates one embodiment of a system incorporating a haptic input device.



FIG. 2 is a block diagram of one embodiment of a touch-based user input device that can be used in conjunction with the system illustrated in FIG. 1.



FIG. 3 is a block diagram of one embodiment of a haptic input device that can be used in conjunction with the system illustrated in FIG. 1.



FIG. 4 is a block diagram of the transmitter and one or more input sensors of the touch-based user interface device shown in FIG. 1, as well as the receiver, decoder, controller, optional sensors, and haptic actuator of the haptic input device shown in FIG. 3.



FIG. 5 is a block diagram of one embodiment of a controller that may be used in conjunction with the haptic input device shown in FIG. 1.



FIG. 6 is a block diagram of another embodiment of a controller that may be used in conjunction with the haptic input device shown in FIG. 1.



FIG. 7 illustrates a schematic diagram of the operation of the system illustrated in FIG. 1, according to one embodiment.



FIG. 8 is a flowchart illustrating one embodiment of a method for providing haptic feedback.



FIG. 9 is a flowchart illustrating another embodiment of a method for providing haptic feedback.





DETAILED DESCRIPTION

Embodiments described herein generally relate to haptic input devices that can receive an input from a user and provide haptic feedback based on the input from the user. In some embodiments, the haptic input device may be configured to interface with a touch-based user interface device, such as a touch screen. The touch-based user interface device may further include one or more input sensors, such as force sensors or position sensors, that are configured to sense one or more characteristics of a haptic input device as it engages the touch screen. For example, the one or more characteristics may include a position of the device relative to the touch screen, a pressure being applied on the touch screen surface by the haptic input device, an angle of the input device relative to the touch screen, and the like. The touch-based user interface device may determine a haptic response based on the one or more characteristics and transmit the haptic response to the haptic input device. The haptic input device may include a haptic actuator that generates haptic feedback based on the received haptic response. The haptic response may take the form of a control signal that drives a haptic actuator or a look-up value that corresponds to a control signal stored in a look-up table. In some embodiments, the haptic input device may also include additional sensors configured to sense one or more characteristics of the haptic input device, such as the orientation of the haptic input device, the acceleration of the device relative to the touch screen surface, and so on.



FIG. 1 illustrates one embodiment of a system 100 incorporating a haptic input device 101. As shown in FIG. 1, the system 100 may include the haptic input device 101 and a touch-based user interface device 103 that serves as a user input output (I/O) device. The touch-based user interface device 103 may include a touch screen surface 105 and one or more transmitters 107 configured to transmit signals to a receiver of the haptic input device 101. The transmitters 107 may be wired or wireless transmitters, or a combination of both wired and wireless transmitters.


In one embodiment, the haptic input device 101 may be a stylus that is configured to resemble a writing utensil. For example, the stylus may include a tapered or pointed tip that is configured to contact the touch screen surface 105. In some embodiments, the tip may be capacitive in order to permit registration of the contact on the touch screen surface 105. In other embodiments, the haptic input device 101 may have other configurations. For example, the haptic input device 101 may have a blunt, as opposed to a pointed, tip, or may take the form of a ball.


The haptic input device 101 may be configured to provide haptic feedback to a user. This haptic feedback may be any type of tactile feedback that takes advantage of a user's sense of touch and/or sight, for example, by creating forces, vibrations, and/or motions that may be perceived by the user. As alluded to above, the haptic input device 101 may be configured to provide haptic feedback based on input gestures from the user. Haptic feedback may be used to enhance the user's interaction with the touch-based user interface device 103 by providing mechanical stimulation to the user when the user is engaging the device 103. For example, haptic feedback may confirm the user's selection of a particular item, such as a virtual icon or a button, or may be provided when the user's input device is positioned over a selectable item. The haptic input device 101 may also provide a haptic output when the device is over, near or passes the boundary of a window or application shown on a display, or when the device is over, near or passes a graphic item having a particular texture. It should be appreciated that haptic feedback may be provided when a cursor controlled by the haptic input device meets these or other conditions set forth in this document. Indeed, certain embodiments may employ a haptic input device to move a cursor on a display that is not touch-sensitive. Accordingly, the description, functionality and operations set forth herein generally apply to a haptic input device operating a cursor on a display screen lacking capacitive, pressure-sensing or other touch-sensing capabilities, as well.


The touch-based user interface device 103 can function as, for example, a media device, a communications device, a digital camera, a video camera, a storage device, or any other electronic device. Some examples of touch-based user interface devices 103 incorporating touch screen surfaces 105 include Apple Inc.'s iPhone.TM., iPod Nano.TM., iPod Touch.TM., and iPad.TM. devices. Other examples may include tablet personal computers, laptops, and so on. The touch screen surface 105 may include one or more input sensors that allow a user to interact with the touch-based user interface device 103 by sensing various touch-based input gestures, such as swiping, tapping, scrolling, and so on, applied across the touch screen surface 105. The input sensors may include one or more capacitive sensors, optical sensors, acoustic sensors, force sensors, and so on.


In some embodiments, the touch-based input gestures may be applied through moving an object other than a finger, such as the input device 101, or moving multiple objects simultaneously (e.g., multi-touch inputs). As will be described further below, the input sensors may obtain information regarding the sensed gestures. The input sensors may detect changes in pressure and/or capacitance from an object impacting the touch screen; such changes may be the basis for interpreting gestural input. The touch-based user interface 103 may further include one or more transmitters 107 configured to transmit the information regarding the sensed gestures to a processing device provided in the touch-based user interface device 103, which may translate the received information into a particular input command. As an example, the input sensors may derive position information, such as distance traveled and/or direction of motion, regarding a sensed gesture, and the processing device may execute certain functionality based on the received distance and/or direction information. For example, the device may interpret sensed motion as a request to move a cursor on the screen. As another example, the input sensors may be configured to sense a particular gesture or pattern of gestures and, in response, execute a particular command. For example, a tap or an application of pressure onto the touch screen surface 105 may be associated with a selection, while sliding the object along the touch screen surface 105 in a particular manner may be associated with scrolling, enlarging, shrinking, and so on. In some embodiments, a combination of gestures by a finger or other object and the haptic input device may be interpreted together to provide a particular haptic feedback to a user through the haptic input device. The processing device may be any known processing device, including, but not limited to, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a microcontroller, a graphics processing unit (GPU), and so on.


In some embodiments, the haptic input device 101 may provide haptic and/or visual feedback based on the position of the haptic input device 101 with respect to the touch screen surface 105. Position information may be sensed by the user interface device 103 and provided to the haptic device, for example across a communications link. In other embodiments, the haptic input device may determine its location relative to the touch screen surface. As one example of position-based feedback, the haptic input device 101 may provide haptic, audible and/or visual feedback when the device passes over a selectable button or icon being displayed by the touch screen surface 105. For example, in one embodiment, the processing device of the touch-based user interface device 103 may run a graphics editing program that allows the user to create an image by moving the haptic input device 101 across the touch screen surface 105 to manipulate a cursor to draw or otherwise interact with graphical elements. In this embodiment, the haptic input element 101 may be configured to provide haptic/visual/audible feedback when the user selects the drawing cursor using the input device 101, as the user moves the input device 101 across the touch screen surface 105, or as the user interacts with, nears, passes or is over a user-selectable element. The graphics editing program may be similar to various commercial off-the-shelf programs, such as, Autodesk, Inc.'s SketchBook™ KikiPixel's Inspire Pro, Microsoft Corporation's MS Paint™, and so on.


In other embodiments, the haptic input device 101 may be configured to provide haptic, audible and/or visual feedback based the amount of pressure applied by the haptic input device 101 to the touch screen surface 105. In such embodiments, the haptic input device 101 and/or the touch screen surface 105 may include one or more pressure sensors configured to sense pressure being applied by the device 101 to the surface. In one embodiment, the haptic input device 103 may provide haptic, audible and/or visual feedback when pressure applied to the user input device by the haptic input device exceeds a predetermined threshold. In other embodiments, the haptic input device 101 may be configured to provide haptic, audible and/or visual feedback if the input device 101 and/or touch-based user input device 103 detects any pressure being applied onto the surface 105, regardless of the amount of pressure being applied. With respect to one embodiment in which the touch-based user interface device 103 is running a graphics editing program, the haptic input element 101 may allow the user to “draw” an image only if the touch-based user input device 103 and/or the haptic input device 101 determine that the user is applying sufficient pressure onto the touch screen surface 105 via the haptic input device 101. It should be appreciated that the user input device may thus determine both a location of the haptic input device on the touch screen (through capacitive sensing, for example) and a force exerted on the screen by the haptic input device (through pressure sensing, for example).


In further embodiments, the haptic input device 101 may be configured to provide haptic and/or visual feedback based on a combination of the position of the haptic input device 101 with respect to the touch screen surface 105, and the amount of pressure applied by the haptic input device 101 onto the touch screen surface. In such embodiments, the touch-based user input device 103 may allow the user to select buttons or icons only if the touch-based user input device 103 and/or the haptic input device 101 determine that the haptic input device 101 is positioned over a selectable button or icon, and that the user is applying pressure onto the touch screen surface 105 via the haptic input device 101. Similarly, with respect to embodiments in which the touch-based user interface device 103 is running a graphics editing program, the haptic input element 101 may allow the user to “draw” an image only if the touch-based user input device 103 and/or the haptic input device 101 determine that the haptic input device 101 is positioned over a “paintable” portion of the touch screen surface, and that the user is applying pressure onto the touch screen surface 105.


Certain embodiments may also provide haptic feedback that varies with a distance to a user interface element, such as a selectable icon and the like. As the haptic input device 101 approaches the user interface element, the haptic device may provide enhanced or increased feedback. For example, a frequency and/or intensity of haptic feedback may increase as the haptic input device 101 comes closer to the user interface element. Likewise, as the haptic input device (or its tip/selection portion) moves further away, the frequency and/or intensity of the feedback may diminish. In this manner, the haptic feedback may indicate if a user is approaching or receding from certain elements or portions of a display screen.



FIG. 2 illustrates one embodiment of a touch-based user input device 103 that can be used in conjunction with the system 100 illustrated in FIG. 1. As shown in FIG. 2, the touch-based user input device 103 may include a processing device 160. The processing device 160 may be any known processing device, including, but not limited to, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a microcontroller, a graphics processing unit (GPU), software or firmware configured to execute instructions in memory to perform various processing functions, and so on and so forth.


The processing device 160 may be communicatively coupled to a storage device 162. In one embodiment, the storage device 162 may be a memory device, such as non-volatile or volatile memory, a hard disk drive, a flash memory drive, and so on and so forth. The storage device 162 may store software or firmware for running the touch-based user interface device 103. For example, in one embodiment, the storage device 162 may store operating system software that includes a set of instructions that are executable on the processing device 160 to enable the setup, operation and control of the touch-based user interface device 103. The operating system software may also provide a menu-based operating system that can be navigated by the user through a graphical user interface displayed or presented to the user on the touch screen 105.


The processing device 160 may also be communicatively coupled to one or more input sensors 164. As alluded to above, the input sensors 164 may be configured to sense various touch-based input gestures, such as swiping, tapping, scrolling, and so on, applied across the touch screen surface 105. The input sensors 164 may be any type of sensor, including capacitive sensors, resistive sensors, acoustic sensors, infrared sensors, and so on. The touch-based input gestures may be applied by an object, such as the input device 101, a finger, and so on, and may obtain gesture information regarding the sensed gestures. For example, as discussed above, the input sensors may derive position information, such as distance and/or direction information, regarding a sensed gesture. As another example, the input sensors may be force sensors configured to measure the amount of pressure being applied to the touch screen surface 105.


The touch-based user interface 103 may further include a transmitter 107 that is communicatively coupled to the processing device 160. The transmitter 107 may be configured to transmit signals to the haptic input device 101 over a wired or a wireless connection. In one embodiment, the signals transmitted by the transmitter 107 may be radio-frequency (RF) or infrared (IR) signals. However, in other embodiments, the command signals may be other types of electromagnetic signals. For example, the command signals may be microwaves signals, radio signals, and so on and so forth. In one embodiment, the transmitted signals may be generated in response to the gesture information received from the input sensors 164. As will be further described below, the transmitted signals may be encoded with a control signal for driving a haptic actuator or a look-up value that corresponds to a control signal in a look-up table stored in a storage device of the haptic input device 101.


In some embodiments, the touch-based user interface 103 may also include an optional receiver 166 communicatively coupled to the processing device 160. The receiver 166 may be configured to receive signals from the haptic input device 101 over a wireless receiver or a wired connection. In one embodiment, the received signals may include information obtained by one or more input sensors of the haptic input device 101. As will be further discussed below, the information may include acceleration information, orientation information, pressure information, tilt information and so on.



FIG. 3 illustrates one embodiment of a haptic input device 101 that can be used in conjunction with the system 100 illustrated in FIG. 1. As shown in FIG. 3, in one embodiment, the haptic input device 101 may be a stylus. The haptic input device 101 may include one or more engagement portions or tips 111 configured to contact (and to register contact on) the touch screen surface 105, a receiver 119, a decoder 112, a controller 113, one or more haptic actuators 114, one or more optional sensors 116 configured to sense various characteristics of the user's manipulation of the haptic input device 101, and an optional transmitter 118. Some examples of sensors 116 that may be used in conjunction with various embodiments of the haptic input device 101 are described below. As shown in FIG. 3, the haptic input device 101 may also include a power source 115 configured to supply power to the controller 113 and/or the haptic actuator 114. The power source 115 may be a battery or some other type of power supply, such as a power adapter, an electromechanical system such as a generator or an alternator, a solar power cell, and so on and so forth.


In one embodiment, the tip 111 may be formed from a conductive material, such as metal, or from a non-metallic conductive material, such as graphite, various salts, plasmas, and so on. In other embodiments, the tip 111 may be formed from a nonconductive material. The tip 111 may include a portion configured to contact the touch screen surface 105. This portion may be pointed, as shown in FIG. 3, or may be blunt. In another embodiment, the tip may be configured as a ball that is configured to roll along the touch screen surface 105 so that different portions of the tip 111 may contact the touch screen surface 105.


The tip 111 may be communicatively coupled to a receiver 119. The receiver 119 may be any type of wireless or wired receiver that is configured to receive signals from the one or more transmitters 107 of the touch-based user interface device 103. As alluded to above, the signals may include a haptic response based on the touch-based input gestures received by the input sensors of the touch screen surface 105. In one embodiment, the receiver 119 may be configured to receive wireless signals that are wirelessly transmitted by the one or more transmitters 107. The wireless signals may be transmitted using any type of wireless transmission medium, including, but not limited to, Wi-Fi, Bluetooth, IR, RF, and so on and so forth. In other embodiments, the stylus 101 may be coupled to the touch-based user interface device 103 via a wired connection, and the receiver 119 may receive the signals from the transmitters 107 over the wired connection.


As shown in FIG. 3, the receiver 119 may be communicatively coupled to a decoder 112 that is configured to decode the signals received from the touch-based user interface device 103. In one embodiment, the touch-based user interface device 103 may modulate the signal to include a control signal for driving the haptic actuator 114 or a look-up value corresponding to a control signal for driving the haptic actuator 114, and the decoder 112 may include or take the form of a demodulator that is configured to demodulate the signal to derive the control signal and/or look-up value. If the signal is modulated with a control signal, then the decoder 112 may demodulate the signal to obtain the control signal, and transmit the control signal to a controller communicatively coupled to the decoder. In contrast, if the signal is encoded with a look-up value, the decoder 112 may process the signal to obtain the look-up value, and access a waveform memory to obtain the control signal corresponding to the look-up value. This process will be further explained below with respect to FIG. 5.


The controller 113 may receive the control signal from the decoder 112 and transmit the control signal to drive the one or more haptic actuators 114. The controller 113 may be hardware, such as a chip or an extension card configured to interface with the actuator 114, or may be software or firmware configured to manage the operation of the actuator 114.


The haptic actuator 114 may be configured to generate various types of haptic feedback based on the commands received from the controller 113. Some examples of haptic actuators 114 that may be used in conjunction with embodiments of the haptic input device 101 include electromagnetic actuators, solenoid actuators, piezoelectric actuators, electroactive polymers, vibration motors contactless actuators configured to provide electrovibratory, electrostatic and/or electrocutaneous output, and so on and so forth. For example, in one embodiment, the haptic actuator 114 may be a weight that can move axially along the shaft of the input device 101 in response to a control signal to generate a click, two clicks, vibration, and so on. In some embodiments, the haptic input device 101 may include multiple actuators 114 that are each configured to emit a different type of feedback or feedback to a different portion of the haptic input device. Other embodiments may only include a single haptic actuator 114 configured to provide a single type of feedback, or a single haptic actuator 114 configured to provide multiple types of feedback.


The haptic actuator 114 may further be configured to generate different types of haptic feedback based on touch-based input gesture information received by the touch-based user interface device 103. For example, the haptic actuator 114 may be configured to vibrate to represent an alarm, or may simulate a single or a double click to confirm the selection of a button or an icon. In another embodiment, the actuator 114 may be configured to simulate resistances or motions of the input device 101 on the touch screen surface 105. For example, the actuators 114 may be configured to simulate the feeling of moving a pen or a paintbrush across a piece of paper or a canvas. This may be accomplished by a single actuator 114 that is configured to generate different types of forces to create different types of feedback, or by multiple actuators 114 that are each communicatively coupled to the controller 113. For example, the frequency, intensity and/or duration of a haptic output waveform may be shaped to provide a particular feel to the user. A high frequency, continuous output may emulate the sensation of moving a pen across a smooth surface while a lower frequency signal may emulate the feel of moving a pen across a rougher surface. The output signal may be discontinuous to emulate the feel of a rough or bumpy surface.


In some embodiments, the haptic actuator 114 may further generate haptic feedback that can be felt by the nerves of a user's fingers without physically moving the body of the haptic input device 101. For example, the haptic actuators 114 may emit electrostatic signals that penetrate the housing of the input device 101 to stimulate the user's fingers. The electrostatic signals may stimulate various nerves in the user's fingertips, thereby allowing the user to feel a tactile sensation when holding the haptic input device 101.


In one embodiment, the controller 113 may further be communicatively coupled to one or more optional local sensors 116. The optional sensors 116 may be configured to sense various parameters based on the user's manipulation of the haptic input device 103. For example, in one embodiment, the sensors 116 may be configured to sense motion of the haptic input device 103. Continuing the example, one sensor 116 may be an accelerometer configured to detect the magnitude and/or direction of acceleration of the tip 111 of the haptic input device 103. In addition to or instead of the use of an accelerometer, the sensor 116 may be a gyroscope (or other suitable sensor) configured to measure the angle of the haptic input device 103. The sensors 116 may be communicatively coupled to the controller 113, and may transmit information to the controller 113 regarding the sensed characteristics, which may include acceleration and/or orientation information. The controller 113 may be configured to transmit corresponding control commands to the haptic actuator 114 based on the information received from the sensors 116.


The haptic device's angular measurement may be coupled with a measurement of the angle of the touch screen surface 105. The angle of the touch screen surface may be derived, for example, from a gyroscope, multi-axis accelerometer or other suitable sensor within the user interface device 103. The two angular measurements may be used together to determine a relative angle of the haptic input device with respect to the touch screen surface or user interface device. Given each angle, either the haptic device or the user interface device (or a computing device associated with either) may relatively easily and quickly determine the relative angle. “Computing devices” may include a laptop computer, desktop computer, server, tablet computer, smart phone, personal digital assistant, and so on.


The optional sensors 116 may include one or more force or pressure sensors. The force sensors may be configured to sense various forces being applied on the haptic input device 101 by the user. As alluded to above, in one embodiment, the force sensors may be configured to detect the amount of pressure being applied on the touch screen 105 by the tip 111 of the haptic input device 101. In another embodiment, the force sensors may be configured to detect the amount of pressure being applied to the haptic input device 101, for example, by a user gripping the device. The sensors 116 may transmit the pressure information to the controller 113, which may transmit corresponding control commands to the haptic actuator 114 based on the received pressure information. Accordingly, in one embodiment, the haptic feedback may be varied according to whether pressure is being applied to the haptic input device 101 and/or the amount of pressure applied to the haptic input device 101. Similarly, the entire surface of the haptic input device may sense pressure and/or capacitive changes resulting from a user gripping the instrument. In such embodiments, the pressure/capacitive sensing may be used to selectively apply haptic feedback only to those portions of the device being gripped. This may be accomplished, for example, by incorporating multiple haptic actuators into the haptic input device such that each actuator provides haptic feedback for a specific portion of the device.


In some embodiments, the haptic input device 101 may further include an optional transmitter 118 that is communicatively coupled to the one or more sensors 116. The transmitter 118 may be configured to receive information regarding the sensed parameters and transmit the information to the touch-based user input device 103 through either a wired or a wireless connection. Accordingly, the touch-based user input device 103 may adjust an output of the touch screen 105 based on the information obtained by the one or more sensors 116. In one embodiment, acceleration information from an accelerometer in the haptic input device 101 may be used to change the output of a graphics creation or editing program. For example, if the acceleration information indicates that the user is moving the haptic input device 101 at a high rate of acceleration, the line created by the graphics editing program may lighten and/or thin out from its starting point to its terminus. In contrast, if the acceleration information indicates that the user is moving the haptic input device 101 at a low rate of acceleration, the line created by the graphics editing program may become darker and/or thinner.


As another example, the output shown on the touch screen can be modified according to the orientation and/or angular information of the haptic device. This information may be captured by a gyroscope in the haptic input device 101, or other suitable sensor. Likewise, the output may depend not only on the absolute angle of the haptic device, but the relative angle of the haptic device to the touch screen and/or user input device 103. As mentioned above, this relative angle may be determined from the absolute angle of the haptic device and the absolute angle of the user input device. As one example of how an output may be adjusted, the width of the line created by the graphics editing program may adjusted according to the tilt of the haptic input device 101 relative to the touch screen 105 to simulate writing with a calligraphy pen or painting with a paint brush. Additionally, the angle and/or thickness of the line may be adjusted according to the tilt of the haptic input device 101 relative to the touch screen 105, with a higher tilt corresponding to the creation of a more slanted, thicker or angled line, for example. (Alternative embodiments may vary the effect of the haptic input device's tilt angle on an output generated by the user input device.) Thus, a single haptic device may be used to create a line of varying thickness or depth of color in a single stroke, or another output that varies physically or temporally in response to changes in pressure, capacitance, angle and the like during a continuous input.


The haptic device 101 may have one or more orientation sensors, such as a multi-axis accelerometer, that may determine the axial orientation of the haptic device. Thus, the orientation sensor may detect when the haptic device 101 rotates. This rotational information may also be used to vary an input from the haptic device or an output shown on a display in response to the haptic device's input. Rotating the haptic device may, for example, be substituted for certain conventional input gestures such as clicking or tapping. Likewise, an output may be varied as the haptic device is rotated. A line may be made thicker or thinner in a graphics program, for example. As another example, rotating the haptic device in one direction may increase an audio volume from an associated device, while rotating the haptic device in another direction may decrease the audio volume.


As another example, the line created by the graphics editing program can be modified according to the pressure information captured by a force sensor in the haptic input device 101. For example, the width and/or darkness of the line created by the graphics editing program may be adjusted according to the amount of pressure being applied onto the haptic input device either by a user's grip or by forcing a tip of the input device onto the touch screen or other surface. In one embodiment, more pressure may correspond to a darker and/or a thicker line, while less pressure may correspond to a lighter and/or thinner line. Likewise, changes in grip pressure may be used to signal different inputs to the touch screen 105/user input device 103.


It should be appreciated that the haptic feedback provided by the haptic input device 101 may vary with the output of the display device, as described above. Returning to the example of adjusting a line thickness, as the line appears thicker on the screen, a haptic output from the haptic device 101 may likewise increase in frequency, intensity, duration and the like. This haptic feedback may be more readily perceived by the user than the change in line thickness, for example if the haptic input device is moving relatively slowly across the touch screen or other display device and so obscuring recent portions of the graphical line. Thus, the user may perceive the haptic feedback and use it to adjust the input or output generating the feedback (in this case, the line thickness). Accordingly, the haptic feedback may be part of a closed loop with the user input and may provide data to the user that is useful in modifying his input, such as grip strength, pressure on a touch screen and the like.


In another embodiment, the haptic input device 101 may further include an optional audio transmitter, such as a speaker, that is communicatively coupled to the controller 113. The controller 113 may transmit control commands to the speaker based on information received from the sensors 116 and/or the one or more transmitters 107 on the touch-based user input device 103. The output of the speaker may vary based on the activity being simulated, as well as the user's manipulation of the haptic input device 101. For example, in one embodiment, the speaker may simulate the sound of moving a pen or a paintbrush across a piece of paper or a canvas, with the speaker emitting different sounds for emulating a pen or a paintbrush. In another embodiment, the volume of the speaker may be adjusted based on the amount of pressure being applied to the touch screen surface 105. For example, the volume may be gradually increased as the input device 101 applies more pressure to the touch screen surface 105. In other embodiments, the volume and/or sound may be adjusted according to the position of the input device 101 relative to the touch screen surface 105.


It should be appreciated that some embodiments may employ only a haptic input device 101 and no user input device and/or touch screen. For example, the haptic input device 101 may be used with a sheet of paper, flat surface and the like to provide input to an appropriately-configured computing device. In such embodiments, a visual output on a display associated with the computing device, or an audible output from a speaker associated with the computing device, may be varied in the foregoing manners and/or according to the foregoing description.



FIG. 4 is a block diagram showing the transmitter 107 and one or more input sensors 108 of the touch-based user interface device 103 of FIG. 2, as well as the receiver 119, decoder 112, controller 113, optional sensors 116, and haptic actuator 114 of the haptic input device 101 shown in FIG. 3. As shown in FIG. 4, in one embodiment, the transmitter 107 of the touch-based user interface device 103 may be configured to transmit a signal to the receiver 119 of the haptic input device 101. As discussed above, the signal may carry a control signal for driving the haptic actuator 114 or a look-up value corresponding to a control signal. The transmitted signal may be based on information obtained from various input sensors 108 integrated into the touch screen surface 105 of the touch-based user interface device 103, such as the position of the haptic input device 101 on the touch screen surface 105, the amount of pressure being applied by the input device 101 onto the touch screen surface 105, and so on.


The receiver 119 of the haptic input device 101 may transmit the signal received from the transmitter 107 to the decoder 112, which may be configured to decode the signal to obtain either the control signal or the look-up value, and to transmit the control signal or the look-up value to the controller 113. If the decoder 112 transmits a control signal to the controller 113, the controller 113 may transmit the control signal through a driver to the haptic actuator 114, which may generate haptic feedback consistent with the control signal. In contrast, if the decoder 112 transmits a look-up value to the controller 113, the controller 113 may process the look-up value to obtain a control signal corresponding to the look-up value, and transmit the control signal to the haptic actuator 114.


Alternatively, the controller 113 may receive a signal from one or more optional sensors 116. As discussed above, the one or more optional sensors 116 may be provided in the haptic input device 101, and may sense various parameters of the haptic input device 101. The parameters may include the orientation of the input device 101, the pressure being applied to the input device 101 by the user's fingers, the pressure being applied by the tip of the input device 101 onto the touch screen surface 105, the acceleration of the input device 101 across the touch screen surface 105, and so on and so forth. Upon receiving the signal from the sensors 116, the controller 113 may generate a control signal from the signal and transmit the waveform to the haptic actuator 114, which may generate haptic feedback consistent with the waveform.



FIG. 5 illustrates one embodiment of a controller 113 that may be used in conjunction with the haptic input device 101 shown in FIG. 1. The controller 113 may include a waveform memory 120, a local control device 122, and a local driver device 124 communicatively coupled to the local control device 122 and the waveform member 120. The waveform memory 120 may be configured to store one or more preprogrammed waveforms, which may produce different haptic feedback responses when processed by a haptic actuator 114.


As alluded to above, the decoder 112 may transmit a look-up value to the controller 112. The look-up value may correspond to one or more control signals stored in the waveform memory 120. In one embodiment, the look-up value may be a series of binary 1's and 0's that can be transmitted over a low data rate transmission link. The look-up value may correspond to an entry in a look-up table storing a corresponding control signal for driving the haptic actuator or a set of corresponding control signals for driving the haptic actuator. Upon receiving the look-up value, the local control device 122 may access the waveform memory 120 to determine whether any of the control signals stored in the waveform memory 120 correspond to the received look-up value. If the local control device 122 determines that the received look-up value corresponds to at least one of the waveforms stored in the waveform memory 120, the control device 122 may access the waveform memory 120 and transmit the corresponding control signal or control signals to the driver device 124. The local control device 122 may further transmit a control command to the driver device 124 to transmit the control signal or control signals from the waveform memory 120 to the haptic actuator 114. In another embodiment, the optional sensors 116 of the haptic input device 101 may transmit the look-up value to the local control device 122.



FIG. 6 illustrates another embodiment of a controller 133 that may be used in conjunction with the haptic input device 101 shown in FIGS. 3 and 4. The controller 133 may include a local control device 142 and a local driver device 144 communicatively coupled to one another. In this embodiment, the touch-based user interface device 103 may transmit a carrier signal modulated with a command signal for driving the haptic actuator 114 (shown in FIG. 3). Further, the decoder 113 may demodulate the control signal from the carrier signal. The control signal may then be transmitted to the local driver 144, which in turn, may transmit the control signal to the haptic actuator 114 upon receiving a corresponding control command from the local control device 142.


Some embodiments may include a controller that is configured to receive both control signals and look-up values. In such embodiments, the controller may further be configured to update the control signals stored in the look-up table and/or the stored look-up values based on signals received from the touch-based user interface device 103. For example, to change or update the haptic feedback associated with a particular look-up value, the controller may receive a look-up value and an updated control signal associated with the look-up value from the touch-based user interface device 103 and replace the control signal stored in the database with the updated control signal.



FIG. 7 is a schematic diagram one possible operation of the system 100 illustrated in FIG. 1. As shown in FIG. 7, the touch-based user interface device 103 may run an operating system 152 supporting one or more software or firmware applications. In one embodiment, the operating system may include a graphical user interface allowing a user to select and run such applications or otherwise interact with the device 103. For example, an active application may be a game, a graphics editing program, a word processing program, and so on. As alluded to above with respect to FIG. 2, the operating system 152 may be stored in the storage device 162 of the touch-based user interface device 103, and accessed and implemented by the processor 160.


The touch-based user interface device 103 may include a touch driver 150 to determine a position and/or pressure exerted by the haptic input device 101 on the touch screen 105. As alluded to above, the position and/or pressure information may be obtained through one or more input sensors 164 of the touch-based user interface device 103. The touch driver 150 may be software stored on the storage device 162 of the touch-based user interface device 103 or may be implemented as a hardware component of the touch-based user interface device 103.


The operating system 152 may then transmit the position and/or pressure information to a haptic driver 156, which may process the position and/or pressure information to determine an appropriate haptic response. For example, in one embodiment, the haptic driver 156 may access a local storage device storing a relational database containing one or more position and/or pressure parameters and one or more corresponding control signals or look-up values. The haptic driver 156 may be software stored in the storage device 162 of the touch-based user interface device 103 or may be a separate hardware component. Upon matching the position and/or pressure information with the appropriate control signal or look-up value, the haptic driver 156 may transmit the control signal or look-up value to the haptic input device 101 via a transmitter 107 of the touch-based user interface device 103.



FIG. 8 is a flowchart illustrating one embodiment of a method 800 for providing haptic feedback. For example, the illustrated method 800 may be performed by the processing device 160 of the touch-based user interface device 103 shown in FIGS. 1 and 2. The method 800 may begin by receiving input gestures from one or more input sensors. As discussed above, the input gestures may include swiping, tapping, scrolling, and so on, applied across the touch screen surface 105. The input sensors may include position and/or pressure sensors. The input gestures may be applied by a haptic input device 101. In the operation of block 804, position and/or pressure information may be derived from the input gestures received from the one or more input sensors. The position information may be in x-y coordinate form, while the pressure information may be measured as the force per unit area in a direction perpendicular to the touch screen surface 105.


In the operation of block 806, the processing device may determine whether or not to generate a haptic response based on the position and/or pressure information. In one embodiment, this may involve determining whether the haptic input device 101 is positioned over a selectable item, which can be a button, an icon, a cursor, and so on, displayed on the touch screen surface 105. In another embodiment, the processing device may determine whether the haptic input device 101 is applying a sufficient amount of pressure to the touch screen surface 105. In a further embodiment, the processing device may determine both whether the haptic input device 101 is positioned over a selectable button or icon, as well as whether the device 101 is applying a sufficient amount of pressure to the touch screen surface 105. If, in the operation of block 806, the processing device determines that a haptic response is appropriate, then, in the operation of block 810, the processing device may determine the appropriate haptic response associated with the input gestures. This may be accomplished by referencing a relational database, such as a look-up table, storing one or more input gestures and one or more haptic responses associated with the one or more input gestures. The haptic responses may be in the form of control signals for driving the haptic actuator in the haptic input device 101 or look-up values corresponding to one or more control signals for driving the haptic actuator. In the operation of block 812, the processing device may transmit the one or more control signals (or look-up values) to the haptic input device 101. The method 800 may then proceed back to the operation of block 802, in which the processing device may receive another input gesture.


If, in the operation of block 806, the processing device determines that a haptic response is not appropriate, then, in the operation of block 808, the processing device may not determine a haptic response associated with the received input gesture. In such situations, no haptic response may be emitted from the haptic input device. For example, in one embodiment, the processing device may determine that a haptic response it not appropriate if the pressure applied by the haptic input device 101 onto the touch screen surface 105 is insufficient. In another embodiment, the processing device may determine that a haptic response is not appropriate if the haptic input device 101 is not located in a position that enables selection of an item being displayed on the touch screen. The method 800 may then proceed back to the operation of block 802, in which the processing device may receive another input gesture.



FIG. 9 is a flowchart illustrating another embodiment of a method 900 for providing haptic feedback. For example, the illustrated method 900 may be performed by the controller 113 of the haptic input device 101 shown in FIGS. 1 and 3. The method 900 may begin by receiving a control signal or a look-up value, as indicated in block 902. The control signal or the look-up value may be received from the transmitter 107 of the touch-based user interface device or from one or more input sensors 116 of the haptic input device 101. As discussed above, a control signal may be used to drive the haptic actuator in the haptic input device 101, while a look-up value may correspond to one or more control signals stored in a database. In the operation of block 904, the controller 113 may determine whether a control signal or a look-up value is received. The operation of block 904 is optional. If, in the operation of block 904, the controller 113 determines that a look-up value is received, then, in the operation of block 906, the haptic input device 101 may access the waveform memory, which may include a relational database storing one or more look-up values and one or more corresponding control signals associated with the look-up values. In the operation of block 908, the controller 113 may determine whether there is a match between at least one of the one or more look-up values and at least one of the control signals. If, in the operation of block 908, the controller 113 determines that there is a match, then, in the operation of block 910, the controller 113 may transmit the control signal matching the look-up value to the haptic actuator. If, in the operation of block 908, the controller 113 determines that there is no match, then, in the operation of block 914, the controller 113 may not transmit a control signal to the haptic actuator. The method may then return to the operation of block 902, in which the controller 113 may receive a control signal or a look-up value.


If, in the operation of block 904, the controller 113 determines that a control signal is received, then, in the operation of block 912, the controller 113 may transmit the control signal to the haptic actuator. The method may then return to the operation of block 902, in which the controller 113 may receive a control signal or a look-up value.


The order of execution or performance of the methods illustrated and described herein is not essential, unless otherwise specified. That is, elements of the methods may be performed in any order, unless otherwise specified, and that the methods may include more or less elements than those disclosed herein. For example, it is contemplated that executing or performing a particular element before, contemporaneously with, or after another element are all possible sequences of execution.


Although the haptic device 101 is generally described above as operating in conjunction with a touch screen, it should be appreciated that certain embodiments of the haptic device may be configured to operate with any surface or even while held in the air (e.g., without the tip touching anything). For example, location sensors (accelerometers, gyroscopes, and the like) may be used to determine a position and motion of the haptic device. A pressure sensor in the tip may initiate input to an associated computing device when the haptic device is pressed against a surface. Alternatively, one or more pressure sensors may be located along the barrel of the haptic device and measure pressure exerted thereon by a user's grip. Input may be provided from the haptic device to the computing device when the pressure on the barrel exceeds a certain threshold. Still other embodiments may omit pressure sensors entirely and function to provide input only when the haptic device is in an on state. Essentially, the haptic device may be configured to provide any functionality described herein without requiring a touch screen to accept that input.


In addition, it should be appreciated that the haptic input device 101 may generate haptic feedback based on the touch screen sensing a touch by an object other than the haptic input device. For example, if a finger touches the touch screen, the haptic device may output haptic feedback to indicate the touch. Further, the gesture or other input provided by the object may vary the nature of the haptic feedback. Continuing the example, a finger moving up and down on a touch screen to scroll a webpage or other document may provide a sustained “swoosh” of haptic feedback from the haptic input device. The same finger tapping the touch screen may cause the haptic input device to output a single thump.


Certain embodiments may support the operation and detection of multiple haptic input devices with a single user input device. For example, a single touch screen may detect impacts from multiple haptic input devices. Further, in such embodiments an input from a haptic input device may produce a haptic output in a different haptic input device. That is, a first haptic device interacting with the touch screen (or the like) may cause a second haptic input device to output haptic feedback. This may be useful in various multi-person endeavors, such as collaborative editing, game playing and the like.


Still another embodiment may permit communication between multiple haptic input devices and/or multiple user input devices. For example, four people may each have their own haptic input device and touch screen/user input device. Each user action with his or her user input device may cause one or more of the other persons' haptic input devices to produce a haptic output. Such embodiments may also be useful in multi-person activities such as gaming.


The foregoing discussion provides example embodiments, methods and operations of a haptic device. Accordingly, they should be understood as examples only and not as limitations. The proper scope of protection is defined by the following claims.

Claims
  • 1. A haptic input device held by a user, comprising: a housing defining a tapered input end to interact with a screen of a touch-based user interface device, the housing enclosing: a receiver within the tapered input end and configured to wirelessly receive a signal from the touch-based user interface device, the received signal corresponding to a touch input gesture provided to the screen by the user and detected by the touch-based user interface device;a decoder to extract an input signal from the received signal;a controller to generate a control signal based on the input signal;a haptic actuator to provide a first haptic feedback response corresponding to the control signal;a sensor to determine an orientation of the haptic input device relative to the touch-based user interface device; anda transmitter to transmit the orientation to the touch-based user interface device; whereinthe haptic actuator is configured to provide a second haptic feedback based on the orientation.
  • 2. The haptic input device of claim 1, wherein the sensor comprises an accelerometer.
  • 3. The haptic input device of claim 1, wherein the sensor comprises a pressure sensor.
  • 4. The haptic input device of claim 1, wherein the sensor comprises a gyroscope.
  • 5. The haptic input device of claim 1, wherein the received signal comprises an analog signal to drive the haptic actuator.
  • 6. The haptic input device of claim 1, wherein the received signal comprises a look-up value corresponding to a control signal configured to drive the haptic actuator.
  • 7. The haptic input device of claim 6, wherein the look-up value comprises at least one binary value.
  • 8. The haptic input device of claim 6, wherein the controller comprises a waveform memory storing one or more look-up values and one or more control signals, each corresponding to one of the one or more look-up values.
  • 9. The haptic input device of claim 1, further comprising an engagement portion configured to engage an exterior surface of the screen.
  • 10. A method for providing haptic feedback, comprising: detecting an input gesture by at least one input sensor of a touch-sensitive input device, the input gesture performed by a stylus in contact with a touch screen surface of the touch-sensitive input device;deriving, by a processor within the touch-sensitive input device, an orientation characteristic of the stylus;receiving, by the processor, a signal from the stylus comprising a acceleration characteristic of the stylus;determining, by the processor, a haptic response based on the orientation characteristic, the acceleration characteristic, and the input gesture; andtransmitting a haptic output signal to the stylus, the haptic output signal comprising a look-up value corresponding to the haptic response, the output signal instructing the stylus to provide the haptic response.
  • 11. The method of claim 10, wherein the look-up value comprises a series of binary values.
  • 12. The method of claim 10, further comprising: receiving an external signal from an external user input device; anddetermining the haptic response based at least partially on the external signal.
  • 13. The method of claim 12, wherein the haptic response provides an output to a user signifying an interaction between a second haptic input device and the external user device.
  • 14. The method of claim 10, wherein: the input gesture is a first input gesture; andthe method further comprises: detecting a second input gesture provided by an object in contact with the touch screen surface; anddetermining the haptic response based at least partially on the second input gesture.
  • 15. The method of claim 10, further comprising: determining a pressure exerted on the stylus by a user; andusing the pressure to modify the haptic response.
  • 16. A haptic input device, comprising: a housing comprising an input end and a sensor end, the input end configured to contact a touch-sensitive surface of a user interface device;a receiver within the input end and configured to receive a modulated signal from the user interface device, the modulated signal corresponding to a gesture input provided by the input end to the touch-sensitive surface;a demodulator in communication with the receiver and configured to extract an input signal from the modulated signal;a sensor within the sensor end and configured to provide sensor data based on an orientation characteristic of the haptic input device relative to the user interface surface; anda controller in communication with the demodulator and the sensor and configured to generate a control signal based in part on the input signal and in part on the sensor data;a haptic actuator coupled to the controller and configured to provide a haptic feedback response based on the control signal; anda transmitter configured to transmit the sensor data to the user interface device such that an output of the user interface device is modified based on the sensor data; wherein the sensor end is opposite the input end.
US Referenced Citations (421)
Number Name Date Kind
3001049 Didier Sep 1961 A
3390287 Sonderegger Jun 1968 A
3419739 Clements Dec 1968 A
4236132 Zissimopoulos Nov 1980 A
4412148 Klicker et al. Oct 1983 A
4414984 Zarudiansky Nov 1983 A
4695813 Nobutoki et al. Sep 1987 A
4975616 Park Dec 1990 A
5010772 Bourland Apr 1991 A
5245734 Issartel Sep 1993 A
5283408 Chen Feb 1994 A
5293161 MacDonald et al. Mar 1994 A
5317221 Kubo et al. May 1994 A
5365140 Ohya et al. Nov 1994 A
5434549 Hirabayashi et al. Jul 1995 A
5436622 Gutman et al. Jul 1995 A
5510584 Norris Apr 1996 A
5513100 Parker et al. Apr 1996 A
5587875 Sellers Dec 1996 A
5590020 Sellers Dec 1996 A
5602715 Lempicki et al. Feb 1997 A
5619005 Shibukawa et al. Apr 1997 A
5621610 Moore et al. Apr 1997 A
5625532 Sellers Apr 1997 A
5629578 Winzer et al. May 1997 A
5635928 Takagi et al. Jun 1997 A
5718418 Gugsch Feb 1998 A
5739759 Nakazawa et al. Apr 1998 A
5742242 Sellers Apr 1998 A
5783765 Muramatsu Jul 1998 A
5793605 Sellers Aug 1998 A
5812116 Malhi Sep 1998 A
5813142 Demon Sep 1998 A
5818149 Safari et al. Oct 1998 A
5896076 Van Namen Apr 1999 A
5907199 Miller May 1999 A
5951908 Cui et al. Sep 1999 A
5959613 Rosenberg et al. Sep 1999 A
5973441 Lo et al. Oct 1999 A
5982304 Selker et al. Nov 1999 A
5982612 Roylance Nov 1999 A
5995026 Sellers Nov 1999 A
5999084 Armstrong Dec 1999 A
6069433 Lazarus et al. May 2000 A
6078308 Rosenberg et al. Jun 2000 A
6127756 Iwaki Oct 2000 A
6135886 Armstrong Oct 2000 A
6218966 Goodwin Apr 2001 B1
6220550 McKillip, Jr. Apr 2001 B1
6222525 Armstrong Apr 2001 B1
6252336 Hall Jun 2001 B1
6351205 Armstrong Feb 2002 B1
6373465 Jolly et al. Apr 2002 B2
6408187 Merriam Jun 2002 B1
6411276 Braun et al. Jun 2002 B1
6429849 An Aug 2002 B1
6438393 Surronen Aug 2002 B1
6444928 Okamoto et al. Sep 2002 B2
6455973 Ineson Sep 2002 B1
6465921 Horng Oct 2002 B1
6552404 Hynes Apr 2003 B1
6552471 Chandran et al. Apr 2003 B1
6557072 Osborn Apr 2003 B2
6642857 Schediwy Nov 2003 B1
6693626 Rosenberg Feb 2004 B1
6717573 Shahoian et al. Apr 2004 B1
6809462 Pelrine et al. Oct 2004 B2
6809727 Piot et al. Oct 2004 B2
6864877 Braun et al. Mar 2005 B2
6906697 Rosenberg Jun 2005 B2
6906700 Armstrong Jun 2005 B1
6954657 Bork et al. Oct 2005 B2
6963762 Kaaresoja et al. Nov 2005 B2
6995752 Lu Feb 2006 B2
7005811 Wakuda et al. Feb 2006 B2
7016707 Fujisawa et al. Mar 2006 B2
7022927 Hsu Apr 2006 B2
7023112 Miyamoto et al. Apr 2006 B2
7081701 Yoon et al. Jul 2006 B2
7091948 Chang et al. Aug 2006 B2
7121147 Okada Oct 2006 B2
7123948 Nielsen Oct 2006 B2
7130664 Williams Oct 2006 B1
7136045 Rosenberg et al. Nov 2006 B2
7161580 Bailey et al. Jan 2007 B2
7162928 Shank et al. Jan 2007 B2
7170498 Huang Jan 2007 B2
7180500 Marvit et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7194645 Bieswanger et al. Mar 2007 B2
7217891 Fischer et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7219561 Okada May 2007 B2
7253350 Noro et al. Aug 2007 B2
7269484 Hein Sep 2007 B2
7333604 Zernovizky et al. Feb 2008 B2
7334350 Ellis Feb 2008 B2
7348968 Dawson Mar 2008 B2
7388741 Konuma et al. Jun 2008 B2
7392066 Hapamas Jun 2008 B2
7423631 Shahoian et al. Sep 2008 B2
7446752 Goldenberg et al. Nov 2008 B2
7469595 Kessler et al. Dec 2008 B2
7495358 Kobayashi et al. Feb 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7562468 Ellis Jul 2009 B2
7569086 Kikuchi et al. Aug 2009 B2
7575368 Guillaume Aug 2009 B2
7586220 Roberts Sep 2009 B2
7619498 Miura Nov 2009 B2
7639232 Grant et al. Dec 2009 B2
7641618 Node et al. Jan 2010 B2
7675253 Dorel Mar 2010 B2
7675414 Ray Mar 2010 B2
7679611 Schena Mar 2010 B2
7707742 Ellis May 2010 B2
7710399 Bruneau et al. May 2010 B2
7732951 Mukaide Jun 2010 B2
7737828 Yang et al. Jun 2010 B2
7742036 Grant et al. Jun 2010 B2
7788032 Moloney Aug 2010 B2
7793429 Ellis Sep 2010 B2
7793430 Ellis Sep 2010 B2
7798982 Zets et al. Sep 2010 B2
7868489 Amemiya et al. Jan 2011 B2
7886621 Smith et al. Feb 2011 B2
7886631 Smith et al. Feb 2011 B2
7888892 McReynolds et al. Feb 2011 B2
7893922 Klinghult et al. Feb 2011 B2
7919945 Houston et al. Apr 2011 B2
7929382 Yamazaki Apr 2011 B2
7946483 Miller et al. May 2011 B2
7952261 Lipton et al. May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7961909 Mandella et al. Jun 2011 B2
8031172 Kruse et al. Oct 2011 B2
8069881 Cunha Dec 2011 B1
8072418 Crawford et al. Dec 2011 B2
8081156 Ruettiger Dec 2011 B2
8082640 Takeda Dec 2011 B2
8098234 Lacroix et al. Jan 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125453 Shahoian et al. Feb 2012 B2
8141276 Ellis Mar 2012 B2
8156809 Tierling et al. Apr 2012 B2
8169401 Hardwick May 2012 B2
8179202 Cruz-Hernandez et al. May 2012 B2
8188623 Park et al. May 2012 B2
8205356 Ellis Jun 2012 B2
8210942 Shimabukuro et al. Jul 2012 B2
8232494 Purcocks Jul 2012 B2
8248277 Peterson et al. Aug 2012 B2
8248278 Schlosser et al. Aug 2012 B2
8253686 Kyung et al. Aug 2012 B2
8255004 Huang et al. Aug 2012 B2
8261468 Ellis Sep 2012 B2
8264465 Grant et al. Sep 2012 B2
8270114 Argumedo et al. Sep 2012 B2
8288899 Park et al. Oct 2012 B2
8291614 Ellis Oct 2012 B2
8294600 Peterson et al. Oct 2012 B2
8315746 Cox et al. Nov 2012 B2
8378798 Bells et al. Feb 2013 B2
8384316 Houston et al. Feb 2013 B2
8384679 Paleczny et al. Feb 2013 B2
8390594 Modarres et al. Mar 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8398570 Mortimer et al. Mar 2013 B2
8411058 Wong et al. Apr 2013 B2
8446264 Tanase May 2013 B2
8451255 Weber et al. May 2013 B2
8461951 Gassmann et al. Jun 2013 B2
8466889 Tong et al. Jun 2013 B2
8471690 Hennig et al. Jun 2013 B2
8515398 Song et al. Aug 2013 B2
8542134 Peterson et al. Sep 2013 B2
8545322 George et al. Oct 2013 B2
8570291 Motomura Oct 2013 B2
8575794 Lee et al. Nov 2013 B2
8596755 Hibi Dec 2013 B2
8598893 Camus Dec 2013 B2
8599047 Schlosser et al. Dec 2013 B2
8599152 Wurtenberger et al. Dec 2013 B1
8600354 Esaki Dec 2013 B2
8621348 Ramsay et al. Dec 2013 B2
8629843 Steeves et al. Jan 2014 B2
8633916 Bernstein et al. Jan 2014 B2
8674941 Casparian et al. Mar 2014 B2
8680723 Subramanian Mar 2014 B2
8681092 Harada et al. Mar 2014 B2
8682396 Yang et al. Mar 2014 B2
8686952 Pope et al. Apr 2014 B2
8710966 Hill Apr 2014 B2
8723813 Park et al. May 2014 B2
8735755 Peterson et al. May 2014 B2
8760273 Casparian et al. Jun 2014 B2
8780060 Maschmeyer et al. Jul 2014 B2
8787006 Golko et al. Jul 2014 B2
8797152 Henderson et al. Aug 2014 B2
8798534 Rodriguez et al. Aug 2014 B2
8845071 Yamamoto et al. Sep 2014 B2
8857248 Shih et al. Oct 2014 B2
8861776 Lastrucci Oct 2014 B2
8866600 Yang et al. Oct 2014 B2
8890668 Pance et al. Nov 2014 B2
8918215 Bosscher et al. Dec 2014 B2
8928621 Ciesla et al. Jan 2015 B2
8948821 Newham et al. Feb 2015 B2
8970534 Adachi et al. Mar 2015 B2
8976141 Myers et al. Mar 2015 B2
9008730 Kim et al. Apr 2015 B2
9019088 Zawacki et al. Apr 2015 B2
9035887 Prud'Hommeaux et al. May 2015 B1
9072576 Nishiura Jul 2015 B2
9083821 Hughes Jul 2015 B2
9092129 Abdo et al. Jul 2015 B2
9098991 Park et al. Aug 2015 B2
9122325 Peshkin et al. Sep 2015 B2
9131039 Behles Sep 2015 B2
9134834 Reshef Sep 2015 B2
9158379 Cruz-Hernandez et al. Oct 2015 B2
9189932 Kerdemelidis et al. Nov 2015 B2
9201458 Hunt et al. Dec 2015 B2
9202355 Bernstein Dec 2015 B2
9235267 Pope et al. Jan 2016 B2
9274601 Faubert et al. Mar 2016 B2
9274602 Garg et al. Mar 2016 B2
9274603 Modarres et al. Mar 2016 B2
9275815 Hoffmann Mar 2016 B2
9300181 Maeda et al. Mar 2016 B2
9317116 Ullrich et al. Apr 2016 B2
9318942 Sugita et al. Apr 2016 B2
9325230 Yamada et al. Apr 2016 B2
9357052 Ullrich May 2016 B2
9360944 Pinault Jun 2016 B2
9390599 Weinberg Jul 2016 B2
9396434 Rothkopf Jul 2016 B2
9405369 Modarres et al. Aug 2016 B2
9417695 Griffin et al. Aug 2016 B2
9449476 Lynn Sep 2016 B2
9454239 Elias et al. Sep 2016 B2
9467033 Jun et al. Oct 2016 B2
9477342 Daverman et al. Oct 2016 B2
9480947 Jiang et al. Nov 2016 B2
9501912 Havskjold et al. Nov 2016 B1
9544694 Abe et al. Jan 2017 B2
9594450 Lynn et al. Jul 2017 B2
9778743 Grant et al. Oct 2017 B2
9779592 Hoen Oct 2017 B1
9823833 Grant et al. Nov 2017 B2
9934661 Hill Apr 2018 B2
9990099 Ham et al. Jun 2018 B2
20010045941 Rosenberg et al. Nov 2001 A1
20020163510 Williams et al. Nov 2002 A1
20020181744 Vablais et al. Dec 2002 A1
20030128195 Banerjee et al. Jul 2003 A1
20030210259 Liu Nov 2003 A1
20040021663 Suzuki et al. Feb 2004 A1
20040127198 Roskind et al. Jul 2004 A1
20050057528 Kleen Mar 2005 A1
20050107129 Kaewell et al. May 2005 A1
20050110778 Ben Ayed May 2005 A1
20050116940 Dawson Jun 2005 A1
20050118922 Endo Jun 2005 A1
20050217142 Ellis Oct 2005 A1
20050237306 Klein et al. Oct 2005 A1
20050243072 Denoue Nov 2005 A1
20050248549 Dietz et al. Nov 2005 A1
20050258715 Schlabach Nov 2005 A1
20060007184 Rosenberg Jan 2006 A1
20060014569 DelGiorno Jan 2006 A1
20060154674 Landschaft et al. Jul 2006 A1
20060209037 Wang et al. Sep 2006 A1
20060239746 Grant Oct 2006 A1
20060252463 Liao Nov 2006 A1
20070099574 Wang May 2007 A1
20070152974 Kim et al. Jul 2007 A1
20070178942 Sadler et al. Aug 2007 A1
20070188450 Hernandez et al. Aug 2007 A1
20080084384 Gregorio et al. Apr 2008 A1
20080158149 Levin Jul 2008 A1
20080165148 Williamson et al. Jul 2008 A1
20080181501 Faraboschi Jul 2008 A1
20080192014 Kent et al. Aug 2008 A1
20080204428 Pierce et al. Aug 2008 A1
20080252594 Gregorio et al. Oct 2008 A1
20080255794 Levine Oct 2008 A1
20080291620 DiFonzo et al. Nov 2008 A1
20090002328 Ullrich et al. Jan 2009 A1
20090096746 Kruse Apr 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090120105 Ramsay et al. May 2009 A1
20090128503 Grant et al. May 2009 A1
20090135142 Fu et al. May 2009 A1
20090135153 Narusawa May 2009 A1
20090135164 Kyung May 2009 A1
20090167542 Culbert et al. Jul 2009 A1
20090167702 Nurmi Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090225046 Kim et al. Sep 2009 A1
20090236210 Clark et al. Sep 2009 A1
20090267892 Faubert Oct 2009 A1
20090267920 Faubert et al. Oct 2009 A1
20090305744 Ullrich Dec 2009 A1
20090313542 Cruz-Hernandez et al. Dec 2009 A1
20100020036 Hui et al. Jan 2010 A1
20100048256 Huppi et al. Feb 2010 A1
20100053087 Dai et al. Mar 2010 A1
20100079264 Hoellwarth Apr 2010 A1
20100089735 Takeda et al. Apr 2010 A1
20100141408 Doy et al. Jun 2010 A1
20100141606 Bae et al. Jun 2010 A1
20100152620 Ramsay et al. Jun 2010 A1
20100164894 Kim et al. Jul 2010 A1
20100188422 Shingai et al. Jul 2010 A1
20100194547 Terrell Aug 2010 A1
20100231508 Cruz-Hernandez et al. Sep 2010 A1
20100265197 Purdy Oct 2010 A1
20100309141 Cruz-Hernandez et al. Dec 2010 A1
20100328229 Weber et al. Dec 2010 A1
20110012717 Pance et al. Jan 2011 A1
20110053577 Lee et al. Mar 2011 A1
20110075835 Hill Mar 2011 A1
20110077055 Hill Mar 2011 A1
20110107958 Pence et al. May 2011 A1
20110121765 Anderson et al. May 2011 A1
20110128239 Polyakov et al. Jun 2011 A1
20110148608 Grant et al. Jun 2011 A1
20110163985 Bae et al. Jul 2011 A1
20110175692 Niiyama Jul 2011 A1
20110193824 Modarres et al. Aug 2011 A1
20110248948 Griffin et al. Oct 2011 A1
20110260988 Colgate et al. Oct 2011 A1
20110263200 Thornton et al. Oct 2011 A1
20110291950 Tong Dec 2011 A1
20110304559 Pasquero Dec 2011 A1
20120068957 Puskarich et al. Mar 2012 A1
20120075198 Sulem et al. Mar 2012 A1
20120092263 Peterson et al. Apr 2012 A1
20120126959 Zarrabi et al. May 2012 A1
20120133494 Cruz-Hernandez et al. May 2012 A1
20120139844 Ramstein et al. Jun 2012 A1
20120206248 Biggs Aug 2012 A1
20120223824 Rothkopf Sep 2012 A1
20120256848 Madabusi Srinivasan Oct 2012 A1
20120274578 Snow et al. Nov 2012 A1
20120280927 Ludwig Nov 2012 A1
20120286943 Rothkopf et al. Nov 2012 A1
20120319827 Pance et al. Dec 2012 A1
20120319987 Woo Dec 2012 A1
20130027345 Binzel Jan 2013 A1
20130033967 Chuang et al. Feb 2013 A1
20130063356 Martisauskas Mar 2013 A1
20130106699 Babatunde May 2013 A1
20130120290 Yumiki et al. May 2013 A1
20130124076 Bruni et al. May 2013 A1
20130141365 Lynn et al. Jun 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130207793 Weaber et al. Aug 2013 A1
20130217491 Hilbert et al. Aug 2013 A1
20130222280 Sheynblat et al. Aug 2013 A1
20130228023 Drasnin et al. Sep 2013 A1
20130261811 Yagi et al. Oct 2013 A1
20130300590 Dietz et al. Nov 2013 A1
20140035397 Endo et al. Feb 2014 A1
20140082490 Jung et al. Mar 2014 A1
20140085065 Biggs et al. Mar 2014 A1
20140091857 Bernstein Apr 2014 A1
20140143785 Mistry et al. May 2014 A1
20140197936 Biggs et al. Jul 2014 A1
20140232534 Birnbaum et al. Aug 2014 A1
20140267076 Birnbaum et al. Sep 2014 A1
20140267952 Sirois Sep 2014 A1
20150005039 Liu et al. Jan 2015 A1
20150040005 Faaborg Feb 2015 A1
20150090572 Lee et al. Apr 2015 A1
20150098309 Adams et al. Apr 2015 A1
20150109215 Puskarich Apr 2015 A1
20150169059 Behles et al. Jun 2015 A1
20150192414 Das et al. Jul 2015 A1
20150194165 Faaborg et al. Jul 2015 A1
20150220199 Wang et al. Aug 2015 A1
20150227204 Gipson et al. Aug 2015 A1
20150296480 Kinsey et al. Oct 2015 A1
20150324049 Kies et al. Nov 2015 A1
20150349619 Degner et al. Dec 2015 A1
20160049265 Bernstein Feb 2016 A1
20160063826 Morrell et al. Mar 2016 A1
20160071384 Hill Mar 2016 A1
20160162025 Shah Jun 2016 A1
20160163165 Morrell et al. Jun 2016 A1
20160172953 Degner et al. Jun 2016 A1
20160195929 Martinez et al. Jul 2016 A1
20160196935 Bernstein Jul 2016 A1
20160206921 Szabados et al. Jul 2016 A1
20160211736 Moussette et al. Jul 2016 A1
20160216764 Morrell et al. Jul 2016 A1
20160216766 Puskarich Jul 2016 A1
20160231815 Moussette et al. Aug 2016 A1
20160233012 Lubinski et al. Aug 2016 A1
20160241119 Keeler Aug 2016 A1
20160259480 Augenbergs et al. Sep 2016 A1
20160306423 Uttermann et al. Oct 2016 A1
20160371942 Smith, IV et al. Dec 2016 A1
20170038905 Bijamov et al. Feb 2017 A1
20170070131 Degner et al. Mar 2017 A1
20170084138 Hajati et al. Mar 2017 A1
20170085163 Hajati et al. Mar 2017 A1
20170192507 Lee et al. Jul 2017 A1
20170192508 Lim et al. Jul 2017 A1
20170255295 Tanemura et al. Sep 2017 A1
20170257844 Miller et al. Sep 2017 A1
20170285747 Chen Oct 2017 A1
20170311282 Miller et al. Oct 2017 A1
20170357325 Yang et al. Dec 2017 A1
20170364158 Wen et al. Dec 2017 A1
20180052550 Zhang et al. Feb 2018 A1
20180075715 Morrell et al. Mar 2018 A1
20180081441 Pedder et al. Mar 2018 A1
20180203513 Rihn Jul 2018 A1
Foreign Referenced Citations (86)
Number Date Country
2015100710 Jul 2015 AU
2016100399 May 2016 AU
2355434 Feb 2002 CA
1324030 Nov 2001 CN
1817321 Aug 2006 CN
101120290 Feb 2008 CN
101409164 Apr 2009 CN
101763192 Jun 2010 CN
101903848 Dec 2010 CN
101938207 Jan 2011 CN
102025257 Apr 2011 CN
201829004 May 2011 CN
102246122 Nov 2011 CN
102315747 Jan 2012 CN
102591512 Jul 2012 CN
102713805 Oct 2012 CN
102844972 Dec 2012 CN
102915111 Feb 2013 CN
103019569 Apr 2013 CN
103181090 Jun 2013 CN
103218104 Jul 2013 CN
103278173 Sep 2013 CN
103416043 Nov 2013 CN
19517630 Nov 1996 DE
10330024 Jan 2005 DE
102009038103 Feb 2011 DE
102011115762 Apr 2013 DE
0483955 May 1992 EP
1047258 Oct 2000 EP
1686776 Aug 2006 EP
2060967 May 2009 EP
2073099 Jun 2009 EP
2194444 Jun 2010 EP
2264562 Dec 2010 EP
2315186 Apr 2011 EP
2374430 Oct 2011 EP
2395414 Dec 2011 EP
2461228 Jun 2012 EP
2631746 Aug 2013 EP
2434555 Oct 2013 EP
H05301342 Nov 1993 JP
2002199689 Jul 2002 JP
2002102799 Sep 2002 JP
200362525 Mar 2003 JP
2003527046 Sep 2003 JP
2004236202 Aug 2004 JP
2010272903 Dec 2010 JP
2014509028 Apr 2014 JP
2016095552 May 2016 JP
20050033909 Apr 2005 KR
1020100046602 May 2010 KR
1020110101516 Sep 2011 KR
20130024420 Mar 2013 KR
200518000 Nov 2007 TW
200951944 Dec 2009 TW
201145336 Dec 2011 TW
201218039 May 2012 TW
201425180 Jul 2014 TW
WO199716932 May 1997 WO
WO 00051190 Aug 2000 WO
WO 2001059588 Aug 2001 WO
WO 01089003 Nov 2001 WO
WO2002073587 Sep 2002 WO
WO 2003038800 May 2003 WO
WO2006057770 Jun 2006 WO
WO2007114631 Oct 2007 WO
WO2008075082 Jun 2008 WO
WO2009038862 Mar 2009 WO
WO2009068986 Jun 2009 WO
WO2009097866 Aug 2009 WO
WO2009122331 Oct 2009 WO
WO2009150287 Dec 2009 WO
WO 10085575 Jul 2010 WO
WO2010087925 Aug 2010 WO
WO 11007263 Jan 2011 WO
WO 12052635 Apr 2012 WO
WO 12129247 Sep 2012 WO
WO 13069148 May 2013 WO
WO 13150667 Oct 2013 WO
WO 13169302 Nov 2013 WO
WO 13173838 Nov 2013 WO
WO 13186847 Dec 2013 WO
WO 14018086 Jan 2014 WO
WO 14098077 Jun 2014 WO
WO 13169299 Nov 2014 WO
WO 15023670 Feb 2015 WO
Non-Patent Literature Citations (23)
Entry
International Search Report and Written Opinion, PCT/US2011/060463, 20 pages, dated Mar. 8, 2012.
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC—vol. 49, pp. 73-80, 1993.
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Standford University, 6 pages, at least as early as Sep. 30, 2009.
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004.
Kim et al., “Tactile Rendering of 3D Features on Touch Surfaces,” UIST '13, Oct. 8-11, 2013, St. Andrews, United Kingdom, 8 pages.
U.S. Appl. No. 14/910,108, filed Feb. 4, 2016, Martinez et al.
U.S. Appl. No. 15/045,761, filed Feb. 17, 2016, Morrell et al.
U.S. Appl. No. 15/046,194, filed Feb. 17, 2016, Degner et al.
U.S. Appl. No. 15/047,447, filed Feb. 18, 2016, Augenbergs et al.
U.S. Appl. No. 14/841,582, filed Aug. 31, 2015, Morrell et al.
U.S. Appl. No. 14/928,465, filed Oct. 30, 2015, Bernstein.
U.S. Appl. No. 14/942,521, filed Nov. 16, 2015, Hill.
Astronomer's Toolbox, “The Electromagnetic Spectrum,” http://imagine.gsfc.nasa.gov/science/toolbox/emspectrum1.html, updated Mar. 2013, 4 pages.
U.S. Appl. No. 15/102,826, filed Jun. 8, 2016, Smith et al.
U.S. Appl. No. 15/251,459, filed Aug. 30, 2016, Miller et al.
U.S. Appl. No. 15/260,047, filed Sep. 8, 2016, Degner.
U.S. Appl. No. 15/306,034, filed Oct. 21, 2016, Bijamov et al.
U.S. Appl. No. 15/364,822, filed Nov. 30, 2016, Chen.
U.S. Appl. No. 15/800,630, filed Nov. 1, 2017, Morrell et al.
U.S. Appl. No. 15/881,476, filed Jan. 26, 2018, Moussette et al.
U.S. Appl. No. 15/897,968, filed Feb. 15, 2018, Hill.
Nakamura, “A Torso Haptic Display Based on Shape Memory Alloy Actuators,” Massachusetts Institute of Technology, 2003, pp. 1-123.
Actuator definition downloaded from http://www.thefreedictionary.com/actuator on May 3, 2018, 2 pages.
Related Publications (1)
Number Date Country
20120127088 A1 May 2012 US