Touch interface devices can include computing devices having touch sensitive surfaces used to receive input from operators of the devices. For example, many smart phones, tablet computers, and other devices have touch sensitive screens that identify touches from operators as input to the devices. Haptic or tactile feedback from such screens has emerged as a highly sought feature.
Effective mechanisms for producing such a physical sensation have been lacking. Some known mechanisms include vibrating the entire device, while in others the screen is tapped or “popped”, or the screen is shimmied laterally. Interesting haptic effects can be produced, but the effects fall short of the kind of tactile sensations that one encounters in touching an actual textured surface, or a device that has physical buttons or ridges or other physical haptic features.
Buttons in particular are a high priority. In touching a real button, a user's fingers are sensitive to the edges of the button, so that the location of the button is evident and the user has confidence, without looking, of being properly registered or aligned to the button.
Touch is an “active sense,” as it is fundamentally an interplay of the user's motion with the sensations received. Touch is seldom employed without motion. The sensation of touching a button or another feature—such as a ridge, bump, curve, etc—may benefit from several modes of touch, which are generally used in combination.
A first mode is due to the pattern of force indenting the surface of the fingertip. This can be thought of as a static phenomenon, as, in principle, one could perceive a pattern just by pressing a fingertip into contact with a surface. In practice the perception of a pattern is enhanced by sliding the fingertip across it, much as a reader of Braille slides a finger across a Braille character, rather than pressing a finger onto it.
An additional mode is the guiding of fingertip motion that an edge or pattern presents. This mode seems to require (not just be enhanced by) motion of the fingertip. A sensation of letting the surface guide the finger's motion is experienced. An example is following a ridge line, display edge, or the edge of a button that is large compared to the fingertip. Arrays of controls (buttons and switches) in vehicles present many such haptic features, to reduce reliance on vision. Other devices with which one wishes to become haptically familiar also tend to have strong haptic features, e.g. musical instruments.
Additionally, lateral forces may be perceived even when there is no ongoing finger motion at a given moment. For instance, a user may have pushed a finger up against a button edge or haptic feature, and left it in contact there, so that a lateral force continues to push back.
In accordance with one embodiment, a method for applying force from a surface to an object (such as a user's finger) is provided. The method includes moving the surface in one or more lateral directions of the surface, wherein the moving in one or more lateral directions is performed periodically at a frequency of at least about 1 kiloHertz. The method also includes periodically moving the surface in at least one angled direction that is at least one of obliquely or perpendicularly angled to the surface. The generally planar surface at least one of articulates into and out of contact with the object or varies in degree of engagement with the object. The method further includes controlling the moving in one or more lateral directions and moving in at least one angled direction to impart a force that is oriented along the surface, wherein the force is configured to provide a haptic output to an operator of a device that includes the surface.
In another embodiment, a touch interface device is provided. The touch interface device includes a touch surface configured to be engaged by an object. The touch interface also includes a first actuator assembly operably connected to the touch surface. The first actuator assembly is configured to displace the touch surface in one or more lateral directions along the touch surface at a first frequency that is at least about 1 kiloHertz. Further, the touch interface includes a second actuator assembly operably connected to the touch surface. The second actuator assembly is configured to displace the touch surface in an angled direction that is at least one of obliquely or perpendicularly angled to the touch surface at a second frequency, which may be close to or the same as the first frequency, and may vary in phase with respect the first frequency. The touch interface device also includes a controller operably connected with the first and second actuator assemblies. The controller is configured to operate the first and second actuator assemblies so that the touch surface varies in engagement with the object to impart a force on the object that is along the touch surface.
In another embodiment, a tangible and non-transitory computer readable storage medium for a system that includes a processor is provided. The computer readable storage medium includes one or more sets of instructions configured to direct the processor to control a first actuator assembly to move a touch surface in one or more lateral along the touch surface, wherein the first actuator assembly moves the generally planar surface in the one or more lateral directions periodically at a frequency of at least about 1 kiloHertz. The processor is also directed to control a second actuator assembly to move at least a portion of the generally planar surface in at one or more angled directions that are at least one of obliquely or substantially perpendicularly angled to the touch surface. The second actuator assembly moves the touch surface periodically. The processor is further directed to control motion in the one or more lateral directions and motion in one or more angled directions to impart a force on the object along the touch surface, wherein the force is configured to provide haptic output to an operator of a device that includes the touch surface.
The subject matter described herein will be better understood from reading the following description of non-limiting embodiments, with reference to the attached drawings, wherein below:
Embodiments of the present inventive subject matter provide for improved performance in haptic or tactile sensations provided by, for example, a surface such as a touch screen. In embodiments, motion in at least one direction that is substantially co-planar with the screen is combined with motion in at least one of an oblique direction or a direction that is substantially perpendicular to the surface. The motions are synchronized or controlled to provide a sensation of lateral movement of the surface against an object, such as a finger or other appendage, positioned proximate to the screen.
For example, a vertical motion (substantially perpendicular to the screen, or, as another example, substantially perpendicular to a touch pad) may bring the screen into and out of contact with a finger, while the lateral motion of the surface is controlled so that the lateral motion is experienced in a chosen direction when the surface is at or near a vertical peak (with the surface contacting the finger), with movement in other lateral directions occurring when the surface is not in contact with the finger, and thus not experienced or sensed.
In other embodiments, the vertical or oblique movement may be such that the degree of engagement of the surface with an object, such as a finger is varied. For example, a lateral force in a desired direction may be imparted by controlling the motions such that the surface is moving in the desired lateral direction at or near a point of maximum engagement, while moving in another direction at a point of minimum engagement, whereby any motion may be imperceptible to a human at or near the point of minimum engagement.
Further, the motions may be periodic at a frequency substantially high enough so that the periodicity is substantially imperceptible to human detection. The frequency, for example, may be ultrasonic, so that the vibration is also not heard.
Thus, embodiments provide a net force in a selected direction or directions. Further, embodiments provide for the perception of a force that can be applied to a finger (or other object) that is stationary, or even moving in a similar direction as the force. In some embodiments sufficiently high frequency vibrations are employed to allow for vibrations that are not tactilely perceived by human touch and/or perceived by sound, so that the imparted force is experienced as a constantly perceived force during the duration of the movement.
In the illustrated embodiment, the interface device 100 includes an outer housing 104 disposed around the touch surface 102. The interface device 100 uses motion of the touch surface 102 along two or more axes to generate a net force on a human fingertip that is perceived by a person utilizing the interface device 100. The interface device 100 can include a processor 106 that operates based on one or more sets of instructions stored on a tangible and non-transitory computer readable storage medium (e.g., software stored on computer memory or hard wired logic) to move the touch surface 102. In one embodiment, the motions of the touch surface 102 are provided along one or more axes that lie substantially in the plane of the touch surface 102, and also along one or more axes that are not in the plane of the touch surface 102, such as along an axis that is perpendicular to the plane of the touch surface 102 and/or an axis that is obliquely angled to the plane of the touch surface 102. The motion of the touch surface 102 along one or more axes within the plane of the touch surface 102 (or, for example, along one or more directions generally along a curved touch surface) may be referred to herein as lateral motions or lateral vibrations, or planar motions or planar vibrations, of the touch surface 102. The motion of the touch surface 102 along one or more axes that are not in the plane (or along the surface) of the touch surface 102 may be referred to as oblique motions, oblique vibrations, vertical motions, vertical vibrations, perpendicular motions, or perpendicular vibrations. Also, while the terms “vibrate” and “vibratory” may be used herein to describe the motion of the touch surface 102, the touch surface 102 may be moved in other ways that do not involve vibration of the touch surface 102.
As described in more detail below, the lateral (or planar) motion and vertical (or non-planar) motion of the touch surface 102 can be used in conjunction with each other to move one or more points of the touch surface 102 in an orbit. The term “orbit” refers to the two-dimensional or three-dimensional path taken by one or more points of the touch surface 102. Based on a variety of factors, including the amplitude, frequency, and phase relationships of the lateral motions and the vertical motions, the touch surface 102 can impart a net force on one or more fingers that engage the touch surface 102. This net force can be a generally lateral force on the fingers and may be used to generate one or more haptic effects of the touch surface 102.
The net force is referred to herein as being in a lateral (or planar) direction or being generally lateral in that the force may have a vertical or non-planar component, but is experienced as a lateral force by the object engaging or contacting the touch surface 102. For example, the vertical motion may be used to change the engagement of the object with the surface, so that only during a portion of the orbit of a point on the screen is it applying a force to the object. The engagement may be changed by bringing the surface into and out of contact with the object, or the level or degree of engagement may be changed. For example, at or near a maximum level of engagement, the surface may be sufficiently urged into the object so that the corresponding lateral movement at that portion of the orbit is applied to the object as a net force.
In the illustrated embodiment, the touch surface 102 is depicted as a single continuous surface. In other embodiments, the touch surface 102 may comprise a series of separate surfaces arranged as, for example, columns or rows, that are separately articulable with respect to each other.
The actuators 200 may include, for example, piezoelectric elements, electromagnetic elements, or electrostatic elements that induce motion of the touch surface 102. Alternatively, one or more of the actuators 200 may be another type of actuator that moves the touch surface 102. The reaction masses 202 provide bodies against which the actuators 200 may push to move the touch surface 102. For example, piezoelectric actuators 200 may be energized and expand to push against the reaction masses 202 and move the touch surface 102 in an opposite direction from a reaction mass being pushed against. As another example, electrostatic actuators 200 may be energized to generate an electric field that pushes the actuators 200 away from or toward the corresponding reaction masses 202 to move the touch surface 102. In the illustrated embodiment, the actuators are depicted as being substantially co-planar with the touch surface and exerting forces that are substantially co-planar with the touch surface. In alternate embodiments, other arrangements may be employed. For example, linkages or other mechanisms may be employed to allow the actuators to be located beneath the touch surface. In such embodiments, the actuators may exert forces on the linkages or other mechanisms that are substantially parallel to the touch screen, or at a different angle, such as substantially perpendicular to the touch screen.
The reaction masses 202 may be mounted, for example, directly or indirectly to a housing, such as the housing 104 (shown in
The actuators 200 are controlled to move the touch surface 102 in a variety of different directions, or along different paths. For example, the actuators 200a and 200c may become energized to move the touch surface 102 in a downward direction 210 and an upward direction 212, respectively, as seen from the perspective of
As shown in the embodiment of
The resulting path of motion along the plane of the touch surface 102 that is produced by one or more of the actuators 200 may be linear along a single or a varying axis, or the motion may be circular or elliptical. In the example shown in
For example, by energizing the actuator 202a to move the touch surface 102 downward (along direction 210) at the same time as energizing the actuator 202b to move the touch surface 102 leftward (along direction 214), the overall resulting motion will be down and to the left. By varying the selected actuator or actuators as well as the level of energization of the selected actuator or actuators, paths such as lines, circles, ellipses, or other paths may be traversed by the point 206.
The actuators 200 may laterally move the touch surface 102 in a rapid manner to laterally vibrate the touch surface 102 in various directions. The frequency at which the actuators 200 laterally vibrate the touch surface 102 may be relatively large such that movement of the touch surface 102 is not audible to a human operator of the interface device 100. Further, the frequency in embodiments is selected so that the vibration, or oscillation, of the touch screen is substantially imperceptible to human detection, resulting in a perceived sensation of a constant force or urging in a given direction or directions. In embodiments, the frequency of lateral vibrations is at least about 1 kiloHertz (kHz). In other embodiments, the frequency of lateral vibrations of the touch surface 102 may be at least 20 kHz. In other embodiments, the frequency of the lateral vibrations may be at least 30 kHz.
To reduce or minimize power consumption of a power source that energizes the actuators 200 (e.g., an internal battery or external power source), the use of resonance in vibrating the touch surface 102 may be used so that vibrational energy is not excessively dissipated. For example, a compliant mounting can be used to mount the touch surface 102 in the interface device 100 that, in combination with the mass of the touch surface 102, causes the touch surface 102 to resonate at a desired frequency. Further still, in embodiments, the reaction masses 202 may be denser and/or smaller than the touch surface 102 and the oscillations of the touch surface 102 may be symmetrized in the manner of a tuning fork, so that vibrations do not pass beyond the mounting structure, into for instance the outer housing 104 (shown in
The actuators 200 may also be coordinated to achieve a “focusing” of vibrational energy at selected locations on the touch surface 102, for example by a technique known as “time reversal.” Vibrational energy may, for instance, be focused on the locations where the fingers are touching the surface 102. The locations of focus may track the locations of the fingertips. In this way, there would be greater vibrational energy at the fingertip locations, and less elsewhere.
As also discussed above, the actuators 200 may also be disposed underneath the touch surface 102 instead of being located at the edges. Such positioning may, for instance, reduce the size of a bezel around the perimeter of the touch surface 102. In embodiments, the actuators 200 may be distributed across a large fraction of the area of touch screen 102, or even, in embodiments, across substantially the entire area of the touch screen 102. Such positioning, for example, may help ensure that each portion of the surface of 102 moves in a desired manner.
In alternate embodiments, lateral vibration may be produced from perpendicular vibration (an example of perpendicular vibration is discussed in connection with
Lateral and vertical vibrations may also be combined by bending of the surface. Bending motions are naturally involved in certain perpendicular motions (for example, as discussed below in connection with
In another embodiment, lateral vibration of the touch surface 102 may be achieved by transmitting acoustic waves across the touch surface 102. For example, the actuators 200 may be acoustic transmitters oriented to generate surface acoustic waves (SAW) across the plane of the touch surface 102. The surface acoustic waves may induce lateral motion of the touch surface 102.
As shown in
In addition to the lateral motion, the touch surface 102 may be moved along an axis that is out of the plane of the touch surface 102, such as by being vertically moved or vibrated along the opposite vertical arrows 304, 306. The vertical direction, as used in connection with
Further, in embodiments, the resonances of the lateral vibrations and the perpendicular vibrations are near enough in value so that a minimum of power is dissipated. Because the vertical and lateral resonances may have different inertial and compliant elements associated therewith, and also due to manufacturing tolerances and inconsistencies, the lateral and vertical resonances may not be identical in frequency. However, due to non-zero resonant bandwidths, the resonances do not need to be identical to be driven efficiently at the same frequency. In other embodiments, one of the lateral and vertical resonances may be a harmonic of the other resonance. In embodiments, the resonances have a high quality factor (Q) so that a minimum of power is dissipated.
Similarly, the embodiment shown in
Returning to the discussion of
As shown in
The out-of-plane motion of the touch surface 102 along the orbit 322 (also corresponding to vertical arrow 304) can cause the touch surface 102 to move up toward a finger 308 and contact or engage the finger 308 at or near the upper peak 324 of the orbit 322. Alternatively or additionally, the out-of-plane motion of the touch surface 102 may further compress the touch screen 102 against a finger 308 that already is in an engaged relationship (e.g., physically contacting) with the finger 308, thus increasing a level or amount of engagement. When the touch surface 102 moves upward to engage or further compress against the finger 308, the concurrent lateral motion of the touch screen 102 imparts a laterally directed force on the finger 308. For example, if the vertical motion of the touch surface 102 along the vertical arrow 304 causes the touch surface 102 to engage the finger 308 when the touch surface 102 also is laterally moving along the lateral arrow 302, then the touch surface 102 may impart a net force on the finger 308 that pushes the finger 308 generally along the lateral direction 302. As another example, if the vertical motion of the touch surface 102 along the vertical arrow 304 causes the touch surface 102 to engage the finger 308 when the touch surface 102 also is laterally moving along the opposite lateral arrow 300, then the touch surface 102 may impart a net force on the finger 308 that pushes the finger 308 generally along the lateral direction 300. The net force that is imparted on the finger 308 can be referred to as a net lateral force or lateral force. A force imparted along a surface as discussed herein may be, for example, generally planar with a generally planar touch surface, generally coincident with a curved touch surface, or at a relatively small angle (e.g. a few degrees) to a touch surface.
When the touch surface 102 moves downward, toward the lower peak 326 (also corresponding to vertical arrow 306) to dis-engage or reduce a level of engagement with the finger 308, the lateral motion is not conveyed strongly to the finger 308 (because, for example, the finger does not contact the surface, or as another example, because the level of engagement is low, or as another example, because the level of engagement is reduced so that the movement is sensed much less strongly than movement at or near the upper peak 324 of the orbit 322). Thus, by an “engage and push” phenomenon the object is affected strongly by only a portion of the lateral path traversed by the touch surface.
Human sensitivity to vibration diminishes at higher frequencies. Thus, by selecting appropriately high frequencies, the engage-and push phenomenon is experienced by a human user as a continuous push. For example, in embodiments, frequencies of about 20 kHz or higher are employed. In other embodiments, for example, frequencies of about 30 kHz or higher are employed. Further still, embodiments described herein may provide an experienced lateral force to a non-moving object, such as a finger, in contrast to methods that rely on frictional modulation to apply a force to a moving finger. (It should be noted that friction modulation may be used to accentuate the experienced movement in certain embodiments, as discussed below.)
In one embodiment, lateral forces may be imposed on the finger 308 by the combination of lateral movement and vertical movement of the touch surface 102 at the same time as a friction coefficient of the touch surface 102 is changed. Friction may be changed, for example, by varying the amplitude of the vertical movement. For example, larger vertical movements may result in increased friction coefficients of the touch surface 102. Conversely, smaller vertical movements may result in reduced friction coefficients of the touch surface 102. As another example, friction may be varied by use of a force resulting from electrostatic attraction. The sensations of controllable lateral drive (e.g., imparting a net lateral force on the finger 308) and of “slipperiness” (e.g., changing the friction coefficient of the touch surface 102) may be distinguishable to the user and independent selection and control of these sensations can confer greater design freedom in creating a touch user interface with the touch surface 102.
The direction of the lateral force imparted on the finger 308 can be selected or controlled by varying the axes of the lateral vibrations and/or vertical vibrations of the touch surface 102. For example, changing a direction of the lateral vibrations can cause the finger 308 to be driven in another direction along the lateral vibrations when the touch surface 102 moves upward and engages the finger 308. Utilizing lateral motions traversing shapes such as circles or ellipses in certain embodiments allows for the chosen direction to be changed by varying the phase relationship of the lateral and vertical movements without necessarily requiring alteration of the lateral movement.
For example,
In some embodiments, the lateral and vertical vibrations occur at substantially the same frequency. By altering one or both frequencies slightly, the phase relationship of the vibrations may be changed. This change in phase relationship may be used to alter the point along the elliptical path 208 at which the upper peak of the orbit occurs. For example, by altering the phase relationship so that the upper peak of the orbit occurs at about point 228, the direction of the net lateral force is shown by direction 230 (tangential to the elliptical path 208 at point 228). Thus, by using a lateral path such as an ellipse, different directions of imparted net lateral force may be selected by varying the phase relationship of the vertical and lateral oscillations, without necessarily altering the path of the lateral oscillation. In other embodiments, shapes other than ellipses may be employed, such as circles or lines. In other embodiments, the direction of the net lateral force imparted is altered by varying the axis of the lateral vibration, either additionally or alternatively to adjusting the phase relationship between the lateral and vertical vibrations.
The magnitude and direction of the lateral force on the finger 308 may be selected or controlled by varying amplitudes of the lateral vibrations and vertical vibrations. For example, larger lateral vibrations of the touch surface 102 may impart a greater net force on the finger 308 when the touch surface 102 moves upward to engage or compress the finger 308. Conversely, smaller lateral vibrations can impart a smaller net force on the finger 308. Larger vertical vibrations of the touch surface 102 may impart a larger net force on the finger 308, as the touch surface 102 may compress the finger 308 to a greater degree during the upward movement of the touch surface 102.
As also discussed above, the magnitude and direction of the lateral force on the finger 308 may be selected or controlled by varying the relative phases, or phase relationship, of the lateral vibrations and vertical vibrations. For example, the difference in phases of the periodic lateral vibrations and of the periodic vertical vibrations may change the direction and/or magnitude of the lateral movement of the touch surface 102 when the touch surface 102 moves upward to engage or compress the finger 308. As described above, the direction and/or magnitude of the lateral movement of the touch surface 102 can impart a lateral force on the finger 308 in a same or similar direction when the touch surface 102 engages the finger 308.
In some embodiments, the frequency of perpendicular (vertical) vibrations (referred to as fperp) is equal to or substantially the same as the frequency of the lateral vibrations (referred to as flat), while in other embodiments, the frequency of the perpendicular vibrations may differ from the frequency of the lateral vibrations. If flat=fperp (or harmonic multiples), the phase and/or amplitude of the two motions may be utilized to produce a desired path of movement of a portion of the touch surface. In one embodiment, for example, the vertical and lateral motions are ninety degrees out of phase with respect to each other and the amplitude of flat is varied. The out of phase motions can combine to produce an elliptical motion of the touch surface 102, as described above. Other phase angles may be of interest in generating linear, elliptical, or circular motions.
As also discussed above, the interface device 100 may be configured to provide resonances that allow the efficient conservation of power in the interface device 100, for example, to reduce the sizes of the actuators 200 (shown in
The interface device 500 includes lateral actuators 504 and vertical actuators 506. The actuators 504, 506 may be piezoelectric elements. Alternatively, one or more of the actuators 504, 506 may be another type of actuator that moves the touch surface 502 laterally and vertically, such as electrostatic actuators. The lateral actuators 504 are coupled with reaction masses 508 and coupler bodies 510. The coupler bodies 510 are joined with the touch surface 502. The lateral actuators 504 are disposed on opposite sides of the touch surface 502. The embodiment depicted in
The lateral actuators 504 are energized to move the touch surface 502 in one or more lateral directions 512, 514. The lateral actuators 504 push against the reaction masses 508 to move the touch surface 502 in the lateral directions 512, 514. The longitudinal compliance of the touch surface 502, the reaction masses 508, and the lateral actuators 504 can form a resonant system. Perpendicular motion of the touch surface 502 may be created by the vertical actuators 506. The vertical actuators 506 may be energized to bend the touch surface 502 and thereby vertically move portions of the touch surface 502 (e.g., in and out of the page of
In embodiments, the interface device 500 is configured (for example, by selection of mounting components, reaction masses, and the like) such that the resonance for the lateral vibrations, and that for the vertical vibrations, may be near or equivalent to each other in frequency. Thus, for example, a bending mode resonant frequency of the touch surface 502 may be substantially similar to a longitudinal resonant frequency of the resonant system formed by the longitudinal compliance of the touch surface 502, the reaction masses 508, and the lateral actuators 504, with conventional oscillators and amplifiers used to drive both the lateral actuators 504 and the vertical actuators 506. Alternatively, the frequency of the lateral vibrations or vertical vibrations may be a harmonic of the other. The frequency of the lateral vibrations and/or the vertical vibrations may be shifted or changed slightly from time to time, for a brief interval, in order to change the phase relationship of the lateral and vertical vibrations without losing significant energy in so doing. Also, the amplitude of either or both oscillations may be adjusted if desired. As described above, changing the direction or magnitude of the lateral force exerted on the finger 308 (see
In one embodiment, the interface device 100 (shown in
In one embodiment, the length across the objects 400, 402 or the surface area of interaction between the objects 400, 402 is relatively large compared to the separation distance (d). The electrostatic normal force (F) between the objects 400, 402 may be modeled as in a parallel plate capacitor and based on the following relationship:
where F represents the electrostatic normal force, ∈ represents the relative permittivity (also known as the dielectric constant) of the touch surface, ∈0 represents the permittivity of free space (=8.85×10−12 Farads per meter), A represents the surface area of interface between the objects 400, 402, V represents the potential difference across the objects 400, 402, and d represents the separation distance between the objects 400, 402.
The electrostatic normal force (F) may be estimated by assuming that the dielectric constant (∈) is 5, the surface area (A) is 1×10−4 square meters (m2), and the separation distance (d) is 1×10−5 meters (m). For a potential difference (V) of 150 volts, the electrostatic normal force is approximately 0.5 Newtons. This normal force would add on to the normal force arising from vertical vibration of the touch surface and the associated compression of the fingertip. An increased normal force gives rise to increased lateral force. A rough estimate of lateral force is the normal force times the coefficient of friction. The coefficient of friction of skin on glass may be approximately unity, although it may be more or less depending on factors such as surface finish. As a result, average lateral forces of about 0.25 Newtons or greater may be applied to the finger that touches the surface. The electric field associated with the above parameters is E=V/d=1.5×107 Volts per meter (V/m), which may be less than the breakdown strength of many insulators, such as parylene (2.8×108 V/m). Thus, even higher electric field strengths than 1.5×107 V/m may be feasible without exceeding the breakdown strength of the touch surface.
At 802, a touch surface is coupled with lateral actuators. For example, a touch surface, such as one of the touch surfaces 102, 502 discussed above may be coupled with lateral actuators, such as the actuators 200, 504 discussed above, that laterally move the touch surfaces 102, 502 in one or more directions in the planes of the touch surfaces 102, 502.
At 804, the touch surface is coupled with vertical actuators. For example, the touch surfaces 102, 502 may be coupled with vertical actuators, such as the actuators 506 discussed above, that vertically move the touch surfaces 102, 502 in one or more directions that are oriented perpendicular or obliquely to the planes of the touch surfaces 102, 502.
At 806, the touch surface is laterally moved. As 808, the touch surface is vertically moved. The movements associated with 806 and 808 may occur simultaneously or concurrently. For example, the touch surfaces 102, 502 may vibrate in two or more directions in the planes of the touch surfaces 102, 502 at the same time that the touch surfaces 102, 502 bend or otherwise move vertically in two or more directions. The combined lateral and vertical movements of the touch surfaces 102, 502 can cause one or more points on the touch surfaces 102, 502 to move in a two or three dimensional orbit, such as the circumnavigation of a circle, ellipse, line, quadrilateral, sphere, ellipsoid, or the like.
At 810, the touch surface engages an appendage to impart a lateral force on the appendage. For example, the touch surfaces 102, 502 may engage one or more fingers 308 to impart a lateral force on the fingers 308. As described above, the vertical movement of the touch surfaces 102, 502 may cause the touch surfaces 102, 502 to engage and/or press against the fingers 308 and the lateral movement of the touch surfaces 102, 502 may impart the lateral force on the fingers 308.
In accordance with one or more embodiments described herein, haptic effects can be created in a touch device by modulating the shear forces applied to a fingertip as a function of finger location, finger velocity, and/or finger acceleration. The shear force also can depend on events occurring in a computer program, such as a “virtual” collision occurring in an electronic game that is played on the touch device.
It should be appreciated that the ability to modulate force on one or more appendage is part of what makes haptic feedback via a touch surface possible. To create haptic experiences that are useful and/or interesting, it is generally important to generate forces that closely correspond to specific actions of the fingertips and/or to specific events occurring under software control. By way of illustration, consider a game in which the fingertips are used both to bat a ball, and to capture the ball. In this illustration, the ball is a simulated ball that appears on a computer display disposed underneath the touch surface. Consider the act of batting the ball with one finger. In this case, the lateral force generated by certain methods and systems described herein would depend on both the position and velocity of the finger as well as the position and velocity of the simulated ball. Even higher derivatives of position, such as acceleration, might also be involved. In one embodiment, the force exerted on the finger might increase when the position of the finger intersects that of the surface of the ball, indicating a collision. The force might also depend on the relative velocity of the finger and the ball, increasing for higher velocities. Thus, unlike many existing technologies, the force is not a simple vibration, but is an active force that varies as a function of state variables such as positions, velocities and accelerations. Now consider the act of capturing the ball and holding it between two fingers. In this case, the reaction forces at the two fingers, which are again functions of state variables such as positions and velocities, should point in approximately opposite directions. As the ball is held, the forces should persist. Unlike many existing technologies, the force provided by certain embodiments described herein is neither a simple vibration nor even a transient. The abilities to generate persistent forces, and to generate different forces at different fingers, are advantages of the technology described here. In the above discussion, it should be apparent that the technology described here may be integrated with means of measuring the position of one or more fingertips, and with means of displaying graphic images (and also audio, since events like batting a ball are often accompanied by sound).
There are many techniques for measuring fingertip positions and which may be used here. These include, without limitation, resistive, surface capacitive, projected capacitive, infrared, acoustic pulse recognition, and in-cell optical sensing. There are also many techniques for displaying graphic images and audio. Most of these may combine easily with the lateral drive techniques described here, but capacitive and projective capacitive sensing might seem to interfere with the rapidly varying electric fields used in the electrostatic embodiments. However, capacitive and projective capacitance sensing may be done at a much higher frequency, in the megahertz range, with filtering to separate the signals related to capacitive sensing from those resulting from actuation. It may be desirable to use the same electrodes for both purposes.
In accordance with one embodiment, a method for applying force from a surface to an object is provided. The method includes moving the surface in one or more lateral directions of the surface, wherein the moving in one or more lateral directions is performed periodically at a frequency of at least about 1 kiloHertz. The method also includes periodically moving the surface in at least one angled direction that is at least one of obliquely or perpendicularly angled to the surface. The generally planar surface articulates into and out of contact with the object or varies in degree of engagement with the object. The method further includes controlling the moving in one or more lateral directions and moving in at least one angled direction to impart a force that is oriented along the surface, wherein the force is configured to provide a haptic output to an operator of a device that includes the surface.
In another aspect, the moving in one or more lateral directions and moving in at least one angled direction are performed at substantially the same frequency. Further, in embodiments, a direction of the imparted force is varied by varying a phase relationship between the moving in one or more lateral directions and moving in at least one angled direction.
In another aspect, one of the moving in one or more lateral directions and moving in at least one angled direction is performed at a harmonic frequency of the other of the moving in one or more lateral directions and moving in at least one angled direction.
In another aspect, the moving in one or more lateral directions is performed periodically at a frequency substantially above a frequency that vibrations are tactilely perceived by humans. The moving in one or more lateral directions may be performed periodically at a frequency of at least about 20 kiloHertz. Further, in some embodiments, the moving in one or more lateral directions is performed periodically at a frequency of at least about 30 kiloHertz.
In another aspect, the method further includes modulating a frictional force experienced by the object concurrently with the moving in at least one angled direction. For example, in some embodiments, the frictional force is modulated by varying an electrostatic attraction between the object and the surface. Optionally, the electrostatic attraction has a different amplitude or phase at a plurality of points distributed about the surface, whereby a plurality of objects contacting the surface experience different imparted forces.
In another aspect, the surface is generally planar and the one or more lateral directions of the surface are substantially co-planar with the surface.
In another embodiment, a touch interface device is provided. The touch interface device includes a touch surface configured to be engaged by an object. The touch interface also includes a first actuator assembly operably connected to the touch surface. The first actuator assembly is configured to displace the touch surface in one or more lateral directions along the touch surface at a first frequency that is at least about 1 kiloHertz. Further, the touch interface includes a second actuator assembly operably connected to the touch surface. The second actuator assembly is configured to displace the touch surface in an angled direction that is at least one of obliquely or perpendicularly angled to the touch surface at a second frequency. The touch interface device also includes a controller operably connected with the first and second actuator assemblies. The controller is configured to operate the first and second actuator assemblies so that the touch surface varies in engagement with the object to impart a force on the object that is along the touch surface.
In another aspect, the first actuator assembly is configured to displace the touch surface at a first frequency that is at least about 20 kiloHertz.
In another aspect, the first actuator assembly is configured to displace the touch surface at a first frequency that is at least about 30 kiloHertz.
In another aspect, the first frequency and the second frequency are substantially the same.
In another aspect, the controller is further configured to vary a direction of the imparted force by varying a phase relationship between a first oscillation in the one or more lateral directions and a second oscillation in the angled direction.
In another aspect, the touch interface device includes a first massive system and a second massive system. The first massive system includes at least one of a first mounting or a first reactive mass. The second massive system includes at least one of a second mounting or a second reactive mass. The resonances of the first massive system and the second massive system are substantially the same.
In another embodiment, a tangible and non-transitory computer readable storage medium for a system that includes a processor is provided. The computer readable storage medium includes one or more sets of instructions configured to direct the processor to control a first actuator assembly to move a touch surface in one or more lateral along the touch surface, wherein the first actuator assembly moves the generally planar surface in the one or more lateral directions periodically at a frequency of at least about 1 kiloHertz. The processor is also directed to control a second actuator assembly to move at least a portion of the generally planar surface in at one or more angled directions that are at least one of obliquely or substantially perpendicularly angled to the touch surface. The second actuator assembly moves the touch surface periodically. The processor is further directed to control motion in the one or more lateral directions and motion in one or more angled directions to impart a force on the object along the touch surface, wherein the force is configured to provide haptic output to an operator of a device that includes the touch surface.
In another aspect, the motion in one or more lateral directions and motion in one or more angled directions are performed at substantially the same frequency. Further, in embodiments, a direction of the imparted force is varied by varying a phase relationship between the motion in one or more lateral directions and motion in one or more angled directions. In another aspect, the processor is further configured to modulate a frictional force experienced by the object concurrently with the motion in one or more angled directions. For example, in embodiments the frictional force is modulated by varying an electrostatic attraction between the object and the touch surface. Further, in additional embodiments, the electrostatic attraction has a different amplitude at a plurality of points distributed about the touch surface, whereby a plurality of objects contacting the touch surface experience different imparted forces.
It is to be understood that the above description is intended to be illustrative, and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the inventive subject matter without departing from its scope. While the dimensions and types of materials described herein are intended to define the parameters of the inventive subject matter, they are by no means limiting and are example embodiments. Many other embodiments will be apparent to one of ordinary skill in the art upon reviewing the above description. The scope of the one or more embodiments of the subject matter described herein should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein.” Moreover, in the following claims, the terms “first,” “second,” and “third,” etc. are used merely as labels, and are not intended to impose numerical requirements on their objects. Further, the limitations of the following claims are not written in means-plus-function format and are not intended to be interpreted based on 35 U.S.C. § 112, sixth paragraph, unless and until such claims limitations expressly use the phrase “means for” followed by a statement of function void of further structure.
This written description uses examples to disclose several embodiments of the inventive subject matter, and also to enable a person of ordinary skill in the art to practice the embodiments disclosed herein, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the subject matter may be defined by the claims, and may include other examples that occur to one of ordinary skill in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.
The foregoing description of certain embodiments of the disclosed subject matter will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (for example, processors or memories) may be implemented in a single piece of hardware (for example, a general purpose signal processor, microcontroller, random access memory, hard disk, and the like). Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. In embodiments, one or more of the functional blocks are implemented via a non-transitory computer storage medium that does not include signals. The various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
As used herein, an element or step recited in the singular and proceeded with the word “a” or “an” should be understood as not excluding plural of said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to “one embodiment” of the presently described inventive subject matter are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments “comprising,” “including,” or “having” an element or a plurality of elements having a particular property may include additional such elements not having that property.
Since certain changes may be made in the above-described systems and methods, without departing from the spirit and scope of the subject matter herein involved, it is intended that all of the subject matter of the above description or shown in the accompanying drawings shall be interpreted merely as examples illustrating the inventive concepts herein and shall not be construed as limiting the disclosed subject matter.
This application claims priority benefit to U.S. Provisional Application No. 61/499,221, entitled “Touch Interface Device And Method For Applying Lateral Forces On A Human Appendage,” which was filed on Jun. 21, 2011 (“the '221 Application”). The entire subject matter of the '221 Application is incorporated by reference. This application incorporates in its entirety the subject matter of U.S. patent application Ser. No. 13/468,695, entitled “A Touch Interface Device And Method For Applying Controllable Shear Forces To A Human Appendage,” which was filed on May 10, 2012 (“the '695 Application”). This application incorporates in its entirety the subject matter of U.S. patent application Ser. No. 13/468,818, entitled “A Touch Interface Device Having An Electrostatic Multitouch Surface And Method For Controlling The Device,” which was filed on May 10, 2012 (“the '818 Application”).
This invention was made with government support under IIS0964075 awarded by the National Science Foundation. The government has certain rights in the invention.
Number | Name | Date | Kind |
---|---|---|---|
4686407 | Ceperley | Aug 1987 | A |
5184319 | Kramer | Feb 1993 | A |
5561337 | Toda | Oct 1996 | A |
5587937 | Massie et al. | Dec 1996 | A |
5631861 | Kramer | May 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5760530 | Kolesar | Jun 1998 | A |
6059506 | Kramer | May 2000 | A |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6570299 | Takeshima et al. | May 2003 | B2 |
6693516 | Hayward | Feb 2004 | B1 |
6970160 | Mulligan et al. | Nov 2005 | B2 |
6979164 | Kramer | Dec 2005 | B2 |
7148875 | Rosenberg et al. | Dec 2006 | B2 |
7271707 | Gonzales | Sep 2007 | B2 |
7292227 | Fukumoto | Nov 2007 | B2 |
7390157 | Kramer | Jun 2008 | B2 |
7701445 | Inokawa et al. | Apr 2010 | B2 |
7714701 | Altan et al. | May 2010 | B2 |
7742036 | Grant et al. | Jun 2010 | B2 |
7825903 | Anastas et al. | Nov 2010 | B2 |
7952498 | Higa | May 2011 | B2 |
7986303 | Braun | Jul 2011 | B2 |
8169402 | Shahoian | May 2012 | B2 |
8253306 | Morishima et al. | Aug 2012 | B2 |
8253703 | Eldering | Aug 2012 | B2 |
8279193 | Birnbaum | Oct 2012 | B1 |
8325144 | Tierling | Dec 2012 | B1 |
8362882 | Heubel | Jan 2013 | B2 |
8405618 | Colgate | Mar 2013 | B2 |
8436825 | Coni | May 2013 | B2 |
8493354 | Birnbaum | Jul 2013 | B1 |
8525778 | Colgate et al. | Sep 2013 | B2 |
8570296 | Birnbaum | Oct 2013 | B2 |
8581873 | Eldering | Nov 2013 | B2 |
8624864 | Birnbaum | Jan 2014 | B2 |
8659571 | Birnbaum | Feb 2014 | B2 |
8711118 | Short | Apr 2014 | B2 |
8754757 | Ullrich | Jun 2014 | B1 |
8754758 | Ullrich | Jun 2014 | B1 |
8780053 | Colgate | Jul 2014 | B2 |
8823674 | Birnbaum | Sep 2014 | B2 |
8836664 | Colgate | Sep 2014 | B2 |
8847741 | Birnbaum | Sep 2014 | B2 |
8866601 | Cruz-Hernandez | Oct 2014 | B2 |
8866788 | Birnbaum | Oct 2014 | B1 |
8981915 | Birnbaum | Mar 2015 | B2 |
9041662 | Harris | May 2015 | B2 |
9104285 | Colgate | Aug 2015 | B2 |
9110507 | Radivojevic | Aug 2015 | B2 |
9122325 | Peshkin | Sep 2015 | B2 |
9122330 | Bau | Sep 2015 | B2 |
9261963 | Jiang | Feb 2016 | B2 |
20010026266 | Schena et al. | Oct 2001 | A1 |
20010043847 | Kramer | Nov 2001 | A1 |
20030038776 | Rosenberg et al. | Feb 2003 | A1 |
20040237669 | Hayward et al. | Dec 2004 | A1 |
20050017947 | Shahoian et al. | Jan 2005 | A1 |
20050030284 | Braun et al. | Feb 2005 | A1 |
20050030292 | Diederiks | Feb 2005 | A1 |
20050057527 | Takenaka et al. | Mar 2005 | A1 |
20050057528 | Kleen | Mar 2005 | A1 |
20050173231 | Gonzales | Aug 2005 | A1 |
20060097996 | Tabata | May 2006 | A1 |
20060115348 | Kramer | Jun 2006 | A1 |
20060119573 | Grant | Jun 2006 | A1 |
20060187197 | Peshkin | Aug 2006 | A1 |
20060209037 | Wang et al. | Sep 2006 | A1 |
20060244732 | Geaghan | Nov 2006 | A1 |
20060279548 | Geaghan | Dec 2006 | A1 |
20070146317 | Schena | Jun 2007 | A1 |
20070236450 | Colgate et al. | Oct 2007 | A1 |
20070236474 | Ramstein | Oct 2007 | A1 |
20080007517 | Peshkin | Jan 2008 | A9 |
20080048974 | Braun et al. | Feb 2008 | A1 |
20080055244 | Cruz-Hernandez | Mar 2008 | A1 |
20080060856 | Shahoian et al. | Mar 2008 | A1 |
20080062122 | Rosenberg et al. | Mar 2008 | A1 |
20080062143 | Shahoian | Mar 2008 | A1 |
20080062144 | Shahoian | Mar 2008 | A1 |
20080062145 | Shahoian et al. | Mar 2008 | A1 |
20080068351 | Rosenberg et al. | Mar 2008 | A1 |
20080111447 | Matsuki | May 2008 | A1 |
20080129705 | Kim et al. | Jun 2008 | A1 |
20080170037 | Cruz-Hernandez et al. | Jul 2008 | A1 |
20090002328 | Ullrich | Jan 2009 | A1 |
20090036212 | Provancher | Feb 2009 | A1 |
20090079550 | Makinen et al. | Mar 2009 | A1 |
20090085882 | Grant | Apr 2009 | A1 |
20090115734 | Fredriksson | May 2009 | A1 |
20090189873 | Peterson et al. | Jul 2009 | A1 |
20090225046 | Kim | Sep 2009 | A1 |
20090231113 | Olien et al. | Sep 2009 | A1 |
20090284485 | Colgate et al. | Nov 2009 | A1 |
20090290732 | Berriman et al. | Nov 2009 | A1 |
20100108408 | Colgate et al. | May 2010 | A1 |
20100109486 | Polyakov et al. | May 2010 | A1 |
20100141407 | Heubel | Jun 2010 | A1 |
20100149111 | Olien | Jun 2010 | A1 |
20100156818 | Burrough et al. | Jun 2010 | A1 |
20100177050 | Heubel | Jul 2010 | A1 |
20100207895 | Joung | Aug 2010 | A1 |
20100225596 | Eldering | Sep 2010 | A1 |
20100231367 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100231508 | Cruz-Hernandez | Sep 2010 | A1 |
20100231539 | Cruz-Hernandez | Sep 2010 | A1 |
20100231540 | Cruz-Hernandez | Sep 2010 | A1 |
20100231541 | Cruz-Hernandez | Sep 2010 | A1 |
20100231550 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100312366 | Madonna | Dec 2010 | A1 |
20110009195 | Porwal | Jan 2011 | A1 |
20110012717 | Pance et al. | Jan 2011 | A1 |
20110043477 | Park et al. | Feb 2011 | A1 |
20110090167 | Harris | Apr 2011 | A1 |
20110115754 | Cruz-Hernandez | May 2011 | A1 |
20110128239 | Polyakov et al. | Jun 2011 | A1 |
20110141052 | Bernstein | Jun 2011 | A1 |
20110157088 | Motomura | Jun 2011 | A1 |
20110187658 | Song et al. | Aug 2011 | A1 |
20110193824 | Modarres | Aug 2011 | A1 |
20110215914 | Edwards | Sep 2011 | A1 |
20110260988 | Colgate | Oct 2011 | A1 |
20110285637 | Karkkainen | Nov 2011 | A1 |
20110285666 | Poupyrev | Nov 2011 | A1 |
20110285667 | Poupyrev | Nov 2011 | A1 |
20110316798 | Jackson | Dec 2011 | A1 |
20120028577 | Rodriguez | Feb 2012 | A1 |
20120038568 | Colloms et al. | Feb 2012 | A1 |
20120062516 | Chen et al. | Mar 2012 | A1 |
20120075210 | Coni et al. | Mar 2012 | A1 |
20120126959 | Zarrabi et al. | May 2012 | A1 |
20120206248 | Biggs | Aug 2012 | A1 |
20120206371 | Turunen et al. | Aug 2012 | A1 |
20120217982 | Narayanasamy | Aug 2012 | A1 |
20120223880 | Birnbaum | Sep 2012 | A1 |
20120229400 | Birnbaum | Sep 2012 | A1 |
20120229401 | Birnbaum | Sep 2012 | A1 |
20120268386 | Karamath et al. | Oct 2012 | A1 |
20120268412 | Cruz-Hernandez | Oct 2012 | A1 |
20120286847 | Peshkin | Nov 2012 | A1 |
20120287068 | Colgate | Nov 2012 | A1 |
20120293441 | Eldering | Nov 2012 | A1 |
20120326999 | Colgate | Dec 2012 | A1 |
20120327006 | Israr | Dec 2012 | A1 |
20130044049 | Biggs et al. | Feb 2013 | A1 |
20130106774 | Radivojevic | May 2013 | A1 |
20130207904 | Short | Aug 2013 | A1 |
20130207917 | Cruz-Hernandez | Aug 2013 | A1 |
20130222303 | Colgate | Aug 2013 | A1 |
20130222310 | Birnbaum | Aug 2013 | A1 |
20130300683 | Birnbaum | Nov 2013 | A1 |
20140071079 | Heubel | Mar 2014 | A1 |
20140092055 | Radivojevic | Apr 2014 | A1 |
20140101545 | Paek | Apr 2014 | A1 |
20140104165 | Birnbaum | Apr 2014 | A1 |
20140118125 | Bhatia | May 2014 | A1 |
20140139327 | Bau | May 2014 | A1 |
20140139448 | Levesque | May 2014 | A1 |
20140139450 | Levesque | May 2014 | A1 |
20140139451 | Levesque | May 2014 | A1 |
20140139452 | Levesque | May 2014 | A1 |
20140184497 | Birnbaum | Jul 2014 | A1 |
20140198130 | Lacroix | Jul 2014 | A1 |
20140218185 | Cruz-Hernandez | Aug 2014 | A1 |
20140247227 | Jiang | Sep 2014 | A1 |
20140247406 | Park | Sep 2014 | A1 |
20140320431 | Cruz-Hernandez | Oct 2014 | A1 |
20140333565 | Birnbaum | Nov 2014 | A1 |
20140342709 | Stepanian | Nov 2014 | A1 |
20140347270 | Birnbaum | Nov 2014 | A1 |
20140347323 | Colgate | Nov 2014 | A1 |
20150009168 | Levesque | Jan 2015 | A1 |
20150035780 | Birnbaum | Feb 2015 | A1 |
20150054773 | Jiang | Feb 2015 | A1 |
20150070146 | Cruz-Hernandez | Mar 2015 | A1 |
20150145657 | Levesque | May 2015 | A1 |
20150160771 | Takeuchi | Jun 2015 | A1 |
20150160772 | Takeuchi | Jun 2015 | A1 |
20150185848 | Levesque | Jul 2015 | A1 |
20150185849 | Levesque | Jul 2015 | A1 |
20150189223 | Levesque | Jul 2015 | A1 |
20150301673 | Peshkin | Oct 2015 | A1 |
20150355710 | Modarres | Dec 2015 | A1 |
20150363365 | Campbell | Dec 2015 | A1 |
Number | Date | Country |
---|---|---|
2001-255993 | Sep 2001 | JP |
2006-163206 | Jun 2006 | JP |
2008-287402 | Nov 2008 | JP |
WO 2010105001 | Sep 2010 | WO |
WO2010105006 | Sep 2010 | WO |
WO 2010139171 | Dec 2010 | WO |
Entry |
---|
ISR and W/O for PCT/US2012/043281 dated Jan. 22, 2013. |
Reznik et al.; “A Flat Rigid Plate is a Universal Planar Manipulator”; In IEEE International Conference on Robotics and Automation; May 1998; pp. 1471-1477. |
Bau et al.; “TeslaTouch: Electrovibration for Touch Surfaces”; User Interface Science and Technology (UIST) 2010; Oct. 3-6; New York. |
Grimnes; “Electrovibration, cutaneous sensation of microampere current”; Acta. Physiol. Scand.; Jan. 1983; pp. 19-25; vol. 118; No. 1. |
Kaczmarek; “Electrotactile Display of Computer Graphics for Blind—Final Report”; National Eye Institute grant 5-R01-EY10019-08; Dec. 23, 2004. |
Kaczmarek et al.; “Polarity Effect in Electrovibration for Tactile Display”; IEEE Trans on Biomedical Engineering; Oct. 2006; pp. 2047-2054; vol. 53; No. 10. |
Strong et al.; “An Electrotactile Display”; IEEE Tranactions on Man-Machine Systems; Mar. 1970; pp. 72-79; vol. MMS-11; No. 1. |
Biggs et al.; “Haptic Interfaces”; Published by Lawrence Erlbaum Associates; 2002; pp. 93-115; Chapter 5. |
Minsky; “Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-Feedback Display”; PhD Thesis; Massachusetts Institute of Technology, Cambridge, MA; Jul. 6, 1995; pp. 1-217. |
Robles-De-La-Torre; “Comparing the Role of Lateral Force During Active and Passive Touch: Lateral Force and its Correlates are Inherently Ambiguous Cues for Shape Perception under Passive Touch Conditions”; 2002; Proceedings of Eurohaptics 2002, University of Edinburgh, United Kingdom; 2002; pp. 159-164. |
Robles-De-La-Torre et al.; “Force Can Overcome Object Geometry in the Perception of Shape Through Active Touch”; Letters to Nature, Jul. 2001; pp. 445-448; vol. 412. |
Cerundolo; “Effect of Charge Migration in Electrostatic Tactile Displays”; MS Thesis, Dept of Mechanical Engineering, Northwestern University; Aug. 2010. |
http://niremf.ifac.cnr.it/tissprop/htmlclie/htmlclie.htm; Sep. 20, 2012; pp. 1-3. |
www.senseg.com; Sep. 20, 2012; pp. 1-2. |
http://www.teslatouch.com/; Sep. 20, 2012; pp. 1-4. |
Kaczmarek et al.; “Electrotactile and Vibrotactile Displays for Sensory Substitution Systems”; IEEE Transactions on Biomedical Engineering; Jan. 1991; pp. 1-16; vol. 38, No. 1. |
Tang et al.; “A Microfabricated Electrostatic Haptic Display for Persons with Visual Impairments”; IEEE Transactions on Rehabilitation Engineering; Sep. 1998; pp. 241-248; vol. 6, No. 3. |
Mallinckrodt et al.; “Perception by the Skin of Electrically InducedVibrations”; Science; Sep. 1953; pp. 277-278; vol. 118, No. 3062. |
Yamamoto et al.; “Electrostatic Tactile Display for Presenting Surface Roughness Sensation”; in Industrial Technology, 2003 IEEE International Conference; Dec. 2003, pp. 680-684. |
Takasaki et al.; “Transparent Surface Acoustic Wave Tactile Display,” International Conference on Intelligent Robots and Systems; Aug. 2005, pp. 3354-3359. |
Watanabe et al.; “A Method for Controlling Tactile Sensation of Surface Roughness Using Ultrasonic Vibration”; in IEEE International Conference on Robotics and Automation; May 1995; pp. 1134-1139; vol. 1. |
Biet et al.; “Implementation of Tactile Teedback by Modifying the Perceived Friction”; The European Physical Journal Applied Physics; Jul. 2008; pp. 123-135; , vol. 43, No. 1. |
Winfield et al; “T-PaD: Tactile Pattern Display through Variable Friction Reduction”; World Haptics Conference; 2007; pp. 421-426. |
Wang et al.; “Haptic Overlay Device for Flat Panel Touch Displays”; Proceedings of the 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; 2004; pp. 1. |
Chubb et al.; “ShiverPad: A Device Capable of Controlling Shear Force on a Bare Finger”; Third Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; Mar. 18-20, 2009; pp. 18-23. |
Chubb et al.; “ShiverPaD: A Glass Haptic Surface that Produces Shear Force on a Bare Finger”; Transactions on Hapics; 2010; pp. 1-10; vol. X, No. X. |
Kato et al.; “Sheet-Type Braille Displays by Integrating Organic Field-Effect Transistors and Polymeric Actuators”; IEEE Transactions on Electron Devices; Feb. 2007; pp. 202-209; vol. 54; No. 2. |
Pasquero et al.; “STReSS: A Practical Tactile Display With One Millimeter Spatial Resolution and 700 Hz Refresh Rate,” Proc. of Eurohaptics 2003 Dublin, Ireland; Jul. 2002; pp. 94-110. |
Levesque et al.; “Experimental Evidence of Lateral Skin Strain During Tactile Exploration”; CHI-2009—Clicking on Buttons; Apr. 6, 2009; pp. 261-275. |
Harrison et al.; “Providing Dynamically Changeable Physical Buttons on a Visual Display”; Proc. of the 27th international conf. on Human factors in computing systems; 2009; pp. 299-308. |
Biet, Discrimination of Virtual Square Gratings by Dynamic Touch on Friction Based Tactile Displays, Symposium on Haptic Interfaces for Virtual Environments and Teleoperator Systems, pp. 41-48, 2008. |
E.C. Chubb, “Shiverpad: A haptic surface capable of applying shear forces to bare finger,” Master's thesis, Northwestern University, Evanston, IL, USA, 2009. |
Minsky et al., Feeling and Seeing: Issues in Force Display, Symposium on Interactive 3D Graphics, Proceedings of 1990 Symposium, Snowbird, Utah, pp. 235-243, 270, 1990. |
Takaaki et al., An application of saw to a tactile display in virtual reality, IEEE Ultrasonics Symposium, pp. 1-4, 2000. |
Takaaki et al., Surface Acoustic Wave Tactile Display, IEEE Computer Graphics and Applications, pp. 55-63, Nov./Dec. 2001. |
Takasaki et al., A surface acoustic wave tactile display with friction control, IEEE Computer Graphics and Applications, IEEE International Conference, pp. 240-243, 2001. |
Wiesendanger et al., Squeeze film air bearings using piezoelectric bending elements, 5th Intl. Conference on Motion and Vibration Control, (MOVIC2000) pp. 181-186, 2000. |
Winfield, A Virtual Texture Display using Ultrasonically Vibrating Plates, Paper [online], Nov. 2007, [retrieved on Dec. 4, 2010]. Http://vroot.org/node/4707. |
European Search Report for EP Application No. 12 802 419.7 dated Mar. 20, 2015. |
Goethals, Tactile Feedback for Robot Assisted Minimally Invasive Surgery: An Overview, paper [online], Jul. 2008. |
Number | Date | Country | |
---|---|---|---|
20120326999 A1 | Dec 2012 | US |
Number | Date | Country | |
---|---|---|---|
61499221 | Jun 2011 | US |