USING TACTILE FEEDBACK TO PROVIDE SPATIAL AWARENESS

Information

  • Patent Application
  • 20120327006
  • Publication Number
    20120327006
  • Date Filed
    September 05, 2012
    12 years ago
  • Date Published
    December 27, 2012
    11 years ago
Abstract
An image capturing device may be combined with a touch screen to generate a tactile map of an environment. The image capturing device captures an image of the environment which is then processed and used to correlate a point of user contact on a touch screen to a particular tactile sensation. The touch screen may then generate an electric signal (i.e., tactile feedback) corresponding to the tactile sensation which is felt by a user contacting the touch screen. By using the electrical signal as tactile feedback (e.g., electrovibration), the user may determine relative spatial locations of the objects in the environment, the objects' physical characteristics, the distance from the objects to the image capturing device, and the like.
Description
BACKGROUND

1. Field of the Invention


Embodiments of the invention relate to touch surfaces, and, in particular, to electrovibration for touch surfaces based on a captured image.


2. Description of the Related Art


Touch provides humans with a wide variety of sensations that allow us to feel the world. We can enjoy the feeling of textures, as well as objects and materials. Beyond experience, tactile sensations also guide us with everyday tasks and help us to explore object properties that we normally are not able to see.


Interest in designing and investigating haptic interfaces for touch-based interactive systems grown rapidly in recent years. Haptics refers to the sense of touch. This interest in haptic interfaces is fueled by the popularity of touch-based interfaces, both in research and end-user communities. However, one major problem with touch interfaces is the lack of dynamic tactile feedback. A lack of haptic feedback decreases the realism of visual environments, breaks the metaphor of direct interaction, and reduces interface efficiency because the user cannot rely on familiar haptic cues for accomplishing even the most basic interaction tasks.


In general, adding tactile feedback to touch interfaces is challenging. In one conventional approach, the touch surface itself can be actuated with various electromechanical actuators, such as piezoelectric bending motors, voice coils, and solenoids. The actuation can be designed to create surface motion either in the normal or lateral directions. Such an approach has been used in the design of tactile feedback for touch interfaces on small handheld devices by mechanically vibrating the entire touch surface. With low frequency vibrations, a simple “click” sensation can be simulated. A major challenge in using mechanical actuation with mobile touch surfaces is the difficulty of creating actuators that fit into mobile devices and produce sufficient force to displace the touch surface. Creating tactile interfaces for large touch screens, such as interactive kiosks and desktop computers, allows for larger actuators. Larger actuated surfaces, however, begin to behave as a flexible membrane instead of a rigid plate. Complex mechanical deformations occur when larger plates are actuated, making it difficult to predictably control tactile sensation or even provide enough power for actuation.


An alternative approach to actuation of the touch surface is to decouple the tactile and visual displays. In the case of mobile devices, tactile feedback can be provided by vibrating the backside of the device, stimulating the holding hand. Alternatively, it is possible to embed localized tactile actuators into the body of a mobile device or into tools used in conjunction with touch interfaces. This approach, however, breaks the metaphor of direct interaction, requires external devices, and still does not solve the problem of developing tactile feedback for large surfaces.


SUMMARY

One embodiment provides a method that receives a tactile feedback map based on an image, where the tactile feedback map stores a spatial location of at least one object in the image and at least one tactile sensation associated with the object. The method also includes receiving a position of user contact on a touch screen and identifying a tactile sensation by correlating the position of the user contact to a location within the tactile feedback map. The method includes generating a first electrical signal corresponding to the tactile sensation on at least one electrode associated with the touch screen.


Another embodiment provides a touch device that includes a touch screen configured to identify a position of user contact, where the touch screen is configured to receive a tactile feedback map based on an image. The tactile feedback map stores the spatial location of at least one object in the image and at least one tactile sensation associated with the object. Moreover, the touch device identifies a first electrical signal by correlating the position of the user contact to a location within the tactile feedback map. The touch device also includes a signal driver configured to generate the first electrical signal corresponding to the tactile sensation on at least one electrode in the touch device.


Another embodiment provides a touch device that includes a touch screen configured to identify a position of user contact and an image processing module. The image processing module receives an image of an environment generated from an image capturing device and generates a tactile feedback map based on the image. Moreover, the tactile feedback map stores the spatial location of at least one object in the image and at least one tactile sensation associated with the object and the image processing module identifies a tactile sensation by correlating the position of the user contact received from the touch screen to a location within the tactile feedback map. The touch device includes a signal driver configured to generate a first electrical signal corresponding to the tactile sensation on at least one electrode associated with the touch screen.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a system configured to implement one or more aspects of the embodiments disclosed herein.



FIGS. 2A-2B are conceptual diagrams of touch surfaces configured for providing electrovibration, according to embodiments disclosed herein.



FIGS. 3A-3C illustrate electrical charges corresponding to electrovibration actuation, according to embodiments disclosed herein.



FIG. 4A illustrates an attractive force induced between a finger and a touch surface, according to one embodiment disclosed herein.



FIGS. 4B-4C illustrate an attractive force induced between a finger and a touch surface and a friction force between the sliding finger and the touch surface, according to embodiments disclosed herein.



FIGS. 5A-5B are flow diagrams of method steps for providing electrovibration actuation, according to embodiments disclosed herein.



FIG. 6 is a graph of absolute detection thresholds for different frequencies of an input signal, according to embodiments disclosed herein.



FIG. 7 illustrates frequency just-noticeable-differences (JNDs) based on a user survey, according to one embodiment disclosed herein.



FIG. 8 illustrates amplitude JNDs based on a user survey, according to one embodiment disclosed herein.



FIG. 9 illustrates the results of a user survey of four textures produced by four frequency-amplitude combinations, according to one embodiment disclosed herein.



FIG. 10A is a conceptual diagram illustrating multiple electrodes each controlled by a separate wire, according to one embodiment disclosed herein.



FIG. 10B is a conceptual diagram that illustrates controlling multiple electrodes with switches, according to one embodiment disclosed herein.



FIG. 11A is a conceptual diagram illustrating implementing an impedance profile, according to one embodiment disclosed herein.



FIG. 11B is flow diagram for providing electrovibration actuation based on an impedance profile, according to one embodiment disclosed herein.



FIG. 12 is a conceptual diagram that illustrates combining a low frequency signal and high frequency signal, according to one embodiment disclosed herein.



FIG. 13 is a conceptual diagram illustrating using a image capturing device for tactile feedback, according to one embodiment disclosed herein.



FIG. 14 is a system diagram for capturing an image used for providing tactile feedback, according to one embodiment disclosed herein.



FIGS. 15A-15B illustrate the electrodes in a touch screen, according to embodiments disclosed herein.



FIG. 16 is a flow diagram for providing tactile feedback based on a captured image, according to one embodiment disclosed herein.





DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS

Embodiments of the invention provide an interface that allows users to feel a broad range of tactile sensations on touch screens. Unlike other tactile technologies, embodiments of the invention do not use any mechanical motion. In one embodiment, a touch panel includes a transparent electrode covered by a thin insulation layer. An electrical signal is coupled to the electrode. As described in greater detail below, in another embodiment, a signal can be applied directly to the user via the back side of the device. The signal may be a time-varying signal. In some embodiments, the time-varying signal is periodic. When a finger, or other conductive object such as a pen, slides along the insulation layer of the touch panel, a sensation of tactile texture is perceived.


Embodiments of the invention can be easily combined with different display and input technologies and can be used in many applications. For example, a touch screen can simulate the feeling of various textures. Another example application includes enhancing drawing applications with the feeling of paint on a virtual canvas. Embodiments of the invention can also simulate friction between objects. For example, dragging a virtual car could feel different depending on the type of virtual pavement on which the car is being dragged. In another example, dragging large files using the touch screen could create more friction than compared to dragging smaller files. Similarly, embodiments of the invention allow the user feel constraints, such as snapping to a grid in a manipulation task. There are many more applications of embodiments of the invention. Combined with other input modalities such as video, embodiments of the invention create many new applications and exciting user experiences. Specifically, the embodiments disclosed herein may use an image capturing device to capture and image of an environment which is then processed and used to map a point of user contact on a touch screen to a particular tactile sensation. This system may, for example, help visually impaired users to locate objects around them, determine physical characteristics of the objects in the environment, navigate the environment, and the like.


System Overview

Embodiments may be implemented as a system, method, apparatus or computer program product. Accordingly, various embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects. Furthermore, embodiments may take the form of a computer program product embodied in one or more computer-readable medium(s) having computer-readable program code embodied therewith.


Any combination of one or more computer-readable medium(s) may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. A computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device. A computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein. Computer program code for carrying out operations of various embodiments may be written in any combination of one or more programming languages (including an object oriented programming language such as Java™, Smalltalk, C++ or the like and conventional procedural programming languages, such as the “C” programming language or similar programming languages). The program code may execute entirely on the user's computer (device), partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. The remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer.


It will be understood that certain embodiments can be implemented by a device such as a computer executing a program of instructions. These computer program instructions may be provided to a processor of a special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, implement the functions/acts specified.


These computer program instructions may also be stored in a computer-readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer-readable medium produce an article of manufacture including instructions which implement the function/act specified.


The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus or the like to produce a computer implemented process such that the instructions that execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified.



FIG. 1 is a block diagram of a system configured to implement one or more aspects of the invention. An example device that may be used in connection with one or more embodiments includes a computing device in the form of a computer 110. Components of computer 110 may include, but are not limited to, a processing unit 120, a system memory 130, and a system bus 122 that couples various system components including the system memory 130 to the processing unit 120. Computer 110 may include or have access to a variety of computer-readable media. The system memory 130 may include computer-readable storage media, for example in the form of volatile and/or nonvolatile memory such as read only memory (ROM) and/or random access memory (RAM). By way of example, and not limitation, system memory 130 may also include an operating system, application programs, other program modules, and program data.


A user can interface with (for example, enter commands and information) the computer 110 through input devices 140. A monitor or other type of display surface can also be connected to the system bus 122 via an interface, such as an output interface 150. In addition to a monitor, computers may also include other peripheral output devices. The computer 110 may operate in a networked or distributed environment using logical connections to one or more other remote device(s) 170 such as other computers. The logical connections may include network interface(s) 160 to a network, such as a local area network (LAN), a wide area network (WAN), and/or a global computer network, but may also include other networks/buses.


Certain embodiments are directed to systems and associated methods for creating tactile interfaces for touch surfaces that do not use any form of mechanical actuation. Instead, certain embodiments exploit the principle of “electrovibration,” which allows creation of a broad range of tactile sensations by controlling electrostatic friction between an instrumented touch surface and a user's fingers. When combined with an input-capable interactive display, embodiments enable the creation of a wide variety of interactions augmented with tactile feedback. Various example embodiments are described in further detail below. The details regarding the example embodiments provided below are not intended to be limiting, but are merely illustrative of example embodiments.


Electrovibration for Touch Surfaces

Embodiments of the invention provide mechanisms for creating tactile interfaces for touch surfaces that does not use any form of mechanical actuation. Instead, the proposed technique exploits the principle of electrovibration, which allows embodiments to create a broad range of tactile sensations by controlling electrostatic friction between an instrumented touch surface and the user's finger or fingers. When combined with an input-capable interactive display, embodiments of the invention enable a wide variety of interactions augmented with tactile feedback.



FIG. 2A is a conceptual diagram of a touch surface 200 configured for providing electrovibration, according one embodiment of the invention. The touch surface 200 includes a transparent electrode sheet 202 applied onto a glass plate 204 coated with an insulator layer 206. A controller causes the transparent electrode 202 to be excited with a periodic electrical signal V(t) coupled to connectors. For example, the connectors could normally be used by a position sensing driver (not shown) of the touch surface 200. When an input signal of sufficient amplitude is provided, an electrically induced attractive force fe develops between a sliding finger 208 and the underlying electrode 202, increasing the dynamic friction fr between the finger 208 and the touch surface 200. Because the amplitude of fe varies with the signal amplitude, changes in friction fr are also periodic, resulting in periodic skin deformations as the finger 208 slides on the touch surface 200. These deformations are perceived as vibration or friction and can be controlled by modulating the amplitude and frequency of the applied signal. Also, the input signal V(t) is uniformly propagated across the transparent electrode 202; therefore, the resulting tactile sensation is spatially uniform.


In one embodiment, the electrical signal V(t) comprises a sinusoidal waveform. In other embodiments, the electrical signal V(t) comprises other waveforms, including square or triangular waveforms. In some embodiments, the signal can be mono-phasic or bi-phasic. In some embodiments, the signal is rectified. In some embodiments, the signal includes a DC (direct current) offset. In some embodiments, coupling the electrical signal V(t) to the electrode 202 comprises providing the signal directly to the electrode 202. In other embodiment, coupling the electrical signal V(t) to the electrode 202 comprises inductively coupling the electrical signal V(t) to the electrode 202 via capacitive, resistive, and/or inductive elements.


As shown, the user's finger can be connected to a ground 210. In one embodiment, the user can be placed at a potential difference from the electrode. Although our bodies provide a natural link to the ground, creating a direct ground connection can increase the intensity of the tactile sensation. Without such grounding, the voltage could be increased to provide the same intensity of sensation. Grounding can be achieved by wearing a simple ground electrode. For example, the user can wear an anti-static wristband. Users can also sit or stand on a grounded pad. In the case of mobile devices, the backside of the enclosure, which contacts the user when the mobile device is grasped, could be used as the ground. In other embodiments, the ground electrode comprises a grounded pad on which a user is standing, sitting, holding, resting on lap, wearing, touching, or otherwise coupled to the user including via intermediate objects and materials.


In yet another embodiment, a ground plane (not shown) can be included in the touch surface 200. The ground plane can comprise a mesh or include a pattern of holes. When the user touches a finger to the touch surface, the user is effectively grounded by the ground plane. The signal, in this embodiment, is applied to the user.


In yet another embodiment, the electrode layer itself can include both grounding and signal elements. Accordingly, part of the touching finger would be connected to the ground and part to the signal, hence the ground connection is occurring on the finger.



FIG. 2B is a conceptual diagram of a touch surface 200 configured for providing electrovibration, according one embodiment of the invention. As shown, the electrical signal V(t) can be applied to the finger 208 and a path to ground 210 is provided to the electrode 202. In some embodiments, the electrical signal can be applied to the back side of the apparatus and pass through the user's body to the finger 208. A tactile sensation is also perceived in the finger when the finger 208 slides on the insulation layer in the configuration shown in FIG. 2B.


According to various embodiments, the insulator layer 206 can be made of different materials and can have different textures, i.e. a different finish. The electrode 202 can also be made of different materials, including ITO (Indium tin oxide), silver, conductive rubber, copper, aluminum, conductive ink, conductive glue, conductive paint or any other conductive material.


In some cases, the critical factor for safe operation of electrical devices is current, rather than voltage. According to embodiments of the invention, induced charge in the finger causes a force on the finger, and the amount of induced current flowing through the user's hand is negligible. For example, the current supplied to the touch surface 200 can be limited to 0.5 mA, which is typically considered safe for humans. In some embodiment, current limitation is defined by the power rating of an operational amplifier used in the driving circuit. In fact, users experience the same amount of current while using conventional capacitive touch panels. To further protect the user, some embodiments can implement a current limiting circuit.


“Electrovibration” tactile actuation differs from “electrocutaneous” and “electrostatic” tactile actuation. Electrocutaneous displays stimulate tactile receptors in human fingers with electric charge passing through the skin. In contrast, there is no passing charge in electrovibration: the charge in the finger is induced by a charge moving on a conductive surface. Furthermore, unlike electrocutaneous tactile feedback, where current is directly stimulating the nerve endings, stimulation with electrovibration is mechanical, created by a periodic electrostatic force deforming the skin of the sliding finger.


In the electrostatic approach, a user is manipulating an intermediate object, such as a piece of aluminum foil, over an electrode pattern. A periodic signal applied to this pattern creates weak electrostatic attraction between an object and an electrode, which is perceived as vibration when the object is moved by the user's finger. The tactile sensation, therefore, is created indirectly: the vibration induced by electrostatic force on an object is transferred to the touching human finger. In case of electrovibration, no intermediate elements are required; the tactile sensation is created by directly actuating the fingers.


Tactile feedback based on electrovibration has several compelling advantages. Embodiments of the invention provide a mechanism that is fast, low-powered, dynamic, and can be used in a wide range of interaction scenarios and applications, including multi-touch interfaces. Embodiments of the invention demonstrate a broad bandwidth and uniformity of response across a wide range of frequencies and amplitudes. Furthermore, the technology is highly scalable and can be used efficiently on touch surfaces of any size, shape, and/or configuration, including large interactive tables, hand-held mobile devices, as well as curved, flexible, and/or irregular touch surfaces. Lastly, because embodiments of the invention do not have any moving parts, they can be easily implemented in existing devices with minimal physical modification to the devices.


One implementation of an electrovibration touch surface includes implementing a multi-touch interactive tabletop, a wall mounted surface, or any other technically feasible configuration. A touch panel in accordance with FIGS. 2A-2B can be used as a projection and input surface. An additional diffuser plane can be installed behind the panel. A projector can be used to render graphical content. To capture the user input, the panel can be illuminated from behind with infrared illuminators. An infrared camera captures reflections of user fingers touching the surface. For example, the multi-touch tracking can be performed at 60 frames per second. Finger positions are transmitted to a hardware mechanism and/or software application responsible for controlling interactive features, visual display, and tactile output. This implementation is scalable and can be adapted to other input techniques, including frustrated internal reflection and surface acoustic tracking, among others. It can be easily extended, modified and applied to any surface or device. Indeed, since there is no mechanical motion, almost any object can be instrumented with electrovibration-based tactile feedback. The electrodes can be transparent or opaque, be painted on curved and irregular surfaces, and added to any display, hand tool, or appliance. In other embodiments, other sensing technologies can be used in combination with the electrovibration techniques described herein, such as distance tracking, pressure input, contact area tracking, among others.



FIGS. 3A-3C illustrate electrical charges corresponding to electrovibration actuation, according to embodiments of the invention. As shown in FIG. 3A, a touch surface comprises a glass plate 304, an electrode 306, and an insulation layer 308. An input signal V(t) is applied to the electrode 306. The input signal V(t) can oscillate and cause positive and negative charges to alternate within the electrode. At the time shown in FIG. 3A, the charges in the electrode are negative. Negative charges in the electrode 306 cause positive charges to accumulate along the bottom portion of the insulation layer 308 and negative charges to accumulate along the top portion of the insulation layer 308. This causes positive charges to be induced in the user's finger 302 when placed in contact with the insulation layer 308.


As described, as the input signal V(t) oscillates, so do the charges in electrode 306. This causes the charges in the insulation layer 308 to “flip-flop” within the insulation layer 308. As shown in FIG. 3B, the positive charges within the insulation layer 308 are moving upwards (i.e., towards the user's finger 302), and the negative charges within the insulation layer 308 are moving downwards (i.e., towards the electrode 306). FIG. 3B also illustrates that some of the charges in the electrode 306 are now positive. The positive charges within the insulation layer 308 continue moving upwards, and the negative charges within the insulation layer 308 continue moving downwards. Negative charges have also started to accumulate within the user's finger tip.



FIG. 3C illustrates the changes within the touch surface at yet another point in time. As shown, the charges in the electrode 306 are now positive. Positive charges in the electrode 306 cause negative charges to accumulate along the bottom portion of the insulation layer 308 and positive charges to accumulate along the top portion of the insulation layer 308. This causes negative charges to accumulate in the user's finger 302 when placed in contact with the insulation layer 308.


As described, a input signal V(t) applied to the electrode 306 displaces charges within the insulation layer 308, creating an oscillating electric field. When the finger 302 is placed on the surface of the touch panel, a periodic motion of electrical charges is induced in the tip of the finger 302. As described above, in other embodiments, the electrical signal V(t) can be applied to the finger 302 and a path to ground is provided to the electrode 306.



FIG. 4A illustrates an attractive force fe induced between a finger 402 and a touch surface, according to one embodiment of the invention. The touch surface comprises a glass plate 404, an electrode 406, and an insulation layer 408. An input signal V(t) is applied to the electrode 406. When an input signal V(t) of sufficient amplitude is provided, the electrically induced attractive force fe develops between the finger 402 and the underlying electrode 406. The induced attractive force fe oscillates between a stronger force and a weaker force as the charges oscillate within the finger 402. The oscillation of the magnitude of the induced attractive force fe is illustrated in the FIG. 4A with the dotted arrow representing the induced attractive force fe.



FIGS. 4B-4C illustrate an attractive force fe induced between a finger 402 and a touch surface and a friction force fr between the sliding finger 402 and the touch surface as the finger 402 slides in the direction of the finger motion, according to embodiments of the invention. Because the amplitude of fe varies with the signal amplitude, changes in friction fr are also periodic, resulting in periodic skin deformations as the finger 208 slides on the touch surface 200. These deformations are perceived as vibration or friction and can be controlled by modulating the amplitude and frequency of the applied signal.



FIGS. 4B-4C illustrate the finger 402 sliding along the touch surface. As shown, the magnitude of the attractive force fe and the friction force fr shown in FIG. 4B (i.e., at one finger position) is greater than the magnitude of the attractive force fe and the friction force fr shown in FIG. 4C (i.e., at another finger position). In some embodiments, these changes in the magnitude of the friction force fr are periodic as the user slides the finger 402 along the touch surface, resulting in periodic skin deformations that are perceived as texture.



FIG. 5A is a flow diagram of method steps for providing electrovibration actuation, according to one embodiment of the invention. Persons skilled in the art would understand that, even though the method 500 is described in conjunction with the systems of FIGS. 1-4C, any system configured to perform the method steps, in any order, is within the scope of embodiments of the invention.


As shown, the method 500 begins at step 502, where a signal is provided to an electrode placed between a substrate and an insulation layer. In one embodiment, the substrate is a glass plate. In some embodiments, the electrode and/or the insulation surface is transparent and forms part of a touch screen surface. The signal provided to the electrode comprises a periodic, modulated, and/or complex waveform. According to various embodiments, the insulation layer can be made of different materials and can have different textures, i.e. a different finish. The electrode and/or the insulation surface can be made of different materials, including ITO (Indium tin oxide), conductive rubber, copper, silver, aluminum, conductive ink, conductive glue, or any other conductive material.


At step 504, responsive to a digit sliding along the insulation layer, a tactile sensation is perceived by the digit. In some embodiments, the digit comprises a finger. As described, changes in the magnitude of a friction force fr between the digit and the insulation layer can be periodic as the user slides the digit along the touch surface, resulting in periodic skin deformations that are perceived as texture.



FIG. 5B is a flow diagram of method steps for providing electrovibration actuation, according to one embodiment of the invention. Persons skilled in the art would understand that, even though the method 550 is described in conjunction with the systems of FIGS. 1-4C, any system configured to perform the method steps, in any order, is within the scope of embodiments of the invention.


As shown, the method 550 begins at step 552, where a signal is provided to a user of a device that includes a touch surface. The signal can be generated by a controller included within the device. In one example, the signal is coupled to the back side of the device, which includes a metal surface. Coupling the signal to the back side of the device can include providing the signal directly to the back surface, inductively coupling the signal to the back surface, or any other technique for coupling a signal to a surface. For example, the user that is holding the device can receive the signal through the user's hand.


At step 554, responsive to a digit sliding along an insulation layer of the device, a tactile sensation is perceived by the digit. As described herein, the device can includes an electrode placed between a substrate and an insulation layer. In one embodiment, the substrate is a glass plate. In some embodiments, the electrode and/or the insulation surface are transparent and form part of a touch screen surface. In some embodiments, the digit comprises a finger.


In some embodiments, the method 550 described in FIG. 5B corresponds to the arrangement shown in FIG. 2B, where the signal is applied to the user and the electrode is connected to a path to ground. As described, changes in the magnitude of a friction force fr between the digit and the insulation layer can be periodic as the user slides the digit along the touch surface, resulting in periodic skin deformations that are perceived as texture.


Perception-Based Characteristics of Electrovibration

As described above, varying the frequency, amplitude, DC offset, and/or any other properties of the input signal to the electrode causes the user to feel different tactile feedback. The tactile feedback perceived by a particular individual may be different than the sensation perceived by another individual.


In some embodiments, there is a baseline of human sensitivity that defines an absolute detection threshold and frequency and amplitude discrimination thresholds. In the case of electrovibration, the absolute detection threshold is the minimum voltage amplitude that creates a barely detectable sensation at a specific frequency. Voltages below the detection threshold are not usable in creating haptic sensations. In some embodiments, the frequency of the input signal affects the absolute detection threshold.



FIG. 6 is a graph of absolute detection thresholds for different frequencies of an input signal, according to some embodiments of the invention. The data shown in FIG. 6 is based on a user survey and is not meant to be limiting. The data shown in FIG. 6 merely shows one example of absolute detection thresholds for different frequencies.


The absolute detection thresholds for five reference frequencies are shown in FIG. 6. The mean detection thresholds of electrovibrations with standard error bars are shown on the left axis and a force detection threshold curve is shown with units along the right axis. The thresholds are defined in “dB re 1 V peak” units computed as 20 log10(A) where A is the signal amplitude in Volts. Using this unit is a standard practice in psychophysical experiments due to linearity of human perception in logarithmic scale. For comparison, a force detection threshold curve is also plotted in FIG. 6. In this example, there was a statistically significant effect of frequency on the threshold levels (F(4,36)=12.8; p<0.001), indicating that the threshold levels depend on the stimulus frequency.


The amplitude and frequency discrimination thresholds are typically referred to as just-noticeable-differences (JNDs), which are the smallest detectable differences between two stimuli. The detection and discrimination thresholds together form a set of fundamental measures that describe the dynamic range and processing capabilities of electrovibration sensations. These measures can be used to design interfaces and applications using embodiments of the invention.


In some embodiments, the detection threshold levels for electrovibrations closely coincide with the force detection threshold levels for sinusoidal stimulus. Experiments have shown that sensations created with embodiments of the invention are closely related to perception of forces lateral to the skin. The relation between electrovibration voltages and perceived forces may not be linear.


In some embodiments, the detection threshold levels provide guidelines for designing tactile interfaces using electrovibration. For example, the detection threshold levels inform the designer that at each frequency the applied voltage must be above the corresponding detection threshold level in order to provide a tactile sensation that a user can perceive. They also allow optimizing power requirements. For example, at 400 Hz the tactile signal could create an easily discernable tactile sensation at 18 dB re 1 V level or 16 Vpp. On the other hand, at 180 Hz the voltage threshold level is half of that, requiring significantly less power (12 dB re 1 V peak or 8 Vpp). Therefore, tactile feedback can be optimized to require less power, which can be especially important for mobile devices.


The frequency and amplitude discrimination thresholds describe the resolution of human perception: they determine the granularity of tactile sensations that can be used in designing interfaces. For example, if designers want to create two distinct tactile sensations, then they would make sure that the amplitude of voltages for each sensation are at least a certain voltage different apart from one another for the user to be able to differentiate them. Similar considerations also apply for frequency of stimuli.



FIG. 7 illustrates frequency just-noticeable-differences (JNDs) based on a user survey, according to one embodiment of the invention. Five subjects were subjected to a test at five different frequency levels. The results for each subject are shown in FIG. 7 with a different symbol corresponding to each subject. Also shown are the average values with standard error bars. It should be understood that the results shown in FIG. 7 are not meant to be limiting, but rather show one example of frequency discrimination thresholds.



FIG. 8 illustrates amplitude just-noticeable-differences (JNDs) based on a user survey, according to one embodiment of the invention. Five subjects were subjected to a test at five different frequency levels. The results for each subject are shown in FIG. 8 with a different symbol corresponding to each subject. Also shown are the average values with standard error bars. It should be understood that the results shown in FIG. 8 are not meant to be limiting, but rather show one example of amplitude discrimination thresholds.


As described, the sensations felt by individual users can vary from person to person. FIG. 9 illustrates the results of a user survey of four textures produced by four frequency-amplitude combinations, according to one embodiment of the invention. As shown, users were subjected to four combinations of frequency and amplitude, including 80 Hz-80 Vpp (voltage peak-to-peak), 80 Hz-115 Vpp, 400 Hz-80 Vpp, and 400 Hz-115 Vpp.


Low frequency stimuli were perceived as rougher compared to high frequencies. They were often likened to “wood” and “bumpy leather,” versus “paper” and “a painted wall” for higher frequency stimuli.


The effect of amplitude depends on stimuli frequency. For high frequency textures (e.g., 400 Hz) an increase of amplitude increased perceived smoothness of tactile sensations. Similarly, at 80 Vpp textures were mostly compared to “cement surface” and “cheap paper,” and at 115 Vpp they were compared to “paper” or “a painted wall.” Some participants explicitly pointed out this increase in perceived smoothness.


At low frequencies (e.g., 80 Hz), an increase in stimuli amplitude heightens the perception of stickiness. While some participants referred explicitly to a “sticky” sensation, others compared the sensation to that of touching a “motorcycle handle” or “rubber.” Other participants associated viscosity with this type of texture. One participant compared his experience to “running fingers through viscous liquid.”


Again, it should be understood that the results shown in FIG. 9 are not meant to be limiting, but rather show one example of amplitude discrimination thresholds.


Exemplary Use Cases and Implementations of Electrovibration Actuation

As described above, embodiments of the invention can be implemented in a wide variety of use cases and applications. Some of these use cases and applications are outlined below. The examples provided below are merely exemplary and are not meant to limit the scope of embodiments of the invention.


One implementation of the electrovibration techniques described herein includes simulation applications. This class of applications includes such tactile effects as textures for virtual objects, simulation of friction between objects or objects and a virtual surface, and/or activities like painting and drawing, where tools are manipulated on top of a canvas.


In addition, tactile feedback on touch screens allows for non-visual information layers. For example, a visual image of a star field could be supplemented with a “tactile image” of radiation intensity felt by fingers running over the areas of interest. The tactile channel can be dynamic in both amplitude and frequency, potentially offering two additional channels of information.


Another example includes incorporating tactile feedback with conventional GUI (graphical user interface) elements. For example, a slider in GUI window can report their drag extent by changing the tactile feedback frequency. Similarly, a user could run his or her fingers over a list of emails to sense those that are new or with the highest priority. There are numerous other interaction design ideas that can be implemented using embodiments of the invention.


According to various embodiments, changing the frequency of the input signal can be modified by modulating the input signal with a different Pulse-Amplitude-Modulated waveform.


In yet another example, direct manipulation is ripe for tactile augmentation, especially in touch interfaces where occlusion can be problematic. Files, icons, and other “dragable” items could be augmented with variable levels of friction to not only confirm that the target was successfully captured, but also convey properties like file size and drag-and-drop applicability. For example, larger files may be associated with greater friction than smaller files. Object alignment, snapping, and grid-based layouts could be also supplemented with tactile feedback. Such tactile augmentation could enable eyes-free interaction with sufficient practice.


Repeated cursor motion over a region, i.e. rubbing, has been used in image editing applications for erasing, smoothing, desaturating, and other procedures that incrementally increase or decrease some attribute of the image. Rubbing interaction offers an interesting application of dynamic tactile feedback. For example, as a user progressively wipes out pixels in an area of an image, the tactile sensation could decrease.


In some embodiments, fingers in motion could be stimulated, while static fingers would not receive any tactile stimulation. Therefore, embodiments of the invention allow for multi-touch tactile feedback so long as at each moment only one finger is moving on the surface. There are at least two examples where this can be employed in a unique and useful manner. One implementation includes gestures where one finger defines a reference point, while another finger is used for manipulation. A selection from a pie menu is one example, where one finger is static while another moves rotationally to select an item. Similarly, shape transformations can be implemented, where one finger defines a static reference point while a moving finger specifies the amount of transformation, e.g. stretching, rotation, or zooming. In all such operations, a moving finger can be easily supplemented with tactile feedback using embodiments of the invention.


Still further examples include gestures that employ asymmetric separation of labor between the two hands. For example, a non-dominant hand could perform a gross manipulation, such as orienting a sheet of paper, while the dominant hand performs a fine-grained interaction, such as writing. Another implementation could use one or more modal buttons to define operation of a common slider. As in the previous example, one or more fingers are static, while one or more are engaged in movement and provided with tactile feedback using embodiments of the invention.


In yet another embodiment, embodiments of the invention can be implemented in a multi-touch surface having multiple electrodes addressed individually. The tactile display could include one or more individually addressable and individually controlled transparent electrode plates, each of which is covered with the thin insulating layer. The electrodes can provide independent tactile feedback when a finger slides on them. Each electrode can be addressed independently from other surrounding electrodes. Accordingly, different sensations can be created for different fingers.


In one embodiment, each of the multiple electrodes is controlled by an independent wire. FIG. 10A is a conceptual diagram illustrating multiple electrodes 1000 each controlled by a separate wire, according to one embodiment of the invention. A controller 1002 is coupled by a separate wire to each electrode. Each of the multiple electrodes 1000 receives a separate input signal. The advantage of using independent wires for each electrode is that implementing multiple wires is relatively easy, but a disadvantage is that using many wires may not scalable as the number of electrodes increases.


In one embodiment, a driver can sweep over all connections to create a dynamic high frequency AC signal. In other words the driver turns each of the electrodes on for a very short time, then turns it off, goes to the next electrode, turns on and off and so on. If driver is switching very fast, the resulted signal on each pattern can be a Pulse-Amplitude-Modulated (PAM) wave, or a derivative of a PAM. In another embodiment, embodiments can create an equivalent number of signal sources as electrodes and then connect each signal source to one electrode. This embodiment may be particularly advantageous in embodiments that include a relatively small number of electrodes.


In yet another embodiment, one or more conductive pathways or layers of conductive material can be used to control the independent electrodes. FIG. 10B is a conceptual diagram that illustrates controlling multiple electrodes with switches, according to one embodiment of the invention. As shown, an underlying electrode 1004 is provided with an input signal 1006. Small electronically-controlled switches 1010 can be used to bridge top patches 1008 that are touched by the users and the underlying electrode 1004. Control paths 1014 can be used to connect and disconnect the top patches 1008 from the signal electrode 1004 with a driver 1012. In other embodiments, the multiple electrodes can be controlled with a switching circuit coupled to each of the electrodes.


The electronically-controlled switches 1010 can comprise transistors, diodes, relays, or other components, such as flexible electronics materials and/or organic electronics. The switches 1010 can be controlled by a grid of wires for addressing. In one implementation, a single wire may be provided per row of patches 1008, and single wire per column of the array of patches.


In yet another embodiment (not shown), the driving electrodes and electrodes that are touched are separated and connected via capacitive coupling. Each electrode is driven through capacitive coupling from one patterned layer to another patterned layer. In other words, the top layer has only patterns and that is all. An advantage of this approach is that we can use simple techniques to design control pads and the electrodes no not have to be transparent. In addition, some embodiments would not require wires on the conductive top-most layer. In some embodiments, an LED (light emitting diode) screen can be placed between the driving electrodes and the electrodes that are touched. In other embodiments, a display, a wall, clothes, or other materials can be placed in between the two layers of the electrode.


Various implementations can be used to drive the signal for the electrovibration surface. A conductive layer, such as conductive layer 1004 shown in FIG. 10B, can be powered with a high frequency AC (e.g., 1 MHz) signal, that is switched on and off by the electronically-controlled switches to modulate a lower frequency signal (e.g., 100 Hz) for tactile response for a particular tile 1008. Doing so creates a train of bursts where each burst includes high frequency pulses. These stimuli are perceived as low frequency tactile sensations.


Alternatively, embodiments of the invention can modulate the amplitude of the high frequency signal as well, thereby creating a low frequency tactile wave represented as Pulse-Amplitude-Modulated (PAM) signal. Humans would perceive only the low frequency signal and would not feel the high frequency component. Furthermore, in some embodiments, the carrier frequency can be chosen so that the impedance path to ground for the tactile signal frequency is minimal. This could allow for the effect to be perceived without explicit return electrode for the ground.


The properties of the actuating signal, such as signal amplitude, signal frequency, signal carrier frequency, DC offset, among others, can be dynamically adjusted depending on the total impedance profile of the system. FIG. 11A is a conceptual diagram illustrating implementing an impedance profile, according to embodiments of the invention. A sensor 1104 can be connected to an electrode 1102, and measure the overall impedance profile Zm for the finger touching the touch panel. According to some embodiments, the impedance can be dependent on a variety of factors, including the resistance of the circuit, both through the user's body and through electronic components, the moisture of the digit in contact with the surface, the amount of contact surface between the digit and the device, among others. Various techniques can be used the measure the impedance profile. In one embodiment, a sweep of the frequencies is performed and then the response is measured.


In some embodiments, the tactile perception felt by the user is based on the impedance value. In one embodiment, the potential difference between the user and ground can cause the actuation to be perceived differently. For example, the sensation that a user feels could be different when standing on a metal bridge and interacting with the device versus standing on a wooden floor and interacting with the device. Also, in some embodiments, the impedance can vary in time during the interaction with the touch surface. For example, a user can walk up a flight of stairs while interacting with the device, which would change the user's potential difference to ground.


The measured impedance profile measured by the sensor 1104 is transmitted to a controller 1106 that provides an actuation signal 1108 to the electrode 1102. According to some embodiments, the parameters of the signal 1108 are based on the measured impedance Zm so that tactile perception of the electrovibration is maintained. In some embodiments, the signal amplitude and/or DC offset can be adjusted so that the potential difference of the user to ground is maintained the tactile feedback is perceived similarly. In some embodiments, the impedance value is used to control a carrier frequency of a PAM modulation or other modulation of the signal and/or adjust a value of a DC component of the signal. Thus, for example, as a user interacting with the device walks over a metal bridge and up a flight of wooden stairs, the signal that is output from the controller is adjusted so that the perceived tactile feedback remains similar during the entire interaction. Without dynamically controlling the signal, as described herein, the strength of feedback would change as the user's potential difference to ground changes, which could be jarring to the user.


Although FIG. 11A shows the signal output from the controller being applied to the electrode 1102, another embodiment of the invention provides for a grounded electrode and the signal being applied to the user (i.e., through the back side of the device). Also, the controller 1106 can be implemented as hardware, software, or a combination of hardware and software.


In addition, an amplifier 1110 is included in the path from the controller 1106 to the electrode 1102 or finger. In one embodiment, the amplifier is a transistor-based amplifier. According to some embodiments, the amplifier 1110 can be included within the controller 1106 or separately from the controller. In some embodiments, gain of the amplifier can be adjusted to dynamically control the signal that is output from the controller. Also, using a transistor-based amplifier allows for a DC offset to be included in the output signal. In prior art techniques, a transformer amplifier was used. However, a transformer-based amplifier can only drive an AC (alternating current) signal and cannot pass a DC offset. Additionally, with smaller device such as hand-held devices, transformer amplifiers may be too large. Accordingly, a transistor-based amplifier is smaller and can easily fit within the housing of a hand-held device.


In some embodiments, the tactile signal can be modulated using the PAM modulation techniques described herein, and then a high frequency (i.e., carrier) signal can be used to encode information that is used for other purposes. Examples include creating a sound, watermarking tactile sensations, sending information to objects touching the surface, e.g. device placed on the surface, sending information to a device worn by the user, and/or sending power to the devices placed on the table.


In some embodiments, a DC (direct current) offset is added to the periodic signal. Adding a DC offset to a signal can increase the perceived strength of the signal and allows for stronger sensations. In some embodiments, the control electronics that control the DC offset are independent of the control electronics that control the variability of tactile signal, allowing embodiments to optimize the tactile perception.


In other embodiment, only positive or only negative periodic signal is provided.



FIG. 11B is flow diagram for providing electrovibration actuation based on an impedance profile, according to one embodiment of the invention. Persons skilled in the art would understand that, even though the method 1150 is described in conjunction with the systems of FIGS. 1-11A, any system configured to perform the method steps, in any order, is within the scope of embodiments of the invention.


As shown, the method 1150 begins at step 1152, where a sensor determines an impedance profile of user touching a device. According to some embodiments, the impedance can be dependent on a variety of factors, including the resistance of the circuit, both through the user's body and through electronic components, the moisture of the digit in contact with the surface, the amount of contact surface between the digit and the device, among others. Various techniques can be used the measure the impedance profile. In one embodiment, a sweep of the frequencies is performed and then the response is measured.


At step 1154, a controller generates a signal based on the impedance profile. As described, the parameters of the signal can be modified so that the perceived haptic feedback remains similar throughout the user interaction with the device. In some embodiments, the signal amplitude and/or DC offset can be adjusted so that the potential difference of the user to ground is maintained so that the tactile feedback is perceived similarly.


At step 1156, the controller transmits the signal to the electrode or to the finger in contact with the device. In one embodiment, as shown in the FIG. 2A, the signal can be transmitted to the electrode. In another embodiment, as shown in the FIG. 2B, the signal can be transmitted to the finger via coupling the signal to the back side of the device and having the signal pass through the user's body to the finger in contact with the device. Additionally, in some embodiments, the signal passes through a transistor-based amplifier before arriving at the electrode or the finger in contact with the device.


In other embodiments, a low frequency signal is combined with higher frequency signal creating a combined signal that is perceived as a single sensation. FIG. 12 is a conceptual diagram that illustrates combining a low frequency signal and high frequency signal, according to one embodiment of the invention. As shown, a low frequency signal 1202 is combined with a high frequency signal 1204 to produce a combined signal 1206. A human could perceive both the low frequency and the high frequency components of the combined signal 1206 independently, in certain embodiments. The control techniques could control both signal frequencies independently, and different information can be represented using different frequencies embedded into the same combined signal 1206.


In embodiments that include multiple individually controlled electrodes, the electrodes can be arranged in a pattern. The electrode patterns can have various shapes, sizes and arrangements. For example, the electrode pattern can includes electrodes shaped as squares, triangles, hexagons, circles, or any other shape. For example, in some embodiments, the electrode patterns may allow the user to feel the edge of a bush putton.


In some embodiments the electrode is transparent. In other embodiments, the electrode is opaque or translucent.


Further, embodiments of the invention can be combined with other actuation technologies. In one embodiment, electrovibration actuation can be combined with mechanical vibrotactile actuation to provide a wider range of sensations. In another embodiment, electrovibration actuation can be combined with an ultrasonic horizontal actuator. Ultrasonic motion can be provided below the perception level, i.e., the ultrasonic motion moves the plate instead of the finger. Accordingly, the surface would be sliding finger in relation to the finger while the user is sensing the electrovibration. Such an implementation would allow the user to feel tactile feedback when the finger is not moving, i.e., to feel button presses. In yet another embodiment, electrovibration actuation can be combined with capabilities in temperature changes. In one example, a marble surface could feel cold and a wood surface could feel warm.


Conventional surface capacitive input sensing includes interpreting the voltage and/or current drops on corners of the touch panel when a user is touching the surface. This conventional voltage and/or current drop is interpreted relatively to a reference signal that is a low voltage, AC signal of constant amplitude and frequency injected into the electrode of the capacitive touch panel. In some embodiments of the invention, a high voltage, arbitrarily-shaped signal is injected in the electrode of the capacitive touch panel to provide tactile sensation. The signal can then be used as a reference voltage for capacitive sensing. The voltage and/or current drop on the corners of the panel are interpreted relative to the arbitrary, high voltage signal injected into the electrode in order to compute touch coordinates.


In some embodiments, the reference signal and the signal according to embodiments of the invention can be injected alternatively using a high frequency switching mechanism. The switching frequency can be beyond the sensitivity threshold. Thus, rapidly alternating input sensing and haptic feedback signal can be perceived as parallel mechanisms from the user's perspective.


In yet another embodiment, other objects placed on or near the electrovibration surface can provide electrovibration tactile feedback. In one embodiment, the objects are capacitively coupled to the electrovibration surface and, therefore, can provide tactile feedback. For example, a toy placed on the electrovibration surface can be coated with an insulating film. The electrovibration haptic effect can be sensed by a user that touches the object.


In some embodiments, the conductive surface can be a wire, on which fingers slide up and down. For example, the wire can be created as a combination of resistive and conductive threads. In other embodiments, actuation can be done through a graphite or lead pencil on paper. For example, an electrode can be placed underneath the paper provides voltage and the pen would feel differently in the user's hands, e.g. this is tactile sensation between pen and user. In yet another embodiment, the conductive layer can be conductive paint, and the insulation layer can be non-conductive paint.


In some embodiments, electrovibration tactile feedback can be used to supplement the display of real-world objects. Embodiments of the invention can be overlaid on top of an image of an object or an actual 3D object. When the user touches the screen, the user can “feel” the image of the object or an actual 3D physical object. To accomplish this sensation, embodiments of the invention receive a digital representation of the image, that can be either 2D or 3D, convert the digital representation into a tactile representation, and then overlay the tactile representation over the digital representation. To correctly determine which part of the object the user intends to touch, at least one camera or other sensor can be used to track user viewpoint direction and correlate it with finger position on top of the touch sensitive surface. In one embodiment, the user can feel the image of the object where the picture is stored a computing device, and in other embodiments, the user can feel the image of the object where the picture is received over a network, such as the Internet. In some embodiments, the object comprises a key on a keypad or a text entry interface.


In some embodiment, the signal coupled to the electrode is based on a controller detecting that a first digit is placed in contact with the insulation surface and that a second digit is placed in contact with the insulation surface, where the first digit is stationary and the second digit is sliding along the insulation surface. In some embodiments, the system recognizes multiple touch points and different tactile feedback is provided when different touch points are moved asynchronously. For example, left and right hands could be placed on the touch panel. When the left hand is moving (while right hand is kept static) signal A is provided. When right hand is moving (while left hand is kept static) signal B is provide. When both hands are moving, no signal or signal C is provided.


In another embodiment, a camera can be used in conjunction with the electrovibration techniques described herein. A user holds a camera, e.g., a camera included in a mobile device, and points the camera at an object. Software associated with the camera can detect the object that is being captured and cause a corresponding tactile feedback to be sensed when the user touches the screen.


In yet another embodiment, an object can be placed inside a protective case that covers the object. For example, in a museum setting, articles in the museum can be placed behind a glass case, e.g., artifacts, paintings, animals. The protective case can be implemented with embodiments of the invention. Thus, when a user touches the protective case, the user can feel electrovibration actuation that can correspond to the feel of the object within the protective case. In some embodiments, additional tracking techniques could be used to understand which part of the tactile map should be presented to the user. The tracking techniques could utilize gaze or face tracking techniques, as well as other techniques.


Some embodiments of the invention could allow for collaboration applications, where different sensations are provided to different users. Other embodiments may provide an application where erasing and/or sketching can be done using a finger with variable friction. Similarly, embodiments allow for tactile feedback for rubbing-based interaction. Other embodiments include tactile scroll wheels, GUI items are augmented with tactile feedback provided by the scroll wheel of a mouse.


Yet another example of application of embodiments of the invention is augmenting a text entry technique with tactile sensations. For example, input keys can be displayed on a display screen, such that when a user touches the keys, the user feels a tactile feedback, e.g., while touch-typing or during a “swype”-style text or password input.


In yet another embodiment, amusement park rides can be augmented with electrovibration technology according to embodiments of the invention. For example, a slide that a person slides down can provide electrovibration feedback.


Electrovibration for Hand-Held Touch Surfaces


FIG. 13 is a conceptual diagram illustrating using a image capturing device for tactile feedback, according to one embodiment disclosed herein. The system 1300 includes a image capturing device 1305, an image processing module 1310, a tactile feedback driver 1315, and a touch device 1320. The image capturing device 1305 captures images, which may be a still image or video, of an environment (e.g., an open-air or enclosed space) which are then transmitted to the image processing module 1310. The image capturing device 1305 has a view area 1345 that corresponds to an image taken by the device 1305. As shown, the view area 1345, and the resulting image, includes two objects 1330A-B with defined spatial locations in the environment and relative to each other. The image capturing device 1305 may be a camera, either a still-frame or video camera, an infrared (IR) detector (or any other RF spectrum detector), ultrasonic imaging device, radar, sonar, and the like. Moreover, the image capturing device 1305 may be passive—e.g., a detects incoming visible light to generate an image—or active—e.g., the device 1305 includes a IR projector that transmits IR light which is then reflected and detected by a IR sensitive camera. Moreover, the image capturing device may be either a stand-alone unit that is communicatively coupled to the image processing module 1310 or integrated into other elements in system 1300.


The image processing module 1310 may include different software and/or hardware elements that process the image received from the image capturing device 1305. In one embodiment, the image processing module 1310 may also transmit commands to the image capturing device 1305 such as changing the view area 1345, and thus, the image captured by the device 1305 by panning or zooming in or out. The image processing module 1310 may identify the objects 1330A-B within the image using any type of image processing technique. In one embodiment, the processing module 1310 may detect different colors or shapes that are used to classify and identify different objects 1330 in the image. For example, the image processing module 1310 may detect an octagonal shape that is red. Based on this information, the image processing module 1310 may identify that portion of the image as a stop sign. Nonetheless, the embodiments disclose herein are not limited to any particular method or algorithm of processing images to identify objects.


Based on the identified objects, the image processing module 1310 may associate one or more tactile sensations with each object. Specifically, the module 1310 may assign a different electrical signal (i.e., electrovibration) to triangle 1330A than to circle 1330B which provides different tactile sensations when the electric signal flows though a user as shown by FIGS. 2A-2B. The electrical signals may vary based on an intensity (i.e., voltage amplitude), frequency, and wave shape which can be combined to generate a plurality of different electrical signals, and thus, unique tactile sensation. The assignment of tactile sensation to an object may be based on the object's shape, color, perceived texture, distance from the image capturing device 1305 to the object 1330, and the like. Moreover, different parts of the object may be assigned different electrical signals that yield different tactile sensations from other parts of the object. For example, if the image processing module 1310 detects an object that includes both a rough and smooth portion, the module 1310 may assign two different electrical signals to the different textured portions.


The image processing module 1310 (or tactile feedback driver 1315) may use a perceptual code that defines how electrical signals are assigned to the identified objects in an image. The perceptual code may be stored in the image processing module 1310 or the tactile feedback driver 1315 when the module is fabricated or configured by a user of the system 1300 (or a combination of both). Generally, the perceptual code is a haptics rendering algorithm optimized for electrovibration devices for driving electrical signals that generate perceivable tactile sensations. If the user has learned how the perceptual code assigns electrical signals, once the user feels a particular tactile sensation, the user can correlate the perceived tactile sensation to a visual characteristic of the object—e.g., its shape, type, texture, color, distance from the image capturing device 1305, etc. For example, the perceptual code may require that all objects of a particular type (e.g., a stop sign) are assigned a certain tactile sensation generated by an electric signal with a particular frequency and wave shape. However, the image processing module 1310 may vary the intensities (i.e., voltage peak-to-peak values) of the electric signals to inform the user of the distance between the image capturing device 1305 and the object. As the distance between the object and the device 1305 decreases, the image processing module 1310 may increase the intensity of the electric signal assigned to the object. Thus, if the user is at or near the same location as the image capturing device 1305, the intensifying electric signal may indicate to the user that the object is approaching her location.


In one embodiment, the image processing module 1310 may use the image and perceptual code to generate a tactile feedback map. The map may include the spatial relationship between the objects 1330A-B as well as the particular tactile sensation (or sensations) associated with each object 1330A-B. The image processing module 1310 may use the tactile feedback map when communicating with a visually impaired user. That is, instead of using visual cues for identifying a spatial relationship of an object (or objects) in an environment, the tactile feedback map is used to generate an electrovibration that provides spatial relationship information to a user, as well as other information such as the shape, color, or texture of the object. In one embodiment, the image processing module 1310 may constantly receive updated images (i.e., real-time updates) from the image capturing device 1305 which are used to revise the tactile feedback map.


The touch device 1320 includes a touch screen 1325 configured to track user interaction with the device. The touch screen 1325 may be similar to the touch screen discussed above—e.g., a capacitive sensing device used to track the location of a user's finger. In one embodiment, the touch device 1320 may be capable of tracking multiple points of user contact with the touch screen 1325 (i.e., multi-touch).


The tactile feedback driver 1315 may be used to facilitate communication between the touch device 1320 and the image processing module 1310. In one embodiment, the tactile feedback driver 1315 may receive location data that provides a location of user contact on the touch screen 1325. For example, the touch screen 1325 may inform the tactile feedback driver 1315 that the user's finger 1350 is tracing a path on the screen 1325 as shown by arrows 1335A-C. The tactile feedback driver 1315 may forward this location data to the image processing module 1310 to map the location of the user's finger 1350 onto the tactile feedback map. If the location of the user's finger 1350 is at a spatial location that matches the location of an identified object on the tactile feedback map, the image processing module transmits the electric signal corresponding to that object to the tactile driver 1315. The tactile driver 1315 may then generate the electric signal on one or more electrodes in the touch device to provide the electrovibration in the user's finger 1350 as shown in FIGS. 2A-2B and as discussed above. Once the user's finger 1350 moves from a location on the touch screen 1325 that no longer correlates to the spatial location of object 1330A in the tactile feedback map, the image processing module 1310 may instruct the tactile feedback driver 1315 to no longer provide the tactile sensation corresponding to object 1330A to the user. Ghosted portion 1340B illustrates the position in the touch screen 1325 that corresponds to the position of object 1330B in the tactile feedback map. If the user were to move her finger 1350 to portion 1340B, the image processing module 1310 would then instruct the tactile feedback driver 1315 to provide the electric signal assigned to object 1330B to the user. In this manner, the system 1300 is able to translate a captured image into a tactile feedback map that can be used to generate electrovibration to the user that corresponds to objects in the image.


In one embodiment, the tactile feedback driver 1315 and image processing module 1310 may communicate using digital signals. If the tactile feedback map dictates that the tactile feedback driver 1315 should generate a tactile sensation in the touch device 1320, the image processing module 1310 may send digital data identifying the electric signal to be provided to the user. The driver 1315 decodes the digital data and generates the analog electric signal corresponding to the tactile sensation in the touch device 1320. In one embodiment, the tactile feedback driver 1315 may not provide any electric signal when the user is not contacting the portions 1340A-B of the screen 1325 that correspond to identified objects 1330A-B. Alternatively, the driver 1315 may provide a baseline tactile sensation to the touch device 1320 when the user is not contacting the portions 1340A-B.



FIG. 14 is a system diagram for capturing an image used for providing tactile feedback, according to one embodiment disclosed herein. The system 1400 includes the image capturing device 1305 communicatively coupled to a compute element 1410. The image capturing device 1305 may include a memory (not shown) that stores one or more images 1405 recorded by the device 1305. As mentioned previously, the image capturing device 1305 may be either active or passive and is not limited to any particular method for generating an image of a physical environment.


The image capturing device 1305 may forward the images 1405 to the compute element 1410 which includes a processor 1415 and memory 1420. The processor 1415 represents one or more processors (e.g., microprocessors) or multi-core processors. The memory 1420 may represent random access memory (RAM) devices comprising the main storage of the compute element 1410, as well as supplemental levels of memory, e.g., cache memories, non-volatile or backup memories (e.g., programmable or flash memories), read-only memories, and the like. In addition, the memory 1420 may be considered to include memory storage physically located in the compute element 1410 or on another computing device coupled to the compute element 1410. The memory 1420 includes an image processing module 1310 which may implemented by software, hardware, or some combination of both. As discussed previously, the image processing module 1310 processes a received image 1405 and generates a tactile feedback map 1420 based on, for example, objects identified in the image 1405 and a defined perceptual code.


The compute element 1410 may be coupled to the touch device 1320 which includes a tactile feedback driver 1315, touch screen 1325, and touch detection module 1435. The tactile feedback driver 1315 may transmit positional data 1425 to the image processing module 1310 defining a position on the touch screen 1325 that the user is currently contacting. For example, the touch detection module 1435 may be coupled to a plurality of electrodes in the touch screen 1325 which are used to detect a change of capacitance that occurs when the user contacts the screen 1325. Once the image processing module 1310 receives the positional data 1425, it determines what, if any, tactile sensation should be provided to the user based on correlating the positional data 1425 to the tactile feedback map 1430. The tactile feedback driver 1315 may then receive a signal back from the image processing module 1310 which identifies a particular tactile sensation and generates an electric signal in the electrodes of the screen 1325 as discussed previously.


The components of system 1400 may be combined to form one or more integrated devices. For example, the image capturing device 1305, the compute element 1410, and the touch device 1320 may integrated into a single integrated device such as a cell phone, laptop, tablet computer, and the like. In this manner, the different elements may communicate using internal busses or other short-distance communication methods. Alternatively, only two of the three elements shown may be integrated. For example, the image capturing device 1305 may be integrated with the touch device 1320. Here, the compute element 1410 may communicate with the touch device 1320 and image capturing device 1305 using wired or wireless communication protocols—e.g., Ethernet, IEEE 802.11b, and the like. Further, one or more of the elements shown in FIG. 14, or an integrated device that is a combination of these elements, may be handheld such that they are easily portable by a user.



FIGS. 15A-15B illustrate the electrodes in a touch screen, according to embodiments disclosed herein. Specifically, FIG. 15A illustrates a tactile feedback system 1500 that provides a single tactile sensation across the electrodes 1505A-G. For clarity, any outer layers of the touch screen 1325 that may cover the electrodes 1505A-G have been removed. Once the tactile feedback driver 1315 receives an instruction from the image processing module (not shown) to provide a specific tactile sensation, the driver 1315 generates the corresponding tactile sensation on all of the electrodes. However, unlike mechanical vibrotactile actuation where the entire device vibrates, electrovibration is localized such that it is only felt at a location that is contacting the touch screen 1325—e.g., at or near the tip of the finger touching the screen 1325. However, the electrovibration may still generate the tactile sensation in the user even if the user is holding a conductive object—e.g., a stylus—that is moving across the screen 1325. Moreover, the user may also feel residual effects of the tactile sensation when a user's appendage is above the screen 1325—i.e., the screen 1325 and the user is separated by a layer of air. As used herein, a user (or conductive object) can “contact” a tactile feedback screen by either directly contacting the screen or by the user being close enough to the screen to feel the residual effects without directly contacting the screen.


Using the techniques shown in FIG. 2A-2B, the tactile feedback driver 1315 may provide the electric signal V(t) either at the electrodes 1505A-G associated with the touch screen 1325 or at the electrode 1510 associated with the handle 1515 while the other electrode (or electrodes) is set to a reference voltage (e.g., ground). For example, the user may use one hand to contact the screen 1325 using a finger or palm while holding the device with the other hand at the handle 1515. Grasping the handle electrically connects the user with the electrode 1510. Once the user contacts the touch screen 1325, the electrodes 1505 and 1510 are electrically coupled and the electrovibration is perceived by the user. As such, a second user, who is not electrically coupled to electrode 1510, would not perceive the tactile sensation if she also contacts the touch screen 1325.


Nonetheless, any design of the touch device 1320 is contemplated so long as the user is able to contact electrode 1510. For example, the touch device 1320 may include a connector mechanism that connects the user to electrode 1510 so that the user does not need to come into direct contact with the electrode 1510 at the outside surface of the device 1320. Accordingly, the user may hold the device 1320 in any manner and still receive tactile feedback. The connector mechanism may be, for example, a wire with a conductor that wraps around a user's wrist.


The tactile feedback driver 1315 is connected to each electrode 1505A-G at a single node. Thus, the driver 1315 generates one electrical signal that is transmitted to each of the electrodes 1505. Stated differently, the tactile feedback driver 1315 controls each of the electrodes 1505A-G as if they are one single electrode. Although not shown, the tactile feedback driver 1315 may also be connected to the electrode 1510. In one embodiment, the electrodes 1505 may also be used for determining location of user contact on the screen (e.g., capacitive touch detection) or manipulating a display material (e.g., liquid crystal) for displaying an image on the screen 1325. Alternatively, the electrodes 1505 for tactile feedback may be independent of other systems in the touch device 1320, such as the display or touch detection systems. Moreover, instead of a plurality of electrodes 1505, the system 1500 may include only one electrode for providing tactile feedback. Also, the electrodes 1505 may be configured to provide electrovibration in only a portion of the touch screen 1325.



FIG. 15B illustrates a system 1501 compatible with multi-touch electrovibration sensing. In contrast to FIG. 15A, FIG. 15B illustrates that the tactile feedback driver 1315 electrically connects to each electrode via a separate electrical connection. These connections enable the driver 1315 to generate different electrical signals on the different electrodes 1505A-G. That is, the tactile feedback driver 1315 may be configured to generate different electric signals such that a finger contacting the touch screen 1325 in proximity to electrode 1505A receives a different tactile sensation than a different finger contacting the touch screen 1325 in proximity to electrode 1505B. In one embodiment, vertical electrodes may be added and separately controlled by the tactile feedback driver 1315 to provide additional granularity. For example, if the user had two fingers that were both in proximity to the same electrode 1505A—i.e., contacting the touch screen 1325 at the same horizontal line—the tactile feedback driver 1315 may be instructed by the image processing module to instead use two vertical electrodes that are each proximate to only one of the fingers for providing two different tactile sensations to the fingers. Of course, other electrode designs are contemplated for supporting multi-touch—e.g., concentric circles or rectangles, straight electrodes arranged at a slant, and the like.


In one embodiment, two or more of the electrodes 1505 may be arranged into groups where the driver 1315 generates an electric signal for each group. Further, certain electrodes may not be connected to the driver 1315, or alternatively, the electrodes 1505 may spaced in such a way to mitigate the likelihood that two different electric signals may be felt at the same point of user contact. In another embodiment, the tactile feedback driver 1315 may include a switching network for switching between system 1500 shown in FIG. 15A and system 1501 shown in FIG. 15B. That is, a single driver 1315 may be used to either generate a single electric signal on all the electrodes 1505 (single-touch) or generate a plurality of electric signals that are transmitted in parallel to multiple electrodes 1505 (multi-touch).



FIG. 16 is a flow diagram for providing tactile feedback based on a captured image, according to one embodiment disclosed herein. The method 1600 begins at step 1605 where an image capturing device captures an image of an environment. The image may be based on detecting electromagnetic spectrum (e.g., visible light, IR, or radar) as well as physical waves (e.g., sound waves). The image capturing device (e.g., a camera on a mobile phone that detects visible light) may then process the detected information to produce an image of the environment.


In one embodiment, in addition to producing an image of the environment, the image capturing device may also process the resulting image to identify any location identifiers such as a landmark, Quick Response (QR) Code®, street sign, GPS markers, and the like. For example, if the image capturing device detects a location identifier within the image, the device may generate data (e.g., metadata) associated with the location identifier. The location identifier may be used to retrieve a predefined tactile feedback map from a memory rather than having to generate the map directly from the capture image. For example, if the image capturing device detects a QR code corresponding to a particular exhibit in a museum, the image processing module may retrieve from memory a predefined tactile feedback map having a predefined associated with the QR code. Moreover, the tactile feedback map may be created using a computer generated image that is based on a physical real-world environment. For example, the tactile feedback map may be generated based on a computer rendering of the museum.


At step 1610, the image capturing device may transmit the captured image the image processing module for generating (or retrieving) a tactile feedback map based on data contained within the image. In one embodiment, image processing module identifies one or more objects and their spatial locations within the image. The spatial location may include a three dimensional mapping of the objects relative to each other as well as a distance from the object to the image capturing device. For example, the image processing module may determine the distance from the object to the camera and assign a particular intensity (i.e., voltage amplitude) to an electric signal corresponding to the object. Changing the intensity based on distance may change the perceived tactile sensation—e.g., the sensation varies from smooth to bumpy as shown in FIG. 9. Moreover, the area in the tactile feedback map dedicated to the object may vary depending on the distance from the object to the camera. That is, the map may outline an area for an object that grows larger as the object moves closer to the camera. The method for assigning electrovibrational signals to particular objects based on the physical characteristics of the object or its location may be defined by a perceptual code.


In one embodiment, the image processing module may calibrate the tactile feedback map to correspond to the dimensions of a touch screen. This may require the image processing module to increase or decrease the resolution of the captured image in order to coordinate the spatial location of the objects to the physical dimensions of the touch screen, or even ignore certain parts of the capture image—e.g., the captured image is a circle but the touch screen is a square.


In one embodiment, the image capturing device may be integrated with other components in a tactile feedback system, such as a touch device, the image processing module, and the like. Moreover, the image capturing device may provide updated images in real time. Accordingly, the module may be configured to account for visual displacement such as parallax—i.e., where as the image moves from side to side the objects in the distance appear to move more slowly than the objects closer to the image capturing device. In this manner, the tactile feedback map may be updated to provide to the user the same spatial locations of the objects as would be obtained by visually inspecting the captured image. Alternatively, the image capturing device may capture a single image, or only capture an image based on a user prompt.


In one embodiment, the image or a signal from the image capturing device may be used to retrieve the tactile feedback map from memory or from a remote storage location (e.g., the map is downloaded from a server via a wireless communication link). For example, the image capturing device may detect a location identifier (e.g., a QR code) which prompts the image processing module to fetch a corresponding tactile feedback map from memory or a remote server via a network connection. In this embodiment, the image capturing device that took the image on which the preconfigured tactile feedback map is based may be located remotely from the tactile device. For example, the user may enter a museum and use an image capturing device integrated into a handheld tactile device to identify a QR code which instructs the tactile device to retrieve a preconfigured tactile map from memory or via a communication network. However, the retrieved tactile feedback map may be derived from an image captured previously by a camera that is not directly or indirectly coupled to the tactile feedback device. This process enables the image processing module to avoid having to generate the tactile feedback map from a captured image since this may have been performed previously (or contemporaneously) by a separate image capturing device and image processing system.


In one embodiment, instead of using an integrated camera to adjust the tactile feedback map, the tactile device may include a different location system (e.g., GPS, accelerometers, IR location systems, and the like) to change the tactile feedback map, regardless whether the tactile device generated the tactile feedback map from an image captured by a built-in camera or retrieved a preconfigured tactile feedback map. For example, the tactile feedback device may use a GPS receiver to determine that the user has moved and update the size or intensity of the electrovibration associated with the objects in the tactile feedback map to represent the distance between the objects and the tactile feedback device. Here, the tactile device uses a separate location system—i.e., not an image processing system—to update the tactile feedback map.


At step 1615, the image processing module may receive a location (or locations) of user contact on a touch screen. The user contact may be instigated by a user's finger, palm, or other appendage contacting the touch screen. Moreover, in some touch systems the user may not have to come into physical contact with the touch screen to identify a location of user contact. Nonetheless, although the touch system may be able to identify a location of the user's appendage without direct physical contact, to perceive the electrovibration resulting from the electrostatic and frictional forces discussed above, the user may need to be in direct contact with the touch screen. The embodiments disclosed herein are not limited to any particular touch screen or touch detection method.


Additionally, the user may be able to zoom in or out of the tactile feedback map using the touch screen. The user may provide input—e.g., a certain motion or motions on the touch screen—that indicates the user would like to zoom in at a particular area. Referring to FIG. 13, the area 1340A of the object may be too small for the user to correctly identify its shape. After receiving the command to zoom in, the image processing module may update the tactile feedback map to increase the area 1340A occupied by the object in the map. As the area of the object in the tactile feedback map increases, so does the area 1340A within the touch screen 1325 that corresponds to the object. Thus, the user may more easily identify the tactile sensation or the shape of the object after zooming in. If the user continues to zoom in on the portion 1340A, the object 1340B may no longer be within the tactile feedback map and cannot be felt on the touch screen 1325. Once the user is satisfied that she has correctly identified the object 1340A, she may indicate using the touch screen 1325 to zoom out. This may lead to object 1340B once again being included within the tactile feedback map, and thus, portion 1340B will once again produce the tactile sensation of object 1330B if the user contact is within that defined area.


At step 1620, the image processing module determines a position on the tactile feedback map based on the location of the user contact on the touch screen. That is, the location of the user contact is correlated to the tactile feedback map. In one embodiment, each location (an X and Y coordinate) on the touch screen may correspond to a unique location within the tactile feedback map. Thus, a location on the touch screen may directly correlate to a location in the tactile feedback map.


The tactile feedback map may be a data structure (e.g., lookup table, database, virtual map, etc.) stored in memory with a plurality of locations of the touch screen that correspond to objects captured in the image. The image processing module may update the map as new image data is received. For example, at one time, a particular location of the touch screen may correspond to an object but at a second time, because either the image capturing device or the object has moved, the object may correspond to a different location of the touch screen or be removed completely from the tactile feedback map.


In other embodiments, the image processing module may use the tactile feedback map or the captured image to display an image on a device. For example, the touch screen may be integrated with a display system (e.g., a LCD display) that permits the image processing module to use the tactile feedback map to display an image on the touch screen. Doing so may enhance the experience of the user by permitting her to see the objects as well as sense tactile feedback from contacting the objects on the touch screen. Referring to FIG. 13, both the triangle object 1340A and circle object 1340B may be displayed on the touch screen 1325 rather than only being sensed by tactile feedback as shown.


At step 1625, the image processing module may determine a tactile feedback sensation corresponding to the identified position of the user contact. For example, the data structure of the tactile feedback map may also include an entry that stores a tactile sensation corresponding to each object that was assigned using the perceptual code. For example, the X and Y coordinates provided by the touch screen may be used to look up a location of the tactile feedback map. If an object is at that location, the map may provide a tactile sensation associated with that object. As new images are received, the image processing module may update the data structure to reflect the new objects or an object's positional change.


Once the sensation is identified, the image processing module may send an instruction to a tactile feedback driver to generate the electric signal which provides the tactile sensation to a user contacting the touch screen. For example, the tactile feedback map may store a digital code that the image processing module transmits using a digital communication technique or an internal bus to the tactile feedback driver. Based on the code, the driver generates the electric signal that produces the desired tactile sensation. In one embodiment, the image processing module may determine two different tactile sensations based on two different user contact locations on the touch screen. The module then transmits the instructions to the tactile feedback driver to provide the different sensations at the different locations on the touch screen.


At step 1630, the tactile feedback driver generates the electric signal on one or more electrodes in the touch screen. For example, the tactile feedback driver may be use electrodes arranged as shown in FIGS. 15A-15B. Thus, the tactile feedback driver may generate a single electric signal or multiple signals that correspond to different electrodes in the touch screen. For example, the electrodes may form a grid which permits the tactile feedback driver to transmit the electric signal to only the electrodes that are proximate to the location of the user contact without being felt at other points of user contact.


In one embodiment, auditory data may be provided along with the tactile feedback. The image processing module may identify different characteristics of the objects in a captured image and transmit these characteristics to the touch device which may then use a speaker for conveying the characteristics to the user. For example, the image processing module may identify the color of the object in an image. Once the image processing module determines that the location of the user contact correlates to the location of the object in the tactile feedback map, the module may transmit instructions to the touch device to use the speakers to convey the object's color to the user.


CONCLUSION

Embodiments of the invention offer several significant advantages over conventional mechanical vibrotactile actuation on touch screens. The absence of mechanical motion in electrovibration actuation techniques is the most immediate difference between embodiments of the invention and conventional mechanical actuation. This feature has several notable implications.


Regardless of the type of material, any plane of material will flex when actuated. This problem is exacerbated when the plate is large and actuation forces are applied on its periphery, which is common when designing tactile feedback for touch surfaces. Consequently, not only are vibrotactile solutions not feasible for large interactive surfaces, but even for small screen sizes, different parts of the screens would have different magnitudes of physical displacement and, therefore, different tactile sensations. Electrovibration, on the other hand, does not suffer from this effect as electrical potential is evenly and instantaneously distributed over the entire plate. The tactile feedback in embodiments of the invention is uniform across surfaces of any size.


When a periodic force vibrates a plate of material, as is the case for vibrotactile displays, the plate spring properties are combined with dampening, which is inherent due to attachment of the plate to an enclosure or chassis, and together they result in a highly attenuated frequency response of the plate. As a result, for a signal of the same amplitude, the mechanical displacement of the plate will be different at different frequencies, peaking close to its resonant mechanical frequency and then dramatically decreasing. These complex signal attenuations make it essentially impossible to engineer a flat response—even software amplitude correction cannot hope to counter these laws of physics. Because electrovibration requires no moving parts, it suffers from neither of these effects.


Byproduct noise is a serious consideration when designing end-user interactive systems. For example, most people accept that their kitchen blenders are noisy, but people use them rarely and then only briefly, so the noise level is tolerated. This level of noise would not be acceptable in a computing device we hope to use for extended period of time. Unfortunately, physical vibrations are often noisy, e.g., consider a mobile phone vibrating on a table. Compounding this problem is the fact that interactive surfaces tend to have large surface areas, which displace a considerable volume of air, essentially turning their screens into speakers. Because there is no physical motion in embodiments of the invention, the electrovibration surface is silent.


Moving parts naturally wear over time, which alters their performance characteristics and may eventually lead to failure. In addition, the vibrating screen must be separated from the enclosure with a small seam to accommodate movement, which allows dust, liquid and other debris inside the device. Sealing this seam, however, dampens vibrations, which decreases the intensity of tactile feedback. None of these issues are relevant in embodiments of the invention. In addition, embodiments of the invention can be practiced without requiring a high-voltage signal. In some examples, tactile sensations can be perceived with as little as 8 Volts peak-to-peak.


As described, vibrotactile actuation delivers tactile feedback by displacing the entire surface. As a result, all fingers resting on the surface will be stimulated and any physical object located on the surface is likely to chatter and move around, which is less favorable. In general, there is no way to localize tactile feedback to particular digits when vibrotactile feedback is used with interactive surfaces.


This disclosure has been presented for purposes of illustration and description but is not intended to be exhaustive or limiting. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiments were chosen and described in order to explain principles and practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.


Although illustrative embodiments have been described herein, it is to be understood that the embodiments are not limited to those precise embodiments, and that various other changes and modifications may be affected therein by one skilled in the art without departing from the scope or spirit of the disclosure.

Claims
  • 1. A method, comprising: receiving a tactile feedback map based on an image, wherein the tactile feedback map stores a spatial location of at least one object in the image and at least one tactile sensation associated with the object;receiving a position of user contact on a touch screen;identifying a tactile sensation by correlating the position of the user contact to a location within the tactile feedback map; andgenerating a first electrical signal corresponding to the tactile sensation on at least one electrode associated with the touch screen.
  • 2. The method of claim 1, further comprising, before generating the first electrical signal, determining that the spatial location of the object in the tactile feedback map includes the location within the tactile feedback map.
  • 3. The method of claim 1, wherein the electrical signal is predefined to produce an electrovibration in a user in contact with the touch screen.
  • 4. The method of claim 1, wherein tactile feedback map represents the spatial relationship between the at least one object and at least one other object identified from the received image.
  • 5. The method of claim 1, further comprising: capturing the image of an environment; andgenerating the tactile feedback map based on the image;
  • 6. The method claim 1, wherein at least one of the tactile feedback map and the image are retrieved from a computing device separate from the touch screen via a communication network.
  • 7. The method of claim 1, further comprising: after receiving the tactile feedback map, receiving updated images; andupdating the spatial relationship of the object in the tactile feedback map based on the updated images.
  • 8. The method of claim 7, further comprising assigning a different tactile sensation to the object or a size of the object in the tactile feedback map based on a distance from the object to an image capturing device capturing the updated images, wherein the distance is derived from the updated images.
  • 9. The method of claim 1, further comprising: receiving a second position of user contact on the touch screen;identifying a second tactile sensation by correlating the second position of the user contact to a second location within the tactile feedback map; andgenerating a second electrical signal corresponding to the tactile sensation on different electrode associated with the touch screen, wherein the first and second electrical signals are generated on the respective electrodes simultaneously.
  • 10. A touch device, comprising: a touch screen configured to identify a position of user contact, wherein the touch screen is configured to receive a tactile feedback map based on an image,wherein the tactile feedback map stores the spatial location of at least one object in the image and at least one tactile sensation associated with the object, andwherein the touch device identifies a first electrical signal by correlating the position of the user contact to a location within the tactile feedback map; anda signal driver configured to generate the first electrical signal corresponding to the tactile sensation on at least one electrode in the touch device.
  • 11. The touch device of claim 10, wherein the touch screen further comprises: an electrode; andan insulation surface disposed on the electrode, wherein the signal driver is configured to cause the first electrical signal to be coupled to a user of the touch device such that a tactile sensation is perceived in at least one digit of the user that slides on the insulation surface.
  • 12. The touch device of claim 10, further comprising at least one electrode in direct electrical contact with the user.
  • 13. The touch device of claim 10, further comprising a plurality of electrodes in the touch screen, wherein the signal driver generates the first electric signal on a first one of the plurality of electrodes and a different, second electric signal on a second one of the plurality of electrodes simultaneously.
  • 14. The touch device of claim 10, wherein, upon determining that the spatial location of the object in the tactile feedback map matches the correlated location of the user contact within the tactile feedback map, the signal generator is configured to generate the first electrical signal.
  • 15. A touch device, comprising: a touch screen configured to identify a position of user contact;an image processing module that receives an image of an environment generated from an image capturing device and generates a tactile feedback map based on the image, wherein the tactile feedback map stores the spatial location of at least one object in the image and at least one tactile sensation associated with the object, wherein the image processing module identifies a tactile sensation by correlating the position of the user contact received from the touch screen to a location within the tactile feedback map; anda signal driver configured to generate a first electrical signal corresponding to the tactile sensation on at least one electrode associated with the touch screen.
  • 16. The touch device of claim 15, wherein the image capturing device is configured to provide, at a periodic interval, updated images of the environment, and wherein the image capturing device is integrated into the touch device.
  • 17. The touch device of claim 15, further comprising a first electrode associated with the touch screen and a second electrode configured to be in direct electric contact with a user in contact with wherein the first and second electrodes create an electrical circuit that permits the tactile sensation to be felt by the user once the user contacts the touch screen and moves a user appendage contacting the touch screen.
  • 18. The touch device of claim 17, wherein the touch screen further comprises: the first electrode; andan insulation surface disposed on the first electrode.
  • 19. The touch device of claim 17, wherein the second electrode is disposed on an outer surface of the touch device.
  • 20. The touch device of claim 15, wherein the image processing module determines whether the spatial location of the object in the tactile feedback map includes the location within the tactile feedback map.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation-in-part of co-pending U.S. patent application Ser. No. 13/092,564 entitled “ELECTROVIBRATION FOR TOUCH SURFACES” attorney docket number DISN/0062, filed Apr. 22, 2011 and Ser. No. 13/092,572 entitled “ELECTROVIBRATION FOR TOUCH SURFACES” attorney docket number DISN/0062.02, filed Apr. 22, 2011. These related patent applications are herein incorporated by reference in their entirety.

Provisional Applications (4)
Number Date Country
61430125 Jan 2011 US
61347068 May 2010 US
61430125 Jan 2011 US
61347068 May 2010 US
Continuation in Parts (2)
Number Date Country
Parent 13092564 Apr 2011 US
Child 13603833 US
Parent 13092572 Apr 2011 US
Child 13092564 US