The present invention generally relates to haptic feedback and more particularly to systems and methods for using textures in graphical user interface widgets.
Over the past several years, the use of devices that incorporate touch-screens and haptic feedback has grown exponentially. These devices are used as portable organizers, telephones, music players, and gaming systems. As haptic technology improves, devices may incorporate haptic effects configured to simulate textures. Accordingly, systems and methods for using textures in graphical user interface widgets are needed.
Embodiments of the present invention provide systems and methods for using textures in graphical user interface widgets. For example, in one embodiment, a system for using textures in graphical user interface widgets comprises: an actuator configured to receive a haptic signal and output a haptic effect based at least in part on the haptic signal, the haptic effect configured to simulate a texture; a touch-sensitive interface configured to detect a user interaction and output a interface signal; and a processor in communication with the actuator and the touch-sensitive interface, the processor configured to: receive the interface signal; receive a display signal comprising a plurality of pixels defining a display area; determine a first texture associated with a first group of pixels defining a first section of the display area; determine a second texture associated with a second group of pixels defining a second section of the display area; and transmit a haptic signal configured to cause the actuator to: output a first haptic effect configured to simulate the first texture if the user interaction is associated with the first section of the display area, and output a second haptic effect configured to simulate the second texture if the user interaction is associated with the second section of the display area.
These illustrative embodiments are mentioned not to limit or define the invention, but rather to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, which provides further description of the invention. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
These and other features, aspects, and advantages of the present invention are better understood when the following Detailed Description is read with reference to the accompanying drawings, wherein:
Embodiments of the present invention provide systems and methods for using textures in graphical user interface widgets.
One illustrative embodiment of the present invention comprises a messaging device, such as a mobile phone. In the illustrative embodiment, the messaging device comprises the Samsung Haptic Phone (SCH-W420) equipped with Immersion Corporation's TouchSense® 3000, TouchSense® 4000, or TouchSense® 5000 vibrotactile feedback systems, formerly known as Immersion Corporation's VibeTonz® vibrotactile feedback system. In other embodiments, different messaging devices and haptic feedback systems may be utilized.
The illustrative messaging device comprises a display, a speaker, a network interface, a memory, and a processor in communication with each of these elements. The illustrative messaging device also comprises a touch-sensitive interface and an actuator, both of which are in communication with the processor. The touch-sensitive interface is configured to sense a user's interaction with the messaging device, and the actuator is configured to output a haptic effect. The illustrative messaging device may further comprise a manipulandum configured to detect a user interaction and transmit an interface signal associated with the user interaction to the processor.
In the illustrative messaging device, the display is configured to display a graphical user interface to the user. The graphical user interface may comprise virtual objects, for example icons, buttons, or a virtual keyboard. The illustrative messaging device further comprises a touch-sensitive interface, such as a touch-screen, mounted overtop of the display. The touch-sensitive interface allows the user to interact with the virtual objects displayed in the graphical user interface. For example, in one embodiment, the graphical user interface may comprise a virtual keyboard. In such an embodiment, the touch-sensitive interface allows the user to touch a key on the virtual keyboard to input the alphanumeric character associated with that key. This functionality may be used to type messages, or otherwise interact with objects in the graphical user interface.
In the illustrative messaging device the processor is configured to determine a haptic effect and transmit a haptic signal corresponding to the haptic effect to an actuator configured to output the haptic effect. In the illustrative messaging device, this haptic effect simulates a texture that the user feels on the surface of the touch-sensitive interface. The simulated texture may be associated with the user interface shown on the display. For example, the display may show an icon comprising the shape of a rock. In such an embodiment, the processor may determine a haptic effect configured to simulate the texture of the rock on the surface of the touch-sensitive interface. Then, the processor will transmit a haptic signal to an actuator configured to output the haptic effect. When the actuator receives the haptic signal it will output a haptic effect, such as a vibration, at a frequency configured to cause the surface of the touch-sensitive interface to approximate the texture of the rock.
In the illustrative embodiment, the processor may implement a haptic map to determine the haptic effect. For example, in the illustrative embodiment, the processor may receive a display signal comprising a plurality of pixels, each of the pixels associated with a color. For example, in the illustrative embodiment, each pixel in the display signal may be associated with the color red, green, or blue, and may further be associated with an intensity for each color. In the illustrative embodiment, the processor will assign a haptic value to each color and further assign a haptic intensity associated with the intensity of each color. Then, the processor will transmit a haptic signal comprising the haptic values and haptic intensities to an actuator configured to output the haptic effect.
In the illustrative embodiment, the processor may further determine the haptic effect based on an external trigger. For example, in the illustrative embodiment, the processor is configured to receive an interface signal from a touch-sensitive interface configured to detect a user interaction. Then, in the illustrative embodiment, the processor will determine the haptic effect based at least in part on the interface signal. For example, the processor may modify the haptic value or haptic intensity based at least in part on the interface signal. In the illustrative embodiment, if the touch-sensitive interface detects a high speed or high pressure user interaction, the processor will determine a higher intensity haptic effect.
The illustrative messaging device may output a haptic effect for a multitude of purposes. For example, in one embodiment, the haptic effect may act as a confirmation that the processor has received an interface signal associated with a user interaction. For example, the graphical user interface may comprise a button, and the touch-sensitive interface may detect user interaction associated with pressing the button and transmit an interface signal to the processor. In response, the processor may determine a haptic effect to confirm receiving the interface signal. In such an embodiment, the haptic effect may cause the user to feel a texture on the surface of the touch-sensitive interface. In the illustrative embodiment, the processor may further determine haptic effects for other purposes. For example, the illustrative messaging device may output a texture to alert the user to boundaries on the display or as an identification for objects such as icons on the surface of the display.
This illustrative example is given to introduce the reader to the general subject matter discussed herein. The invention is not limited to this example. The following sections describe various additional non-limiting embodiments and examples of systems and methods for using textures in graphical user interface widgets.
Referring now to the drawings in which like numerals indicate like elements throughout the several figures,
The processor 110 is configured to execute computer-executable program instructions stored in memory 122. For example, processor 110 may execute one or more computer programs for messaging or for generating haptic feedback. Processor 110 may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), one or more field programmable gate arrays (FPGAs), or state machines. Processor 110 may further comprise a programmable electronic device such as a programmable logic controller (PLC), a programmable interrupt controller (PIC), a programmable logic device (PLD), a programmable read-only memory (PROM), an electronically programmable read-only memory (EPROM or EEPROM), or other similar devices.
Memory 122 comprises a computer-readable medium that stores instructions, which when executed by processor 110, cause processor 110 to perform various steps, such as those described herein. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage or transmission devices capable of providing processor 110 with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. In addition, various other devices may include computer-readable media such as a router, private or public network, or other transmission devices. The processor 110 and the processing described may be in one or more structures, and may be dispersed throughout one or more structures.
Processor 110 is in communication with the network interface 112. The network interface 112 may comprise one or more methods of mobile communication, such as infrared, radio, Wi-Fi, or cellular network communication. In other variations, the network interface 112 comprises a wired network interface, such as Ethernet. The messaging device 102 can be configured to exchange messages or virtual message objects with other devices (not shown) over networks such as a cellular network and/or the Internet. Embodiments of messages exchanged between devices may comprise voice messages, text messages, data messages, or other forms of digital messages.
The processor 110 is also in communication with one or more touch-sensitive interfaces 114. In some embodiments, touch-sensitive interface 114 may comprise a touch-screen or a touch-pad. For example, in some embodiments, touch-sensitive interface 114 may comprise a touch-screen mounted overtop of a display configured to receive a display signal and output an image to the user. In other embodiments, touch-sensitive interface 114 may comprise an optical sensor or another type of sensor. In one embodiment, touch-sensitive interface may comprise an LED detector. For example, in one embodiment, touch-sensitive interface 114 may comprise an LED finger detector mounted on the side of display 116. In some embodiments, the processor is in communication with a single touch-sensitive interface 114, in other embodiments, the processor is in communication with a plurality of touch-sensitive interfaces, for example, a first touch-screen and a second touch screen. The touch-sensitive interface 114 is configured to detect user interaction, and based on the user interaction, transmit signals to processor 110. In some embodiments, touch-sensitive interface 114 may be configured to detect multiple aspects of the user interaction. For example, touch-sensitive interface 114 may detect the speed and pressure of a user interaction, and incorporate this information into the interface signal.
In the embodiment shown in
In some embodiments, processor 110 receives signals from touch-sensitive interface 114 that are associated with an interaction with the graphical user interface shown on display 116. For example, in one embodiment, touch-sensitive interface 114 may comprise a touch-screen and a graphical user interface on display 116 may comprises a virtual keyboard. In such an embodiment, when the user interacts with a section of the touch-screen that overlays one of the keys of the virtual keyboard, the touch-screen will send an interface signal to processor 110 corresponding to that user interaction. Based on the interface signal, processor 110 will determine that the user pressed one of the keys on the virtual keyboard. This functionality allows the user to interact with other icons and virtual objects on the display 116. For example, in some embodiments the user may flick the touch-screen to move a virtual ball or turn a virtual knob.
As shown in
In some embodiments, determining the haptic effect may comprise a haptic map. In such an embodiment, determining the haptic effect may comprise mapping the display signal to the actuators. For example, the display signal may comprise a plurality of pixels, each of the pixels associated with a color. In such an embodiment, each pixel may be associated with the color red, green, or blue; each color may further be associated with an intensity, for example, an intensity of 1-8. In such an embodiment, determining the haptic effect may comprise assigning a haptic effect to each color. In some embodiments, the haptic effect may comprise a direction and intensity of operation, for example, in one embodiment the haptic signal may be configured to cause a rotary actuator to rotate clockwise at one-half power. In some embodiments, the intensity of operation may be associated with the intensity of the color. Once processor 110 determines a haptic effect, it transmits a haptic signal comprising the haptic effect. In some embodiments, processor 110 may assign a haptic effect to only some of the pixels in the display signal. For example, in such an embodiment, the haptic effect may be associated with only a portion of the display signal.
In some embodiments, processor 110 may utilize a haptic map to determine the haptic effect and then output the display signal to display 116. In other embodiments, processor 110 may determine the haptic effect using a haptic map, and then not transmit the display signal to display 116. In such an embodiment, the display 116 may stay dark, or off, while actuator 118 is outputting the haptic effect. For example, in such an embodiment, processor 110 may receive a display signal from a digital camera associated with messaging device 102. In some embodiments, in order to conserve battery power, the user may have deactivated display 116. In such an embodiment, the processor may utilize a haptic map to provide the user with a haptic effect simulating a texture on the surface of the display. This texture may be used to alert the user when the camera is in focus, or when some other event has occurred. For example, processor 110 may use facial recognition software to determine haptic effects simulating textures at locations on display 116 that would be associated with faces if display 116 were activated.
In some embodiments, processor 110 may determine the haptic effect based at least in part on a user interaction or trigger. In such an embodiment, processor 110 receives an interface signal from touch-sensitive interface 114, and determines the haptic effect based at least in part on the interface signal. For example, in some embodiments, processor 110 may determine the haptic effects based on the location of the user interaction detected by touch-sensitive interface 114. For example, in one embodiment, processor 110 may determine a haptic effect that simulates the texture of a virtual object that the user is touching on display 116. In other embodiments, processor 110 may determine the intensity of the haptic effect based at least in part on the interface signal. For example, if touch-sensitive interface 114 detects a high pressure user interaction, processor 110 may determine a high intensity haptic effect. In another embodiment, if touch-sensitive interface 114 detects a low pressure user interaction, processor 110 may determine a low intensity haptic effect. In still other embodiments, processor 110 may determine the intensity of the haptic effect based at least in part on the speed of the user interaction. For example, in one embodiment, processor 110 may determine a low intensity haptic effect when touch-sensitive interface 114 detects low speed user interaction. In still other embodiments, processor 110 may determine no haptic effect, unless it receives an interface signal associated with user interaction from touch-sensitive interface 114.
Once processor 110 determines the haptic effect, it transmits a haptic signal associated with the haptic effect to actuator 118. Actuator 118 is configured to receive a haptic signal from processor 110 and generate the haptic effect. Actuator 118 may be, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA). In some embodiments, actuator 118 may comprise a plurality of actuators, for example an ERM and an LRA.
In some embodiments of the present invention, the haptic effect generated by actuator 118 is configured to simulate a texture that the user feels on the surface of touch-sensitive interface 114 or display 116. This texture may be associated with the graphical user interface shown on display 116. For example, display 116 may show an icon comprising the shape of a rock. In such an embodiment, processor 110 may determine a haptic effect configured to simulate the texture of a rock on the surface of touch-sensitive interface 114. Then, processor 110 will transmit a haptic signal associated with the haptic effect to actuator 118, which outputs the haptic effect. For example, when actuator 118 receives the haptic signal, it may output a vibration at a frequency configured to cause the surface of the touch-sensitive interface to comprise the texture of a rock. In other embodiments, actuator 118 may be configured to output a vibration at a frequency that causes the surface of display 116 or touch-sensitive interface 114 to comprise the texture of: water, ice, leather, sand, gravel, snow, skin, fur, or some other surface. In some embodiments, the haptic effect may be output onto a different portion of messaging device 102, for example onto its housing. In some embodiments, actuator 118 may output a multitude of vibrations configured to output multiple textures at the same time. For example, actuator 118 may output a vibration configured to cause the surface of display 116 to comprise the texture of sand, and, actuator 118 may also be configured to output additional vibrations, configured to cause the user to feel the texture of rocks in the sand.
In some embodiments, not shown in
Processor 110 may determine a haptic effect for many reasons. For example, in some embodiments, processor 110 may output a haptic effect that corresponds to the texture of an object shown on display 116. In such an embodiment, the display may show multiple objects, and the processor may determine a different haptic effect as the user moves his/her finger from object to object, thus simulating a different texture for each object. In some embodiments, the haptic effect may act as a confirmation that processor 110 has received a signal associated with user interaction. For example, in one embodiment, the graphical user interface may comprise a button and touch-sensitive interface 114 may detect user interaction associated with pressing the button. When touch-sensitive interface 114 transmits an interface signal associated with the user interaction to processor 110, processor 110 may determine a haptic effect to confirm receipt of the interface signal. In such an embodiment, the haptic effect may cause the user to feel a texture on the surface of touch-sensitive interface 114. For example, the processor may output a haptic effect that simulates the texture of sand to confirm that processor 110 has received the user input. In other embodiments, the processor may determine a different texture, for example, the texture of water, ice, oil, rocks, or skin. In some embodiments, the haptic effect may serve a different purpose, for example, alerting the user of boundaries on display 116, or providing the user with haptic information about the image on display 116. For example, in some embodiments, each icon on display 116 may comprise a different texture and when the user moves their finger from one icon to another, the processor will determine a haptic effect that simulates the texture of each icon. In further embodiments, the processor may change the texture when the user's finger moves from contact with an icon to contact with the background of the display, thus alerting the user that he/she is no longer touching the icon.
As shown in
In some embodiments, not shown in
The sensor signals may comprise one or more parameters associated with a position, a movement, an acceleration, or a “jerk” (i.e. the derivative of acceleration) of the messaging device 102. For example, in one embodiment, the sensor may generate and transmit a sensor signal comprising a plurality of parameters, each parameter associated with a movement along or about one measured translational or rotational axis. In some embodiments, the sensor outputs voltages or currents that processor 110 is programmed to interpret to indicate movement along one or more axes.
In some embodiments, processor 110 will receive the sensor signal and determine that it should activate the virtual workspace and interpret sensed movement of the messaging device 102 in an X, Y, or Z direction as corresponding to a virtual movement “within” the virtual workspace. The user may then move device 102 within the virtual workspace to select functions or files, by gesturing within the virtual space. For example, by moving the messaging device 102 in the Z-Axis overtop of a function within the virtual workspace. In some embodiments, the user may use gestures within the virtual workspace to modify the haptic effects output by messaging device 102.
As shown in
Referring still to
Manipulandum 214 and touch-sensitive interface 218 are configured to detect user interaction and transmit interface signals corresponding to the user interaction to the processor. In some embodiments, the user interaction is associated with a graphical user interface shown on display 216. In such an embodiment, the processor receives the interface signal and, based at least in part on the interface signal, modifies the graphical user interface on display 216. For example, in the embodiment shown in
Messaging device 200 further comprises an actuator configured to receive a haptic signal and output a haptic effect (not shown in
In the embodiment shown in
As shown in
As shown in
In the embodiment shown in
For example, in the embodiment shown in
These haptic effects are configured to cause the user to feel a texture on the surface of display 350 as the user moves his/her finger over its surface. In some embodiments, the messaging device may comprise multiple actuators. In such an embodiment, the processor may be configured to determine which actuator to output the haptic signal to. For example, in one embodiment a messaging device may comprise two actuators configured to output different intensity haptic effects. In such an embodiment, the processor may determine that all haptic effects with an intensity of less than 3 should be output by the first actuator, and all haptic effects with an intensity of greater than or equal to 3 should be output by a second actuator. In other embodiments, each color may be mapped to a specific actuator. For example, in such an embodiment all haptic effects associated with the color blue may be output by a first actuator, all haptic effects associated with the color red may be output by a second actuator, and all haptic effects associated with the color green may be output by a third actuator. In other embodiments, the messaging device may implement different combinations of colors, intensities, haptic effects, and actuators to simulate various textures on the surface of the display.
Then, touch-sensitive interface 114 transmits an interface signal to processor 110, which receives the interface signal 404. In some embodiments, touch-sensitive interface 114 may comprise a touch-screen or a touch-pad. For example, in some embodiments, touch-sensitive interface 114 may comprise a touch-screen mounted overtop of a display configured to receive a display signal and output an image to the user. In other embodiments, touch-sensitive interface 114 may comprise a button, switch, mouse, scroll wheel, roller ball, or some other type of physical device interface known in the art. In some embodiments, processor 110 is in communication with a single touch-sensitive interface 114. In other embodiments, processor 110 is in communication with a plurality of touch-sensitive interfaces 114, for example, a touch-screen and a roller ball. Touch-sensitive interface 114 is configured to detect user interaction, and based on the user interaction, transmit signals to processor 110. In some embodiments, touch-sensitive interface 114 may be configured to detect multiple aspects of the user interaction. For example, touch-sensitive interface 114 may detect the speed and pressure of a user interaction and incorporate this information into the interface signal. In some embodiments, touch-sensitive interface 114 is capable of detecting multi-touch.
Next, processor 110 determines a first texture associated with a first group of pixels defining a first section of the display area 406. The section of the display area defined by the first group of pixels may define an object in a graphical user interface, for example text, figures, or an icon. In some embodiments, the processor 110 will determine a texture associated with the characteristics of the object. For example, if the object comprises text, processor 110 may determine a course texture for tall letters such as “l” and a softer texture for shorter letters such as “o.” In another embodiment, processor 110 may determine a texture based on the contents of a file associated with the icon. For example, processor 110 may determine a course texture for a file that contains more than a user defined amount of data, and a softer texture for a file that contains less than that amount. The first texture may comprise one of many textures known in the art, for example, the texture of steel, ice, fur, skin, leather, sand, sandpaper, rocks, snow, water, or oil. Or, in some embodiments, processor 110 may determine that the first texture comprises no texture.
Then, processor 110 determines a second texture associated with a second group of pixels defining a second section of the display area 408. In some embodiments, the second section of the display area may comprise all of the display area not occupied by the first section of the display area. In other embodiments, the second section of the display area may comprise a specific object in the graphical user interface, for example, text, figures, or an icon. In yet other embodiments, the second section may comprise some subset of the area not occupied by the first section. In some embodiments, the processor 110 will determine that the second texture is associated with the characteristics of the object. The second texture may comprise one of many textures known in the art, for example, the texture of steel, ice, fur, skin, leather, sand, sandpaper, rocks, snow, water, or oil. In some embodiments, processor 110 may determine that the second texture is similar or identical to the first texture. Or, in some embodiments, processor 110 may determine that the second texture comprises no texture.
In some embodiments, processor 110 may implement a haptic map to determine the first and second haptic effects. In such an embodiment, processor 110 may map the display signal to one or more actuators. In some embodiments, mapping the display signal to an actuator comprises determining haptic effects at various locations on the display, based at least in part on the display signal. For example, the display signal may comprise a plurality of pixels, each of the pixels associated with a color. In such an embodiment, processor 110 may determine the haptic effect by assigning a haptic value to each color in the display signal. Then processor 110 will determine a haptic effect based, at least in part, on the haptic values. In some embodiments, processor 110 may assign a haptic value to only some of the pixels in the display signal. For example, in such an embodiment, the haptic effect may be associated with only a portion of the display signal.
In some embodiments, processor 110 may determine the first haptic effect and the second haptic effect based, at least in part on, a user interaction or trigger. In such an embodiment, processor 110 receives an interface signal from touch-sensitive interface 114, and determines the haptic effect based at least in part on the interface signal. For example, in some embodiments, processor 110 may determine a different intensity haptic effect based on the interface signal received from touch-sensitive interface 114. For example, if touch-sensitive interface 114 detects a high pressure user interaction, processor 110 may determine a high-intensity haptic effect. In another embodiment, if touch-sensitive interface 114 detects a low pressure user interaction, processor 110 may determine a low-intensity haptic effect.
Next processor 110 transmits a haptic signal to an actuator 118 configured to receive the haptic signal and output a haptic effect 410. Actuator 118 may be, for example, a piezoelectric actuator, an electric motor, an electro-magnetic actuator, a voice coil, a linear resonant actuator, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (ERM), or a linear resonant actuator (LRA). The haptic effect may comprise one of several haptic effects known in the art, for example, vibrations, knocking, buzzing, jolting, or torquing the messaging device. In some embodiments, the haptic signal is configured to cause actuator 118 to output a haptic effect that simulates a texture. In some embodiments, if processor 110 determines that the user interaction is associated with the first section of the display area, the texture will comprise the first texture. In other embodiments, if processor 110 determines that the user interaction is associated with the second section of the display area, the texture will comprise the second texture. In some embodiments, processor 110 may determine the location of the user interaction based at least in part in the interface signal received from touch-sensitive interface 114. In other embodiments, processor 110 may determine the location of the user interaction based on another factor, for example a sensor signal received from a sensor or manipulandum such as a mouse, scroll wheel, or roller ball.
Finally, display 116 receives the display signal and outputs an image based at least in part on the display signal. In some embodiments, display 116 comprises a flat-screen display, such as a Liquid Crystal Display (LCD) or Plasma Screen Display. In other embodiments display 116 comprises a Cathode Ray Tube (CRT) or other type of display known in the art. In still other embodiments, display 116 may comprise touch-sensitive interface 114, for example, display 116 may comprise a touch-screen LCD. In some embodiments, processor 110 is configured to generate a graphical representation of a user interface to be shown on display 116, then transmit a display signal comprising the graphical representation to display 116. In other embodiments, display 116 is configured to receive a display signal from another device. For example, in some embodiments, display 116 may comprise an external display such as a computer monitor.
As shown in
Messaging device 502 further comprises an actuator (not shown in
In some embodiments, the texture may be associated with the text within selection box 506. For example, in some embodiments, the actuator may output a course texture when the user interacts with tall letters such as “l” and a soft texture when the user interacts with short letters such as “a.” In another embodiment, the actuator may output a course texture when the user interacts with uppercase letters and a softer texture when the user interacts with lowercase letters. In still other embodiments, the texture may be associated with other factors, for example, font, font size, length of the text, or length of individual words.
In some embodiments, messaging device 502 may comprise more than one actuator. In such an embodiment, these actuators may be used in combination to generate haptic effects. For example, when the haptic effects of each actuator are combined, they may form a single haptic effect that simulates a texture. In other embodiments, messaging device 502 may use the actuators separately. For example, a first actuator may output a first haptic effect when the user interacts with the section of display 516 associated with selection box 506, and a second actuator may output a second haptic effect when the user interacts with the remainder of display 516.
As shown in
Messaging device 602 further comprises an actuator (not shown in
As shown in
Messaging device 702 further comprises an actuator (not shown in
In some embodiments, messaging device 702 may be configured to output more complex haptic effects configured to simulate unique textures associated with each key on numeric keypad 704. For example, in some embodiments, messaging device 702 may output haptic effects configured to simulate edges for each key on numeric keypad 704. In some embodiments, these haptic effects may comprise vibrations that simulate four edges on each key. In some embodiments, the haptic effects may be further configured to simulate a depth or surface feature for each key. In one embodiment, the haptic effect may be configured to simulate keys that are not perfectly flat, for example keys that are slightly concave. This functionality may enable the user to distinguish one key from another, and may further allow the user to distinguish the center of a key from the edge of a key. In some embodiments, similar functionality may be applied to simulate textures on a larger keyboard, for example, a full QWERTY keyboard. In some embodiments, messaging device 702 may comprise more than one actuator, as described herein in relation to system 500.
As shown in
Messaging device 802 further comprises an actuator (not shown in
In some embodiments, messaging device 802 may determine the texture based on the files associated with the folder. For example, in some embodiments, folder 808 may comprise audio files. In such an embodiment, messaging device 802 may determine the texture based on the type of audio files, for example a course texture if the files are hard rock, and a soft texture if the files are classical. In another example, messaging device may determine the texture based on the properties of the files in the folder. For example, folder 806 may comprise protected or read only files, while folder 808 may comprise modifiable audio files. In such an embodiment, messaging device 802 may determine a course texture when the user interacts with folder 806, and a gentle or soft texture when the user interacts with folder 808. In other embodiments, different factors associated with the folders may be used to determine the haptic effect, for example, folder size, contents of the folder, age of the folder, title of the folder, creator of the files or folder, or some other factor known in the art. In some embodiments, messaging device 802 may comprise more than one actuator, as described herein in relation to system 500.
As shown in
Messaging device 902 further comprises an actuator (not shown in
As shown in
Messaging device 1002 further comprises an actuator (not shown in
As shown in
Messaging device 1102 further comprises an actuator (not shown in
As shown in
Messaging device 1202 further comprises an actuator (not shown in
In still other embodiments, messaging device 1202 may automatically assign textures to other buildings along the user's route. For example, in some embodiments, the messaging device may automatically assign a texture to certain types of buildings, for example all gas stations, restaurants, or hospitals. In one embodiment, building 1208 may comprise a hospital, building 1210 may comprise a mall, and building 1212 may comprise a gas station. In such an embodiment, the user may search for a gas station. As a part of this search, the user may enter a search menu that allows the user to assign a texture to all gas stations along his/her route. Then, the user may run his/her finger over the surface of display 1216 to find a gas station. When the user touches display 1216, he/she will feel a texture on the section of display 1216 associated with building 1212 and know that it is a gas station. In other embodiments, different sections of the interface may be associated with a texture. For example, in one embodiment, one or more of the turns along route 1204 may be associated with a texture. In another embodiment, one or more waypoints along route 1204 may be associated with a texture.
In some embodiments, messaging device 1202 may comprise more than one actuator, as described herein in relation to system 500.
As shown in
Messaging device 1302 further comprises an actuator (not shown in
In some embodiments, messaging device 1302 may comprise more than one actuator, as described herein in relation to system 500.
There are many advantages of systems and methods for using textures in graphical user interface widgets. For example, systems and methods for using textures in graphical user interface widgets adds a previously unused haptic effect to a mobile device. This new effect provides a new avenue for the user to receive information from the device, without the user having to look at the device's display. For example, systems and methods for using textures in graphical user interface widgets may allow the user to assign different textures to different icons, buttons, or other components of their display. Thus, users may be able to determine which icon they are touching, without having to look at that icon. This may increase usability of the device, and may make a device more useful to the visually impaired. It may also increase the adoption of different types of applications that had not previously been utilized by users that often use mobile devices in distracting situations, such as while walking or driving.
Further, systems and methods for using textures in graphical user interface widgets may provide the user with more information, without distracting the user from other tasks. Therefore, it may reduce the likelihood of user error. For example, users will be less likely to hit the wrong icon or press the wrong key if they are utilizing systems and methods for using textures in graphical user interface widgets. This functionality may serve both to increase user satisfaction and increase the adoption rate for technology that incorporates systems and methods for using textures in graphical user interface widgets.
The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
Embodiments in accordance with aspects of the present subject matter can be implemented in digital electronic circuitry, in computer hardware, firmware, software, or in combinations of the preceding. In one embodiment, a computer may comprise a processor or processors. The processor comprises or has access to a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs including a sensor sampling routine, a haptic effect selection routine, and suitable programming to produce signals to generate the selected haptic effects as noted above.
Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
Such processors may comprise, or may be in communication with, media, for example tangible computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, all electronic, optical, magnetic, or other storage devices capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. Also, various other devices may include computer-readable media, such as a router, private or public network, or other transmission device. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.
This patent application is a continuation of and claims priority to application Ser. No. 12/697,037, entitled “Systems and Methods for Using Textures in Graphical User Interface Widgets,” filed on Jan. 29, 2010, which claims priority to U.S. Provisional Patent Application No. 61/159,482, entitled “Locating Features Using a Friction Display,” filed Mar. 12, 2009; and also claims priority to U.S. Provisional Patent Application No. 61/262,041, entitled “System and Method for Increasing Haptic Bandwidth in an Electronic Device,” filed Nov. 17, 2009; and also claims priority to U.S. Provisional Patent Application No. 61/262,038, entitled “Friction Rotary Device for Haptic Feedback,” filed Nov. 17, 2009, the entirety of all of which is hereby incorporated by reference herein. Application Ser. No. 12/697,037 is related to U.S. patent application Ser. No. 12/697,010, filed the same day and entitled “Systems and Methods for a Texture Engine,” which is incorporated by reference herein in its entirety. Application Ser. No. 12/697,037 is related to U.S. patent application Ser. No. 12/697,042, filed the same day and entitled “Systems and Methods for Using Multiple Actuators to Realize Textures,”, which is incorporated by reference herein in its entirety. Application Ser. No. 12/697,037 is related to U.S. patent application Ser. No. 12/696,893, filed the same day and entitled “Systems and Methods for Providing Features in a Friction Display,”, which is incorporated by reference herein in its entirety. Application Ser. No. 12/697,037 is related to U.S. patent application Ser. No. 12/696,900, filed the same day and entitled “Systems and Methods for Friction Displays and Additional Haptic Effects,”, which is incorporated by reference herein in its entirety. Application Ser. No. 12/697,037 is related to U.S. patent application Ser. No. 12/696,908, filed the same day and entitled “Systems and Methods for Interfaces Featuring Surface-Based Haptic Effects,”, which is incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
5144187 | Culp | Sep 1992 | A |
5198732 | Morimoto | Mar 1993 | A |
5691898 | Rosenberg et al. | Nov 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5844392 | Peurach et al. | Dec 1998 | A |
5952806 | Muramatsu | Sep 1999 | A |
5956484 | Rosenberg et al. | Sep 1999 | A |
5959613 | Rosenberg et al. | Sep 1999 | A |
6028593 | Rosenberg et al. | Feb 2000 | A |
6046527 | Roopnarine et al. | Apr 2000 | A |
6084587 | Massie et al. | Jul 2000 | A |
6088019 | Rosenberg | Jul 2000 | A |
6131097 | Peurach et al. | Oct 2000 | A |
6147674 | Rosenberg et al. | Nov 2000 | A |
6169540 | Rosenberg et al. | Jan 2001 | B1 |
6219032 | Brave et al. | Apr 2001 | B1 |
6252579 | Rosenberg et al. | Jun 2001 | B1 |
6285351 | Chang et al. | Sep 2001 | B1 |
6292170 | Chang et al. | Sep 2001 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6697086 | Rosenberg et al. | Feb 2004 | B2 |
6703924 | Tecu et al. | Mar 2004 | B2 |
6850222 | Rosenberg et al. | Feb 2005 | B1 |
6859819 | Rosenberg et al. | Feb 2005 | B1 |
7027032 | Rosenberg et al. | Apr 2006 | B2 |
7109967 | Hioki et al. | Sep 2006 | B2 |
7148875 | Rosenberg et al. | Dec 2006 | B2 |
7812828 | Westerman et al. | Oct 2010 | B2 |
7815436 | Cunningham et al. | Oct 2010 | B2 |
7920124 | Tokita et al. | Apr 2011 | B2 |
8210942 | Shimabukuro et al. | Jul 2012 | B2 |
8264465 | Grant et al. | Sep 2012 | B2 |
8494860 | Ifukube et al. | Jul 2013 | B2 |
8677274 | Tiene et al. | Mar 2014 | B2 |
20010035854 | Rosenberg et al. | Nov 2001 | A1 |
20020177471 | Kaaresoja et al. | Nov 2002 | A1 |
20030063128 | Salmimaa et al. | Apr 2003 | A1 |
20030184518 | Numata et al. | Oct 2003 | A1 |
20040169674 | Linjama et al. | Sep 2004 | A1 |
20040218910 | Chang et al. | Nov 2004 | A1 |
20040233162 | Kobayashi | Nov 2004 | A1 |
20050017947 | Shahoian et al. | Jan 2005 | A1 |
20050030292 | Diederiks et al. | Feb 2005 | A1 |
20050057528 | Kleen | Mar 2005 | A1 |
20060046031 | Janevski et al. | Mar 2006 | A1 |
20060061545 | Hughes et al. | Mar 2006 | A1 |
20060061558 | Grant et al. | Mar 2006 | A1 |
20060101347 | Runov et al. | May 2006 | A1 |
20060119573 | Grant et al. | Jun 2006 | A1 |
20060119586 | Grant et al. | Jun 2006 | A1 |
20060181510 | Faith | Aug 2006 | A1 |
20060209037 | Wang et al. | Sep 2006 | A1 |
20060226298 | Pierson | Oct 2006 | A1 |
20060267416 | Suzuki | Nov 2006 | A1 |
20060290662 | Houston et al. | Dec 2006 | A1 |
20070021961 | Oh et al. | Jan 2007 | A1 |
20070066283 | Haar et al. | Mar 2007 | A1 |
20070236450 | Colgate et al. | Oct 2007 | A1 |
20070236474 | Ramstein | Oct 2007 | A1 |
20070279401 | Ramstein et al. | Dec 2007 | A1 |
20070290988 | Nogami et al. | Dec 2007 | A1 |
20080048974 | Braun et al. | Feb 2008 | A1 |
20080068348 | Rosenberg et al. | Mar 2008 | A1 |
20080068648 | Benz et al. | Mar 2008 | A1 |
20080216578 | Takashima et al. | Sep 2008 | A1 |
20090063472 | Pell et al. | Mar 2009 | A1 |
20090102805 | Meijer et al. | Apr 2009 | A1 |
20090134744 | Yoon et al. | May 2009 | A1 |
20090207129 | Ullrich et al. | Aug 2009 | A1 |
20090227296 | Kim et al. | Sep 2009 | A1 |
20090284485 | Colgate et al. | Nov 2009 | A1 |
20100013653 | Birnbaum et al. | Jan 2010 | A1 |
20100020036 | Hui et al. | Jan 2010 | A1 |
20100026976 | Meehan et al. | Feb 2010 | A1 |
20100073304 | Grant et al. | Mar 2010 | A1 |
20100108408 | Colgate et al. | May 2010 | A1 |
20100145934 | Tran et al. | Jun 2010 | A1 |
20100231539 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100315212 | Radivojevic et al. | Dec 2010 | A1 |
20110115709 | Cruz-Hernandez | May 2011 | A1 |
20110248817 | Houston et al. | Oct 2011 | A1 |
20120232780 | Delson et al. | Sep 2012 | A1 |
20140317200 | Lucero et al. | Oct 2014 | A1 |
20140317503 | Lucero et al. | Oct 2014 | A1 |
20150253848 | Heubel et al. | Sep 2015 | A1 |
20180052556 | Levesque et al. | Feb 2018 | A1 |
Number | Date | Country |
---|---|---|
101118469 | Feb 2008 | CN |
101369175 | Feb 2009 | CN |
0899650 | Mar 1999 | EP |
1401185 | Mar 2004 | EP |
1748350 | Jan 2007 | EP |
1731993 | Feb 2013 | EP |
2416962 | Feb 2006 | GB |
11212725 | Aug 1999 | JP |
2001-290572 | Oct 2001 | JP |
2003-091233 | Mar 2003 | JP |
2004265281 | Sep 2004 | JP |
2005258666 | Sep 2005 | JP |
2006228151 | Aug 2006 | JP |
2007-531113 | Nov 2007 | JP |
2008-516348 | May 2008 | JP |
2009003867 | Jan 2009 | JP |
6463795 | Jan 2019 | JP |
1020090024006 | Mar 2009 | KR |
2001054109 | Jul 2001 | WO |
2004044728 | May 2004 | WO |
2004051451 | Jun 2004 | WO |
2004075169 | Sep 2004 | WO |
2005103863 | Nov 2005 | WO |
2005103863 | Nov 2005 | WO |
2006042309 | Apr 2006 | WO |
2006042309 | Apr 2006 | WO |
2006097400 | Sep 2006 | WO |
2007117418 | Oct 2007 | WO |
2007120562 | Oct 2007 | WO |
2008037275 | Apr 2008 | WO |
2008042745 | Apr 2008 | WO |
2008103535 | Aug 2008 | WO |
2008132540 | Nov 2008 | WO |
2008144108 | Nov 2008 | WO |
20080147622 | Dec 2008 | WO |
2009002605 | Dec 2008 | WO |
2009026337 | Feb 2009 | WO |
2009074185 | Jun 2009 | WO |
Entry |
---|
“Corel Paint Shop Pro Photo X2 Reviewers Guide”, accessed Apr. 7, 2012, Nov. 2, 2007. |
U.S. Appl. No. 12/696,893, “Final Office Action”, dated Mar. 11, 2014, 16 pages. |
U.S. Appl. No. 12/696,893, “Final Office Action”, dated Jun. 6, 2013, 28 pages. |
U.S. Appl. No. 12/696,893, “Non-Final Office Action”, dated Oct. 22, 2013, 16 pages. |
U.S. Appl. No. 12/696,893, “Office Action”, dated Feb. 29, 2012, 20 pages. |
U.S. Appl. No. 12/696,893, “Office Action”, dated Sep. 6, 2012, 27 pages. |
U.S. Appl. No. 12/696,900, “Final Office Action”, dated Aug. 19, 2016, 11 pages. |
U.S. Appl. No. 12/696,900, “Non Final Office Action”, dated Dec. 12, 2016, 11 pages. |
U.S. Appl. No. 12/696,900, “Non-Final Office Action”, dated May 9, 2016, 12 pages. |
U.S. Appl. No. 12/696,900, “Office Action”, dated Jan. 15, 2013, 62 pages. |
U.S. Appl. No. 12/696,908, “Final Office Action”, dated Nov. 10, 2016, 14 pages. |
U.S. Appl. No. 12/696,908, “Non-Final Office Action”, dated May 26, 2016, 14 pages. |
U.S. Appl. No. 12/696,908, “Non-Final Office Action”, dated Sep. 21, 2017, 20 pages. |
U.S. Appl. No. 12/696,908, “Office Action”, dated Jan. 15, 2013, 56 pages. |
U.S. Appl. No. 12/697,010, “Final Office Action”, dated Sep. 16, 2016, 18 pages. |
U.S. Appl. No. 12/697,010, “Non-Final Office Action”, dated May 20, 2016, 12 pages. |
U.S. Appl. No. 12/697,010, “Non-Final Office Action”, dated Feb. 22, 2017, 22 pages. |
U.S. Appl. No. 12/697,042, “Final Office Action”, dated Nov. 30, 2016, 20 pages. |
U.S. Appl. No. 12/697,042, “Non-Final Office Action”, dated May 17, 2016, 43 pages. |
U.S. Appl. No. 12/947,321, “Final Office Action”, dated Oct. 3, 2017, 35 pages. |
U.S. Appl. No. 12/947,321, “Office Action”, dated Nov. 2, 2012, 34 pages. |
U.S. Appl. No. 12/947,532, “Final Office Action”, dated Feb. 2, 2017, 31 pages. |
U.S. Appl. No. 12/947,532, “Final Office Action”, dated Sep. 8, 2017, 33 pages. |
U.S. Appl. No. 12/947,532, “Non-Final Office Action”, dated Sep. 9, 2016, 26 pages. |
U.S. Appl. No. 12/947,532, “Office Action”, dated Nov. 15, 2012. |
U.S. Appl. No. 15/601,580, “Office Action”, dated Feb. 27, 2018. |
Biet , “Discrimination of Virtual Square Gratings by Dynamic Touch on Friction Based Tactile Displays,” Haptic Interfaces for Virtual Environment and Teleoperator Systems, Symposium, IEEE, Piscataway, NJ, 2008, pp. 41-48. |
Chowdhury et al., “The Effect of Amplitude of Vibration on the Coefficient of Friction for Different Materials”, Tribology International. vol. 41, Issue 4, Apr. 2008, pp. 307-317. |
CN201080011708.7 , “Office Action”, dated Jan. 26, 2015, 12 pages. |
CN201080011708.7 , “Office Action”, dated Jul. 10, 2014, 12 pages. |
CN201080011743.9 , “Office Action”, dated Nov. 3, 2014, 20 pages. |
CN201080011743.9 , “Office Action”, dated Apr. 3, 2014, 8 pages. |
CN201080011744.3 , “Office Action”, dated Jun. 24, 2014, 13 pages. |
CN201080011744.3 , “Office Action”, dated Aug. 12, 2015, 6 pages. |
CN201080011744.3 , “Office Action”, dated Dec. 16, 2014, 7 pages. |
CN201080011744.3 , “Office Action”, dated Sep. 23, 2013, 8 pages. |
CN201080011905.9 , “Office Action”, dated Apr. 11, 2016, 18 pages. |
Dewitt , “Designing Sanification of User Data in Affective Interaction”, Master of Science Thesis Stockholm, hppt://w3.nada.kth.se/utbildning/grukth/exjobb/rapportlistor/2007/rapporter07/de_witt_anna-07142, Oct. 20, 2009. |
EP10787610.4 , “Office Action”, dated Aug. 21, 2017, 7 pages. |
JP2011-554172 , “Office Action”, dated Jan. 28, 2014, 3 pages. |
JP2011-554174 , “Office Action”, dated Jan. 28, 2014, 3 pages. |
JP2011-554175 , “Office Action”, dated Mar. 1, 2016, 6 pages. |
JP2011-554180 , “Office Action”, dated Jan. 7, 2014, 2 pages. |
JP2011-554180 , “Office Action”, dated Dec. 16, 2014, 2 pages. |
JP2014-171836 , “Office Action”, dated Nov. 4, 2015, 3 pages. |
JP2015-019308 , “Office Action”, dated Feb. 26, 2016. |
JP2015-019308 , “Office Action”, dated Mar. 1, 2016. |
JP2015-083829 , “Office Action”, dated Feb. 16, 2016, 5 pages. |
JP2016-123698 , “Office Action”, dated Feb. 7, 2017, 2 pages. |
Kaaresoja et al., “Snap-crackle-pop: Tactile feedback for mobile touch screens”, proceedings of the Eurohaptics http://lsc.unv-evry.fr/eurohaptics/upload/cd/papers/f80, Jul. 31, 2006, 2 pages. |
KR10-2011-7023926 , “Office Action”, dated Sep. 30, 2015, 6 pages. |
KR10-2011-7023927 , “Office Action”, dated Sep. 30, 2015. |
KR10-2011-7023928 , “Office Action”, dated Apr. 28, 2016, 3 pages. |
KR10-2011-7023987 , “Office Action”, dated Aug. 21, 2015, 5 pages. |
KR10-2016-7024890 , “Office Action”, dated Oct. 27, 2016, 6 pages. |
KR10-2017-7019986 , “Office Action”, dated Sep. 8, 2017, 10 pages. |
Kumazawa et al., “Seeking user interface casually used immediately after touched, ILE Technical Report”, The Institute of Image Information and Television Engineers, vol. 29, No. 46, Aug. 29, 2005, pp. 67-70. |
Maeno et al., “Tactile Display of Surface Texture by use of Amplitude Modulation of Ultrasonic Vibration”, IEEE Ultrasonics Symposium, 2006, pp. 62-65. |
Meyer et al., “Fingertip Friction Modulation due to Electrostatic Attraction”, IEEE World Haptics Conference, 2013, pp. 43-48. |
Minsky , “Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-Feedback Display”, Ph.D. Dissertation, MIT, Jun. 1995. |
Oakley et al., “Contact IM: Exploring Asynchronous Touch Over Distance”, Internet Citation, URL:http://www.whereveriam.org/work/palpable/Contact!M.pdf, Jan. 1, 2002. |
PCT/US2010/026894 , “International Preliminary Report on Patentability”, dated Sep. 22, 2011. |
PCT/US2010/026894 , “International Search Report and Written Opinion”, dated Jun. 8, 2010. |
PCT/US2010/026897 , “International Preliminary Report on Patentability”, dated Sep. 22, 2011. |
PCT/US2010/026897 , “International Search Report and Written Opinion”, dated Jun. 8, 2010. |
PCT/US2010/026900 , “International Preliminary Report on Patentability”, dated Sep. 22, 2011. |
PCT/US2010/026900 , “International Search Report and Written Opinion”, dated Jun. 8, 2010. |
PCT/US2010/026905 , “International Preliminary Report on Patentability”, dated Sep. 22, 2011. |
PCT/US2010/026905 , “International Search Report and Written Opinion”, dated Jun. 8, 2010. |
PCT/US2010/026907 , “International Preliminary Report on Patentability”, dated Sep. 22, 2011. |
PCT/US2010/026907 , “International Search Report and Written Opinion”, dated Jun. 8, 2010. |
PCT/US2010/026909 , “International Preliminary Report on Patentability”, dated Sep. 22, 2011. |
PCT/US2010/026909 , “International Search Report and Written Opinion”, dated Jun. 8, 2010. |
Rovers et al., “HIM: A Framework for Haptic Instant Messaging”, CHI 2004 (CHI Conference Proceedings, Human Factors in Computing Systems), Apr. 2004, pp. 1313-1316. |
Sekiguchi et al., “Haptic Interface using Estimation of Box Contents Metaphor”, Proceedings of ICAT 2003, http://www.vrsj.org/ic-at/papers/2003/00947 _00000, as available via the Internet. |
Tang et al., “A Microfabricated Electrostatic Haptic Display for Persons with Visual Impairments”, IEEE Transactions on Rehabilitation Engineering, vol. 6, issue 3, Sep. 1998, pp. 241-314. |
Watanabe et al., “A Method for Controlling Tactile Sensation of Surface Roughness Using Ultrasonic Vibration”, IEEE International Conference on Robotics and Automation, 1995, pp. 1134-1139. |
Williamson et at., “Shoogle: Excitatory Multimodal Interaction on Mobile Devices”, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Apr. 28, 2007, pp. 121-124. |
CN 201610196586.2, “Notification of First Office Action”, dated Jun. 15, 2018, 26 pages. |
CN 201610531685.1, “Notification of First Office Action”, dated Jun. 26, 2018, 22 pages. |
JP 2017-093154, “Office Action”, dated Mar. 6, 2018, 3 pages. |
U.S. Appl. No. 15/988,359, “Non Final Office Action”, dated Jun. 29, 2018, 14 pages. |
EP 17181965.9, “Office Action”, dated Nov. 29, 2018, 5 pages. |
EP 18185362.3, “Extended European Search Report”, dated Nov. 22, 2018, 15 pages. |
KR 10-2011-7023927, “Office Action”, dated Oct. 16, 2018, 16 pages. |
Chinese Application CN201610196586.2, “Office Action”, dated Jan. 29, 2019, 34 pages. |
Chinese Application CN201610531685.1, “Office Action”, dated Jan. 23, 2019, 18 pages. |
Number | Date | Country | |
---|---|---|---|
20180173312 A1 | Jun 2018 | US | |
20190187795 A9 | Jun 2019 | US |
Number | Date | Country | |
---|---|---|---|
61159482 | Mar 2009 | US | |
61262041 | Nov 2009 | US | |
61262038 | Nov 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12697037 | Jan 2010 | US |
Child | 15894966 | US |