The present disclosure relates generally to systems and methods for multi-pressure interaction on touch-sensitive surfaces.
With the increase in popularity of handheld devices, especially mobile phones having touch-sensitive surfaces (e.g., touch screens), physical tactile sensations which have traditionally been provided by mechanical buttons no longer apply in the realm of these new generations of devices. Instead, haptic effects may be output by handheld devices to alert the user to various events. Such haptic effects may include vibrations to indicate a button press, an incoming call, or a text message, or to indicate error conditions.
Embodiments of the present invention provide systems and methods for multi-pressure interaction on touch-sensitive surfaces. For example, in one embodiment of a method disclosed herein, the method comprises receiving a first sensor signal from a touch-sensitive input device in response to a first contact of a first object on the touch-sensitive input device, the first sensor signal comprising a first location and a first pressure of the first contact, receiving a second sensor signal from the touch-sensitive input device in response to a second contact of a second object on the touch-sensitive input device substantially simultaneously with the first contact, the second sensor signal comprising a second location of the second contact and a second pressure of the second contact, generating a signal based at least in part on the first sensor signal and the second sensor signal, the signal configured to cause a haptic effect, and outputting the signal. In another embodiment, a computer-readable medium comprises program code for causing a processor to execute such a method.
These illustrative embodiments are mentioned not to limit or define the invention, but rather to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, which provides further description of the invention. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more examples of embodiments and, together with the description of example embodiments, serve to explain the principles and implementations of the embodiments.
Example embodiments are described herein in the context of systems and methods for multi-pressure interaction on touch-sensitive surfaces. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following description to refer to the same or like items.
In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another.
Referring to
In addition to scrolling the web page based on the user-applied multi-pressure inputs, the device 100 also outputs haptic effects to indicate the action taken in response to the input. For example, while scrolling down the web page, the device 100 may output a haptic effect that seems to travel from the top of the device 100 to the bottom of the device 100, and cycle repeatedly while the user continues to scroll the web page. Or if the user is scrolling up the web page, the haptic effect starts at the bottom of the device 100 and seems to travel toward the top of the device 100, and cycle repeatedly while the user continues to scroll the web page. Thus, a user is able to provide multi-touch, multi-pressure input to interact with a device 100 and receive haptic feedback based on the input.
This illustrative example is given to introduce the reader to the general subject matter discussed herein. The invention is not limited to this example. The following sections describe various additional non-limiting embodiments and examples of devices, systems, and methods for multi-pressure interaction on touch-sensitive surfaces.
Referring now to
In the embodiment shown in
The multi-pressure touch-sensitive input device 200 can be any device that comprises or is in communication with a touch-sensitive surface that is capable of detecting pressures associated with at least two contacts on the touch-sensitive surface. For example, the device 200 of
In some embodiments, one or more touch-sensitive surfaces may be on one or more sides of the device 200. For example, in one embodiment, a touch-sensitive surface is disposed within or comprises a rear surface of the device 200. In another embodiment, a first touch-sensitive surface is disposed within or comprises a rear surface of the device 200 and a second touch-sensitive surface is disposed within or comprises a side surface of the device 200. Furthermore, in embodiments where the device 200 comprises at least one touch-sensitive surface on one or more sides of the device 200 or in embodiments where the device 200 is in communication with an external touch-sensitive surface, the display 230 may or may not comprise a touch-sensitive surface. In some embodiments, one or more touch-sensitive surfaces may have a flexible touch-sensitive surface. In other embodiments, one or more touch-sensitive surfaces may be rigid. In various embodiments, the device 200 may comprise both flexible and rigid touch-sensitive surfaces.
In various embodiments, the device 200 may comprise or be in communication with fewer or additional components than the embodiment shown in
The housing 205 of the device 200 shown in
In the embodiment shown in
In the embodiment shown in
An actuator, such as actuators 240 or 260, can be any component or collection of components that is capable of outputting one or more haptic effects. For example, an actuator can be one of various types including, but not limited to, an eccentric rotational mass (ERM) actuator, a linear resonant actuator (LRA), a piezoelectric actuator, a voice coil actuator, an electro-active polymer (EAP) actuator, a memory shape alloy, a pager, a DC motor, an AC motor, a moving magnet actuator, an E-core actuator, a smartgel, an electrostatic actuator, an electrotactile actuator, or any other actuator or collection of components that perform the functions of an actuator. Multiple actuators or different-sized actuators may be used to provide a range of vibrational frequencies, which may be actuated individually or simultaneously. Various embodiments may include a single or multiple actuators and may have the same type or a combination of different types of actuators.
In various embodiments, one or more haptic effects may be produced in any number of ways or in a combination of ways. For example, in one embodiment, one or more vibrations may be used to produce a haptic effect, such as by rotating an eccentric mass or by linearly oscillating a mass. In some such embodiments, the haptic effect may be configured to impart a vibration to the entire device or to only one surface or a limited part of the device. In another embodiment, friction between two or more components or friction between at least one component and at least one contact may be used to produce a haptic effect, such as by applying a brake to a moving component, such as to provide resistance to movement of a component or to provide a torque. In other embodiments, deformation of one or more components can be used to produce a haptic effect. For example, in an embodiment, one or more haptic effects may be output to change the shape of a surface or a coefficient of friction of a surface. In other embodiments, an array of transparent deforming elements may be used to produce a haptic effect, such as one or more areas comprising a smartgel.
In
Referring now to
The method 300 begins in block 310 when a sensor signal is received. For example, in one embodiment, the processor 210 receives a signal from the touch-sensitive display 230 when a user contacts the touch-sensitive display 230 and the signal includes information associated with an input on—or a status of—the touch-sensitive display 230 such as the x, y location and pressure of a contact on the touch-sensitive display 230. In other embodiments, the processor 210 receives a plurality of sensor signals. For example, the processor 210 may receive a first signal including information associated with a first input on the touch-sensitive display 230, a second signal including information associated with a second input on the touch-sensitive display 230, and a third signal including information associated with a third input on the touch-sensitive display 230. In one embodiment, the processor 210 receives a first signal including information containing the x, y location of a contact on the touch-sensitive display 230 and a second signal including information containing the pressure of the contact. In another embodiment, the processor 210 receives a first signal including information containing the x, y locations of two contacts on the touch-sensitive display 230 and a second signal includes information containing pressures of the two contacts. The processor 210 may receive a single signal that includes information associated with two or more inputs on the touch-sensitive display 230. For example, in one embodiment, the processor 210 receives a single signal that includes the x, y location and pressure of a first contact and the x, y location and pressure of a second contact.
As discussed above, in one embodiment, the processor 210 receives a signal from the touch-sensitive display 230. In some embodiments, the device 200 may comprise a touch-sensitive surface separate from, or instead of, a touch sensitive display 230. In such an embodiment, the processor 210 may receive sensor signals(s) from the touch-sensitive surface, or if a plurality of touch-sensitive surfaces are employed, from one or more of the plurality of touch sensitive surfaces.
In some embodiments, the processor 210 may receive one or more sensor signals from the network interface 250. For example, in one embodiment, the network interface 250 is in communication with and receives information from one or more components or devices, or both. In this embodiment, the network interface 250 sends one or more signals to the processor 210 that contain information from the other components or devices, or both. For example, the network interface 250 may receive a signal from another multi-pressure touch-sensitive input device and the signal may contain information regarding an input on a touch-sensitive display of the other device. The network interface 250 may send information regarding the input on the display of the other device to the processor 210. In another embodiment, the network interface 250 receives a signal from a wireless touch-sensitive surface that is in communication with the network interface 250 and the network interface 250 sends one or more signals containing information about an input on or the status of the touch-sensitive surface to the processor 210.
In other embodiments, the network interface 250 may receive a plurality of sensor signals from one or more components or devices in communication with the network interface 250 and can send one or more signals to the processor 210. For example, in one embodiment, the network interface 250 is in communication with a wireless touch-sensitive surface and another multi-pressure touch-sensitive input device. In such an embodiment, the network interface 250 may receive one signal from the wireless touch-sensitive surface and another signal from the multi-pressure touch-sensitive input device. In addition, the network interface 250 may send one or more signals containing information from the wireless touch-sensitive surface or from the other multi-pressure touch-sensitive input device, or both, to the processor 210. Thus, the processor 210 may receive one or more signals from both the touch-sensitive display 230 and the network interface 250. For example, in one embodiment, the processor 210 receives a first signal from the touch-sensitive display 230 containing information about an input on the touch-sensitive display 230 and the processor 210 receives a second signal from the network interface 250 containing information about an input on the display of another multi-pressure touch-sensitive input device that is in communication with the network interface 250.
As discussed above, in one embodiment, the processor 210 receives a signal when a user contacts the touch-sensitive display 230. In such an embodiment, the processor 210 may receive a signal from the touch-sensitive display 230 only when an input is made on the display. Or the processor 210 may receive a signal from the touch-sensitive display 230 when an input is initially made on the touch-sensitive display 230 and when a change to an existing input is made. For example, the processor 210 may receive one or more signals when a user contacts the touch-sensitive display 230 and each time the user moves the contact along the touch-sensitive display 230. In other embodiments, the processor 210 may receive successive signals from the touch-sensitive display 230 for the entire duration of one or more contacts. In one embodiment, the processor 210 receives a signal from the touch-sensitive display 230 at specified time intervals. For example, the processor 210 may receive a signal from the touch-sensitive display 230 periodically, such as every 0.1 ms. In other embodiments, the processor 210 receives a signal containing status information from the touch-sensitive display 230 regardless of whether a contact is made on the touch-sensitive display 230. For example, in one embodiment, the processor 210 receives successive signals from the touch-sensitive display 230 at a specified time intervals regardless of whether a contact is made on the touch-sensitive display 230, but if a contact exists on the touch-sensitive display 230 the signal may contain information regarding the contact such as the location and pressure of the contact.
In the embodiment discussed above, the signal that the processor 210 receives includes information associated with an input on—or a status of—the touch-sensitive display 230 such as the x, y location and pressure of a contact on the touch-sensitive display 230. In various embodiments, a signal that is received by the processor 210 can provide information relating to one or more contacts on the device 200, information relating to a component of the device 200, or information related to other components or devices that the processor 210 can use to determine a contact. For example, in one embodiment a signal contains information indicating that a contact has occurred. In another embodiment, the signal may contain the change in pressure of a contact from a previous measurement to the current measurement. Similarly, a signal may contain information regarding the change in location of a contact from a previous location. In various embodiments, a signal can contain data including, but not limited to, location data, contact data, interaction data, gesture data, duration data, pressure data, thermal data, waveform data, capacitive data, infrared data, photodetection data, optical data, other data necessary or relevant in determining a contact.
Referring again to method 300, once a sensor signal has been received the method 300 proceeds to block 320. In block 320 a contact is determined. As discussed above, in one embodiment, the processor 210 only receives a signal once a contact is made with the touch-sensitive display 230. Thus, in this embodiment, the display 230 receives a sensor signal, determines a contact, and sends a signal to the processor 210. The processor 210, on the other hand, does not have to determine a contact because the processor 210 only receives a signal from the display 230 once a contact has been determined. Thus, in some embodiments, the display 230 receives sensor signals as specified in block 310 and determines a contact as specified in block 320 and the processor determines a response as specified in block 330.
In some embodiments, the processor 210 determines whether a contact has occurred as specified in block 320. For example, a display may receive sensor signals as specified in block 310 and the display may send information associated with the sensor signals to the processor 210, either directly if the display is in communication with the processor 210 or through the network interface 250, which the processor 210 receives and uses to determine whether a contact has occurred as specified in block 320. In one embodiment, the information that the processor 210 receives comprises an instruction specifying that a contact has occurred. In another embodiment, the information that the processor 210 receives is indicative of whether a contact has occurred. For example, if the processor 210 receives information containing an x coordinate, a y coordinate, and a pressure, the processor 210 may be able to use this information to determine that a contact has occurred. In another embodiment, the processor 210 receives pressure information at periodic intervals that the processor 210 uses to determine whether a contact has occurred based upon changes in the pressure information. In other embodiments, if the pressure information the processor 210 receives is less than a threshold pressure the processor 210 may determine that a contact has not occurred and if the pressure is greater than or equal to the threshold pressure the processor 210 may determine that a contact has occurred.
As discussed previously, a contact with the device 200 can be made in numerous ways. For example, a contact can be made with the touch-sensitive display 230 by one or more objects such as, for example, a single finger, multiple fingers, or a pencil. In one embodiment, a contact may include physically contacting the touch-sensitive display 230 and, in another embodiment, a contact may include hovering an object over the touch-sensitive display 230 without physically contacting the touch-sensitive display 230. Thus, in some embodiments the processor 210 can determine a contact based on a physical contact with the touch-sensitive display 230 and, in other embodiments, the processor 210 may determine a contact based on a near-contact with or object hovering over the touch-sensitive display 230.
The device 200 may use various technologies to determine whether a contact has occurred or to obtain information related to a contact. For example, temperatures on or near the touch-sensitive display 230 may be measured to determine whether a contact has occurred. Thus, a finger approaching the touch-sensitive display 230 may be detected and a contact determined based at least in part on the difference in the ambient temperature surrounding the device 200 and the temperature of the approaching finger. In one embodiment, the device 200 comprises one or more capacitive sensors that are used to detect a contact based on an object approaching the touch-sensitive display 230. The device 200 may comprise other components including, but not limited to, an infrared LED, a photodetector, an image sensor, an optical camera, or a combination thereof that may be used to determine, at least in part, whether a contact on the touch-sensitive display 230 has occurred or to obtain information related to a contact. Thus, the device 200 may use any suitable technology that allows the touch-sensitive display 230 to determine, or assists the processor 210 in determining, a contact on the touch-sensitive display 230.
In some embodiments, the device may receive information from the network interface 250 which the processor 210 uses to determine whether a contact has occurred as shown in block 320. For example, the processor 210 may receive information from the network interface 250 that is in communication with another device. In one embodiment, the other device may send the network interface 250 information when a display associated with the other device receives an input and the processor 210 may receive information from the network interface 250 related to the input on the other device. In some embodiments, the processor 210 may receive periodic information from the network interface 250 about another device that is in communication with the network interface. In one embodiment where the network interface 250 is in communication with a remote touch-sensitive surface, the network interface 250 receives information from the touch-sensitive surface and sends information to the processor 210 which the processor 210 uses to determine a contact. In still further embodiments, another component, such as a separate microprocessor or co-processor may be responsible for determining a contact and providing such information to the processor 210. In various embodiments, software stored on the memory 220 and executed by the processor 210 may also be used in determining whether a contact has occurred, such as by implementing various techniques discussed above.
Referring again to method 300, once a contact has been determined 320, the method 300 proceeds to block 330. In block 330 a response is determined. As discussed above, in one embodiment, the processor 210 receives a signal from the touch-sensitive display 230 when a user contacts the touch-sensitive display 230 and the signal includes the x, y location and pressure of the contact on the touch-sensitive display 230. In this embodiment, if the user is viewing a web page displayed on the touch-sensitive display 230 of the device 200 and if the processor 210 determines that the user is touching the touch-sensitive display 230 substantially simultaneously with two fingers and two contacts are determined, and it is determined that the user is applying more pressure with a finger located nearer the bottom of the screen than the other finger, the processor 210 determines that the response should be to update the touch-sensitive display 230 to scroll down the web page and to output a haptic effect that indicates that the page is scrolling down the web page. Alternatively, in this embodiment, the processor 210 may determine that the response should be to update the touch-sensitive display 230 to scroll up the web page and to output a haptic effect that indicates that the page is scrolling up the web page, such as if the processor 210 detects two substantially-simultaneous contacts on the touch-sensitive display 230 and the pressure of the contact located nearer the top of the screen is larger than the pressure of the contact located nearer the bottom of the screen.
In some embodiments, the rate or speed of scrolling is based at least in part on the pressures. For example, the scrolling rate may increase as the difference in pressures between two contacts increases. In one embodiment, one or more haptic effects are output corresponding to the rate of scrolling, such as by vibrating a device at frequencies or magnitudes that vary based on the rate of scrolling. Thus, in some embodiments, the processor 210 determines a response, if any, as specified in block 330. In other embodiments, the touch-sensitive display 230 determines a response, if any. In still further embodiments, another component, such as a separate microprocessor or co-processor in communication with the processor 210, the touch-sensitive display 230, or the network interface 250 may be responsible for determining a response and providing such information to the processor 210 or the network interface 250. In various embodiments, software stored on the memory 220 and executed by the processor 210 may also be used in determining whether a contact has occurred.
The processor 210, touch-sensitive display 230, or other component may use any or all of the information received to determine a contact in determining a response. Thus, in embodiments, components of the device 200 or components in communication with the device 200 or components of another device in communication with the device 200 may use various data including, but not limited to, location data, contact data, interaction data, gesture data, duration data, pressure data, thermal data, waveform data, capacitive data, infrared data, photodetection data, optical data, other data necessary or relevant in determining a response. For example, in one embodiment, pressure data for two contacts is used by the processor 210 to determine a response. In another embodiment, the touch-sensitive display 230 may compare the pressure of a contact against a threshold pressure to determine a response. In other embodiments, information regarding one or more contacts is sent by the device 200 through the network interface 250 to another device that determines a response, if any, and sends information regarding any response back to the device 200.
The processor 210, touch-sensitive display 230, or other component may use the information received in any number of ways to determine whether a response is needed and, if so, what the response should be. For example, in one embodiment, the processor 210 may determine that an image associated with the touch-sensitive display 230 should be moved. In another embodiment, the touch-sensitive display 230 may determine that the color of an object on the touch-sensitive display 230 should be changed. In other embodiments, the processor 210 may determine that one or more actuators need to output one or more haptic effects. Various additional responses are discussed below.
In some embodiments, duration data may be received by the processor 210, the touch-sensitive display 230, or the network interface 250 may be used to determine a response, if any. For example, in one embodiment, the processor 210 may determine a particular response if the length of time that a contact has contacted the touch-sensitive display 230 exceeds a threshold duration. In other embodiments, a response may be determined if the duration of a contact is below a threshold duration. The processor 210 may determine a response based upon the duration of time of two or more contacts with the touch-sensitive display 230. For example, in one embodiment, if the duration of a first contact exceeds the duration of a second contact, the processor 210 may determine a response. In other embodiments, a response may be determined if a second contact occurs within a predetermined time after a first contact with the touch-sensitive display 230. For example, in one embodiment, a second contact must be substantially simultaneous with a first contact for the processor 210 to determine a response.
In some embodiments, location data may be received by the processor 210, the touch-sensitive display 230, or the network interface 250 may be used to determine a response, if any. The location of a contact may be determined in any number of ways. For example, the touch-sensitive display 230 may be addressable using Cartesian x and y coordinates or polar coordinates. Thus, in one embodiment, if the location of a first contact has an x coordinate that is larger than the x coordinate of the second location of a second contact, then the device 200 may determine that the first location is greater than the second location. In another embodiment, if the location of a first contact has a y coordinate larger than the y coordinate of the second location of a second contact, then the device 200 may determine that the first location is greater than the second location. Still in other embodiments, a formula based on the x and y coordinates of each contact may be used to determine the device's 200 response, if any. For example, in one embodiment the formula sqrt(x2+y2) may be used to determine whether a contact is within a particular area or distance from a specified location on the touch-sensitive display 230. In another embodiment, the formula x+2y may be used to determine whether a contact is within a rectangle from a specified coordinate on the touch-sensitive display 230. In one embodiment, the device 200 may determine the location of a contact by logically dividing the touch-sensitive display 230 into sections. For example, the device 200 may logically divide a rectangular touch-sensitive display 230 into three rows and three columns, thus, creating nine contact cells as shown in
Referring again to
For example, the processor 210 may determine that a haptic effect should be output each time a contact is made with the touch-sensitive display 230. Thus, as a user contacts sections “B” and “F,” the processor 210 determines a haptic effect should be output in response to each contact. Further, once the contacts are recognized as a gesture, such as a scroll gesture, the processor 210 may determine a haptic effect associated with the gesture.
In another embodiment, the processor 210 may determine that a response to a detected contact or sequence of contacts is to update an image displayed on the touch-sensitive display 230 and to output a haptic effect. For example, a response may be that an image displayed on the touch-sensitive display 230 is moved. In one embodiment, a response may be that an image displayed on the touch-sensitive display 230 is rotated. For example, referring again to
In one embodiment, a response may be that the graphics displayed on the touch-sensitive display 230 are zoomed in or out. For example, referring still to
In other embodiments, a response may be determined based on a change in location of one or more contacts on the touch-sensitive display 230. For example, the processor 210 may determine a response based on the location of a first contact changing in a northern direction and the location of a second contact changing in an eastern direction. In another embodiment, the processor 210 may determine a response based on the location of a first contact moving in a western direction and a second contact moving in an eastern direction. In other embodiments, the processor 210 can determine a response based on whether the location of a first contact is moving in an opposite direction of the location of a second contact on the touch-sensitive display 230.
In some embodiments, a response may be determined based on a specified interaction with the device 200. An interaction can include any number of actions based on one or more contacts. For example, in one embodiment, the processor 210 may determine a response based on an interaction where the interaction having a first contact having a location corresponding with a graphical object on the touch-sensitive display 230 and a second contact having a location not corresponding with the graphical object on the touch-sensitive display 230. In other embodiments, an interaction may be based two contacts having a location corresponding with a graphical object on the touch-sensitive display 230. In various embodiments, an interaction may be based on two graphical objects on the touch-sensitive display 230 where the location of a first contact corresponds with the first graphical object and the location of a second contact corresponds with the second graphical object.
In other embodiments, the processor 210 can determine a response to a contact on the touch-sensitive display 230 based on a combination of the various data the processor 210 receives from the touch-sensitive display 230 or the network interface 250 or one or more of the factors such as a change in location or an interaction. For example, in one embodiment, a response can be determined by the processor 210 based on both pressure and location of one or more contacts on the touch-sensitive display 230. In another embodiment, the processor 210 can determine a response based on pressure and an interaction. For example, the processor 210 may determine that the color of a graphical displayed on the touch-sensitive display 230 needs to be changed based upon a first contact having a location corresponding to the graphical object and a second contact not having a location corresponding to the graphical object and the first contact having a specified pressure. Other embodiments are described herein and still other embodiments would be apparent to one of skill in the art.
Referring again to the embodiment shown in
In some embodiments, the processor 210 generates a single signal after determining a response. For example, if the processor 210 determines that the touch-sensitive display 230 needs to be updated, then the processor 210 can generate a display signal and send the signal to the touch-sensitive display 230 that causes the graphics associated with the touch-sensitive display 230 to be updated. In other embodiments, the processor 210 generates two, three, or more signals. For example, in one embodiment, the processor 210 generates a different signal for each response that is determined in block 330 of the method 300 shown in
In one embodiment, a generated signal includes a command for a device or component to perform a specified function, such as to output a haptic effect, display an image, play a sound, or transmit a message to a remote device. In another embodiment, a generated signal includes parameters which are used by a device or component receiving the command to determine a response or some aspect of a response. Parameters may include various data related to, for example, magnitudes, frequencies, durations, or other parameters that an actuator can use to determine a haptic effect, output a haptic effect, or both. For example, in one embodiment, the processor 210 generates a signal configured to cause actuator 240 to output a haptic effect. In such an embodiment, the signal may include a pressure parameter that the actuator 240 uses to the intensity of the haptic effect to output. For example, according to one embodiment, the larger the pressure parameter the actuator 240 receives, the more intense the haptic effect that is output. Thus, a signal may include data that is configured to be processed by an actuator, display, network interface, speaker, or other component of a device or in communication with a device in order to determine an aspect of a particular response.
Referring again to
In various embodiments, the processor 210 may output one or more generated signals to any number of devices. For example, the processor 210 may output one signal to the network interface 250. In one embodiment, the processor 210 may output one generated signal to the touch-sensitive display 230, another generated signal to the network interface 250, and another generated signal to the actuator 260. In other embodiments, the processor 210 may output a single generated signal to multiple components or devices. For example, in one embodiment, the processor 210 outputs one generated signal to both actuator 240 and actuator 260. In another embodiment, the processor 210 outputs one generated signal to actuator 240, actuator 260, and network interface 250. In still another embodiment, the processor 210 outputs one generated signal to both actuator 240 and actuator 260 and outputs a second generated signal to the touch-sensitive display 230.
As discussed above, the processor 210 may output one or more signals to the network interface 250. For example, the processor 210 may output a signal to the network interface 250 instructing the network interface 250 to send data to a another component or device in communication with the device 200. In such an embodiment, the network interface 250 may send data to the other device and the other device may perform a function such as updating a display associated with the other device or the other device may output a haptic effect. Thus, in embodiments of the present invention, a second device may output a haptic effect based at least in part upon an interaction with a first device in communication with the second device. In other embodiments, a second device may perform any number of functions such as, for example, updating a display associated with the second device or outputting a sound to a speaker associated with the second device based at least in part on an interaction with a first multi-pressure touch-sensitive input device 200.
In various embodiments, after the processor 210 outputs a signal to a component, the component may send the processor 210 a confirmation indicating that the component received the signal. For example, in one embodiment, actuator 260 may receive a command from the processor 210 to output a haptic effect. Once actuator 260 receives the command, the actuator 260 may send a confirmation response to the processor 210 that the command was received by the actuator 260. In another embodiment, the processor 210 may receive completion data indicating that a component not only received an instruction but that the component has performed a response. For example, in one embodiment, actuator 240 may receive various parameters from the processor 210. Based on these parameters actuator 240 may output a haptic effect and send the processor 210 completion data indicating that actuator 240 received the parameters and outputted a haptic effect.
Another embodiment of the present invention that implements the method 300 shown in
An embodiment of a sculpting application that implements the method 300 shown in
One embodiment of the present invention is directed to a texture-based application that implements the method 300 shown in
In a further embodiment, an image of a keyboard is displayed on the touch-sensitive display 230. A user can interact with the device by contacting the touch-sensitive display 230 at locations which correspond to keys on the keyboard. In some embodiments, a user may use multiple fingers to type on the keyboard. In this embodiment, a haptic effect may be output based on the pressure of one or more contacts. For example, in one embodiment, the magnitude of a haptic effect is a function of the pressure in which a user contacts the touch-sensitive display 230. Thus, the harder (i.e. more pressure) that a user contacts the touch-sensitive display 230, the stronger the haptic effect.
While the steps of method 300 have been shown and described in a particular order, other embodiments of the present invention may comprise the same or additional steps or may perform the steps shown in
Referring now to
The method 500 shown in
In block 520, the processor 210 determines whether the pressure of the first contact is greater than the pressure of the second contact 520. If the pressure of the first contact is greater than the pressure of the second contact, the method 500 proceeds to block 530, otherwise it proceeds to block 550.
In block 530, the processor 210 generates a first actuator signal. In the embodiment shown in
Once the processor 210 generates the first actuator signal as shown in block 530, the processor 210 outputs the first actuator signal as shown in block 540. For example, in the embodiment shown in
In block 550, the processor 210 generates a second actuator signal and outputs the second actuator signal 560 to actuator 260. In this embodiment, the second actuator signal includes a magnitude parameter that actuator 260 uses to determine the desired haptic effect and actuator 260 outputs the haptic effect. For example, in the embodiment shown in
Thus, in the embodiment shown in
Another embodiment of the present invention that implements the method 500 shown in
In some embodiments, the pressure of one or more contacts may be used to determine the rate of turning. For example, in one embodiment an increase in pressure of a contact results in an increased rate of turning. In other embodiments, the pressure of one or more contacts is used to determine both direction and rate of turning. For example, in an embodiment, the pressure of one contact determines the direction of the snowboarder (i.e. left or right) and a pressure of another contact determines the rate of turning. In this embodiment, the direction of the snowboarder may be a function of a threshold pressure. Thus, if the pressure of the contact associated with the direction of the snowboarder is greater than the threshold pressure, the snowboarder may move to the right. If the pressure of the contact associated with the direction of the snowboarder is less than the threshold pressure, the snowboarder may move to the left. Furthermore, in this embodiment, the rate of turning may be a function of the pressure. Thus, an increase in pressure of the contact associated with the rate of turning may result in an increase in the rate of turning of the snowboarder. Likewise, a decrease in pressure of the contact associated with the rate of turning may result in a decrease in the rate of turning of the snowboarder.
In embodiments, one or more haptic effects may also be output based at least in part on the pressure of one or more of the contacts to indicate to the user the direction or the rate of turning, or both. For example, in one embodiment, a haptic effect may be output that indicates that the snowboarder is moving to the left and another haptic effect may be output that indicates that the snowboarder is moving to the right. For example, a vibration may be output on a right side of the device, or a vibration may be output on a left side of the device and move to the right side of the device at a rate corresponding to the rate of the snowboarder's turning. In another embodiment, a haptic effect may be output that indicates that the rate of turning of the snowboarder is increasing and another haptic effect may be output that indicates that the rate of turning of the snowboarder is decreasing, such as by increasing or decreasing a frequency or magnitude of a vibration.
Referring now to
In another embodiment, a user may touch locations on the touch-sensitive display 230 with one hand corresponding with notes “C”, “E”, and “G” and substantially simultaneously the user may touch locations on the touch-sensitive display 230 with another hand corresponding with notes “D”, “F”, and “A”. In response, the device 200 may output a sound having a frequency corresponding with the “C-E-G” chord and a sound having a frequency corresponding with the “D-F-A” chord. In some embodiments, the device 200 may output one or more haptic effects to alert the user that a particular chord or combination of chords, or both, is being pressed by the user. For example, one or more haptic effects may be output that indicate which chord is being played. In such an embodiment, one haptic effect is output if a user plays the “C-E-G” chord and a different haptic effect is output if a user plays the “D-F-A” chord. Thus, a hearing impaired user or a user that wants sound on the device to be muted, can practice playing the simulated piano 710 and determine which chords are being played based upon one or more haptic effects output by the device 200. In another embodiment, the intensity of one or more haptic effects output by the device 200 may be increased or decreased as a user increases or decreases, respectfully, the pressure on various contacts on the simulated keyboard 710. Thus, a user can simulate playing a keyboard by pressing locations on the touch-sensitive display 230 corresponding with the various notes that the user wishes to play and can receive haptic feedback indicating the note or notes that the user presses.
In one embodiment, the processor 210 executes software that determines whether the user is playing the correct notes at the correct time for a given song. For example, for a particular song the notes “C” and “E” may need to be played simultaneously followed by the notes “D”, “F”, and “A” played simultaneously. If the user incorrectly presses notes “C” and “F” instead of notes “C” and “E” the device 200 may output a haptic effect alerting the user that an incorrect note has been played. Likewise, if the user correctly plays notes “C” and “E” simultaneously and plays notes “D”, “F”, and “A” simultaneously but with an incorrect timing, (i.e. the notes are played too fast or too slowly), the device 200 may output a different haptic effect alerting the user that their timing was incorrect.
In another embodiment, a first multi-pressure touch-sensitive input device 200 is in communication with a second multi-pressure touch-sensitive input device 200. In this embodiment, the touch-sensitive display 230 of the first device 200 may display the same information as the touch-sensitive display 230 of the second device 200. For example, both devices may display a keyboard as shown in
Referring now to
The method shown in
In block 820, a first contact and a second contact are received. For example, in the embodiment shown in
Once a first contact and a second contact are received, the method 800 proceeds to block 830. In block 830, a determination is made as to whether the first contact is in a location corresponding to the graphical object. For example, in the embodiment shown in
In block 840, a determination is made as to whether the second contact is in a location corresponding to the graphical object. For example, in
In block 845, the processor 210 generates a first actuator signal. For example, in
Referring still to
Once the first actuator signal has been generated as shown in block 845, the processor 210 outputs the first actuator signal as shown in block 850. For example, in the embodiment shown in
In block 855, the processor 210 generates a second actuator signal. For example, in
Referring still to
Once the second actuator signal has been generated as shown in block 855, the processor 210 outputs the second actuator signal as shown in block 860. For example, in the embodiment shown in
If it was determined in block 830 that the first contact was not in a location corresponding to the graphical object, the method proceeds to block 865. In block 865, a determination is made as to whether the second contact is in a location corresponding to the graphical object. For example, in
In block 870, the processor 210 generates a third actuator signal. For example, in
Referring still to
Once the third actuator signal has been generated as shown in block 870, the processor 210 outputs the third actuator signal as shown in block 875. For example, some embodiments disclosed above, the processor 210 outputs the generated third actuator signal to actuator 240. Actuator 240 receives the third actuator signal from the processor 210 and outputs a haptic effect indicating that the location of where the graphical object 910 is displayed on the touch-sensitive display 230 is being changed. In addition, embodiments shown with respect to
In block 880, the processor 210 generates a fourth actuator signal. For example, in
Referring still to
Once the fourth actuator signal has been generated as shown in block 880, the processor 210 outputs the fourth actuator signal as shown in block 885. For example, in the some embodiments discussed above with respect to
Another embodiment of the present invention that implements the method 800 shown in
In various embodiments, one or more haptic effects may be output based on two or more contacts on one device and two or more contacts on another device. For example, two devices 200 may be in communication with each other and a user of one device 200 may touch a first location on the display 230 with a first finger and may touch a second location on the display 230 with a second finger. Likewise, a user of the second device 200 may touch a first location on the display 230 with a first finger and may touch a second location on the display 230 with a second finger. In one embodiment, the location of the first contact on the first device substantially corresponds with the location of the first contact on the second device and the location of the second contact on the first device substantially corresponds with the location of the second contact on the second device, then a response may occur. For example, in one embodiment, the response may be that access is granted to either or both users to a file, website, application, etc. In embodiments, the response may include one or more haptic effects indicating that access is granted or that the locations of both contacts on each device are substantially at the same location. In other embodiments, one or more haptic effects may be output to either device or both devices indicating that at least one of the contacts does not match if any of the contacts are not at a substantially similar location.
In some embodiments, one or more haptic effects may be output based on the pressure of a contact on a first device and a pressure of a contact on a second device where the first device and the second device are in communication with each other. For example, in a wrestling application where two or more devices are in communication with each other, a user of one of the devices may contact the touch-sensitive display 230 at one location and with a first pressure. A user of another device may contact the touch-sensitive display 230 at a second location corresponding with the first location on the display of the first device and with a second pressure. In this embodiment, one or more haptic effects may be output on either device or both devices based on the pressure of the contacts. For example, in one embodiment, if the pressure of the first contact on the first device is greater than the pressure of the second contact on the second device, then a haptic effect may be output on the second device indicating that the first user is punching harder than the second user. In another embodiment, if the pressure of the second contact on the second device is greater than the pressure of the first contact on the first device, then a haptic effect may be output on the first device indicating that the second user is pushing or grappling harder than the first user and another haptic effect may be output on the second device indicating that the second user is currently winning the match.
While the methods and systems herein are described in terms of software executing on various machines, the methods and systems may also be implemented as specifically-configured hardware, such a field-programmable gate array (FPGA) specifically to execute the various methods. For example, referring again to
Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
The foregoing description of some embodiments of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention.
Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, operation, or other characteristic described in connection with the embodiment may be included in at least one implementation of the invention. The invention is not restricted to the particular embodiments described as such. The appearance of the phrase “in one embodiment” or “in an embodiment” in various places in the specification does not necessarily refer to the same embodiment. Any particular feature, structure, operation, or other characteristic described in this specification in relation to “one embodiment” may be combined with other features, structures, operations, or other characteristics described in respect of any other embodiment.
This application is a continuation of U.S. application Ser. No. 15/405,550, filed Jan. 13, 2017, entitled “Systems and Methods for Multi-Pressure Interaction on Touch-Sensitive Surfaces,” which is a continuation of U.S. application Ser. No. 13/290,502, filed Nov. 7, 2011, entitled “Systems and Methods for Multi-Pressure Interaction on Touch-Sensitive Surfaces,” now U.S. Pat. No. 9,582,178, the entireties of which are hereby incorporated by reference for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2972140 | Hirsch | Feb 1961 | A |
3157853 | Hirsch | Nov 1964 | A |
3220121 | Cutler | Nov 1965 | A |
3497668 | Hirsch | Feb 1970 | A |
3517446 | Corlyon et al. | Jun 1970 | A |
3623064 | Kagan | Nov 1971 | A |
3875488 | Crocker et al. | Apr 1975 | A |
3902687 | Hightower | Sep 1975 | A |
3903614 | Diamond et al. | Sep 1975 | A |
3911416 | Feder | Oct 1975 | A |
3919691 | Noll | Nov 1975 | A |
3923166 | Fletcher et al. | Dec 1975 | A |
4000383 | Lockard | Dec 1976 | A |
4127752 | Lowthorp | Nov 1978 | A |
4160508 | Salsbury | Jul 1979 | A |
4221941 | Genovese | Sep 1980 | A |
4236325 | Hall et al. | Dec 1980 | A |
4262549 | Schwellenbach | Apr 1981 | A |
4311980 | Prudenziati | Jan 1982 | A |
4333070 | Barnes | Jun 1982 | A |
4334280 | McDonald | Jun 1982 | A |
4362408 | Cordes et al. | Dec 1982 | A |
4383154 | Sorenson | May 1983 | A |
4398889 | Lam et al. | Aug 1983 | A |
4436188 | Jones | Mar 1984 | A |
4464117 | Forest | Aug 1984 | A |
4477043 | Repperger | Oct 1984 | A |
4484191 | Vavra | Nov 1984 | A |
4513235 | Acklam et al. | Apr 1985 | A |
4560983 | Williams | Dec 1985 | A |
4581491 | Boothroyd | Apr 1986 | A |
4581972 | Hoshino | Apr 1986 | A |
4599070 | Hladky et al. | Jul 1986 | A |
4603284 | Pea ley | Jul 1986 | A |
4604016 | Joyce | Aug 1986 | A |
4689449 | Rosen | Aug 1987 | A |
4692756 | Clark | Sep 1987 | A |
4706294 | Ouchida | Nov 1987 | A |
4708656 | De Vries et al. | Nov 1987 | A |
4713007 | Alban | Dec 1987 | A |
4725817 | Wihlborg | Feb 1988 | A |
4758165 | Tieman et al. | Jul 1988 | A |
4772205 | Chlumsky | Sep 1988 | A |
4782327 | Kiev et al. | Nov 1988 | A |
4791416 | Adler | Dec 1988 | A |
4794384 | Jackson | Dec 1988 | A |
4794392 | Selinko | Dec 1988 | A |
4795296 | Jau | Jan 1989 | A |
4798919 | Miessler et al. | Jan 1989 | A |
4800721 | Cemenska et al. | Jan 1989 | A |
4821030 | Batson et al. | Apr 1989 | A |
4823106 | Lovell | Apr 1989 | A |
4823634 | Culver | Apr 1989 | A |
4837734 | Ichikawa et al. | Jun 1989 | A |
4839838 | laBiche et al. | Jun 1989 | A |
4840634 | Muller et al. | Jun 1989 | A |
4853874 | Iwamoto et al. | Aug 1989 | A |
4861269 | Meenen, Jr. | Aug 1989 | A |
4868549 | Affinito et al. | Sep 1989 | A |
4885565 | Embach | Dec 1989 | A |
4891764 | McIntosh | Jan 1990 | A |
4896554 | Culver | Jan 1990 | A |
4906843 | Jones et al. | Mar 1990 | A |
4926879 | Sevraln et al. | May 1990 | A |
4930770 | Baker | Jun 1990 | A |
4934694 | McIntosh | Jun 1990 | A |
4935728 | Kiev | Jun 1990 | A |
4949119 | Moncrief et al. | Aug 1990 | A |
4961038 | MacMinn | Oct 1990 | A |
4982918 | Kaye | Jan 1991 | A |
4983786 | Stevens et al. | Jan 1991 | A |
4983901 | Lehmer | Jan 1991 | A |
5004391 | Burden | Apr 1991 | A |
5007300 | Siva | Apr 1991 | A |
5019761 | Kraft | May 1991 | A |
5022384 | Freels | Jun 1991 | A |
5022407 | Horch et al. | Jun 1991 | A |
5035242 | Franklin et al. | Jul 1991 | A |
5038089 | Szakaly | Aug 1991 | A |
5044956 | Behensky et al. | Sep 1991 | A |
5053585 | Yaniger | Oct 1991 | A |
5065145 | Purcell | Nov 1991 | A |
5076517 | Ferranti et al. | Dec 1991 | A |
5078152 | Bond | Jan 1992 | A |
5095303 | Clark et al. | Mar 1992 | A |
5103404 | Mcintosh | Apr 1992 | A |
5107080 | Rosen | Apr 1992 | A |
5107262 | Cadoz et al. | Apr 1992 | A |
5116051 | Moncrief | May 1992 | A |
5116180 | Fung et al. | May 1992 | A |
5121091 | Fujiyama | Jun 1992 | A |
5139261 | Openiano | Aug 1992 | A |
5146566 | Hollis, Jr. et al. | Sep 1992 | A |
5165897 | Johnson | Nov 1992 | A |
5175459 | Daniel et al. | Dec 1992 | A |
5182557 | Lang | Jan 1993 | A |
5184319 | Kramer | Feb 1993 | A |
5185561 | Good et al. | Feb 1993 | A |
5186629 | Rohen | Feb 1993 | A |
5186685 | Mangseth et al. | Feb 1993 | A |
5189355 | Larkins et al. | Feb 1993 | A |
5193963 | McAffee et al. | Mar 1993 | A |
5197003 | Moncrief et al. | Mar 1993 | A |
5203563 | Loper, III | Apr 1993 | A |
5212473 | Louis | May 1993 | A |
5220260 | Schuler | Jun 1993 | A |
5223658 | Suzuki | Jun 1993 | A |
5223776 | Radke et al. | Jun 1993 | A |
5235868 | Culver | Aug 1993 | A |
5237327 | Saltoh et al. | Aug 1993 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5241308 | Young | Aug 1993 | A |
5246316 | Smith | Sep 1993 | A |
5264768 | Gregory et al. | Nov 1993 | A |
5271290 | Fischer | Dec 1993 | A |
5275174 | Cook | Jan 1994 | A |
5275565 | Moncrief | Jan 1994 | A |
5283970 | Algner | Feb 1994 | A |
5286203 | Fuller et al. | Feb 1994 | A |
5289273 | Lang | Feb 1994 | A |
5296871 | Palcy | Mar 1994 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5302132 | Corder | Apr 1994 | A |
5309140 | Everett | May 1994 | A |
5313230 | Venolia et al. | May 1994 | A |
5334027 | Wherlock | Aug 1994 | A |
5341459 | Backes | Aug 1994 | A |
5354162 | Burden et al. | Oct 1994 | A |
5355148 | Anderson | Oct 1994 | A |
5376948 | Roberts | Dec 1994 | A |
5381080 | Schnell et al. | Jan 1995 | A |
5389849 | Asano et al. | Feb 1995 | A |
5389865 | Jacobus et al. | Feb 1995 | A |
5390128 | Ryan et al. | Feb 1995 | A |
5390296 | Crandall et al. | Feb 1995 | A |
5396266 | Brimhall | Mar 1995 | A |
5398044 | Hill | Mar 1995 | A |
5399091 | Mitsumoto | Mar 1995 | A |
5402499 | Robison et al. | Mar 1995 | A |
5402680 | Korenaga | Apr 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5414337 | Schuler | May 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5437607 | Taylor | Aug 1995 | A |
5451924 | Massimino et al. | Sep 1995 | A |
5457479 | Chen | Oct 1995 | A |
5459382 | Jacobus et al. | Oct 1995 | A |
5461711 | Wang et al. | Oct 1995 | A |
5466213 | Hogan et al. | Nov 1995 | A |
5471571 | Smith et al. | Nov 1995 | A |
5473235 | Lance et al. | Dec 1995 | A |
5473344 | Bacon et al. | Dec 1995 | A |
5489812 | Furuhata et al. | Feb 1996 | A |
5491477 | Clark et al. | Feb 1996 | A |
5496174 | Garner | Mar 1996 | A |
5506605 | Paley | Apr 1996 | A |
5512919 | Araki | Apr 1996 | A |
5513100 | Parker et al. | Apr 1996 | A |
5514150 | Rostoker | May 1996 | A |
5521336 | Buchanan et al. | May 1996 | A |
5530455 | Gillick et al. | Jun 1996 | A |
5542672 | Meredith | Aug 1996 | A |
5547382 | Yamasaki et al. | Aug 1996 | A |
5565887 | McCambridge et al. | Oct 1996 | A |
5575761 | Hajianpour | Nov 1996 | A |
5576727 | Rosenberg et al. | Nov 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
5583407 | Yamaguchi | Dec 1996 | A |
5586033 | Hall | Dec 1996 | A |
5587937 | Massie et al. | Dec 1996 | A |
5589828 | Armstrong | Dec 1996 | A |
5589854 | Tsal | Dec 1996 | A |
5591082 | Jensen et al. | Jan 1997 | A |
5596347 | Robertson et al. | Jan 1997 | A |
5600777 | Wang et al. | Feb 1997 | A |
5619180 | Massimino | Apr 1997 | A |
5625576 | Massie et al. | Apr 1997 | A |
5629594 | Jacobus et al. | May 1997 | A |
5631861 | Kramer | May 1997 | A |
5638060 | Kataoka et al. | Jun 1997 | A |
5642469 | Hannaford et al. | Jun 1997 | A |
5642806 | Karadimas | Jul 1997 | A |
5643087 | Marcus et al. | Jul 1997 | A |
5656901 | Kurita | Aug 1997 | A |
5666138 | Culver | Sep 1997 | A |
5666473 | Wallace | Sep 1997 | A |
5684722 | Thorner et al. | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5691747 | Amano | Nov 1997 | A |
5691898 | Rosenberg′ et al. | Nov 1997 | A |
5694013 | Stewart et al. | Dec 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5714978 | Yamanaka | Feb 1998 | A |
5719561 | Gonzales | Feb 1998 | A |
5721566 | Rosenberg et al. | Feb 1998 | A |
5724106 | Autrve et al. | Mar 1998 | A |
5724278 | Chen et al. | Mar 1998 | A |
5729249 | Yasutaka | Mar 1998 | A |
5731804 | Rosenberg | Mar 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5736978 | Hasser et al. | Apr 1998 | A |
5739811 | Rosenberg et al. | Apr 1998 | A |
5742278 | Chen et al. | Apr 1998 | A |
5745715 | Pickover et al. | Apr 1998 | A |
5754023 | Rosten et al. | May 1998 | A |
5755577 | Gillio | May 1998 | A |
5757358 | Osga | May 1998 | A |
5760764 | Martinelli | Jun 1998 | A |
5766016 | Sinclalr et al. | Jun 1998 | A |
5767457 | Gerpheide et al. | Jun 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5769640 | Jacobus et al. | Jun 1998 | A |
5771037 | Jackson | Jun 1998 | A |
5781172 | Engel et al. | Jul 1998 | A |
5784052 | Kevson | Jul 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
5790108 | Salcudean et al. | Aug 1998 | A |
5791992 | Crump et al. | Aug 1998 | A |
5796057 | Nakajima et al. | Aug 1998 | A |
5802353 | Avila et al. | Sep 1998 | A |
5803243 | Nestor et al. | Sep 1998 | A |
5805140 | Rosenberg et al. | Sep 1998 | A |
5805165 | Thome, III et al. | Sep 1998 | A |
5808601 | Leah et al. | Sep 1998 | A |
5808603 | Chen | Sep 1998 | A |
5821921 | Osborn et al. | Oct 1998 | A |
5823876 | Unbehand | Oct 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5826710 | Kurek et al. | Oct 1998 | A |
5831408 | Jacobus et al. | Nov 1998 | A |
5835080 | Beeteson et al. | Nov 1998 | A |
5836443 | Gernhardt et al. | Nov 1998 | A |
5844392 | Peurach et al. | Dec 1998 | A |
5857986 | Moriyasu | Jan 1999 | A |
5865303 | Gernhardt et al. | Feb 1999 | A |
5877748 | Redlich | Mar 1999 | A |
5880714 | Rosenberg et al. | Mar 1999 | A |
5887995 | Holehan | Mar 1999 | A |
5889670 | Schuler et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5896125 | Niedzwiecki | Apr 1999 | A |
5897437 | Nishiumi et al. | Apr 1999 | A |
5903257 | Nishiumi et al. | May 1999 | A |
5912661 | Siddiqui | Jun 1999 | A |
5914705 | Johnson et al. | Jun 1999 | A |
5914708 | LeGrange et al. | Jun 1999 | A |
5917906 | Thornton | Jun 1999 | A |
5929846 | Rosenberg et al. | Jul 1999 | A |
5943044 | Mortinelli et al. | Aug 1999 | A |
5944151 | Jakobs et al. | Aug 1999 | A |
5945772 | Macnak et al. | Aug 1999 | A |
5956016 | Kruenzneretal | Sep 1999 | A |
5956484 | Rosenberg et al. | Sep 1999 | A |
5959613 | Rosenberg et al. | Sep 1999 | A |
5973689 | Gallerv | Oct 1999 | A |
5977867 | Bouin | Nov 1999 | A |
5984785 | Takeda et al. | Nov 1999 | A |
5986643 | Harvill et al. | Nov 1999 | A |
5988902 | Holehan | Nov 1999 | A |
5990869 | Kubica et al. | Nov 1999 | A |
5999168 | Rosenberg et al. | Dec 1999 | A |
6001014 | Ogata | Dec 1999 | A |
6004134 | Marcus et al. | Dec 1999 | A |
6005551 | Osbome et al. | Dec 1999 | A |
6008800 | Pryor | Dec 1999 | A |
6020876 | Rosenberg et al. | Feb 2000 | A |
6024576 | Bevirt et al. | Feb 2000 | A |
6028593 | Rosenberg et al. | Feb 2000 | A |
6037927 | Rosenberg | Mar 2000 | A |
6059506 | Kramer | May 2000 | A |
6061004 | Rosenberg | May 2000 | A |
6067081 | Hahlganss et al. | May 2000 | A |
6067871 | Markyvech et al. | May 2000 | A |
6078126 | Rollins et al. | Jun 2000 | A |
6088017 | Tremblay et al. | Jun 2000 | A |
6088019 | Rosenberg | Jun 2000 | A |
6097964 | Nuovo et al. | Aug 2000 | A |
6100874 | Schena et al. | Aug 2000 | A |
6102803 | Takeda et al. | Aug 2000 | A |
6111577 | Zilles et al. | Aug 2000 | A |
6118435 | Fujita et al. | Sep 2000 | A |
6125385 | Wies et al. | Sep 2000 | A |
6128006 | Rosenberg | Oct 2000 | A |
6130393 | Chu | Oct 2000 | A |
6131097 | Peurach et al. | Oct 2000 | A |
6140987 | Stein et al. | Oct 2000 | A |
6147422 | Delson et al. | Nov 2000 | A |
6147674 | Rosenberg et al. | Nov 2000 | A |
6160489 | Perry et al. | Dec 2000 | A |
6161126 | Wies et al. | Dec 2000 | A |
6166723 | Schena et al. | Dec 2000 | A |
6175090 | Blossfald | Jan 2001 | B1 |
6195592 | Schuler et al. | Feb 2001 | B1 |
6198206 | Saarmaa et al. | Mar 2001 | B1 |
6218966 | Goodwin et al. | Apr 2001 | B1 |
6219034 | Elbing et al. | Apr 2001 | B1 |
6225976 | Yates et al. | May 2001 | B1 |
6239790 | Martinelli et al. | May 2001 | B1 |
6243078 | Rosenberg | Jun 2001 | B1 |
6243080 | Moine | Jun 2001 | B1 |
6262717 | Donahue et al. | Jul 2001 | B1 |
6292173 | Rambaldi et al. | Sep 2001 | B1 |
6307465 | Kayma et al. | Oct 2001 | B1 |
6326901 | Gonzales | Dec 2001 | B1 |
6337678 | Fish | Jan 2002 | B1 |
6339201 | Balaban et al. | Jan 2002 | B1 |
6342880 | Rosenberg et al. | Jan 2002 | B2 |
6344791 | Armstrong | Feb 2002 | B1 |
6347997 | Armstrong | Feb 2002 | B1 |
6368016 | Smith et al. | Apr 2002 | B1 |
6369803 | Brisebois et al. | Apr 2002 | B2 |
6373463 | Beeks | Apr 2002 | B1 |
6374255 | Peurach et al. | Apr 2002 | B1 |
6388655 | Leung | May 2002 | B1 |
6389302 | Vance | May 2002 | B1 |
6414674 | Kam Der et al. | Jul 2002 | B1 |
6422941 | Thorner et al. | Jul 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6445284 | Cruz-Hernandez et al. | Sep 2002 | B1 |
6447069 | Terris et al. | Sep 2002 | B1 |
6469695 | White | Oct 2002 | B1 |
6473069 | Gemheide | Oct 2002 | B1 |
6476794 | Kataoka et al. | Nov 2002 | B1 |
6487421 | Hess et al. | Nov 2002 | B2 |
6498601 | Gujar et al. | Dec 2002 | B1 |
6509892 | Cooper et al. | Jan 2003 | B1 |
6518958 | Mivalima et al. | Feb 2003 | B1 |
6525283 | Leng | Feb 2003 | B2 |
6529122 | Magnussen et al. | Mar 2003 | B1 |
6535201 | Cooper et al. | Mar 2003 | B1 |
6543487 | Bazinet | May 2003 | B2 |
6587091 | Serpa | Jul 2003 | B2 |
6597347 | Yasutake | Jul 2003 | B1 |
6610936 | Gillespie et al. | Aug 2003 | B2 |
6628195 | Coudor | Sep 2003 | B1 |
6636197 | Goldenberg et al. | Oct 2003 | B1 |
6636202 | Islunael et al. | Oct 2003 | B2 |
6639582 | Shrader | Oct 2003 | B1 |
6647145 | Gay | Nov 2003 | B1 |
6657617 | Paolini et al. | Dec 2003 | B2 |
6781569 | Gregorio et al. | Sep 2004 | B1 |
6801191 | Mukal et al. | Oct 2004 | B2 |
6819312 | Fish | Nov 2004 | B2 |
6976562 | Perret et al. | Dec 2005 | B1 |
7168042 | Braun et al. | Jan 2007 | B2 |
8098235 | Heubel et al. | Jan 2012 | B2 |
8264465 | Grant et al. | Sep 2012 | B2 |
8482483 | Hasuike | Jul 2013 | B2 |
9030419 | Freed | May 2015 | B1 |
9582178 | Grant | Feb 2017 | B2 |
20010000663 | Shahoian et al. | May 2001 | A1 |
20010035854 | Rosenberg et al. | Nov 2001 | A1 |
20020033795 | Shahoian et al. | Mar 2002 | A1 |
20020044132 | Fish | Apr 2002 | A1 |
20020103025 | Murzanski et al. | Aug 2002 | A1 |
20020128048 | Aaltonen et al. | Sep 2002 | A1 |
20020149561 | Fukumoto et al. | Oct 2002 | A1 |
20020149570 | Knowles et al. | Oct 2002 | A1 |
20020156807 | Dieberger | Oct 2002 | A1 |
20020171621 | Johnson | Nov 2002 | A1 |
20020177471 | Kaaresoja et al. | Nov 2002 | A1 |
20030006892 | Church | Jan 2003 | A1 |
20030016211 | Woollev | Jan 2003 | A1 |
20030022701 | Gupta | Jan 2003 | A1 |
20030025679 | Tavlo et al. | Feb 2003 | A1 |
20030030628 | Salo et al. | Feb 2003 | A1 |
20030038776 | Rosenberg et al. | Feb 2003 | A1 |
20030048260 | Matusis | Mar 2003 | A1 |
20030058265 | Robinson et al. | Mar 2003 | A1 |
20030067449 | Yoshikawa et al. | Apr 2003 | A1 |
20030071795 | Baldauf et al. | Apr 2003 | A1 |
20030095105 | Vaananen | May 2003 | A1 |
20030122779 | Martin | Jul 2003 | A1 |
20030128191 | Strasser et al. | Jul 2003 | A1 |
20030128192 | Van Os | Jul 2003 | A1 |
20030151597 | Roberts et al. | Aug 2003 | A1 |
20030174121 | PouDvrev et al. | Sep 2003 | A1 |
20030179190 | Franzen | Sep 2003 | A1 |
20040227721 | Moilanen et al. | Nov 2004 | A1 |
20050099393 | Johnson | May 2005 | A1 |
20060149495 | Mazalek et al. | Jul 2006 | A1 |
20090002328 | Ullrich et al. | Jan 2009 | A1 |
20090085878 | Heubel et al. | Apr 2009 | A1 |
20100026647 | Abe et al. | Feb 2010 | A1 |
20100045619 | Birnbaum et al. | Feb 2010 | A1 |
20100088654 | Henhoeffer | Apr 2010 | A1 |
20100127983 | Irani et al. | May 2010 | A1 |
20100156818 | Burrough et al. | Jun 2010 | A1 |
20100171712 | Cieplinaski et al. | Jul 2010 | A1 |
20100231508 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100277431 | Klinghult | Nov 2010 | A1 |
20110014983 | Miller, IV | Jan 2011 | A1 |
20110018695 | Bells et al. | Jan 2011 | A1 |
20110018806 | Yano | Jan 2011 | A1 |
20110043527 | Ording et al. | Feb 2011 | A1 |
20110261021 | Modarres et al. | Oct 2011 | A1 |
20120038582 | Grant | Feb 2012 | A1 |
20120081337 | Camp et al. | Apr 2012 | A1 |
20120126941 | Coggill | May 2012 | A1 |
20120274662 | Kim et al. | Nov 2012 | A1 |
20120327172 | El-Saban | Dec 2012 | A1 |
20130113715 | Grant et al. | May 2013 | A1 |
20140083279 | Little et al. | Mar 2014 | A1 |
20150077345 | Hwang et al. | Mar 2015 | A1 |
Number | Date | Country |
---|---|---|
101809526 | Aug 2010 | CN |
101910978 | Dec 2010 | CN |
19 831 808 | Jan 1999 | DE |
0 085 518 | Jan 1983 | EP |
0 265 011 | Apr 1988 | EP |
0 349 086 | Jan 1990 | EP |
0 607 580 | Jul 1994 | EP |
0 626 614 | Nov 1994 | EP |
0 688 125 | Dec 1995 | EP |
0 556 999 | May 1998 | EP |
1 213 188 | Jun 2002 | EP |
2527966 | Nov 2012 | EP |
2 327 366 | Jan 1999 | GB |
H2-185278 | Jul 1990 | JP |
H4-8381 | Jan 1992 | JP |
H5-192449 | Aug 1993 | JP |
H7-24147 | Jan 1995 | JP |
2001-350592 | Dec 2001 | JP |
2002-259059 | Sep 2002 | JP |
2008-305174 | Dec 2008 | JP |
2010-033455 | Feb 2010 | JP |
2010-211509 | Sep 2010 | JP |
2010-541071 | Dec 2010 | JP |
2011-048606 | Mar 2011 | JP |
2011-528826 | Nov 2011 | JP |
2011-242386 | Dec 2011 | JP |
2013-500517 | Jan 2013 | JP |
10-2011-0086502 | Jul 2011 | KR |
WO 92-00559 | Jan 1992 | WO |
WO 95-20787 | Aug 1995 | WO |
WO 95-20788 | Aug 1995 | WO |
WO 95-32459 | Nov 1995 | WO |
WO 96-28777 | Sep 1996 | WO |
WO 97-12357 | Apr 1997 | WO |
WO 09-718546 | May 1997 | WO |
WO 97-21160 | Jun 1997 | WO |
WO 97-31333 | Aug 1997 | WO |
WO 98-08159 | Feb 1998 | WO |
WO 98-24183 | Jun 1998 | WO |
9843825 | Oct 1998 | WO |
WO 98-58323 | Dec 1998 | WO |
WO 00-030026 | May 2000 | WO |
WO 02-27645 | Apr 2002 | WO |
WO 02-31807 | Apr 2002 | WO |
WO 01-9110 | Nov 2002 | WO |
WO 03-042805 | May 2003 | WO |
WO 2009042424 | Apr 2009 | WO |
2009085378 | Jul 2009 | WO |
WO 2010009552 | Jan 2010 | WO |
WO 2011011552 | Jan 2011 | WO |
WO 11-076248 | Jun 2011 | WO |
WO 11-090324 | Jul 2011 | WO |
WO 2011090324 | Jul 2011 | WO |
Entry |
---|
Adachi et al., “Sensory Evaluation of Virtual Haptic Push-Buttons,” 1994, Suzuki Motor Corp., pp. 1-7. |
Adelstein, et al., “A High Performance Two-Degree-of-Freedom Kinesthetic Interface,” MIT, 1992, pp. 108-112. |
Akamatsu et al., “Multi modal Mouse: A Mouse-Type Device with Tactile and Force Display,” 1994, Presence vol. 3, pp. 73-80. |
Atkinson et al., “Computing with Feeling,”, Compul. & Gmphics, vol. 2,1977, pp. 97-103. |
Batter et al., “Grope-I: A computer Display to the sense of Feel,” Proc IFIP Congress, 1971, pp. 759-763. |
Bejczy et al., “The Phantom Robot: Predictive Displays for Teleopemtion with Time Delay,” IEEE CH2876, Jan. 1990, pp. 546-550. |
Brooks, Jr. et al. “Project GROPE, Haptic Displays for Scientific Visualization,”, Computer Graphics, vol. 24, #4, 1990, pp. 177-184. |
Bullolo et al., “Pen-based force Display for Precision Manipulation in Virtual Environments,” IEEE 0-8186-7084-3, 1995, pp. 217-224. |
Colgate et al., “Implementation of Stiff Virtual Walls in Force-Reflecting Interfaces,” Northwestern University, IL, 1993., pp. 1-8. |
Dennerlein et al., “Vibrotactile Feedback for Industrial Telemanipulators,” 1997, Sixth Annual Symp. On Haptic Interfaces for Virtual Env. And Teleoperator Systems, ASME IMECE, Dallas, pp. 1-7. |
Ellis et al., Design & Evaluation of a High-Performance Prototype Planar Haptic Interface, Dec. 1993, Advances in Robotics, 55-64. |
Fischer, et al., “Specification and Design of Input Devices for Teleoperation,” IEEE CH2876, Jan. 1990, pp. 540-545. |
Gotow et al., “Perception of Mechanical Properties at the Man—Machine Interface,” IEEE CH2503-1, 1987, pp. 688-690. |
Hannaford et al., “Force-Feedback Cursor Control,” NASA Tech Briefs, vol. 13, No. II, 1989, pp. 1-7. |
Hannaford et al., “Performance Evaluation of a 6-Axis Generalized Force-Reflecting Teleoperator,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 21, No. 3, 1991, pp. 621-623, 631-633. |
Hasser, C. et al., “Tactile Feedback with Adaptive Controller for a Force-Reflecting Haptic Display,” Parts I and 2, IEEE 0-7803-3131-1,1996, pp. 526-533. |
Hasser, C.., “Tactile Feedback for a Force-Reflecting Haptic Display,” School of Eng., Univ. of Dayton, Dayton, OH, 1995, pp. 1-98. |
Hirota et al., “Development of Surface Display,” IEEE 0-7803-1363-1,1993, pp. 256-262. |
Howe et al., “Task Performance with a dextrous Teleoperated Hand System,” Proc. of SPIE, vol. 1833, 1992, pp. 1-9. |
Iwata, H, Artificial Reality with Force-feedback: Development of Desktop Virtual Space with Compact Master Manipulator:′ Computer Graphics, vol. 24, No. 4, Aug. 1990, pp. 165-170. |
Kelley et al., “magic Mouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input-Output Device,” Oct. 19, 1993 University of British Columbia, pp. 1-27. |
Kelley et al., “On the Development of a Force-Feedback Mouse and its Integration into a graphical user Interface,” Nov. 1994, Engineering Congress and Exhibition, pp. 1-8. |
Kilpatrick et al., 'The Use of Kinesthetic Supplement in an Interactive Graphics System, University of North Carolina, 1976, pp. 1-172. |
Kotoku, “A Predictive Display with Force Feedback and its Application to Remote Manipulation System with Transmission Time Delay,” Proc. of IEEE-RSJ Int'l Conf. On Intelligent Robots and Systems, Jul. 1992. |
Kotoku, et al., “Environment Modeling for the Interactive Display (EMID) Used in Telerobotic Systems,” IEEE-RSJ Int'l. Workshop on Intelligent Robots and Systems, Nov. 1991, pp. 999-1004. |
Millman et al., “Design of a 4 Degree of Freedom Force-Reflecting Manipuladum with a Specified Force-Torque Workspace,” IEEE CH2969-4, 1991, pp. 1488-1493. |
Minsky et al., Feeling & Seeing: Issues in Force Display:′ ACM089791-351-5, 1990, pp. 235-242, 270. |
Munch et al., “Intelligent Control for Haptic Displays,” Eurograpbics '96, vol. 15, No. 3, 1996, pp. 217-226. |
Ouh-Young et al., “Creating an Illusion of Feel: Control Issues in Force Display,” Univ. of N. Carolina, 1989, pp. 1-14. |
Ouh-Young et al., Using a Manipulator for Force Display in Molecular Docking, IEEE CH2555, 1988, pp. 1824-1829. |
Patrick et al., “Design, Construction, and Testing of a Fingertip Tactile Display for Interaction with Virtual and Remote Environments,” Master of Science Thesis, M IT Nov. 8, 1990. |
Payette et al., “Evaluation of a Force Feedback (Haptic) Computer Printing Device in Zero Gravity,” Oct. 17, 1996, ASME Dynamics Systems, vol. 58 pp. 547-553. |
Ramstein et al., “The Pantograph: A Large Workspace Haptic Device for a Multimodal Human—Computer Interaction,” Computer—Human Interaction, CHI 1994, pp. 1-3. |
Ramstein, Combining Haptic & Brallle Technologies: Design Issues and Pilot Study, 96, Siggraph pp. 37-44. |
Rosenberg et al., “Commercially Viable force feedback Controller for Individuals with Neuromotor Disabilities,” Armstrong Laboratory, AUCF-TR-1997-0016, 1996, pp. 1-33. |
Rosenberg et al., “Perceptual Decomposition of Virtual Haptic Surfaces,” Proc. IEEE Symposium on Research Frontiers in Virtual Reality, 1993, pp. 1-8. |
Rosenberg et al., “The use of force feedback to enhance graphical user interfaces,” Stereoscopic Displays & Virtual Reality Systems, 1996, pp. 243-248. |
Rosenberg, “Perceptual Design of a Virtual Rigid Surface Contact,” Armstrong Laboratory AUCF-TR-1995-0029, 1993, pp. 1-45. |
Rosenberg, “Virtual Haptic Overlays Enhance Performance in Telepresence Tasks,” Dept. of Mech. Eng., Stanford Univ., 1994. |
Rosenberg, L., “Virtual fixtures as tools to enhance operator performance in telepresence environments,” SPIE Manipulator Technology, 1993, pp. 1-12. |
Schmult et al., “Application Areas for a Force-Feedback Joystick,” 1993, Advances in Robotics, vol. 49, pp. 47-54. |
Su et al., “The Virtual Panel Architecture: A 3D Gesture Framework,” University of Maryland, pp. 387-393. |
Tan, et al., “Manual Resolution of Compliance When Work and Force Cues are Minimized,” DSC-vol. 49, Advances in Robotics, Mechatronics, and Haptic Interfaces, ASME 1993, pp. 99-104. |
Wiker et al., “Development of Tactile Mice for Blind Access to Computers, Importance of Stimulation Locus, Object Size, and Vibrotactile Display Resolution,” 1991, Human Factors Society Mtg., pp. 708-712. |
Winey III, “Computer Stimulated Visual & Tactile Feedback as an Ald to Manipulator & Vehicle Control,” MIT, 1981, pp. 1-79. |
Yamakita, et al., “Tele-Virtual Reality of Dynamic Mechanical Model,” Proc. of IEEE/RSJ Int'l. Conf. On Intelligent Robots and Systems, Jul. 1992, pp. 1103-1110. |
Yokokohji, et al., “What You Can See is What You Can Feel—Development of a Visual-Haptic Interface to Virtual Environment,” Proc. VRALS 1996. |
Noll. “Man-Machine Tactile,” SIO Journal, Jul./Aug. 1972 Issue. |
Rosenberg, Virtual Fixtures: Perceptual Overlays Enhance Operator Performance in Telepresence Tasks: Ph.D. Dissertation, Stanford University, Jun. 1994. |
Coaxial Control Shaker part No. C-25502, Safe Flight Instrument Corporation, 26 pages, Jul. 1, 1967; Revised Jan. 28, 2002. |
Adelstein, “A Virtual Environment System For The Study of Human Arm Tremor,” Ph.D. Dissertation, Dept. of Mechanical Engineering, MIT, Jun. 1989. |
Adelstein, “Design and Implementation of a Force Reflecting Manipulandum for Manual Control research,” DSC-vol. 42, Advances in Robotics, Edited by H. Kazerooni, pp. 1-12, 1992. |
Aukstakalnis et al., “Silicon Mirage: The Art and Science of Virtual Reality,” ISBN 0-938151-82-7, pp. 129-180, 1992. |
Balgrie, “Electric Control Loading—A Low Cost, High Performance Alternative,” Proceedings, pp. 247-254, Nov. 6-8, 1990. |
Bejczy, “Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation,” Science, vol. 208, No. 4450, pp. 1327-1335, 1980. |
Bejczy, “Generalization of Bilateral Force-Reflecting Control of Manipulators,” Proceedings of Fourth CISM-IFToMM, Sep. 8-12, 1981. |
Bejczy, et al., “Universal Computer Control System (UCCS) For Space Telerobots,” CH2413-3-87-0000-0318501.00 1987 IEEE, 1987. |
Bejczy et al., “A Laboratory Breadboard System for Dual-Arm Teleoperation,” SOAR '89 Workshop, JSC, Houston, TX, Jul. 25-27, 1989. |
Bliss, “Optical-to-Tactile Image Conversion for the Blind,” IEEE Transactions on Man-Machine Systems, vol. MMS-11, No. 1, Mar. 1970. |
Brooks et al., “Hand Controllers for Teleoperation—A State-of-the-Art Technology Survey and Evaluation,” JPL Publication 85-11; NASA-CR-175890; N85-28559, pp. 1-84, Mar. 1, 1985. |
Burdea et al., “Distributed Virtual Force Feedback, Lecture Notes for Workshop on Force Display in Virtual Environments and its Application to Robotic Teleoperation,” 1993 IEEE International Conference on Robotics and Automation, pp. 25-44, May 2, 1993. |
Cadler, “Design of a Force-Feedback Touch-Introducing Actuator For Teleoperator Robot Control,” Bachelor of Science Thesis, MIT, Jun. 23, 1983. |
Caldwell et al., “Enhanced Tactile Feedback (Tele-Traction) Using a Multi-Functional Sensory System,” 1050-4729-93, pp. 955-960, 1993. |
Eberhardt et al., “OMAR—A Haptic display for speech perception by deaf and def-blind individuals,” IEEE Virtual Reality Annual International Symposium, Seattle, WA, Sep. 18-22, 1993. |
Eberhardt et al., “Including Dynamic Haptic Perception by The Hand: System Description and Some Results,” DSC-vol. 55-1, Dynamic Systems and Control: vol. 1, ASME 1994. |
Fokumoto, “Active Click: Tactile Feedback for Touch Panels,” ACM CHI2001 Extended Abstracts, pp. 121-122, Apr. 2001. |
Gobel et al., “Tactile Feedback Applied to Computer Mice,” International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995. |
Gotow et al, “Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback,” WA11-11:00, pp. 332-337. |
Howe, “A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation,” Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992. |
IBM Technical Disclosure Bullein, “Mouse Ball-Actuating Device With Force and Tactile Feedback,” vol. 32, No. 9B, Feb. 1990. |
Iwata, “Pen-based Haptic Virtual Environment,” 0-7803-1363-1-93 IEEE, pp. 287-292, 1993. |
Jacobsen et al., “High Performance, Dextrous Telerobotic Manipulator With Force Reflection,” Intervention-ROV '91 Conference & Exposition, Hollywood, Florida, May 21-23, 1991. |
Johnson, “Shape-Memory Alloy Tactile Feedback Actuator,” Armstrong Aerospace Medical Research Laboratory, AAMRL-TR-90-039, Aug. 1990. |
Jones et al., “A perceptual analysis of stiffness,” ISSN 0014-4819 Springer International (Springer-Vertag); Experimental Braln Research, vol. 79, No. 1, pp. 150-156, 1990. |
Kaczmarek et al., “Tactile Displays,” Virtual Environment Technologies. |
Kontarinis et al., “Display of High-Frequency Tactile Information to Teleoperators,” Telemanipulator Technology and Space Telerobotics, Won S. Kim, Editor, Proc. SPIE vol. 2057, pp. 40-50, Sep. 7-9, 1993. |
Kontarinis et al., “Tactile Display of Vibratory Information in Teleoperation and Virtual Environments,” Presence, 4(4):387-402, 1995. |
Lake, “Cyberman from Logitech,” GameBytes, 1994. |
Marcus, “Touch Feedback in Surgery,” Proceedings of Virtual Reality and Medicine The Cutting Edge, Sep. 8-11, 1994. |
McAffee, “Teleoperator Subsystem-Telerobot Demonsdtrator: Force Reflecting Hand Controller Equipment Manual,” JPL D-5172, pp. 1-50, A1-A36, B1-B5, C1-C36, Jan. 1988. |
Minsky, “Computational Haptics: The Sandpaper System for Synthesizing Textue for a Force-Feedback Display,” Ph.D. Dissertation, MIT, Jun. 1995. |
Ouh-Young, “Force Display in Molecular Docking,” Order No. 9034744, p. 1-369, 1990. |
Ouh-Young, “A Low-Cost Force Feedback Joystick and Its Use in PC Video Games,” IEEE Transactions on Consumer Electronics, vol. 41, No. 3, Aug. 1995. |
Ouh-Young et al., “The Development of a Low-Cost Force Feedback Joystick and Its Use in the Virtual Reality Environment,” Proceedings of the Third Pacific Conference on Computer Graphics and Applications, Pacific Graphics '95, Seoul, Korea, Aug. 21-24, 1995. |
Patrick et al., “Design and Testing of a Non-reactive, Fingertip, Tactile Display for Interaction with Remote Environments,” Cooperative Intelligent Robotics in Space, Rui J. deFigueiredo et al., Editor, Proc. SPIE vol. 1387, pp. 215-222, 1990. |
Pimentel et al., “Virtual Reality: through the new looking glass,” 2nd Edition; McGraw-Hill, ISBN 0-07-050167-X, pp. 41-202, 1994. |
Rabinowitz et al., “Multidimensional tactile displays: Identification of vibratory intensity, frequency, and contractor area,” Journal of The Accoustical Society of America, vol. 82, No. 4, Oct. 1987. |
Russo, “The Design and Implementation of a Three Degree of Freedom Force Output Joystick,” MIT Libraries Archives Aug. 14, 1990, pp. 1-131, May 1990. |
Russo, “Controlling Dissipative Magnetic Particle Brakes in Force Reflective Devices,” DSC-vol. 42, Advances in Robotics, pp. 63-70, ASME 1992. |
Scannell, “Taking a Joystick Ride”, Computer Currents, Nov. 1994, Boston Edition, vol. 9 No. 11. |
Shimoga, “Finger Force and Touch Feedback Issues in Dexterous Telemanipulation,” Proceedings of Fourth Annual Conference on Intelligent Robotic Systems for Space Exploration, Rensselaer Polytechnic Institute, Sep. 30-Oct. 1, 1992. |
SMK Corporation, “Multi-Functional Touch Panel, Force-Feedback Type, Developed: A Touch Panel Providing a Clicking Feeling,” http:--www.smk.co.jp-whatsnew_e-628csc_e.html, Sep. 30, 2002. |
SMK Corporation, “Force Feedback Type Optical Touch Panel Developed,” SMK Corporation Website, Oct. 30, 2002. |
Snow et al., “Model-X Force-Reflecting-Hand-Controller,” NT Control No. MPO-17851; JPL Case No. 5348, pp. 1-4, Jun. 15, 1989. |
Stanley et al., “Computer Simulation of Interacting Dynamic Mechanical Systems Using Distributed Memory Parallel Processors,” DSV-vol. 42, Advances in Robotics, pp. 55-61, ASME 1992. |
Tadros, Control System Design for a Three Degree of Freedom Virtual Environment Simulator Using Motor-Brake Palr Actuators, MIT Archive © Massachusetts Institute of Technology, pp. 1-88, Feb. 1990. |
Terry et al., “Tactile Feedback in a Computer Mouse,” Proceedings of Fourteenth Annual Northeast Bioengineering Conference, University of New Hampshire, Mar. 10-11, 1988. |
Wiker, “Teletouch Display Development: Phase 1 Report,” Technical Report 1230, Naval Ocean Systems Center, San Diego, Apr. 17, 1989. |
Wakiwaka et al., Influence of Mover Support Structure on linear Oscillatory Actuator for cellular Phones, The Third International Symposium on Linear Drives for Industry Applications, 2001, p. 260-263, Japan. |
Bejczy et al., Kinesthetic Coupling between Operator and Remote Manipulator, International Computer Technology Conference, the American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980. |
ComTouch: A Vibrotactile Communication Device, 2002. |
Nissha Printing Co. Ltd., News, web page available at http://www.nissha.co.jp/english/news/2010/02/news-382.html, as available via the Internet and printed Apr. 5, 2011. |
Japanese Patent Office, Notice of Reasons for Rejection, Application No. JP 2012-241448 dated May 10, 2016. |
European Patent Office, Communication pursuant to Article 94(3) EPC, Application No. 12191217 dated Mar. 11, 2016. |
European Patent Office, Extended European Search Report, European Application No. EP13178283, dated Mar. 21, 2014. |
Chinese Patent Office, Chinese Application No. 201210442566.0, Office Action dated Jul. 26, 2016, 28 pages. |
Chinese Patent Office, Chinese Application No. 201210442566.0 , Office Action dated Mar. 14, 2017, 9 pages. |
Japanese Patent Office, Japanese Application No. 2012-241448, Office Action dated Feb. 21, 2017, 7 pages. |
Japanese Patent Office Application No. 2012-241448, Office Action dated Oct. 3, 2017, 4 pages. |
European Patent Office Application No. 1219217.4, Summons to Attend Oral Proceedings dated Sep. 18, 2017, 9 pages. |
JP 2017-157477, “Office Action”, dated Jun. 19, 2018, 6 pages. |
EP 12191217.4, “Office Action,” dated Jan. 12, 2017, 10 pages. |
EP 18175392.2, “Extended European Search Report,” dated Dec. 12, 2018, 12 pages. |
KR 10-2012-0124681, “Office Action,” dated Oct. 30, 2018, 11 pages. |
KR 10-2012-0124681, “Office Action,” dated Aug. 7, 2019, 4 pages. |
Chinese Application No. CN201710876181.8 , “Office Action”, Nov. 19, 2019, 27 pages. |
Number | Date | Country | |
---|---|---|---|
20190227629 A1 | Jul 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15405550 | Jan 2017 | US |
Child | 16201594 | US | |
Parent | 13290502 | Nov 2011 | US |
Child | 15405550 | US |