This invention relates generally to human/computer interfaces, and more particularly to human/computer interfaces with force feedback that can operate over a network.
Computer networks have become essential to allow users and computers to communicate with each other. Users transmit and receive data over networks in offices, at home, or in portable devices, and do so for a variety of tasks and applications, including communication, distribution, and entertainment. Many different types of networks are used. Local Area Networks (LANs) are typically provided in a limited area and include a relatively small number of computer nodes. The most large scale example of a network is the Internet, which has become extremely popular. The Internet is a Wide Area Network (WAN) that is structured and distributed such that no one authority or entity manages the network. Different communication protocols can be used over networks to allow computers to communicate with each other; for example, protocols such as “Transmission Control Protocol/Internet Protocol” (TCP/IP) and the “World Wide Web” (WWW) are used over the Internet. TCP/IP sends “packets” of data between a host machine, e.g. a server computer on the Internet, and a client machine, e.g. a user's personal computer connected to the Internet, or between two client machines. The WWW is an Internet interface protocol which is supported by the same TCP/IP transmission protocol. Intranets are private networks based upon Internet standards, and have become quite common for managing information and communications within an organization. Intranets, since they adhere to Internet standards, can often use the same interface software as are used on the Internet, such as a web browser.
A variety of information is currently transferred over computer networks. For example, visual data, text data, and sound data can be transmitted over the Internet and the WWW. Image data can be sent from one client machine to another (or from a server machine to a client machine) in a variety of formats. Or, for example, data packets coded in TCP/IP format can be sent from one client machine to another over the Internet to transmit sound data. This last-mentioned technique forms the basis for Internet telephony.
While the transmission of visual images (both static and dynamic, i.e. video), text, and sound over networks, such as the Internet, is well-known, the transmission of other types of sensory data has not been well explored. In particular, the transmission of data over networks pertaining to the sense of touch and/or force has not been established. “Force feedback” allows a user to experience or “feel” tactile sensations as provided through computational information. Using computer-controlled actuators and sensors on a force feedback device, a variety of realistic sensations can be modeled and experienced by the user. This useful and highly immersive sensory modality for interacting with the Internet and other users over the Internet has hereto been unavailable.
The present invention is related to the transmission of information pertaining to a subset of the sense of touch, i.e. the transmission of forces to a user over a computer network system. The “force feedback” provided by the methods and apparatus of the present invention enhance the sensory experience of user-to-user interactions to provide a richer, more interesting, and more enjoyable experience.
A network force feedback system in accordance with the present invention includes a network, a first computer coupled to the network, and a second computer coupled to the network. The first and second computers each include a visual display and a force feedback interface device. The interface device is capable of providing a computer input to the computer and also includes an actuator to output force feedback to a user in response to a force feedback signal provided by the computer. At least one of the computers develops an image on the visual display that is associated with stored force feedback information, and produces the image and the force feedback signal based on information received from the other, remote computer. Preferably, the computers produce the images and the force feedback signals based on information received from the remote computer and based on the computer input from the local force feedback device. The force feedback device can include a local microprocessor that communicates with the computer such that the force feedback signal can take the form of a relatively high-level force command. The present invention therefore permits two computer users to interact using force feedback provided over a network on a client-to-client (peer-to-peer) basis.
A method for providing force feedback between two computers over a network includes establishing a connection between a first computer and a second computer over a network, sending first computer information to a second computer from the first computer over the network, and providing a force feedback signal to the second force feedback device from the second computer, where the force feedback signal is based on the first computer information. The force feedback signal causes the second force feedback device to output forces to the second user using an actuator of the force feedback device. Similarly, second computer information is sent to the first computer from the second computer over the network, and a force feedback signal is provided to the first force feedback device from the first computer. The force feedback signal is based on the second computer information, where the force feedback signal causes the first force feedback device to output forces to the first user using an actuator of the first force feedback device. The force feedback signal can also be based on the input information from the force feedback devices. The information sent over the network can include position information describing a position of a manipulandum of the force feedback devices, and/or can include force feedback information indicating a force sensation to be output by the remote force feedback device. The computers can each display a graphical environment having a first graphical object controlled by the first user and a second graphical object controlled by the second user.
In a different aspect, a method is disclosed of allowing two users to interact physically over a computer network, wherein a first manipulandum is physically contacted and moved by a first user in at least one degree of freedom and a second manipulandum is physically contacted and moved by a second user in at least one degree of freedom. First information is transmitted over the computer network, including an indication of movement of the first manipulandum, to a second manipulandum physically contacted by a second user. A force is applied to the second manipulandum based on the indication of movement of the first manipulandum such that the second user feels an interaction based on movement of the first manipulandum. Second information is similarly transmitted to the first manipulandum such that the first user feels an interaction based on movement of the second manipulandum. Two users can thus physically exchange information and interact over a computer network.
The present invention adds a new sensory modality when interacting with a networked computer system. More particularly, force information can be either downloaded to a client machine from a server machine connected to the network, or force information can be passed between two or more client machines on the network. Peer-to-peer or server-to-peer direct interaction allows two or more users to interact using a different sensory modality, the sense of touch. The interaction may be subject to some transmission (“latency”) delays on networks such as the Internet, but permits remote interactivity with a client's force feedback device in new ways to enhance communication and interaction between users.
These and other advantages of the present invention will become apparent upon reading the following detailed descriptions and studying the various figures of the drawings.
a is a perspective view of a preferred human/computer interface (“force feedback device”) of the present invention;
b is a cross-sectional view taken along line 4b-4b of
a is a perspective view of another preferred embodiment for a force feedback device in accordance with the present invention;
b is a perspective view of a first alternate embodiment for the force feedback device of
c is a perspective view of a second alternate embodiment of the force feedback device of
a is flow-diagram of a “Acquire URL” process in accordance with the present invention;
b is an example of an HTML file of the present invention sent from a web server machine to a client machine;
a is an illustration of an image displayed on a visual display of a client computer as generated from a downloaded HTML web page file;
a and 12b are diagrammatic illustrations of a client-to-client game embodiment displayed on a display screen of a client computer.
In
As noted previously, both the Internet 12 and Intranets operate using the same TCP/IP protocols. This allows Intranets to use similar or the same server machine software and client machine software as are used in Internet 12 applications. Therefore, it will be apparent to those skilled in the art that the following descriptions apply equally well to Internet, Intranet, and other forms of network systems that are compatible with the processes and apparatus disclosed herein.
The Internet 12 includes a number of nodes 20 that are interconnected by data transmission media 22. These nodes are typically routers, switches, and other intelligent data transmission apparatus which route “packets” of TCP/IP information to the desired destination. In some instances, the nodes 20 comprise an Internet service provider (ISP) 20a which allows a client machine to access the “backbone” of the Internet. Alternatively, client machines and web servers can be coupled directly into the backbone of the Internet.
As noted previously, the present invention is directed to the implementation of force feedback over a network, such as the Internet 12. To provide a user of a client machine with the experience of force feedback, force feedback human/computer interfaces (hereafter “force feedback devices”) 24 and 26 can be provided as part of the client machines 14 and 16, respectively. The client machines 14 and 16 are typically provided with computer video monitors 28 and 30 (which is one example of a “visual display”), respectively, which can display images I1 and I2, respectively. Preferably, forces developed by force feedback devices 24 and 26 are correlated with the images I1 and I2 of the client machines 14 and 16, respectively.
The machines 14-18 are considered, in the language of the Internet, to be “resources,” and each has its own unique Uniform Resource Locator or “URL.” In one embodiment of the present invention, a client machine, such as client machine 14 or 16, sends a request for a “web page” residing on, for example, web server machine 18. This is accomplished by the client machine sending a connection request and a URL which specifies the address of the web page to the web server machine 18. The web server machine 18 then sends a web page 32 written in HTML format back to the requesting client machine where it is “cached” in the memory (typically the RAM, hard disk, or a combination of the two) of the client machine. In this embodiment of the invention, the image on the video display of the client machine is generated from the HTML web page file cached on the client machine, and force feedback is provided to a user through the force feedback device as he manipulates a user manipulable object of the force feedback device.
In a peer-to-peer aspect of the present invention, a first client machine, such as client machine 14, and a second client machine, such as client machine 16, directly communicate force feedback commands to each other in standard TCP/IP protocol over the Internet 12. More particularly, client machine 14 can send force feedback and other information to the URL of the client machine 16, and the client machine 16 can send force feedback and other information in standard TCP/IP packets to the URL of the client machine 14. In this way, users of client machine 14 and client machine 16 can interact physically over the Internet 12. Of course, a server machine 18 can likewise directly communicate force feedback commands to a client machine 12 or 14, or all three machines can interact. The client machines can also communicate directly over other types of networks and/or using other communication protocols. Peer-to-peer (i.e. client-to-client) communication is described below with reference to
In
The personal computer system 34 includes a microprocessor 36 clocked by a system clock CLK and which is coupled to a high speed or memory bus 38 and to a lower speed or I/O bus 40. The system RAM 42 and ROM 44 are typically coupled to the high speed memory bus, while various peripherals, such as the video display, hard disk drive, Internet interface (often either a modem or an Ethernet connection), and force feedback device, are typically coupled to the slower I/O bus. The microprocessor executes programs stored in the various memories (RAM, ROM, hard disk, etc.) of the personal computer 34 to control, for example, the image display on the video display and the forces provided by the force feedback device. The manufacture and use of personal computers, such as personal computer 34, are well-known to those skilled in the art.
In
The personal computer system 48 includes the microprocessor 36, the system clock 62, a video monitor 64 (which is one type of “visual display”), and an audio device 66. The system clock 62, as explained previously, provides a system clock signal CLK to the microprocessor 36 and to other components of the personal computer system 48. The display device 64 and the audio output device 66 are typically coupled to the I/O bus 40 (not shown in this figure).
In this preferred embodiment, the force feedback device 50 preferably includes a local microprocessor 68, a local clock 70, optional local memory 71 for the local microprocessor 68, a sensor interface 72, sensors 74, a user manipulatable object 76, “other” input interface 78, an actuator interface 80, a safety switch 82, and actuators 84 which provide a force F to the object 76, and an optional power supply 86 to provide power for the actuator interface 80 and actuator 84.
The microprocessor 36 of the personal computer system 48 is coupled for communication with the local microprocessor 68 of the force feedback device 50. This communication coupling can be through a serial port coupling 88 to the personal computer system, or through a game port coupling 90 to the personal computer system. Virtually all personal computer systems built to the IBM PC/AT standards will include a serial port and a game port. As noted, the serial port will permit two-way communication between microprocessor 36 and microprocessor 38, and thus is preferable over the game port coupling which only permits one-way communication from the local processor 68 to the microprocessor 36. In consequence, a serial port connection between the personal computer system 48 and the force feedback device 50 will permit force feedback commands to be sent from the microprocessor 36 to the local microprocessor 68, while a game port connection alone will not be able to provide this function. However, some simpler forms of “reflex” type force feedback can still be provided by the force feedback device 50 under the control of the local microprocessor 68 even if only a slower interface is used. It should also be noted that the microprocessor 36 and a local microprocessor 68 may communicate over both the serial port and game port connection to provide a greater communication bandwidth. A preferred serial port is the Universal Serial Bus (USB) of a personal computer, although an RS-232 serial bus, or other serial busses, a parallel bus, an Ethernet bus, or other types of communication links can also be used.
In use, the user 52 of the client machine 46 grasps the user object 76 (or “manipulandum”) of the force feedback device 50 and manipulates (i.e. exerts a force to move or attempt to move) the user object to cause a “pointer” or other graphical object to move in the image displayed by the display device 64. For example, a pointer typically takes the form of a small arrow, a pointing hand, or the like. The sensor 75 senses the movement of the user object 76 and communicates the movement to the local microprocessor 68 through the sensor interface 72. The local microprocessor 68 then communicates through serial port 88, game port 90, or both to the microprocessor 36 to cause the microprocessor 36 to create a corresponding movement of the pointer on the image displayed upon the visual display 64. In some embodiments, the sensors 74 can communicate directly to microprocessor 36 without the use of local microprocessor 68. The user can also create other input, such as a “button click,” through the other input 78 which are communicated to the microprocessor 36 by the local microprocessor 68 or directly, e.g., using a game port. The user object 76 can take many forms, including a joystick, mouse, trackball, steering wheel, medical instrument, representation of a body part, gamepad controller, etc., as described in U.S. Pat. Nos. 5,734,373, 6,028,593, and 6,100,874, all incorporated by reference herein.
If the pointer on the display device 64 is at a position (or time) that correlates to a desired force feedback to the user 52, or an event occurs that dictates that force feedback should be output, the microprocessor 36 sends a force feedback command to the local microprocessor 68 over the serial port connection 88. The local microprocessor 68 parses this force feedback command and sends signals to the actuator interface 80 which causes the actuator 84 to create forces F on user object 76, which are experienced by the user 52 as indicated at 60. The safety switch 82, sometimes referred to as a “deadman switch”, blocks the signal from the actuator interface 80 if, for example, the user 52 is no longer grasping the object 76. In this way, the user 52 can interact with the client machine 46 in a visual, auditory, and tactile fashion.
For example, when using the local microprocessor 68 to offload computational burden from the host computer, the host can send high level commands to the local microprocessor 68. The local microprocessor can parse or interpret the commands and implement a local force routine that is stored in local memory 71. Such a force routine might instruct the microprocessor 68 to read sensor positions, determine a force based on the sensor positions, and command the actuators 84 to output the force, all in a local control loop independent from the host computer (the microprocessor 68 would also preferably relay the sensor positions to the host computer). Different force routines can be provided to command different types of force sensations (spring forces, damping forces, vibration forces, etc.). This local control loop can be helpful in increasing the response time for forces applied to the user object 76, which is essential in creating realistic and accurate force feedback. The hardware architecture described above is also described in U.S. Pat. No. 5,739,811, and the high level command protocol between the computer and the force feedback device is also described in U.S. Pat. No. 5,734,373, the disclosures of which are incorporated herein by reference.
In addition to sending force feedback commands, it may be convenient for host computer 48 to send a “spatial representation” to microprocessor 68, which is data describing the layout of all or some of the graphical objects displayed in the hosts' application program or graphical environment which are associated with forces and the types of these graphical objects (in the Web page embodiment, the layout/type of graphical objects can be downloaded from a remote computer providing the Web page).
The microprocessor 68 can store such a spatial representation in memory 71, for example. In addition, the microprocessor 68 can be provided with the necessary instructions or data to correlate sensor readings with the position of the cursor on the display screen. The microprocessor would then be able to check sensor readings, determine cursor and target positions, and determine output forces independently of host computer 48. The host can implement operating system functions (such as displaying images) when appropriate, and low-speed handshaking signals can be communicated between processor 68 and host 48 to correlate the microprocessor and host processes. Also, memory 71 can be a permanent form of memory such as ROM or EPROM which stores predetermined force sensations (force models, values, reflexes, etc.) for microprocessor 68 that are to be associated with particular types of graphical objects.
The host can also send the microprocessor a positional offset that may have occurred between the graphical object or user object controlled by the user and the graphical object or user object controlled by a remote user in a game or simulation. The microprocessor can use the positional offset in the determination of forces. For example, a spring force can be implemented between the user manipulatable objects of two networked host computers, where the magnitude of the spring force is proportional to the positional offset between the two user objects. The spring force thus biases the user objects to synchronized positions.
In
The link 98 can move in and out of a housing 112 as indicated by arrow 114, and link 106 can move in and out of a housing 116 of voice coil 96 as indicated by the arrow 118. The pivots 102, 104, and 110 allow the object 76a to move within the constraints of an x-y plane, but does not permit movement in a z direction orthogonal to the x-y plane. Therefore, the force feedback device is a two degree (2D) of freedom device. That is, the user manipulatable object 76a can move with a first degree of freedom in a x direction, and in a second degree of freedom in the y direction. A 2D force feedback device 50a is considered preferable in the present invention since it correlates well to the two-dimensional screen of a monitor of a client machine.
In
The force feedback devices of
In
The flexible members 134 and 138 serve the same functions as the links of the force feedback device 50a described previously. As the object 76a is moved back and forth along an x-y plane, the flexible member 134 can move in and out of the voice coil housings 94 and 96, respectively, and can bend to accommodate angular movement with respect to the x and y axis. This permits the connectors 132 and 136 to move back and forth within the voice coils 94 and 96, respectively. The force feedback device of
In
In
The embodiments of
As noted previously, a preferred embodiment of the present invention provides a user manipulatable object that has two degrees of freedom. Other user manipulatable objects having one degree of freedom or three or more degrees of freedom are also within the scope of the present invention. For example, one embodiment of the present invention provides only one degree of freedom. Other force feedback devices of the present invention include mice, joysticks, joypads, a steering wheel, and yolks having two or more degrees of freedom.
In
As noted in
In addition to communicating with the server machine, the client machines can communicate directly with each other over the Internet using an Internet communication protocol. For example, client machine 14 can communicate with client machine 16 through a TCP/IP connection. This is accomplished making the URL of the client machine 16 known to the client machine 14, and vice versa. In this fashion, direct communication between client machines can be accomplished without involving the server machine 18. These connections can send force feedback information and other information to the other client machine. For example, a process on the client machine 16 can send force feedback information over a TCP/IP Internet connection to the client machine 14, which will then generate a force feedback command to the force feedback device 24. When the user reacts to the force feedback at force feedback device 24, this information can be sent from client machine 14 to client machine 16 to provide force feedback to the user on force feedback device 26.
In
The process 146 begins at 148 and, in a step 150, a connection request is sent to the “host” of the desired URL. The host, in this example, is a server machine 18 and the desired URL is the URL of the desired web page residing on the server machine 18, the web page including force feedback commands. Alternatively, the desired web page can reside on another server or resource and be retrieved by server machine 18. In response to the connection request of step 150, the server machine 18 sends the HTML file representing the web page over the Internet to be received by the client machine. The HTML file includes a number of “components” which are typically commands, command fragments, instructions, and data which permit the display of the web page and other web browser functionality. In a step 154, and an HTML component is obtained. If this component is the end of file (“eof”), a step 156 detects that fact and the process is completed at 158. Otherwise, the HTML component is parsed and interpreted at a step 160 and process control is returned at step 154. It should be noted that most web browser software will start parsing and interpreting (i.e. processing) the HTML components even before the entire HTML file is received at the client machine. Alternatively, the entire HTML file can be received before the processing begins.
In
In a first line 172 of the <EMBED . . . > command, the force button object is defined by a “IFF” extension file, namely “FORCEBUTTON.IFF.” Next, in a line 174, the size of the button is indicated to be 100 pixels by 100 pixels. In a line 176, the initial state of the button is indicated to be “up” (i.e., unselected), and a line 178 defines the force effect to be “vibration.” A number of parameters 180 defining the character and nature of the vibration are also provided (start time, length, frequency, magnitude, etc.). In a line 182, the “trigger” for the force effect is given by the function “MOUSEWITHIN” with its associated parameters, and by the function “BUTTONSTATE.” The function MOUSEWITHIN determines whether the pointer, the position of which is controlled by the force feedback device, is within the specified boundaries defining a region of the force button. This region can be specified by the parameters and, for example, can be defined as the exact displayed area of the button, or can be defined as a sub-region within the button that is smaller than the displayed size of the button. The function BUTTONSTATE determines whether a button or switch of the force feedback device is in the desired state to trigger the force object event (e.g., a button event in this example). In a line 184, the icon representing the force button is specified as “LOUIS.GIF,” and the text associated with the button is defined as “Hi, I′m Louis” in a line 186. The font of the text is given as “Helvetica” in a line 188. Other force effects, triggers and parameters can also be associated with the force object. For example, a force (such as a vibration) can be triggered if the pointing icon is moved a predetermined velocity or within a predefined range of velocities within the force object. Or, a trajectory of the pointing icon on a force object can trigger a force, like a circle gesture.
The <EMBED . . . > command is an existing functionality of HTML. It essentially embeds function calls which are handled by the web browser. If the suffix of the specified file is a known, standard suffix type, the call is executed directly by the web browser. If, however, the suffix (.IFF in this instance) is not a standard feature of the web browser, the browser will first look for a “plug-in” to implement this feature and, if a suitable plug-in is not found, it will look for application programs implementing this feature. In one embodiment, a plug-in including a reference to a Dynamically Linked Library (DLL) is provided to give functionality to the .IFF suffix. The DLL can be provided local to the client machine or on another linked resource.
With continuing reference to
The present invention also provides for programmability of the embedded force feedback object. An example of this programmability is shown at 198. This optional programmable command can be inserted into the EMBED command 170 and can include, for example, an iterative loop. In line 200, a “FOR” command initializes a counter i to 0, indicates that the counter I is incremented by one per each pass through the loop, and it indicates that the loop should be completed five times, i.e. while i<5. The body of the loop includes a command line 202 which indicates that a force feedback “vibrate” with associated parameters should be evoked, and a line 204 indicates that a 5 second wait should be provided after the vibration has occurred. This step will repeat five times, i.e. the command 198 will cause five vibration sequences separated by four 5 second pauses, and followed by a final 5 second pause. By providing programmability to the force feedback object, force feedback effects based upon past events and upon a complex interaction of factors can be provided.
In
In
In
The area within the body portion 232 has been provided with a number of regions and buttons to illustrate some of the concepts of the present invention. The force feedback device controls the position of a pointer icon 240 which can be caused to interact with the various regions and buttons. As an example, when the force feedback device is manipulated by the user to cause the pointer icon 240 to move within a “texture” region 242, force feedback commands can be created for the force feedback device to provide a desired “texture” to the force feedback device. For example, the texture can feel “rough” to the user by causing the force feedback device to place forces on the user manipulatable object that emulate a rough or bumpy surface. In a region 244, a “viscosity” force feedback can be provided. With this form of force feedback, as the pointer icon is moved through field 244, a viscous “drag” force is emulated on the user manipulatable object. In a region 246, inertial forces can be felt. Therefore, a pointer icon being moved through an “inertia” region would require relatively little or no force to move in a straight line, but would require greater forces to accelerate in a new direction or to be stopped. The inertial force sensations can be applied to the user manipulatable object and felt by the user. In a “keep out” region 248, the pointer image is prevented from entering the region. This is accomplished by creating a repulsive force on the user manipulatable object using a force feedback command to the force feedback device which prevents or inhibits the user from moving the user manipulatable object in a direction of the region 248 when the pointer icon 240 contacts the periphery of the region 248. In contrast, a “snap-in” region 250 will pull a pointer icon 240 to a center 252 whenever the pointer icon engages the periphery of the snap-in region 250 and apply a corresponding attractive force on the user manipulatable object. A “spring” region 243 emulates a spring function such that a pointer icon moving into the spring region “compresses” a spring, which exerts a spring force on the user manipulatable object which opposes the movement of the pointer icon. A region 256 is a “Force To Left” region where the pointer icon within the region 256 is forced to the left side of the region and the user manipulatable object is forced in a corresponding direction as if influenced by some invisible magnetic force or gravitational force. A region 258 illustrates that regions can be of any size or shape and that within a region different force effects can be developed. In this example, within region 258 there is a texture core 260 surrounded by a vibration ring 262. Therefore, as the pointer icon 240 moves into the region 258, the user first experiences vibration from the ring 262, and then experiences a texture as the pointer icon moves within the core 260.
The exemplary force feedback web page of
These and other forces resulting from a pointing icon interacting with various objects displayed on a computer screen are also described in co-pending patent application Ser. No. 08/571,606 filed Dec. 13, 1995, the disclosure of which is incorporated herein by reference.
In
It should be noted that the force feedback driver (e.g., browser plug-in or DLL) can have the ability to interact with JAVA code. In this embodiment, the plug-in reads and executes JAVA commands using the browser's run-time JAVA interpreter. JAVA can optionally be used to make “applets” which perform dynamic models, such as creating complex force feedback sensations.
It should also be noted that the force feedback device itself can have a JAVA interpreting chip on board, permitting the plug-in driver to download JAVA code to the force feedback device to be executed on the device. JAVA and JAVA interpreting chips are available under license from SUN Microcomputers of Mountain View, Calif.
Furthermore, the force feedback driver (e.g., browser plug-in or DLL) can have the ability to interact with instructions provided in other languages besides HTML. For example, virtual reality 3-D graphical environments are increasingly being created and implemented over the World Wide Web and Internet using languages such as the Virtual Reality Modeling Language (VRML) and software such as Active X available from Microsoft Corporation. In these 3-D graphical environments, users may interact with programmed 3-D objects and constructs using client computer 14 or 16, and may also interact with 3-D graphical representations (or “avatars”) controlled by other users over the World Wide Web/Internet from other client computers. Force feedback commands and parameters can be provided in the instructions or files of these other protocols and languages and received by a client computer system in an equivalent manner to that described above so that force feedback can be experienced in simulated 3-D space. For example, embedded force feedback routines can be included in the VRML data for a virtual environment so that when the user moves into a virtual wall, an obstruction force is generated on the user-manipulatable object. Or, when the user carries a virtual object in a controlled virtual glove, the user might feel a simulated weight of the virtual object on the user manipulatable object. In such an embodiment, the force feedback device preferably provides the user with three or more degrees of freedom of movement so that input in three dimensions can be provided to the client computer.
In one embodiment, a first site 310 includes computer 312 that implements a graphical environment, such as a web browser, simulation, or game application, and a first user utilizes display device 314 and force feedback interface device 316. Optionally, local microprocessor 318 is coupled to interface device 316 as described with reference to
Each local computer 312 and 322 has direct access to its own interface device 316 and 326, respectively, but does not have direct access to the remote interface device used by the other user. Thus, the information which describes the position, orientation, other motion or state characteristics, button data, and other information related to each local interface device (collectively considered “motion/state information” herein) is conveyed to the other remote computer. Each local computer 312 and 322 therefore has direct access to the local interface device and networked access to the motion/state information of the remote interface device, allowing a consistent interaction for both users.
The computers 312 and 322 need only exchange the information that is necessary to update the simulated graphical objects controlled by the remote users and other simulated characteristics that may have been affected by the input of a user. This minimal information exchange is often necessary when using networks having low or limited bandwidth and which have a slow rate of information transfer, such as many current connections to the Internet/World Wide Web, often implemented (for many home computer users) using low bandwidth telephone connections and relatively low-bandwidth modems or similar telecommunication devices. The computationally-intensive force feedback calculations to implement the interactions between a user-controlled object (e.g. cursor or paddle) and other objects (e.g., icons, GUI elements, other paddle) are preferably handled locally. The resulting outcome of the force feedback calculations/interactions are transmitted to remote users so as to minimize the information that is transmitted to other computer systems.
One type of information which is sent between the networked computers is motion/location/state information. For example, in a multi-user game interaction, when a local user controls one paddle and a remote user controls a different paddle, the position of the remote user's manipulandum is needed to determine paddle interaction and appropriate forces. Or, if a moving graphical object interacts with a paddle controlled by a local user, the local computer processes the interaction, generates the required local force feedback sensations, computes the new location and velocity of the moving graphical object as a result of the interaction, and conveys the new graphical information to the remote computer(s) so that all game applications can be re-coordinated after the object interaction. The remote computer then computes any force feedback sensations occurring at its own site resulting from the new object position, motion, etc.
When using a network having low- or limited-bandwidth, there may still be a substantial time delay from when a local graphical object, such as a cursor or paddle, changes its location/motion/state information and when the remote web browsers or application programs receive and are updated with that information. Thus, a user at a given site may be viewing a remote-user-controlled graphical object at a time delay while viewing his own cursor in real time without a time delay. For example, the user may witness a cursor-icon interaction a few seconds after the actual event happened on the remote user's local implementation of the interaction. Obviously, this can cause problems in the experience of networked interactions and game play. To compensate for this problem, a networked graphical environment may introduce a short time delay before events occur locally. For example, a short delay can be implemented on the local computer before a ball bounces off of a paddle to reduce the timing discontinuity between remote and local users.
In addition, force feedback or “feel sensation information” can be transferred from one host computer to another over the network. This type of information can be provided, for example, if a force should be output that is not based on position or motion of the user manipulatable objects or interacting graphical objects. Thus, if a button press on a joystick manipulandum of force feedback device 316 designates that a vibration is to be output on the other joystick manipulandum of force feedback device 326, a force feedback command or other similar information can be sent from computer 312 to computer 322, preferably including parameters describing the vibration feel sensation. Computer 322 parses and interprets the command and then commands the force feedback device 326 to output the vibration on the joystick of device 326. Such commands and parameters can be implemented similarly to the HTML or VRML embodiments described above, or in a different format. The computer 322 thus receives the feel sensation information directly from the other computer 312. Alternatively, the computer 312 can simply send the button press information, so that the computer 322 interprets the button press as a particular force sensation and outputs that sensation. However, with such an embodiment, the computer 322 would need mapping information that indicates which feel sensation corresponds to the received button press, and this mapping information has to be updated periodically so as to provide synchronization. If force feedback information is sent directly, there is no need for the computer 322 and/or force feedback device 326 to store data mapping a particular button press to a feel sensation, saving memory and synchronization steps. Such feel sensation information can also be useful to characterize graphical objects in a game or simulation which one computer generates or updates and needs to convey to any other linked computers to provide synchronization. For example, a wall graphical object can be characterized as having a hard or soft surface, having a smooth or frictional surface, and having other force characteristics.
The force feedback command or other information relayed from a host computer to its force feedback device can be determined based on the motion/state information and/or force feedback information received over the network. In many cases, the force feedback command or other information can be also based on input from the local force feedback device. For example, a force need not be commanded until a controlled graphical object impacts a different graphical object. To determine whether the user-controlled graphical object has impacted another object, position (or other motion) information is received from the local force feedback device which indicates the current position of the user object in its degrees of freedom. From this information, a new position of the user-controlled graphical object is determined, and any interactions of this object with other objects are realized.
Many different applications of force feedback implementation over networks in a client-to-client configuration can be implemented. For example, two graphical objects, each controlled by a different user, can interact and the users can experience forces based on the interaction. In one implementation, a first computer displays a first user controlled graphical object that is moved in conjunction with the first user's manipulation of a first force feedback device connected to the first computer. The first computer also displays a second graphical object. The second computer, connected to the first computer by a network, also displays the first and second graphical objects on a display screen of a second host computer. The second graphical object is moved in conjunction with the second user's manipulation of a second force feedback device connected to the second computer. Force feedback can be provided to both first and second computers based on the interaction of the first and second object.
One example of using client-to-client interaction in a game is shown in
The user can also move the user object so that the paddle moves in a direction 366. The user will thus feel like he or she is “carrying” the weight of the ball, as in a sling. The ball will then be released from the paddle and move toward the other paddle 360. As is well known, a goal in such a game might be to direct the ball into the opposing goal. Thus, the first user can try to direct the ball into goal 368, and the second user can control paddle 360 to direct the ball into goal 370. Paddles 360 and 362 are used to block the ball from moving into the defended goal and to direct the ball back at the desired goal. By moving the paddle in a combination of direction 366 and up and down movement, the user can influence the movement of the ball to a fine degree, thus allowing a player's skill to influence game results to a greater degree than in previous games without force feedback. In addition, other features can be included to further influence the ball's direction and the forces felt by the user. For example, the orientation of the paddle can be changed by rotating the paddle about a center point of the paddle, and force feedback can be appropriately applied in that degree of freedom. Other features can also be provided, such as allowing a ball to “stick” to a paddle when the two objects collide and/or when a button is pressed by the user. The user could then activate the button, for example, to release the ball at a desired time.
Each player can feel the forces on their respective paddle from the ball directed by the other player. In addition, if the two paddles 360 and 362 were brought into contact with one another, each player can feel the direct force of the other player on each player's user object. That is, the first user's force on his user object causes his paddle 362 to move into the other paddle 360, which would cause both the first and second users to feel the collision force. If the first paddle 362 were allowed to push the other paddle 360 across the screen, then the second user would feel the first user's pushing force. The first user would feel similar forces from the second user.
In a “tug of war” game example, the first and second graphical objects (such as two paddles or other objects) can be visually connected. When the first user controls the first graphical object to move left, this position information is transferred to the second computer, and the second user controlling the second graphical object feels a force in a left direction resulting from the first player's manipulation. A similar result occurs for the first player when the second player manipulates the second graphical object. This creates the effect as if each player were pushing the other player directly. Furthermore, force information (“feel sensation information”) can also be transmitted between computers. For example, if a flexible “rope” is modelled connecting the first and second graphical objects, and the first user manipulates the first force feedback device so that the rope is made to oscillate or vibrate, then the first computer can send feel sensation information to the second computer that informs the second computer to command a vibration feel sensation on the second force feedback device, with appropriate parameters describing the vibration such as frequency, amplitude, and duration. The second user thus immediately feels appropriate forces caused by the first user. The winner of the tug-of-war can be the first user to move his or her graphical object to a specific goal or displayed location in opposition to forces from the other player and/or other obstacles. Alternatively, a user can be designated to win the tug-of-war or other game if that user can maintain a particular position of the user manipulatable object amid the forces output based on the interaction of the graphical objects and caused by both users.
A different example of a client-to-client communication of force feedback information can take the form of a “massage” interface. Two users, for example, can interact with each other by each feeling the presence of the other in an “intimate” way through the use of forces influenced by the device manipulation of the other user. The “input” of the user at one client computer is felt as “output” by the user at the other client computer, and vice-versa, much like the tug-of-war example above. Referring to
In a simple application of such an embodiment, the first user can massage the back of the second user by linking the first force feedback device connected to the first client machine with the second force feedback device connected to the second client machine. A user manipulatable object of the first force feedback device can be grasped or otherwise physically contacted by the hand of the first user, tracking the motion of the user's hand and outputting forces to the user's hand. The second force feedback device can have a user manipulatable object shaped like a hand that can engage the back of the second user. The motion of the user object of the first force feedback device can be linked to the motion of the hand object of the second force feedback device such that when the first user moves his or her hand, the hand object connected to the second client machine moves around and engages the back of the second user. Using this embodiment, the first user can massage the back of the second user, where the second user feels the hand object with a motion and pressure dependent on the first user's input. The first user also receives force feedback from the interaction of the hand object with the back of the second user, where the pressure and motion from the second user's back applied to the hand object is relayed to the user object held by the first user. Thus, if the second user leans back and applies force on the hand object, the first user feels that force.
In addition to the feel sensation information sent between the first client and the second client, other sensory information can be sent and displayed. For example, audio and video information can be transferred between the client computers. For example, both client computers can be connected to video cameras and microphones. The data from the video and microphone used by the first user can be sent to and displayed to the second user via a display device, such as a video screen, and via an audio output device, such as speakers; and the data from the second user can likewise be displayed to the first user. These other modalities can complete an environment that allows two or more users to interact through sight, sound, and touch.
In one embodiment of a client-to-client interaction, the force feedback device used by the first user can be linked to the force feedback device used by the second user such that the motion of the two user manipulatable objects are desired to be synchronized. If either user pushes his or her device such that it diverges from synchronization, a restoring force such as a spring force, for example, can be applied in the direction of the diverging device (or the non-diverging device, or both devices, as desired) needed to restore synchronization. The restoring force is applied as long as the two user manipulatable objects are out of synchronization. To determine if the user objects are in synchronization, the positions of the two user objects can be compared, where the position of each remote manipulandum is sent across the network to the other client to allow the comparison. For example, if the two positions maintain a constant relative distance between them within a predefined threshold distance, then no restoring force need be applied. If graphical objects are displayed based on the two user object positions, the graphical objects in some embodiments can be maintained in visual synchronization even though the user manipulatable objects are no longer in actual positional synchronization (if such visual synchronization is desired).
Many types of feel sensations can be sent across the network and combined in various ways. For example, a constant force or a spring force can be commanded to be applied to the force feedback interface device over the network, and other feel sensations/forces such as vibrations sensations can also be commanded over the network to be simultaneously overlaid on the constant or spring force. For example, the first user can press a button that causes the force feedback massage interface of the second user to output a vibration sensation over any forces already being experienced. A user can design a feel sensation using a feel sensation editor, such as shown in U.S. Pat. Nos. 6,147,674 and 6,169,540, both assigned to the assignee of the present application and incorporated herein by reference. This allows the users to design, for example, a massage vibration sensation—including magnitude, direction, envelope, waveform, and to send the created sensation to a different user's site to be experienced by the user or used in that user's own force sensations. In addition, it should be noted that a single client can be interfaced to multiple clients such that a force sensation sent from one client is received by many clients over the network.
The first and second client computers can also be directly connected though a phone line or other transmission medium. The client computers can be networked through a direct TCP/IP connection or other LAN connection, or can be connected through the Internet as described above. For example, both client computers can be connected to the same server that provides a web page or other information to the clients. Information can be provided from one client to the server, then from the server to the desired client computer (or, a server can perform processing on data before it is sent to the receiving client computer). Alternatively, the clients can be connected directly over the Internet with data provided directly between clients (or as directly as possible over the distributed Internet), which tends to reduce the time delay of transmission. The clients need not send visual page information or feel sensation information to each other unless the interaction of one user causes the visual display to change on the other user's visual display.
In yet another embodiment, the peer-to-peer embodiments described above can also include communications with a server machine 18, such as shown in
While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents which fall within the scope of this invention. It should also be noted that there are may alternative ways of implementing both the process and apparatus of the present invention. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.
The present application is a continuation-in-part of U.S. patent application Ser. No. 09/050,665, now U.S. Pat. No. 6,219,033, filed Mar. 30, 1998, entitled “Method and Apparatus for Providing Force Feedback for a Graphical User Interface,” filed Dec. 13, 1995 now U.S. Pat. No. 6,219,032; Ser. No. 08/691,852 now U.S. Pat. No. 5,956,484, entitled “Method and Apparatus for Controlling Force Feedback Interface Systems Utilizing a Host Computer,” which is a continuation of U.S. patent application Ser. No. 08/691,852, now U.S. Pat. No. 5,956, 484, filed Aug. 1, 1996, entitled “Method and Apparatus for Providing force Feedback over a Computer Network,” and a continuation of U.S. patent application Ser. No. 08/664,086, now U.S. Pat. No. 6,028,593, filed Jun. 14, 1996, entitled “Method and Apparatus for Providing Simulated Physical Interactions within Computer Generated Environments,” and claims priority to U.S. Provisional Patent Application No. 60/017,803, filed May 17, 1996, and is also a continuation of U.S. patent application Ser. No. 08/571,606, now U.S. Pat. No. 6,219,032, filed Dec. 13, 1995, entitled “Method for Providing Force Feedback to a User of an Interface Device Based on Interactions of a Controlled Cursor with graphical Elements in a Graphical User Interface,” which is a continuation of U.S. patent application Ser. No. 08/566,282, now U.S. Pat. No. 5,734,373, filed Dec. 1, 1995, entitled “Method and Apparatus for Controlling Force Feedback Interface Systems Utilizing a Host Computer, ” the entirety of all of which are hereby incorporated by reference.
Number | Name | Date | Kind |
---|---|---|---|
2906179 | Bower | Sep 1959 | A |
2972140 | Hrisch | Feb 1961 | A |
3157853 | Hirsch | Nov 1964 | A |
3220121 | Cutler | Nov 1965 | A |
3490059 | Paulsen et al. | Jan 1970 | A |
3497668 | Hirsch | Feb 1970 | A |
3517446 | Corlyon et al. | Jun 1970 | A |
3531868 | Stevenson | Oct 1970 | A |
3623064 | Kagan | Nov 1971 | A |
3795150 | Eckhardt | Mar 1974 | A |
3875488 | Crocker et al. | Apr 1975 | A |
3890958 | Fister et al. | Jun 1975 | A |
3902687 | Hightower | Sep 1975 | A |
3903614 | Diamond et al. | Sep 1975 | A |
3911416 | Feder | Oct 1975 | A |
3919691 | Noll | Nov 1975 | A |
3923166 | Fletcher et al. | Dec 1975 | A |
3944798 | Eaton | Mar 1976 | A |
4114882 | Mau | Sep 1978 | A |
4125800 | Jones | Nov 1978 | A |
4127752 | Lowthorp | Nov 1978 | A |
4131033 | Wright et al. | Dec 1978 | A |
4148014 | Burson | Apr 1979 | A |
4160508 | Salsbury, Jr. et al. | Jul 1979 | A |
4216467 | Colston | Aug 1980 | A |
4236325 | Hall et al. | Dec 1980 | A |
4262549 | Schwellenbach | Apr 1981 | A |
4333070 | Barnes | Jun 1982 | A |
4398889 | Lam et al. | Aug 1983 | A |
4448083 | Hayashi | May 1984 | A |
4464117 | Foerst | Aug 1984 | A |
4477043 | Repperger | Oct 1984 | A |
4477973 | Davies | Oct 1984 | A |
4484191 | Vavra | Nov 1984 | A |
4513235 | Acklam et al. | Apr 1985 | A |
4550221 | Mabusth | Oct 1985 | A |
4550617 | Fraignier et al. | Nov 1985 | A |
4560983 | Williams | Dec 1985 | A |
4571834 | Fraser et al. | Feb 1986 | A |
4581491 | Boothroyd | Apr 1986 | A |
4593470 | Davies | Jun 1986 | A |
4599070 | Hladky et al. | Jul 1986 | A |
4601206 | Watson | Jul 1986 | A |
4603284 | Perzley | Jul 1986 | A |
4604016 | Joyce | Aug 1986 | A |
4632341 | Repperger et al. | Dec 1986 | A |
4638798 | Shelden et al. | Jan 1987 | A |
4653011 | Iwano | Mar 1987 | A |
4654648 | Herrington et al. | Mar 1987 | A |
4676002 | Slocum | Jun 1987 | A |
4679331 | Koontz | Jul 1987 | A |
4686397 | Becker | Aug 1987 | A |
4688983 | Lindbom | Aug 1987 | A |
4689449 | Rosen | Aug 1987 | A |
4692756 | Clark | Sep 1987 | A |
4703443 | Moriyasu | Oct 1987 | A |
4704909 | Grahn et al. | Nov 1987 | A |
4708656 | De Vries et al. | Nov 1987 | A |
4713007 | Alban | Dec 1987 | A |
4734685 | Watanabe | Mar 1988 | A |
4750487 | Zanetti | Jun 1988 | A |
4769763 | Trieb et al. | Sep 1988 | A |
4782327 | Kley et al. | Nov 1988 | A |
4787051 | Olson | Nov 1988 | A |
4791934 | Brunnett | Dec 1988 | A |
4794392 | Selinko | Dec 1988 | A |
4795296 | Jau | Jan 1989 | A |
4795929 | Elgass et al. | Jan 1989 | A |
4798919 | Miessler et al. | Jan 1989 | A |
4800721 | Cemenska et al. | Jan 1989 | A |
4803413 | Kendig et al. | Feb 1989 | A |
4811608 | Hilton | Mar 1989 | A |
4819195 | Bell et al. | Apr 1989 | A |
4823634 | Culver | Apr 1989 | A |
4839838 | LaBiche et al. | Jun 1989 | A |
4849692 | Blood | Jul 1989 | A |
4853874 | Iwamoto et al. | Aug 1989 | A |
4868549 | Affinito et al. | Sep 1989 | A |
4879556 | Duimel | Nov 1989 | A |
4885565 | Embach | Dec 1989 | A |
4888538 | Dimitrov et al. | Dec 1989 | A |
4888877 | Enderle et al. | Dec 1989 | A |
4891764 | McIntosh | Jan 1990 | A |
4891889 | Tomelleri | Jan 1990 | A |
4896554 | Culver | Jan 1990 | A |
4907970 | Meenen, Jr. | Mar 1990 | A |
4907973 | Hon | Mar 1990 | A |
4925312 | Onaga et al. | May 1990 | A |
4930770 | Baker | Jun 1990 | A |
4934694 | McIntosh | Jun 1990 | A |
4935728 | Kley | Jun 1990 | A |
4942538 | Yuan et al. | Jul 1990 | A |
4942545 | Sapia | Jul 1990 | A |
4945305 | Blood | Jul 1990 | A |
4945501 | Bell et al. | Jul 1990 | A |
4949119 | Moncrief et al. | Aug 1990 | A |
4961038 | MacMinn | Oct 1990 | A |
4961138 | Gorniak | Oct 1990 | A |
4961267 | Herzog | Oct 1990 | A |
4962448 | DeMalo et al. | Oct 1990 | A |
4962591 | Zeller et al. | Oct 1990 | A |
4982504 | Soderberg et al. | Jan 1991 | A |
4982618 | Culver | Jan 1991 | A |
4983786 | Stevens et al. | Jan 1991 | A |
4983901 | Lehmer | Jan 1991 | A |
5007085 | Greanias et al. | Apr 1991 | A |
5007300 | Siva | Apr 1991 | A |
5018922 | Yoshinada et al. | May 1991 | A |
5019761 | Kraft | May 1991 | A |
5022384 | Freels | Jun 1991 | A |
5022407 | Horch et al. | Jun 1991 | A |
5035242 | Franklin | Jul 1991 | A |
5038089 | Szakaly | Aug 1991 | A |
5040306 | McMurtry et al. | Aug 1991 | A |
5044956 | Behensky et al. | Sep 1991 | A |
5050608 | Watanabe et al. | Sep 1991 | A |
5065145 | Purcell | Nov 1991 | A |
5072361 | Davis et al. | Dec 1991 | A |
5076517 | Ferranti et al. | Dec 1991 | A |
5078152 | Bond | Jan 1992 | A |
5080377 | Stamper et al. | Jan 1992 | A |
5088046 | McMurtry | Feb 1992 | A |
5088055 | Oyama | Feb 1992 | A |
5095303 | Clark et al. | Mar 1992 | A |
5103404 | McIntosh | Apr 1992 | A |
5107080 | Rosen | Apr 1992 | A |
5107262 | Cadoz et al. | Apr 1992 | A |
5116051 | Moncrief et al. | May 1992 | A |
5116180 | Fung et al. | May 1992 | A |
5126948 | Mitchell et al. | Jun 1992 | A |
5128671 | Thomas, Jr. | Jul 1992 | A |
5131844 | Marinaccio et al. | Jul 1992 | A |
5132672 | Clark | Jul 1992 | A |
5139261 | Openiano | Aug 1992 | A |
5142506 | Edwards | Aug 1992 | A |
5142931 | Menahem | Sep 1992 | A |
5143505 | Burdea et al. | Sep 1992 | A |
5146566 | Hollis, Jr. et al. | Sep 1992 | A |
5148377 | McDonald | Sep 1992 | A |
5165897 | Johnson | Nov 1992 | A |
5175459 | Daniel et al. | Dec 1992 | A |
5178012 | Culp | Jan 1993 | A |
5181181 | Glynn | Jan 1993 | A |
5182557 | Lang | Jan 1993 | A |
5184306 | Edrman et al. | Feb 1993 | A |
5184319 | Kramer | Feb 1993 | A |
5185561 | Good et al. | Feb 1993 | A |
5186629 | Rohen | Feb 1993 | A |
5186695 | Mangseth et al. | Feb 1993 | A |
5187874 | Takahashi et al. | Feb 1993 | A |
5189355 | Larkins et al. | Feb 1993 | A |
5189806 | McMurtry et al. | Mar 1993 | A |
5193963 | McAffee et al. | Mar 1993 | A |
5195179 | Tokunaga | Mar 1993 | A |
5197003 | Moncrief et al. | Mar 1993 | A |
5203563 | Loper, III | Apr 1993 | A |
5204824 | Fujimaki | Apr 1993 | A |
5209131 | Baxter | May 1993 | A |
5212473 | Louis | May 1993 | A |
5220260 | Schuler | Jun 1993 | A |
5223776 | Radke et al. | Jun 1993 | A |
5228356 | Chuang | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5235868 | Culver | Aug 1993 | A |
5240417 | Smithson et al. | Aug 1993 | A |
5243266 | Kasagami et al. | Sep 1993 | A |
5251127 | Raab | Oct 1993 | A |
5251156 | Heier et al. | Oct 1993 | A |
5259120 | Chapman et al. | Nov 1993 | A |
5259894 | Sampson | Nov 1993 | A |
5262777 | Lowe et al. | Nov 1993 | A |
5264768 | Gregory et al. | Nov 1993 | A |
5266875 | Slotine et al. | Nov 1993 | A |
5271290 | Fischer | Dec 1993 | A |
5275174 | Cook | Jan 1994 | A |
5275565 | Moncrief | Jan 1994 | A |
5283970 | Aigner | Feb 1994 | A |
5286203 | Fuller et al. | Feb 1994 | A |
5289273 | Lang | Feb 1994 | A |
5296846 | Ledley | Mar 1994 | A |
5296871 | Paley | Mar 1994 | A |
5298890 | Kanamaru et al. | Mar 1994 | A |
5299810 | Pierce et al. | Apr 1994 | A |
5309140 | Everett | May 1994 | A |
5313230 | Venolia et al. | May 1994 | A |
5334027 | Wherlock | Aug 1994 | A |
5341459 | Backes | Aug 1994 | A |
5351692 | Dow et al. | Oct 1994 | A |
5354162 | Burdea et al. | Oct 1994 | A |
5355148 | Anderson | Oct 1994 | A |
5374942 | Gilligan et al. | Dec 1994 | A |
5379663 | Hara | Jan 1995 | A |
5381080 | Schnell et al. | Jan 1995 | A |
5384460 | Tseng | Jan 1995 | A |
5386507 | Teig et al. | Jan 1995 | A |
5389865 | Jacobus et al. | Feb 1995 | A |
5396266 | Brimhall | Mar 1995 | A |
5396267 | Bouton | Mar 1995 | A |
5397323 | Taylor et al. | Mar 1995 | A |
5402582 | Raab | Apr 1995 | A |
5405152 | Katanics et al. | Apr 1995 | A |
5412880 | Raab | May 1995 | A |
5414337 | Schuler | May 1995 | A |
5417696 | Kashuba et al. | May 1995 | A |
5428748 | Davidson et al. | Jun 1995 | A |
5429140 | Burdea et al. | Jul 1995 | A |
5435554 | Lipson | Jul 1995 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5436638 | Bolas et al. | Jul 1995 | A |
5436640 | Reeves | Jul 1995 | A |
5437607 | Taylor | Aug 1995 | A |
5445166 | Taylor | Aug 1995 | A |
5451924 | Massimino et al. | Sep 1995 | A |
5459382 | Jacobus et al. | Oct 1995 | A |
5466213 | Hogan | Nov 1995 | A |
5467763 | McMahon et al. | Nov 1995 | A |
5471571 | Smith et al. | Nov 1995 | A |
5482051 | Reddy et al. | Jan 1996 | A |
5512919 | Araki | Apr 1996 | A |
5513100 | Parker et al. | Apr 1996 | A |
5526480 | Gibson | Jun 1996 | A |
5530455 | Gillick et al. | Jun 1996 | A |
5547382 | Yamasaki | Aug 1996 | A |
5550562 | Aoki et al. | Aug 1996 | A |
5551701 | Bouton et al. | Sep 1996 | A |
5565840 | Thorner et al. | Oct 1996 | A |
5565887 | McCambridge et al. | Oct 1996 | A |
5565888 | Selker | Oct 1996 | A |
5570111 | Barrett et al. | Oct 1996 | A |
5575761 | Hajianpour | Nov 1996 | A |
5576727 | Rosenberg et al. | Nov 1996 | A |
5577981 | Jarvik | Nov 1996 | A |
5583407 | Yamaguchi | Dec 1996 | A |
5583478 | Renzi | Dec 1996 | A |
5586257 | Perlman | Dec 1996 | A |
5587937 | Massie et al. | Dec 1996 | A |
5589828 | Armstrong | Dec 1996 | A |
5589854 | Tsai | Dec 1996 | A |
5591924 | Hilton | Jan 1997 | A |
5596347 | Robertson et al. | Jan 1997 | A |
5619180 | Massimino et al. | Apr 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5623642 | Katz et al. | Apr 1997 | A |
5625576 | Massie et al. | Apr 1997 | A |
5629594 | Jacobus et al. | May 1997 | A |
5629597 | Imanaka | May 1997 | A |
5631861 | Kramer | May 1997 | A |
5642469 | Hannaford et al. | Jun 1997 | A |
5643087 | Marcus et al. | Jul 1997 | A |
5656901 | Kurita | Aug 1997 | A |
5666138 | Culver | Sep 1997 | A |
5666473 | Wallace | Sep 1997 | A |
5685775 | Bakoglu et al. | Nov 1997 | A |
5690582 | Ulrich et al. | Nov 1997 | A |
5691747 | Amano | Nov 1997 | A |
5691898 | Rosenberg et al. | Nov 1997 | A |
5694013 | Stewart et al. | Dec 1997 | A |
5695400 | Fennell et al. | Dec 1997 | A |
5701140 | Rosenberg et al. | Dec 1997 | A |
5709219 | Chen et al. | Jan 1998 | A |
5714978 | Yamanaka et al. | Feb 1998 | A |
5721566 | Rosenberg et al. | Feb 1998 | A |
5724068 | Sanchez et al. | Mar 1998 | A |
5731804 | Rosenberg | Mar 1998 | A |
5734373 | Rosenberg et al. | Mar 1998 | A |
5736978 | Hasser et al. | Apr 1998 | A |
5739811 | Rosenberg et al. | Apr 1998 | A |
5742278 | Chen et al. | Apr 1998 | A |
5745715 | Pickover et al. | Apr 1998 | A |
5754023 | Roston et al. | May 1998 | A |
5755577 | Gillio | May 1998 | A |
5755620 | Yamamoto et al. | May 1998 | A |
5757358 | Osga | May 1998 | A |
5760764 | Martinelli | Jun 1998 | A |
5766016 | Sinclair | Jun 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5769640 | Jacobus et al. | Jun 1998 | A |
5771037 | Jackson | Jun 1998 | A |
5781172 | Engel et al. | Jul 1998 | A |
5784052 | Keyson | Jul 1998 | A |
5785630 | Bobick et al. | Jul 1998 | A |
5786818 | Brewer et al. | Jul 1998 | A |
5790108 | Salcudean et al. | Aug 1998 | A |
5791992 | Crump et al. | Aug 1998 | A |
5802353 | Avila et al. | Sep 1998 | A |
5805140 | Rosenberg et al. | Sep 1998 | A |
5805165 | Thorne, III et al. | Sep 1998 | A |
5808601 | Leah et al. | Sep 1998 | A |
5818423 | Pugliese et al. | Oct 1998 | A |
5825308 | Rosenberg | Oct 1998 | A |
5831408 | Jacobus et al. | Nov 1998 | A |
5844392 | Peurach et al. | Dec 1998 | A |
5877748 | Redlich | Mar 1999 | A |
5880714 | Rosenberg et al. | Mar 1999 | A |
5884029 | Brush, II et al. | Mar 1999 | A |
5889670 | Schuler et al. | Mar 1999 | A |
5889672 | Schuler et al. | Mar 1999 | A |
5917725 | Thacher et al. | Jun 1999 | A |
5956484 | Rosenberg et al. | Sep 1999 | A |
5959382 | Dauwalter | Sep 1999 | A |
5973689 | Gallery | Oct 1999 | A |
5984880 | Lander et al. | Nov 1999 | A |
5990869 | Kubica et al. | Nov 1999 | A |
6004134 | Marcus et al. | Dec 1999 | A |
6020876 | Rosenberg et al. | Feb 2000 | A |
6028593 | Rosenberg et al. | Feb 2000 | A |
6037927 | Rosenberg | Mar 2000 | A |
6046726 | Keyson | Apr 2000 | A |
6057828 | Rosenberg et al. | May 2000 | A |
6061004 | Rosenberg | May 2000 | A |
6078308 | Rosenberg et al. | Jun 2000 | A |
6088017 | Tremblay et al. | Jul 2000 | A |
6088019 | Rosenberg | Jul 2000 | A |
6100874 | Schena et al. | Aug 2000 | A |
6101530 | Rosenberg et al. | Aug 2000 | A |
6111577 | Zilles et al. | Aug 2000 | A |
6125385 | Wies et al. | Sep 2000 | A |
6131097 | Peurach et al. | Oct 2000 | A |
6160489 | Perry et al. | Dec 2000 | A |
6161126 | Wies et al. | Dec 2000 | A |
6162123 | Woolston | Dec 2000 | A |
6183364 | Trovato | Feb 2001 | B1 |
6184868 | Shahoian et al. | Feb 2001 | B1 |
6349301 | Mitchell et al. | Feb 2002 | B1 |
6353850 | Wies et al. | Mar 2002 | B1 |
6422941 | Thorner et al. | Jul 2002 | B1 |
6859819 | Rosenberg et al. | Feb 2005 | B1 |
6985133 | Rodomista | Jan 2006 | B1 |
Number | Date | Country |
---|---|---|
0085518 | Aug 1983 | EP |
0265011 | Apr 1988 | EP |
0349086 | Jan 1990 | EP |
0626634 | May 1994 | EP |
0607580 | Jul 1994 | EP |
2254611 | Oct 1992 | GB |
01-003664 | Jul 1990 | JP |
H2-185278 | Jul 1990 | JP |
02-109714 | Jan 1992 | JP |
H4-8381 | Jan 1992 | JP |
4-34610 | Feb 1992 | JP |
04-007371 | Aug 1993 | JP |
H5-192449 | Aug 1993 | JP |
05-193862 | Jan 1995 | JP |
H7-24147 | Jan 1995 | JP |
WO 9510080 | Jan 1995 | WO |
WO9502801 | Jan 1995 | WO |
WO 9520787 | Aug 1995 | WO |
WO9520788 | Aug 1995 | WO |
WO9532459 | Nov 1995 | WO |
WO9616397 | May 1996 | WO |
WO9622591 | Jul 1996 | WO |
WO9642078 | Dec 1996 | WO |
WO9712337 | Apr 1997 | WO |
WO9712357 | Apr 1997 | WO |
WO9719440 | May 1997 | WO |
WO9721160 | Jun 1997 | WO |
WO9731333 | Aug 1997 | WO |
Entry |
---|
Yamakita et al. “Tele-Virtual Reality of Dynamic Mechanical Model”, Proceedings of the 1992 IEEE/RSJ International Conference on Intelligent Robots and Systems, Raleigh, NC Jul. 7-10, 1992. |
Hiroo Iwata, Artificial Reeality with Force-feedback: Development of Desktop Virtual Space with Compact Master Manipulator Computer Graphics, vol. 24, No. 4, Aug. 1990 pp. 165-170. |
Ouhyoung, et al. “A Low-Cost Force Feedback Joystick and Its Use in PC Video Games”, IEEE Transactions on Consumer Electronics vol. 41 No. 3, Aug. 1995 pp. 787-794. |
Jacobsen, S.C. et al.,“High Performance, High Dexterity, Force Reflective Teleoperator II,” ANS Topical Meeting on Robofics & Remote Systems, Albuquerque, New Mexico Feb. 24-27, 1991, pp. 1-10. |
Kotoku, Tetsuo et al., “Environment Modeling for the Interactive Display (EMID) Used in Telerobotic Systems,” IEEE Nov. 3-5, 1991, pp. 99-1004. |
Bejczy, Antal K., “The Phantom Robot: Predictive Displays for Teleoperation with Time Delay,” IEEE 1990, pp. 546-550. |
Buttolo, Pietro et al., “Pen-Based Force Display for Precision Manipulation in Virtual Environments,” IEEE Mar. 1995, pp. 1-8. |
Tan, Hong Z. et al., “Human Factors for the Design of Force-Reflecting Haptic Interfaces,” Tan, Srinivasan, Eberman, & Chang, ASME WAM 1994, pp. 1-11. |
Ellis, R.E. et al., “Design and Evaluation of a High-Performance Prototype Planar Haptic Interface,” ASME Dec. 3, 1993, DSC—vol. 49, pp. 55-64. |
Adelstein Bernard D. et al., “A High Performance Two Degree-of-Freedom Kinesthetic Interface,” Massachusetts Institute of Technology 1992, pp. 108-112. |
Colgate J. Edward et al., Implementation of Stiff Virtual Walls in Force-Reflecting Interfaces, Sep. 22, 1993. |
Iwata, Hiroo et al, Volume Haptization, IEEE 1993, pp. 16-18. |
Fischer, Patrick et al., “Specification and Design of Input Devices for Teleoperation,” 1990. |
Burdea, Grigore et al., “Distributed Virtual Force Feedback,” IEEE, May 2, 1993, pp. 25-44. |
Rosenberg, Louis B., “The Use of Virtual Fixtures as Perceptual Overlays to Enhance Operator Performance in Remote Environments,” Air Force Material Command, Sep. 1992, pp. 1-42. |
Rosenberg, Louis B., The Use of Virtual Fixtures to Enhance Operator Performance in Time Delayed Teleoperation, Armstrong Laboratory, Mar. 1993, pp. 1-45. |
Rosenberg, Louis B., “Perceptual Design of a Virtual Rigid Surface Contact,” Center for Design Reseach Stanford University, Air Force Material Command, Apr. 1993, pp. 1-41. |
Rosenberg, Louis B. et al., “Perceptual Decomposition of Virtual Haptic Surfaces,” IEEE, Oct. 1993. |
Rosenberg, Louis B., “Virtual Fixtures as Tools to Enhance Operator Performance in Telepresence Environments,” SPIE Telemanipulator Technology, 1993. |
Burdea, Grigore et al., “Dextrous Telerobotics with Force Feedback—An Overview,” Robotica 1991, vol. 9. |
Colgate, J. Edward et al., “Implementation of Stiff Virtual Walls in Force-Reflecting Interfaces,” 1993, pp. 1-9. |
Yamakita, M. et al., Tele-Virtual Reality of Dynamic Mechanical Model, IEEE Jul. 7-10, 1992, pp. 1103-1110. |
Adlestein, Bernard D. et al., “Design and Implementation of a Force Reflecting Manipulandum for Manual Control Research,” 1992, pp. 1-24. |
Ouh-young, Ming et al., “Force Display Performs Better than Visual Display in a Simple 6-D Docking Task,” IEEE 1989, pp. 1462-1466. |
Kim, Won S. et al., “Graphics Displays for Operator Aid in Telemanipulation,” IEEE 1991, pp. 1059-1067. |
Hannaford, Blake et al., “Performance Evaluation of a Six-Axis Generalized Force-Reflecting Teleoperator,” IEEE May/Jun. 1991, vol. 21, No. 3, pp. 620-633. |
Kim, Won S. et al., A Teleoperation Training Simulator with Visual and Kinesthetic Force Virtual Reality. |
Burdea, Grigore et al., “A Portable Dextrous Master with Force Feedback,” Presence: Teleoperators and Virtual Environments, MIT Press, Jun. 1991. |
Fisher, S.S. et al., “Virtual Environment Display System,” ACM Interactive 3D Graphics, Oct. 1986. |
Herndon , J.N. et al., “The State-of-the-Art Model M-2 Maintenance System,” Proc. of the 1984 Natl Meeting on Robotics and Remote Handling in Hostile Environments, American Nuclear Society, 1984, pp. 59-65. |
Minsky, Margaret et al., “Feeling and Seeing: Issues in Force Display,” ACM 1990, pp. 235-242. |
Batter, James J. et al., “Grope-1: A Computer Display to the Sense of Feel,” pp. TA-4-188-TA-4-192. |
Gotow, J.K., et al., “Perception of Mechanical Properties at the Man-Machine Interface,” IEEE 1987, pp. 688-689. |
Atkinston, William D. et al, “Computing with Feeling,” Comput. & Graphics, vol. 2, No. 2-E, pp. 97-103. |
Noll, A. Michael, “Man-Machine Tactile Communication Dissertation,” Polytechnic Institute of Brooklyn, Jun. 1971, pp. 1-88. |
Ouh-Young, Ming, “Force Display in Molecular Docking,” Chapel Hill 1990, pp. 1-85. |
Ouh-young, Ming et al., “Using a Manipulator for Force Display in Molecular Docking,” IEEE 1988, pp. 1824-1829. |
Wiker, S. et al., “Development of Tactile Mice for Blind Access to Computers: Importance of Stimulation Locus, Object Size, and Vibrotactile Display Resolution,” Proc. of the Human Factors Society 35th Annual Meeting 1991, pp. 708-712. |
Adachi, Yoshitaka et al., “Sensory Evaluation of Virtual Haptic Push-Buttons,” Technical Research Center, Suzuki Motor Corporation, Nov. 1994. |
Su, S. Augustine et al., “The Virtual Panel Architecture: A 3D Gesture Framework,” IEEE 1993, pp. 387-393. |
Tan, Hong Z et al., “Manual Resolution of Compliance When Work and Force Cues are Minimized,” ASME 1993, DSC—vol. 49, pp. 99-104. |
Iwata, Hiroo, “Pen-based Haptic Virtual Environment,” Institute of Engineering Mechanics, University of Tsukuba, Japan, pp. 287-292. |
Kotoku, Tetsuo, “A Predictive Display with Force Feedback and its Application to Remote Manipulation System with Transmission Time Display,” IEEE 1992, 1992, pp. 239-246. |
Howe, Robert D., “Task Performance with a Dextrous Teleoperated Hand System,” Proceedings of SPIE, Nov. 1992, vol. 1833, pp. 1-9. |
Schmult, Brian et al., “Application Areas for a Force-Feedback Joystick,” ASME 1993, DSC—vol. 49, pp. 47-54. |
Hasser, Christopher John, “Tactile Feedback for a Force-Reflecting Haptic Display,” The School of Engineering, University of Dayton, Dec. 1995, pp. iii-xii &1-96. |
Russo, Massimo Andrea, “The Design and Implementation of a Three Degree-of-Freedom Force Output Joystick,” Department of Mechanical Engineering, May 11, 1990, pp. 9-40 & 96 & 97. |
Jones, L.A., etal., “A Perceptual Analysis of Stiffness,” ExperimBrain Research 1990,pp. 151-156. |
Kelley, A. J. et al., “MagicMouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input/Output Device,” Dept. of Elec. Eng., Univ. of Brit. Columbia, 1993, pp. 1-27. |
Kelley, A.J. et al., “On the Development of a Force-Feedback Mouse and Its Integration into a Graphical User Interface,” Symp. on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 1994 Int'l Mechanical Engineering Congress and Exhibition, 1994, pp. 1-8. |
Ramstein, C., “Combining Haptic and Braille Technologies: Design Issues and Pilot Study,” ASSETS '96, ACM 0-89791-776-6, 1996, pp. 37-44. |
Akamatsu, M. et al., “Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display,” Presence, vol. 3, No. 1, 1994, pp. 73-80. |
Munch, S. et al., “Intelligent Control for Haptic Displays,” Eurographics '96, vol. 15, No. 3, Eurographics Association, 1996, pp. C217-C226. |
Payette, J. et al., “Evaluation of a Force Feedback (Haptic) Computer Pointing Device in Zero Gravity,” DSC—vol. 58, Proc. of ASME Dynamics Systems, ASME 1996, pp. 547-553. |
Hannaford, B. et al., “Force Feedback Cursor Control,” NASA Tech Brief, vol. 13, No. 11, Item #21, 1989, pp. 1-4. |
Rosenberg et al., “Commercially Viable Force Feedback Controller for Individuals with Neuromotor Disabilities,” Crew Systems Directorate, AL/CF-TR-1997-0016, 1996, pp. 1-33. |
Millman, P. et al., “Design of a Four Degree-of-Freedom Force-Reflecting Manipulandum with a Specified Force/Torque Workspace,” IEEE CH2969-4, 1991, pp. 1488-1492. |
Ramstein et al., “The Pantograph: A large Workspace Haptic Device for a Multimodal Human-Computer Interaction,” Computer-Human Interaction, CHI '94, 1994, pp. 1-3. |
Rosenberg et al., “The use of force feedback to enhance graphical user interfaces,” Stereoscopic Displays and Virtual Reality Systems III, 1996, pp. 243-248. |
Winey III, C., “Computer Simulated Visual and Tactile Feedback as an Aid to Manipulator and Vehicle Control,” MIT 1981, pp. 1-79. |
Yokokohji et al., “What you can see is what you can feel-development of a visual/haptic interface to virtual environment,” IEEE 0-8186-7295-1, 1996, pp. 46-54. |
Brooks, Jr., F., et al., “Project Grope-Haptic Displays for Scientific Visualization,” ACM-0-89791-344-2, 1990, pp. 177-185. |
Kilpatrick, P., “The use of a Kinesthetic Supplement in an Interactive Graphics System,” University of North Carolina, 1976, pp. 1-175. |
Ming Ouh-young et al., “Creating an Illusion of Feel: Control Issues in Force Display,” University of North Carolina, 1989, pp. 1-14. |
Hirota, K., et al., “Development of Surface Display,” IEEE 0-7803-1363-1, 1993, pp. 256-262. |
Iwata, Hiroo, “Artificial Reality with Force-Feedback: Development of Desktop Virtual Space with Compact Master Manipulator,” Computer Graphics, vol. 24, No. 4, 1990, pp. 165-170. |
Brooks et al., “Project GROPE—Haptic Displays for Scientific Visualization,” Computer Graphics, vol. 24, No. 4, 1990, pp. 177-185. |
Albers, F. Gerry, “Microcomputer Base for Control Loading,” Naval Training Equipment Center 11th NTEC-Industry Conference Proceedings, NAVTRAEQUIPCEN IH-306, 1978. |
Baigrie, “Electric Control Loading—A Low Cost, High Performance Alternative,” Proceedings, pp. 247-254, Nov. 6-8, 1990. |
Iwata, “Pen-based Haptic Virtual Environment,” 0-7803-1363-1/93 IEEE, pp. 287-292, 1993. |
Russo, “The Design and Implementation of a Three Degree of Freedom Force Output Joystick,” MIT Libraries Archives Aug. 14, 1990, pp. 1-131, May 1990. |
Brooks et al., “Hand Controllers for Teleoperation—A State-of-the-Art Technology Survey and Evaluation,” JPL Publication 85-11; NASA-CR-175890; N85-28559, pp. 1-84, Mar. 1, 1985. |
Jones et al., “A perceptual analysis of stiffness,” ISSN 0014-4819 Springer International (Springer-Verlag); Experimental Brain Research, vol. 79, No. 1, pp. 150-156, 1990. |
Burdea et al., “Distributed Virtual Force Feedback, Lecture Notes for Workshop on Force Display in Virtual Environments and its Application to Robotic Teleoperation,” 1993 IEEE International Conference on Robotics and Automation, pp. 25-44, May 2, 1993. |
Snow et al., “Model-X Force-Reflecting-Hand-Controller,” NT Control No. MPO-17851; JPL Case No. 5348, pp. 1-4, Jun. 15, 1989. |
Ouh-Young, “Force Display in Molecular Docking,” Order No. 9034744, p. 1-369, 1990. |
Tadros, Control System Design for a Three Degree of Freedom Virtual Environment Simulator Using Motor/Brake Pair Actuators, MIT Archive © Massachusetts Institute of Technology, pp. 1-88, Feb. 1990. |
Caldwell et al., “Enhanced Tactile Feedback (Tele-Taction) Using a Multi-Functional Sensory System,” 1050-4729/93, pp. 955-960, 1993. |
Adelstein, “Design and Implementation of a Force Reflecting Manipulandum for Manual Control research,” DSC—vol. 42, Advances in Robotics, Edited by H. Kazerooni, pp. 1-12, 1992. |
Gotow et al., “Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback,” WA11-11:00, pp. 332-337. |
Stanley et al., “Computer Simulation of Interacting Dynamic Mechanical Systems Using Distributed Memory Parallel Processors,” DSC—vol. 42, Advances in Robotics, pp. 55-61, ASME 1992. |
Russo, “Controlling Dissipative Magnetic Particle Brakes in Force Reflective Devices,” DSC—vol. 42, Advances in Robotics, pp. 63-70, ASME 1992. |
Kontarinis et al., “Display of High-Frequency Tactile Information to Teleoperators,” Telemanipulator Technology and Space Telerobotics, Won S. Kim, Editor, Proc. SPIE vol. 2057, pp. 40-50, Sep. 7-9, 1993. |
Patrick et al., “Design and Testing of a Non-reactive, Fingertip, Tactile Display for Interaction with Remote Environments,” Cooperative Intelligent Robotics in Space, Rui J. deFigueiredo et al., Editor, Proc. SPIE vol. 1387, pp. 215-222, 1990. |
Adelstein, “A Virtual Environment System for the Study of Human Arm Tremor,” Ph.D. Dissertation, Dept. of Mechanical Engineering, MIT, Jun. 1989. |
Bejczy, “Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation,” Science, vol. 208, No. 4450, pp. 1327-1335, 1980. |
Bejczy, “Generalization of Bilateral Force-Reflecting Control of Manipulators,” Proceedings of Fourth CISM-IFToMM, Sep. 8-12, 1981. |
McAffee, “Teleoperator Subsystem/Telerobot Demonsdtrator. Force Reflecting Hand Controller Equipment Manual,” JPL D-5172, pp. 1- 50, A1-A36, B1-B5, C1-C36, Jan. 1988. |
Minsky, “Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-Feedback Display,” Ph.D. Dissertation, MIT, Jun. 1995. |
Jacobsen et al., “High Performance, Dextrous Telerobotic Manipulator With Force Reflection,” Intervention/ROV '91 Conference & Exposition, Hollywood, Florida, May 21-23, 1991. |
Shimoga, “Finger Force and Touch Feedback Issues in Dexterous Telemanipulation,” Proceedings of Fourth Annual Conference on Intelligent Robotic Systems for Space Expploration, Rensselaer Polytechnic Institute, Sep. 30-Oct. 1, 1992. |
IBM Technical Disclosure Bullein, “Mouse Ball-Actuating Device With Force and Tactile Feedback,” vol. 32, No. 9B, Feb. 1990. |
Terry et al., “Tactile Feedback in a Computer Mouse,” Proceedings of Fouteenth Annual Northeast Bioengineering Conference, University of New Hampshire, Mar. 10-11, 1988. |
Howe, “A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation,” Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992. |
Eberhardt et al., “OMAR—A Haptic display for speech perception by deaf and deaf-blind individuals,” IEEE Virtual Reality Annual International Symposium, Seattle, WA, Sep. 18-22, 1993. |
Rabinowitz et al., “Multidimensional tactile displays: Identification of vibratory intensity, frequency, and contactor area,” Journal of The Acoustical Society of America, vol. 82, No. 4, Oct. 1987. |
Bejczy et al., “Kinesthetic Coupling Between Operator and Remote Manipulator,” International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980. |
Bejczy et al., “A Laboratory Breadboard System for Dual-Arm Teleoperation,” SOAR '89 Workshop, JSC, Houston, TX, Jul. 25-27, 1989. |
Ouh-Young, “A Low-Cost Force Feedback Joystick and Its Use in PC Video Games,” IEEE Transactions on Consumer Electronics, vol. 41, No. 3, Aug. 1995. |
Marcus, “Touch Feedback in Surgery,” Proceedings of Virtual Reality and Medicine The Cutting Edge, Sep. 8-11, 1994. |
Bejczy, et al., “Universal Computer Control System (UCCS) for Space Telerobots,” CH2413-3/87/0000/0318501.00 1987 IEEE, 1987. |
Yamakita et al., “Tele-Virtual Reality of Dynamic Mechanical Model,” Proceedings of the 1992 IEEE/RSJ International Conference on Intelligent Robots and Systems, Raleigh, NC, Jul. 7-10, 1992. |
Noll, “Man-Machine Tactile,” SID Journal, Jul./Aug. 1972 Issue. |
McAffee, “Teleoperator Subsystem/Telerobot Demonsdtrator: Force Reflecting Hand Controller Equipment Manual,” JPL D-5172, pp. 1- 50, A1-A36, B1-B5, C1-C36, Jan. 1988. |
Eberhardt et al., “Including Dynamic Haptic Perception by The Hand: System Description and Some Results,” DSC—vol. 55-1, Dynamic Systems and Control: vol. 1, ASME 1994. |
Gobel et al., “Tactile Feedback Applied to Computer Mice,” International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995. |
Pimentel et al., “Virtual Reality: through the new looking glass,” 2nd Edition; McGraw-Hill, ISBN 0-07-050167-X, pp. 41-202, 1994. |
“Cyberman Technical Specification,” Logitech Cyberman SWIFT Supplement, Apr. 5, 1994. |
Ouhyoung et al., “The Development of a Low-Cost Force Feedback Joystick and Its Use in the Virtual Reality Environment,” Proceedings of the Third Pacific Conference on Compute Graphics and Applications, Pacific Graphics '95, Seoul, Korea, Aug. 21-24, 1995. |
Kaczmarek et al., “Tactile Displays,” Virtual Environment Technologies. |
Scannell, “Taking a Joystick Ride,” Computer Currents, Boston Edition, vol. 9, No. 11, Nov. 1994. |
The American Society of Mechanical Engineers—(copyright 1997)—Proceedings of the ASME Dynamic Systems and Control Division—Presented at the 1997 ASME International Mectranial Engineering Congress and Exposition—Nov. 16-21, 1997, Dallas, Texas. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 10/615,927, mailed Dec. 11, 2007. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 11/232,576, mailed Oct. 21, 2008. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 10/314,400, mailed Nov. 10, 2008. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 11/227,610, mailed Nov. 17, 2008. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 10/615,927, mailed Dec. 9, 2008. |
“High Performance Model of the Immersion Probe,” Immersion-Probe-MD™, Immersion Human Interface. Corporation. |
“The Personal Digitizer™,” Immersion Human Interface Corporation 1994. |
3D Human Interface Tool, Immersion Probe™, Immersion Human Interface Corporation 1994. |
Slocum, Precision Machine Design, Prentice Hall, pp. 661, 664. |
Ansley, D., “Thimble gets in touch with reality,” New Scientist, 1994, p. 19. |
Gossweiler et al., An Introduced Tutorial for Developing Multi-User Virtual Environments, Presence: Teleoperators and Virtual Environments, MIT Press, 3(4), Fall 94 pp. 255-264. |
Kotoku, T. et al., “Environment Modeling for the Interactive Display (EMID) Used in Telerobotic Systems,” IEEE Nov. 3-5, 1991, pp. 99-1004. |
Krishna R., “Virtual Presence Takes Surgeons through the Virtual Keyhole to Hone Their Skills”, Business & Industry, Jul. 4, 1995. |
Krueger, M., Artificial Reality 1988, pp. 54-75. |
Meyer, K. et al., A Survey of Position Trackers, The Massachusetts Institute of Technology 1992, Presence, vol. 1, No. 2. |
Ouh-Young, M. et al. Creating an Illusion of Feel: Control Issues in Force Display, University of North• Carolina, 1989, pp. 1-14. |
Aukstakalnis et al., “Silicon Mirage: The Art and Science of Virtual Reality,” ISBN 0-938151-82-7, pp. 129-180, 1992. |
Bliss, “Optical-to-Tactile Image Conversion for the Blind.” IEEE Transactions on Man Machine Systems, vol. MMS-11, No. 1, Mar. 1970. |
Calder, “Design of a Force-Feedback Touch-Introducing Actuator for Teleoperator Robot Control,” Bachelor of Science Thesis, MIT, May 1983, archived Jun. 23, 1983. |
Eberhardt et al.,“Inducing Dynamic Haptic Perception by The Hand: System Description and Some Results,” DSC—vol. 55-1, Dynamic Systems and Control: vol. 1, ASME 1994. |
Johnson, “Shape-Memory Alloy Tactile Feedback Actuator.” Armstrong Aerospace Medical Research Laboratory, AAMRL•TP.-90-039, Aug. 1990. |
Kontarinis et al., “Tactile Display of Vibratory Information in Teleoperation and Virtual Environments,” Presence, 4(4):387-402, Harvard Univ., 1995. |
Patrick, Design, Construction, and Testing of a Fingertip Tactile Display for Interaction with Virtual and Remote Environments,• Master of Science Thesis, MIT, Aug. 1990, archived Nov. 8, 1990. |
Repperger, D.W., “Active Force Reflection Devices in Teleoperation,” IEEE Control Systems. |
Rosenberg, L., “Virtual Fixtures': Perceptual Overlays Enhance Operator Performance in Telepresence Tasks”, Stanford University, Jun. 1994, pp. 1-214. |
Rosenberg, L., Crew Systems Directorate Biodynamics and Bio-communications Division Wright• Patterson, Air Force Material Command, Mar. 1993, pp. 1-45. |
Rosenberg, L., ‘Virtual Haptic Overlays Enhance Performance in Telepresence Tasks,’ SPIE, 1994. |
Smith, G., ‘Call It Palpable Progress,’ Business Week, Oct. 9, 1995, p. 93, 96. |
Snow, E. et al., Compact Force-Reflecting Hand Controller, JPL, Apr. 1991, vol. 15, No .3, Item No. 153, p. 1-15a. |
Tavkhelidze, D.S. Kinematic Analysis of Ave-link Spherical Mechanisms, Mechanism and Machine Theory 1974, vol. 9, pp. 181-190. |
Wiker, “Teletouch Display Development: Phase 1 Report,” Technical Report 1230, NavalOcean Systems Center, San Diego, Jul. 1988. |
“Component Maintenance Manual With illustrated Parts List, Coaxial Control Shaker Part No. C•25502,” Safe Flight Instrument Corporation, Revised Jan. 28, 2002 (3 pages). |
“Technical Manual Overhaul Instructions With Parts Breakdown, Coaxial Control Shaker Part No. C•25502,” Safe Flight Instrument Corporation, Revised Jul. 15, 1980 (23 pages). |
Cyberman Technical Specification, Log/tech Cyberman SWIFT Supplement to Log/tech Mouse Technical Reference and Programming Guide, Apr. 5, 1994. |
Gobel et al., “Tactile Feedback Applied to Computer Mice.” International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995. |
Kaczmarek et al., “Tactile Displays,” Virtual Environment Technologies, Chap. 9, pp. 349-414. |
Lake, Cyberman from Logitech, at http://www.iblbllo.org/GameByteslIssue21/greviews/cybennan.html, 1994. |
Ouhyoung et al., The Development of a Low-Cost Force Feedback Joystick and Its Use in the Virtual Reality Environment,• Proceedings of the Third Pacific Conference on Computer Graphics and Applications, Pacific Graphics '95, Seoul, Korea, Aug. 21-24, 1995. |
Pimentel et al., “Virtual Reality: through the new looking glass,” 2nd Edition; McGraw-Hill, ISBN 0-07-050167-X, pp. 41-202, 1994. |
Scannell, Taking a Joystick Ride, Computer Currents, Boston Edition, vol. 9, No. 11, Nov. 1994. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 10/615,927, mailed Jul. 21, 2008. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 12/638,263, mailed Nov. 8, 2010. |
United States Patent and Trademark Office, Office Action, U.S. Appl. No. 12/638,263, mailed Mar. 7, 2011. |
Number | Date | Country | |
---|---|---|---|
60017803 | May 1996 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 08566282 | Dec 1995 | US |
Child | 09050665 | US | |
Parent | 08571606 | Dec 1995 | US |
Child | 08566282 | US | |
Parent | 08691852 | Aug 1996 | US |
Child | 08571606 | US | |
Parent | 08664086 | Jun 1996 | US |
Child | 08691852 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 09050665 | Mar 1998 | US |
Child | 09153781 | US |