The present disclosure relates generally to haptic feedback and more specifically relates to systems and methods for enhanced television interaction.
Conventional televisions allow users to watch various content features, such as broadcast, cable, or satellite programming, pre-recorded shows and movies, and play video games. Such content may be provided to a television and displayed to one or more viewers or participants. Frequently, such programming includes other content that is associated with, or supersedes, the programming the viewer desires to watch, such as advertisements or emergency broadcast messages, which may cause the user to lose focus on the television or other display device. Or the user, while enjoying a particular program, may not feel fully immersed within the program.
Embodiments according to the present disclosure provide systems and methods for enhanced television interaction. For example, one disclosed embodiment comprises a method having the steps of receiving notification information, the notification information indicating an event associated with video content displayed by a television device; determining a haptic effect associated with the notification information; and generating and transmitting a haptic signal to a haptic output device, the haptic signal configured to cause the haptic output device to output the haptic effect. In another embodiment, a computer-readable medium comprises program code for causing a processor to carry out such a method.
These illustrative embodiments are mentioned not to limit or define the invention, but rather to provide examples to aid understanding thereof. Illustrative embodiments are discussed in the Detailed Description, which provides further description of the invention. Advantages offered by various embodiments of this invention may be further understood by examining this specification.
The accompanying drawings, which are incorporated into and constitute a part of this specification, illustrate one or more examples of embodiments and, together with the description of example embodiments, serve to explain the principles and implementations of the embodiments.
Example embodiments are described herein in the context of systems and methods for enhanced television interaction. Those of ordinary skill in the art will realize that the following description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure. Reference will now be made in detail to implementations of example embodiments as illustrated in the accompanying drawings. The same reference indicators will be used throughout the drawings and the following description to refer to the same or like items.
In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be appreciated that in the development of any such actual implementation, numerous implementation-specific decisions must be made in order to achieve the developer's specific goals, such as compliance with application- and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another.
Illustrative Systems for Enhanced Television Interaction
Referring now to
As the smart television 110 continues to monitor the television channel, it transmits signals to the user's device to cause haptic effects to be output each time a commercial ends. After a few minutes, the television determines that the program the user is watching is resuming. The smart television again transmits a haptic signal to the user's tablet, which causes the tablet to output a strong haptic effect to the user to indicate that the television program is about to re-start, at which time, the user returns saves her email to finish later and returns to continue watching television.
Somewhat later, the user elects to watch a program that she has previously recorded on a digital video recorder (DVR). She uses her tablet to select the program and to begin its replay. After a time, the recording begins to playback commercials that aired during the broadcast of the recorded program. The user selects a fast-forward function on her tablet to fast forward through the commercials. As she fast forwards, the television monitors the content streaming from the DVR and, as each commercial ends, the smart TV transmits a haptic message to the tablet to output a low intensity haptic effect to indicate that a commercial has finished. Or as shown in
Referring now to
These illustrative examples are given to introduce the reader to the general subject matter discussed herein and the disclosure is not limited to these examples. The following sections describe various additional non-limiting embodiments and examples of systems and methods for enhanced television interaction.
Referring now to
In some embodiments, one or more touch-sensitive surfaces may be included on or disposed within one or more sides of the system 200. For example, in one embodiment, a touch-sensitive surface is disposed within or comprises a rear surface of the system 200. In another embodiment, a first touch-sensitive surface is disposed within or comprises a rear surface of the system 200 and a second touch-sensitive surface is disposed within or comprises a side surface of the system 200. In some embodiments, the system may comprise two or more housing components, such as in a clamshell arrangement or in a slideable arrangement. For example, one embodiment comprises a system having a clamshell configuration with a touch-sensitive display disposed in each of the portions of the clamshell. Furthermore, in embodiments where the system 200 comprises at least one touch-sensitive surface on one or more sides of the system 200 or in embodiments where the system 200 is in communication with an external touch-sensitive surface, the display 250 may or may not comprise a touch-sensitive surface. In some embodiments, one or more touch-sensitive surfaces may have a flexible touch-sensitive surface. In other embodiments, one or more touch-sensitive surfaces may be rigid. In various embodiments, the system 200 may comprise both flexible and rigid touch-sensitive surfaces.
In the embodiment shown in
In addition, the processor 220 is in communication with haptic output device 240 and haptic output device 280, and is further configured to output signals to cause haptic output device 240 or haptic output device 280, or both, to output one or more haptic effects. Furthermore, the processor 220 is in communication with speaker 270 and is configured to output signals to cause speaker 270 to output sounds. In various embodiments, the system 200 may comprise or be in communication with fewer or additional components or devices. For example, other user input devices such as a mouse or a keyboard, or both, or an additional touch-sensitive device may be comprised within the system 200 or be in communication with the system 200. As another example, system 200 may comprise and/or be in communication with one or more accelerometers, gyroscopes, digital compasses, and/or other sensors.
The housing 210 of the system 200 shown in
In the embodiment shown in
In order to generate vibration effects, many devices utilize some type of actuator or haptic output device. Known haptic output devices used for this purpose include an electromagnetic actuator such as an Eccentric Rotating Mass (“ERM”) in which an eccentric mass is moved by a motor, a Linear Resonant Actuator (“LRA”) in which a mass attached to a spring is driven back and forth, or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys. Haptic output devices also broadly include other devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
In other embodiments, deformation of one or more components can be used to produce a haptic effect. For example, one or more haptic effects may be output to change the shape of a surface or a coefficient of friction of a surface. In an embodiment, one or more haptic effects are produced by creating electrostatic forces and/or ultrasonic forces that are used to change friction on a surface. In other embodiments, an array of transparent deforming elements may be used to produce a haptic effect, such as one or more areas comprising a smartgel. Haptic output devices also broadly include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on. In some embodiments comprising haptic output devices 240, 280 that are capable of generating frictional or deformation effects, the haptic output devices 240 or 280 may be overlaid on the touch-sensitive display or otherwise coupled to the touch-sensitive display 250 such that the frictional or deformation effects may be applied to a touch-sensitive surface that is configured to be touched by a user. In some embodiments, other portions of the system may provide such forces, such as portions of the housing that may be contacted by the user or in a separate touch-separate input device coupled to the system. Co-pending U.S. patent application Ser. No. 13/092,484, filed Apr. 22, 2011, entitled “Systems and Methods for Providing Haptic Effects,” the entirety of which is hereby incorporated by reference, describes ways that one or more haptic effects can be produced and describes various haptic output devices.
It will be recognized that any type of input synthesis method may be used to generate the interaction parameter from one or more haptic effect signals including, but not limited to, the method of synthesis examples listed in TABLE 1 below.
In
The embodiment shown in
Environmental factors can include any of the environmental factors noted above or any other quantities representative of an ambient condition or force applied to or directed to the user device 200. Additionally, environmental factors may be evaluated directly from sensor data or may be processed by the device to derive other environmental factors. For example, acceleration data may be used to determine a device orientation, velocity and/or a pattern of motion. As a further example, physiological data such as heart rate, skin resistance, and other factors can be used to determine a physiological state of a device user (e.g., awake, stressed, asleep, REM sleep, etc.).
Referring now to
Communications interface 340 comprises at least one communications device, such as a television receiver, an HDMI receiver, a network connection (e.g. Ethernet, 802.11), or other communications interfaces, such as those discussed above with respect to the handheld device 200. In some embodiments, communications interface 340 may comprise a plurality of communications devices. For example, in one embodiment, communications interface comprises an HDMI interface and an 802.11 interface. In one such an embodiment, the television 310 is configured to receive content from the HDMI interface and to transmit notification signals, such as to the device 200, using the 802.11 interface. Other embodiments may employ other suitable communications interfaces for connecting to or transmitting notification signals to device 200 or other devices, such as Bluetooth, infrared, or other wireless communications interfaces.
While
Referring now to
In the embodiment shown in
In the embodiment shown in
Reference is made throughout this specification to “notification signals.” No particular function or intent should ascribed to the modifier “notification.” Instead, “notification signal” is a convenient label to differentiate from other types of signals referred to in this specification. The term “notification signal” is intended to be read broadly and to encompass, for example, any signal carrying information related to events or other occurrences within, or portions of, content, or information related to or associated with content. Further, “notification signals” need not be signals triggered by any particular type of event or occurrence. Rather, the “notification signals” may simply correspond to a portion of the content having haptic information, as opposed to audio or video information.
In some embodiments discussed above, notification signals may be generated and transmitted to a user device 200 to affect the user's attention to provide additional, secondary information associated with content displayed on the television 310 or the device 200, such as to change the user's focus from her device 200 to the television 310, or from the television 310 to the device. Some embodiments, however, may provide effects intended to enhance the content displayed on the television. For example, in some embodiments, the television (or other device) may generate and transmit signals to the user's device 200 to provide a more immersive experience.
Referring now to
A number of embodiments discussed herein have been discussed in the context of a single user viewing content and interacting with a user device. However, the disclosure is not limited to a single user or a single user device. Rather, embodiments may comprise multiple users or multiple user devices 200, and may further include multiple display devices 310. In some such embodiments, users may be provided with different notification signals. For example, users may have differing preferences regarding the types of effects they receive, or may specify different intensity levels for applied effects. Further, in some embodiments, a user's location relative to a display screen may affect the types or contents of notification signals. For example, users located to the left side of a display 310 may receive different effects, or different intensities for effects, than those located on the right side of the screen. In one embodiment, users located on the left side of the screen may receive higher intensity effects associated with events occurring the left side of the screen, and lesser intensity effects associated with events occurring or the right side of the screen. Similarly, users located further away from the screen may receive lower intensity effects than those closer to the screen. In some embodiments, users may receive different secondary information associated with content on the screen based upon user preferences or profile information about the user, such as age, sex, education level, income, etc. In some embodiments, the users may view differing secondary information on a separate display screen, such as on their respective user device or in a wearable device, and thus different haptic effects may be provided to correspond to the secondary information for each particular user.
Referring now to
As may be seen in
In some embodiments, the device may also reconfigure its user interface into a simpler user interface having fewer commands with larger areas on the screen. For example, in one embodiment, the user's device is configured to operate as a remote control for the television. In one such embodiment, when the device determines that the user is likely not focused on the device, or is only partially focused on the device, the user interface may change to display a reduced set of controls (e.g. user interface 510), such that the user need not closely examine the device or use fine movements to effectuate a particular action. However, if the user changes posture or grip, the user interface may change to provide a greater number of controls or more intricate user interface capabilities, such as may be seen in user interface 530. For example, if the user is holding the device in two hands in a horizontal orientation, the device may determine that the user is highly focused on the device and may provide a more detailed user interface, such as providing additional options or a graphical keyboard or number pad for accepting user input.
While embodiments may provide a rearrangement of a user interface based on the user's detected posture, in some embodiments, the user interface based on a user's usage pattern of a device. For example, if a user frequently watches channel 31, the user interface may incorporate a button to quickly change the channel to channel 31. Or in some embodiments, if the user rarely or never uses a particular features, such as a mute button or rarely changes the input source for the television, such buttons may be reduced in size, or moved onto a secondary screen of the user interface, such that the most frequently used controls are both presented on the primary user interface screen and sized such that they are easily identifiable and usable. In some embodiments, certain preferences associated with a user may be stored by the device. For example, if the user frequently watches one or more television programs, that information may be stored on the device such that if the user arrives in a new location, the device is able to determine the local television channels and present controls to the user to allow the user to quickly navigate to the user's favorite television shows, without needing to learn a new channel arrangement. In some embodiments, the device may provide a display of a guide control that provides information regarding the local playtimes of the user's favorite shows, thus allowing the user to learn and accommodate a different programming schedule.
In addition to, or instead of, modifying the visual appearance of the user interface, some embodiments may modify haptic effects associated with the user interface. For example, haptic effects may be added to certain user interface elements, such as buttons, such as to frequently used buttons, which may allow a user to use the user interface without looking at it, or to warn the user before using an infrequently used interface element. In one embodiment, frequently used buttons or interface elements may be associated with haptic effects having stronger effects, while infrequently used buttons may have weak or no associated haptic effects.
In some embodiments, the user interface may also adjust to accommodate the user's preferred methods of performing certain tasks. For example, in one embodiment, the user may frequently issue voice commands to the device. Thus, the device may incorporate such voice commands with traditional input controls. For example, the user may dislike using the keyboard on the device and speak the name of a desired program to watch and press a channel change button. The device may then interpret the user's input to indicate that the user wishes to watch the program and then issue commands to the television or other device to power the device on (if it is not already) and to change the channel to the desired program.
By allowing the user interface to evolve and incorporate user preference information, the user may have an easier time adjusting to different locations or display devices. For example, the user may not need to research channel and guide information in different locations (e.g. if the user is a frequent traveler), or need to learn different remote controls or control interfaces for different devices.
Referring now to
The embodiment shown in
In block 604, the processor determines a haptic effect associated with the notification information. For example, in one embodiment, the processor 220 receives notification comprising an identifier of a pre-installed haptic effect. The processor extracts the identifier from the notification and accesses haptic effect information associated with the identifier. For example, the haptic effect information may be stored in a computer-readable medium within the device, such as on a flash memory device. In some embodiments, the identifier may comprise a uniform resource locator (URL) indicating a location on a remote device where the haptic effect information may be found and retrieved.
Further embodiments may comprise both an identifier and haptic effect information to be used to generate a haptic effect. In one such embodiment, such a notification may be used to both generate a haptic effect and to temporarily or permanently install a haptic effect on the device for later reuse. For example, a television series may have one or more haptic effects associated with it that may be installed onto a user's device by the smart television and, after the effects, or library of effects, have been installed on the user's device, the television may subsequently refer to effects by identifier rather than re-transmitting the haptic effect information. In some embodiments, if a haptic identifier indicates a haptic effect that is not installed on the device, the device may transmit a signal to the television requesting the haptic effect information associated with the identifier, which may cause the television to provide the haptic effect information or to provide a URL from which the haptic effect information, or library of haptic effect information may be retrieved and installed. For example, a user who enjoys a particular television series may have the series' haptic library installed on her device the first time she watches a show in the series. After haptic effect has been determined, the method proceeds to block 606.
In block 606, the processor 220 generates and transmits a haptic signal to a haptic output device, such as haptic output device 240, the haptic signal configured to cause the haptic output device to output the haptic effect. For example, after retrieving haptic effect information associated with a notification signal, the processor may generate a haptic signal based on parameters from the haptic information, such as frequency, duration, magnitude, etc. In some embodiments, the processor 220 may generate a haptic signal identifying a haptic effect already installed within a haptic output device such that the processor 220 need only identify the desired haptic effect to output. After generating the haptic signal, the processor 220 transmit the signal to the haptic output device to cause the haptic output device to output the haptic effect. After the processor 220 transmits the haptic signal, the method concludes, or it may return to either of blocks 602 or 604 to receive further notification signals, or to determine additional haptic effects, such as in the case where a notification signal comprises information associated with multiple haptic effects.
Referring now to
The method of
In some embodiments, content may not comprise any haptic information. In one such embodiment, the television 310 may monitor the content stream for audio or video characteristic that might indicate certain events. For example, the television 310 may comprise one or more audio recognition components to identify certain audio events, such as running engines, creaking floorboards, screeching tires, suspenseful music, screaming or yelling, fighting, whispered speech, or other audio characteristics that may indicate one or more events occurring within the content. In some embodiments, the television 310 may be configured to monitor a content stream to identify, recognize, and analyze speech within a content stream. In some embodiments, the television 310 may be configured to recognize certain video information, such as planes, trains, cars, boats, beaches, explosions, fires, etc., or to learn to recognize people that appear frequently appear within video content, or have information to allow recognition of prominent actors, which may allow the television to recognize main characters within the content. While monitoring a content stream, the method proceeds to block 704.
In block 704, the television 310 detects an event within the content stream. For example, while monitoring the content stream, in one embodiment, the television may detect an event by identifying certain information associated with the content stream. For example, as discussed above, information associated with a content stream may indicate the occurrence of an event, such as an explosion or a car accident. In some embodiments, an event may occur during a very brief period of time, such as during a single frame of video or over a period of a second or two, while in some embodiments, an event may span a significant amount of time or an entire scene (or scenes) within a content stream, such as a scene on a ship or in an airplane, where an event such as a running engine may be detected. In some embodiments, the television 310 may detect that an advertisement has interrupted a television program or that the content stream includes a featured product associated with an advertisement. For example, in one embodiment, the content stream may have associated information indicating an advertisement associated with a particular scene in a content stream, and the television 310 may detect that an advertising event has occurred.
In some embodiments, the television 310 may be configured to detect an event based on recognized audio or video. As discussed above, the television 310 may be configured to identify certain audio or video information, such as certain sounds or images. Further, in some embodiments, the television 310 may recognize events based on a correspondence between certain audio information and certain video information. For example, if the television detects a creaking floorboard and a darkened scene with a human silhouette, the television may determine a scary scene is occurring. In another embodiment, if the television detects the sound of a wave crashing and detects video of a beach, the television may determine that a relaxing scene is occurring. After detecting an event, the method proceeds to block 706.
In block 706, the television 310 generates and transmits a notification signal to a user device 200. For example, in one embodiment, the television 310 generates a notification signal comprising a haptic signal. The haptic signal, in this embodiment, comprises an identification of a haptic effect to be output and a magnitude and duration of the effect to be output. In some embodiments, however, the haptic signal may comprise waveform information describing the effect to be output, or spatial information indicating a region on a device to which a frictional or deformation effect is to be applied, as well as information regarding tactile characteristics of a frictional or deformation effect that is to be applied. In some embodiments, a plurality of haptic effects may be incorporated within a notification signal, or a notification signal may transmit one or more haptic effects to be stored on the device 200 for later access. For example, as discussed above, a television series may have a certain set of haptic effect associated with it. If the television 310 detects that an episode of the television program is beginning, the television 310 may generate and transmit a notification signal to the device requesting information regarding whether the device has the haptic effect library associated with the television program stored within its memory. If the device responds negatively, the television 310 may generate and transmit a notification signal comprising part or all of the set of haptic effects, or it may comprise a URL of a location from which the haptic effect library may be accessed or retrieved.
In some embodiments, a notification signal may comprise information that complements or enhances content being displayed by the television. For example, in one embodiment, the television may generate and transmit a notification signal comprising a haptic effect that is configured to enhance the suspense in a scene, such as by providing a subtle vibration to the device associated with a person moving through a darkened house and then providing a sharp jolting effect when the villain appears in a doorway. In some embodiments, the notification signal may comprise a haptic signal configured to generate a haptic effect associated with an ambient condition within the content stream, such as a car engine or a boat sailing through rough water.
In some embodiments, a generated notification signal may comprise a URL of a haptic effect to be output, or of an advertisement to be displayed. For example, as discussed above, the television 310 may detect an event associated with an advertisement or with a product shown within a content stream. After detecting such an event, the television 310 may generate and transmit a notification signal comprising advertisement information. For example, in one embodiment, the notification signal may comprise a URL to a product's website, or the notification signal may comprise an advertisement or coupon to be displayed on the user's device. In some embodiments, the notification signal may comprise a haptic effect configured to draw the user's attention to advertisement displayed on the user's device. After (or while) generating and transmitting the notification signal, the method may return to block 702 and the television may continue to monitor the content stream.
While the method of
Referring now to
The method of
In some embodiments, the device 200 may determine that the user rarely changes a video source on a television and only uses volume controls associated with a separate audio/visual receiver device. In one such embodiment, the device 200 may determine that the video source selector control should not be displayed on the primary interface screen, but should only be available through a menu of options available if the user swipes the primary screen to present a secondary (or tertiary, or other) set of controls. In addition, the device may select controls that initiate commands to a television 310, but only select volume controls that initiate commands to the A/V receiver, while not displaying volume controls for the television 310. In some embodiments, the device may determine that the user prefers to hold the device in her right hand and thus, controls should be arranged to accommodate a right-handed grip on the device.
In some embodiments, the device may determine a user preference based on sensed information about the device. For example, in some embodiments, the device 200 may comprise one or more sensors 290 capable of sensing an orientation or movement of the device, or whether the device is being grasped or not. Such sensor information may be used to determine an applicable user preference. For example, if a device determines that it is not being grasped by a user based on sensor information, the device may determine a first set of user preferences. If the device later receives a sensor signal indicating the device is being grasped, the device may determine a second set of user preferences is applicable. Still further sets of user preferences may be determined based on other sensed information, such as an orientation that indicates that the user is focused on the device as may be seen in
In block 804, the device 200 displays the user interface. For example, after determining the user's preferences, the device displays the controls having a certain sizes and at location determined to correspond to the user's preference. In the embodiment shown in
In block 806, the device 806 determines a usage pattern of the device or the user interface. For example, the device may monitor the user's use of one or more controls displayed on the device. For example, if a user frequently uses a particular control, the device may increase a weighting score associated with the control. Or, if the user frequently watches particular programs or applies particular volume settings, the device may store information associated with such usage patterns. In one embodiments, the device may monitor other applications the user uses while watching content on television, such as a web browsing application or an application that provides information about actors, plot summaries, or reviews. In some embodiments, the device may monitor sensed information to determine usage patterns. For example, in one embodiment, the device may monitor an orientation of the device while the user watches content displayed on a television to learn whether a user is focusing on the device or the television. After determining a usage pattern, the method proceeds to block 808.
In block 808, the device modifies the user interface. For example, in one embodiment, the device 200 modifies the user interface by adding or removing controls within the user interface. In one embodiment, the device 200 determines that the user frequently accesses an application installed on the device 200 to retrieve detailed information about the casts of various shows or movies. The device 200 modifies the user interface to include a control to allow the user to quickly access the application. In one embodiment, the device 200 determines that the user usually sets the volume to a particular setting when viewing content, thus the device 200 re-sizes a volume control that would be used to adjust the volume to the desired level. For example, if the current volume is below the user's typical desired level, the user interface may be modified to increase the size of the “volume up” control to allow the user to easily increase the volume to the typical level. As discussed above, in some embodiments, the device 200 may modify the user interface by displaying an alternate user interface or by modifying or changing haptic effects associated with interface elements. For example, as discussed above with respect to
While the methods and systems herein are described in terms of software executing on various machines, the methods and systems may also be implemented as specifically-configured hardware, such as field-programmable gate array (FPGA) specifically to execute the various methods. For example, embodiments can be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in a combination thereof. In one embodiment, a device may comprise a processor or processors. The processor comprises a computer-readable medium, such as a random access memory (RAM) coupled to the processor. The processor executes computer-executable program instructions stored in memory, such as executing one or more computer programs for editing an image. Such processors may comprise a microprocessor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), field programmable gate arrays (FPGAs), and state machines. Such processors may further comprise programmable electronic devices such as PLCs, programmable interrupt controllers (PICs), programmable logic devices (PLDs), programmable read-only memories (PROMs), electronically programmable read-only memories (EPROMs or EEPROMs), or other similar devices.
Such processors may comprise, or may be in communication with, media, for example computer-readable media, that may store instructions that, when executed by the processor, can cause the processor to perform the steps described herein as carried out, or assisted, by a processor. Embodiments of computer-readable media may comprise, but are not limited to, an electronic, optical, magnetic, or other storage device capable of providing a processor, such as the processor in a web server, with computer-readable instructions. Other examples of media comprise, but are not limited to, a floppy disk, CD-ROM, magnetic disk, memory chip, ROM, RAM, ASIC, configured processor, all optical media, all magnetic tape or other magnetic media, or any other medium from which a computer processor can read. The processor, and the processing, described may be in one or more structures, and may be dispersed through one or more structures. The processor may comprise code for carrying out one or more of the methods (or parts of methods) described herein.
The foregoing description of some embodiments of the invention has been presented only for the purpose of illustration and description and is not intended to be exhaustive or to limit the invention to the precise forms disclosed. Numerous modifications and adaptations thereof will be apparent to those skilled in the art without departing from the spirit and scope of the invention.
Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, operation, or other characteristic described in connection with the embodiment may be included in at least one implementation of the invention. The invention is not restricted to the particular embodiments described as such. The appearance of the phrase “in one embodiment” or “in an embodiment” in various places in the specification does not necessarily refer to the same embodiment. Any particular feature, structure, operation, or other characteristic described in this specification in relation to “one embodiment” may be combined with other features, structures, operations, or other characteristics described in respect of any other embodiment.
Number | Name | Date | Kind |
---|---|---|---|
2524782 | Ferrar et al. | Oct 1950 | A |
3490059 | Paulsen et al. | Jan 1970 | A |
3623046 | Scourtes | Nov 1971 | A |
3875488 | Crocker et al. | Apr 1975 | A |
4050265 | Drennen et al. | Sep 1977 | A |
4103155 | Clark | Jul 1978 | A |
4125800 | Jones | Nov 1978 | A |
4148014 | Burson | Apr 1979 | A |
4311980 | Prusenziati | Jan 1982 | A |
4385836 | Schmitt | May 1983 | A |
4391282 | Ando et al. | Jul 1983 | A |
4400790 | Chambers et al. | Aug 1983 | A |
4443952 | Schulien et al. | Apr 1984 | A |
4546347 | Kirsch | Oct 1985 | A |
4637264 | Takahashi et al. | Jan 1987 | A |
4639884 | Sagues | Jan 1987 | A |
4678908 | LaPlante | Jul 1987 | A |
4680466 | Kuwahara et al. | Jul 1987 | A |
4692726 | Green et al. | Sep 1987 | A |
4695266 | Hui | Sep 1987 | A |
4699043 | Violante De Dionigi | Oct 1987 | A |
4712101 | Culver | Dec 1987 | A |
4724715 | Culver | Feb 1988 | A |
4728954 | Phelan et al. | Mar 1988 | A |
4734685 | Watanabe | Mar 1988 | A |
4776701 | Pettigrew | Oct 1988 | A |
4794384 | Jackson | Dec 1988 | A |
4795907 | Kitazawa | Jan 1989 | A |
4799055 | Nestler et al. | Jan 1989 | A |
4803413 | Kendig et al. | Feb 1989 | A |
4811608 | Hilton | Mar 1989 | A |
4815006 | Andersson et al. | Mar 1989 | A |
4819195 | Bell et al. | Apr 1989 | A |
4823106 | Lovell | Apr 1989 | A |
4825157 | Mikan | Apr 1989 | A |
4840634 | Muller et al. | Jun 1989 | A |
4851771 | Ikeda et al. | Jul 1989 | A |
4860051 | Taniguchi et al. | Aug 1989 | A |
4891889 | Tomelleri | Jan 1990 | A |
4906843 | Jones et al. | Mar 1990 | A |
4914976 | Wyllie | Apr 1990 | A |
4935725 | Turnau | Jun 1990 | A |
4935728 | Kley | Jun 1990 | A |
4937685 | Barker et al. | Jun 1990 | A |
4940234 | Ishida et al. | Jul 1990 | A |
4962448 | DeMaio et al. | Oct 1990 | A |
4964837 | Collier | Oct 1990 | A |
4965446 | Vyse | Oct 1990 | A |
4982504 | Soderberg et al. | Jan 1991 | A |
5006703 | Shikunami et al. | Apr 1991 | A |
5024626 | Robbins et al. | Jun 1991 | A |
5053975 | Tsuchihashi et al. | Oct 1991 | A |
5062830 | Dunlap | Nov 1991 | A |
5065145 | Purcell | Nov 1991 | A |
5068529 | Ohno et al. | Nov 1991 | A |
5079845 | Childers | Jan 1992 | A |
5086197 | Liou | Feb 1992 | A |
5095303 | Clark et al. | Mar 1992 | A |
5107080 | Rosen | Apr 1992 | A |
5113179 | Scott-Jackson et al. | May 1992 | A |
5116051 | Moncrief et al. | May 1992 | A |
5125261 | Powley | Jun 1992 | A |
5132927 | Lenoski et al. | Jul 1992 | A |
5138154 | Hotelling | Aug 1992 | A |
5139261 | Openiano | Aug 1992 | A |
5148377 | McDonald | Sep 1992 | A |
5155423 | Karlen et al. | Oct 1992 | A |
5168268 | Levy | Dec 1992 | A |
5182557 | Lang | Jan 1993 | A |
5195179 | Tokunaga | Mar 1993 | A |
5195920 | Collier | Mar 1993 | A |
5202961 | Mills et al. | Apr 1993 | A |
5204600 | Kahkoska | Apr 1993 | A |
5209131 | Baxter | May 1993 | A |
5216337 | Orton et al. | Jun 1993 | A |
5223658 | Suzuki | Jun 1993 | A |
5229836 | Nagano | Jul 1993 | A |
5230623 | Guthrie et al. | Jul 1993 | A |
5235868 | Culver | Aug 1993 | A |
5239249 | Ono | Aug 1993 | A |
5246316 | Smith | Sep 1993 | A |
5247648 | Watkins et al. | Sep 1993 | A |
5254919 | Bridges et al. | Oct 1993 | A |
5275565 | Moncrief | Jan 1994 | A |
5280276 | Kwok | Jan 1994 | A |
5284330 | Carlson et al. | Feb 1994 | A |
5289273 | Lang | Feb 1994 | A |
5296846 | Ledley | Mar 1994 | A |
5313229 | Gilligan et al. | May 1994 | A |
5313230 | Venolia et al. | May 1994 | A |
5317336 | Hall | May 1994 | A |
5329289 | Sakamoto et al. | Jul 1994 | A |
5341459 | Backes | Aug 1994 | A |
5351692 | Dow et al. | Oct 1994 | A |
5359193 | Nyui et al. | Oct 1994 | A |
5374942 | Gilligan et al. | Dec 1994 | A |
5379663 | Hara | Jan 1995 | A |
5384460 | Tseng | Jan 1995 | A |
5390128 | Ryan et al. | Feb 1995 | A |
5390296 | Crandall et al. | Feb 1995 | A |
5396267 | Bouton | Mar 1995 | A |
5397323 | Taylor et al. | Mar 1995 | A |
5398044 | Hill | Mar 1995 | A |
5402499 | Robison et al. | Mar 1995 | A |
5402582 | Raab | Apr 1995 | A |
5402680 | Korenaga | Apr 1995 | A |
5417696 | Kashuba et al. | May 1995 | A |
5428748 | Davidson et al. | Jun 1995 | A |
5436542 | Petelin et al. | Jul 1995 | A |
5436640 | Reeves | Jul 1995 | A |
5452615 | Hilton | Sep 1995 | A |
5457479 | Cheng | Oct 1995 | A |
5457793 | Elko et al. | Oct 1995 | A |
5467763 | McMahon et al. | Nov 1995 | A |
5473344 | Bacon et al. | Dec 1995 | A |
5474082 | Junker | Dec 1995 | A |
5481914 | Ward | Jan 1996 | A |
5491477 | Clark et al. | Feb 1996 | A |
5512919 | Araki | Apr 1996 | A |
5514150 | Rostoker | May 1996 | A |
5524195 | Clanton, III et al. | Jun 1996 | A |
5526022 | Donahue et al. | Jun 1996 | A |
5530455 | Gillick et al. | Jun 1996 | A |
5543821 | Marchis et al. | Aug 1996 | A |
5547383 | Yamaguchi | Aug 1996 | A |
5550562 | Aoki et al. | Aug 1996 | A |
5550563 | Matheny et al. | Aug 1996 | A |
5570111 | Barrett et al. | Oct 1996 | A |
5576727 | Rosenberg et al. | Nov 1996 | A |
5583407 | Yamaguchi | Dec 1996 | A |
5591924 | Hilton | Jan 1997 | A |
5592401 | Kramer | Jan 1997 | A |
5604345 | Matsuura | Feb 1997 | A |
5611731 | Bouton et al. | Mar 1997 | A |
5623582 | Rosenberg | Apr 1997 | A |
5623642 | Katz et al. | Apr 1997 | A |
5627531 | Posso et al. | May 1997 | A |
5628686 | Svancarek et al. | May 1997 | A |
5635897 | Kuo | Jun 1997 | A |
5638421 | Serrano et al. | Jun 1997 | A |
5652603 | Abrams | Jul 1997 | A |
5666138 | Culver | Sep 1997 | A |
5680141 | Didomenico et al. | Oct 1997 | A |
5691747 | Amano | Nov 1997 | A |
5694153 | Aoyagi et al. | Dec 1997 | A |
5722071 | Berg et al. | Feb 1998 | A |
5724106 | Autry et al. | Mar 1998 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5734108 | Walker et al. | Mar 1998 | A |
5740083 | Anderson et al. | Apr 1998 | A |
5745057 | Sasaki et al. | Apr 1998 | A |
5749577 | Couch et al. | May 1998 | A |
5755620 | Yamamoto et al. | May 1998 | A |
5763874 | Luciano et al. | Jun 1998 | A |
5767836 | Scheffer et al. | Jun 1998 | A |
5771037 | Jackson | Jun 1998 | A |
5795228 | Trumbull et al. | Aug 1998 | A |
5808568 | Wu | Sep 1998 | A |
5808603 | Chen | Sep 1998 | A |
5818426 | Tierney et al. | Oct 1998 | A |
5825305 | Biferno | Oct 1998 | A |
5828295 | Mittel et al. | Oct 1998 | A |
5831593 | Rutledge | Nov 1998 | A |
5841133 | Omi | Nov 1998 | A |
5841423 | Carroll, Jr. et al. | Nov 1998 | A |
5841428 | Jaeger et al. | Nov 1998 | A |
5844673 | Ivers | Dec 1998 | A |
5877748 | Redlich | Mar 1999 | A |
5879327 | Moreau DeFarges et al. | Mar 1999 | A |
5889506 | Lopresti et al. | Mar 1999 | A |
5912661 | Siddiqui | Jun 1999 | A |
5917486 | Rylander | Jun 1999 | A |
5919159 | Lilley et al. | Jul 1999 | A |
5936613 | Jaeger et al. | Aug 1999 | A |
5954689 | Poulsen | Sep 1999 | A |
5963196 | Nishiumi et al. | Oct 1999 | A |
5986638 | Cheng | Nov 1999 | A |
6017273 | Pelkey | Jan 2000 | A |
6031222 | Carapelli | Feb 2000 | A |
6078311 | Pelkey | Jun 2000 | A |
6078876 | Rosenberg et al. | Jun 2000 | A |
6097499 | Casey et al. | Aug 2000 | A |
6097964 | Nuovo et al. | Aug 2000 | A |
6104379 | Petrich et al. | Aug 2000 | A |
6183364 | Trovato | Feb 2001 | B1 |
6192432 | Slivka et al. | Feb 2001 | B1 |
6241574 | Helbing | Jun 2001 | B1 |
6259433 | Meyers | Jul 2001 | B1 |
6280327 | Leifer et al. | Aug 2001 | B1 |
6293798 | Boyle et al. | Aug 2001 | B1 |
6295608 | Parkes et al. | Sep 2001 | B1 |
6300038 | Shimazu et al. | Oct 2001 | B1 |
6349301 | Mitchell et al. | Feb 2002 | B1 |
6418329 | Furuya | Jul 2002 | B1 |
6546390 | Pollack et al. | Apr 2003 | B1 |
6633224 | Hishida et al. | Oct 2003 | B1 |
6760751 | Hachiya et al. | Jul 2004 | B1 |
7757171 | Wong et al. | Jul 2010 | B1 |
8992322 | Endo et al. | Mar 2015 | B2 |
9285905 | Buuck | Mar 2016 | B1 |
20010018354 | Pigni | Aug 2001 | A1 |
20010045978 | McConnell et al. | Nov 2001 | A1 |
20020072674 | Criton et al. | Jun 2002 | A1 |
20020151992 | Hoffberg et al. | Oct 2002 | A1 |
20020177471 | Kaaresoja | Nov 2002 | A1 |
20030043206 | Duarte | Mar 2003 | A1 |
20030061400 | Eves | Mar 2003 | A1 |
20030068053 | Chu | Apr 2003 | A1 |
20030112269 | Lentz et al. | Jun 2003 | A1 |
20040031058 | Reisman | Feb 2004 | A1 |
20040076444 | Badovinac et al. | Apr 2004 | A1 |
20040193393 | Keane | Sep 2004 | A1 |
20050187747 | Paxson et al. | Aug 2005 | A1 |
20050223237 | Barletta et al. | Oct 2005 | A1 |
20050229224 | Matsumoto et al. | Oct 2005 | A1 |
20060011042 | Brenner | Jan 2006 | A1 |
20060066569 | Eid et al. | Mar 2006 | A1 |
20070033259 | Wies | Feb 2007 | A1 |
20070202841 | Cruz-Hernandez | Aug 2007 | A1 |
20080098331 | Novick et al. | Apr 2008 | A1 |
20080165081 | Lawther et al. | Jul 2008 | A1 |
20090049092 | Capio et al. | Feb 2009 | A1 |
20090079690 | Watson | Mar 2009 | A1 |
20090096632 | Ullrich | Apr 2009 | A1 |
20090157753 | Lee | Jun 2009 | A1 |
20090293079 | McKee et al. | Nov 2009 | A1 |
20090327894 | Rakib et al. | Dec 2009 | A1 |
20100013653 | Birnbaum et al. | Jan 2010 | A1 |
20100153995 | Belz et al. | Jun 2010 | A1 |
20110018697 | Birnbaum | Jan 2011 | A1 |
20110099017 | Ure | Apr 2011 | A1 |
20110102160 | Heubel et al. | May 2011 | A1 |
20110125788 | Joo et al. | May 2011 | A1 |
20110133910 | Alarcon | Jun 2011 | A1 |
20110153768 | Carter et al. | Jun 2011 | A1 |
20120028577 | Rodriguez | Feb 2012 | A1 |
20120070085 | Arn | Mar 2012 | A1 |
20120093216 | Black | Apr 2012 | A1 |
20120110621 | Gossweiler, III | May 2012 | A1 |
20130086178 | Osborne et al. | Apr 2013 | A1 |
20130227410 | Sridhara et al. | Aug 2013 | A1 |
20130229271 | Fantauzza | Sep 2013 | A1 |
20130307786 | Heubel | Nov 2013 | A1 |
20130311881 | Birnbaum et al. | Nov 2013 | A1 |
20130326552 | Adams | Dec 2013 | A1 |
20140167940 | Choi | Jun 2014 | A1 |
20140232657 | Aviles | Aug 2014 | A1 |
20140235347 | Zhang | Aug 2014 | A1 |
20140267906 | Mickelsen | Sep 2014 | A1 |
20150070150 | Levesque et al. | Mar 2015 | A1 |
20150097658 | Yagi et al. | Apr 2015 | A1 |
20150123774 | Ioffreda | May 2015 | A1 |
20160085303 | Israr | Mar 2016 | A1 |
20160295302 | Fleureau | Oct 2016 | A1 |
20170025153 | Svendsen | Jan 2017 | A1 |
Number | Date | Country |
---|---|---|
1543096 | Nov 2004 | CN |
1599925 | Dec 2008 | CN |
102523493 | Jun 2012 | CN |
197 57 385 | Jul 1999 | DE |
0 085 518 | Aug 1989 | EP |
0 470 257 | Feb 1992 | EP |
0 358 989 | Jul 1994 | EP |
0 875 819 | Oct 2002 | EP |
2276240 | Jan 2011 | EP |
2429183 | Mar 2012 | EP |
2 237 160 | Apr 1991 | GB |
2 347 199 | Aug 2000 | GB |
WO 9616397 | May 1996 | WO |
WO 9624398 | Aug 1996 | WO |
WO 9632679 | Oct 1996 | WO |
WO 0077689 | Dec 2000 | WO |
WO 0100630 | Jan 2001 | WO |
WO 0167297 | Sep 2001 | WO |
WO 03000319 | Jan 2003 | WO |
WO 2009027110 | Mar 2009 | WO |
WO 2009137329 | Nov 2009 | WO |
WO 2010129892 | Nov 2010 | WO |
Entry |
---|
Adelstein, B., A Virtual Environment System for the Study of Human Arm Tremor, Submitted to the Dept. of Mechanical Engineering in partial fulfillment of the requirements for the degree of Doctor of Philosophy at the Massachusetts Institute of Technology, Jun. 1989, pp. 1-253. |
Adelstein, B. et al., Design and Implementation of a Force Reflecting Manipulandum for Manual Control Research, DSC—vol. 42, Advances in Robotics, ASME 1992, pp. 1-12. |
Akamatsu et al., Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display, Presence, vol. 3, No. 1 pp. 73-80, 1994. |
ATIP98.059: Virtual Reality (VR) Development at SERI (Korea), Asian Technology Information Program (ATIP) Jul. 20, 1998, pp. 1-9. |
Aukstakalnis, S. et al., The Art and Science of Virtual Reality Silicon Mirage, 1992, Peachpit Press, Inc., Berkeley, CA, pp. 129-180. |
Baigrie, S. et al., Electric Control Loading-A Low Cost, High Performance Alternative, Proceedings, Nov. 6-8, 1990, pp. 247-254. |
Bejczy, A., Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation, Science, vol. 208, No. 4450, 1980, pp. 1327-1335. |
Bejczy, A. et al., Kinesthetic Coupling Between Operator and Remote Manipulator, International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980, pp. 1-9. |
Bejczy, A. et al., A Laboratory Breadboard System for Dual-Arm Teleoperation, SOAR '89 Workshop, JSC, Houston, Jul. 25-27, 1989. |
Bejczy, A. et al., Universal Computer Control System (UCCS) for Space Telerobots, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA, pp. 317-324. |
Bjork, S. et al., An Alternative to Scroll Bars on Small Screens, Play: Applied Research on Art and Technology, Viktoria Institute, Box 620, SE-405 30 Gothenburg, Sweden, pp. 1-2. |
Bouguila, L. et al., Effect of Coupling Haptics and Stereopsis on Depth Perception in Virtual Environment, Precision and Intelligence Laboratory, Tokyo Institute of Technology, 4259 Nagatsuta cho Midori ku Yokohama shi 226-8503-Japan. |
Brooks, T. et al., Hand Controllers for Teleoperation: A State-of-the-Art Technology Survey and Evaluation, 1985, NASA Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA. |
Burdea, G. et al., Distributed Virtual Force Feedback, IEEE Workshop on “Force Display in Virtual Environments and its Application to Robotic Teleoperation,” May 2, 1993, Atlanta, GA. |
Calder, B. et al., Design of a Force-Feedback Touch-Inducing Actuator for Teleoperator Robot Control, Submitted to the Department of Mechanical Engineering and Electrical Engineering in partial Fulfillment of the requirements of the degree of Bachelors of Science in Mechanical Engineering and Bachelor of Science in Electrical Engineering at the Massachusetts Institute of Technology, May 1983. |
Caldwell, D. et al., Enhanced Tactile Feedback (Tele-Taction) using a Multi-Functional Sensory System, Dept. of Electronic Eng., University of Salford, Salford, M5 4WT, UK, 1993. |
Cyberman Technical Specification, Logitech Cyberman SWIFT Supplement, Revision 1.0, Apr. 5, 1994, pp. 1-29. |
Eberhardt, S. et al., OMAR-A Haptic Display for Speech Perception by Deaf and Deaf-Blind Individuals, IEEE Virtual Reality Annual International Symposium, Sep. 18-22, 1993, Seattle Washington. |
Eberhardt, S. et al., Inducing Dynamic Haptic Perception by the Hand: System Description and Some Results, Dynamic Systems and Control, 1994, vol. 1, presented at 1994 International Mechanical Engineering Congress and Exposition, Chicago Illinois, Nov. 6-11, 1994. |
Fukumoto, M. et al., Active Click: Tactile Feedback for Touch Panels, NTT DoCoMo Multimedia Labs, Japan. |
Gobel, M. et al., Tactile Feedback Applied to Computer Mice, International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995. |
Gotow, J. et al., Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback, The Robotics Institute and Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA 15213, pp. 332-337, 1984. |
Hansen, W., Enhancing Documents with Embedded Programs: How Ness extends Insets in the Andrew Toolkit, 1990, Information Technology Center, Carnegie Mellon University, Pittsburgh, PA 15213. |
Hasser, C. et al., Tactile Feedback with Adaptive Controller for a Force-Reflecting Haptic Display Part 1: Design, 1996, Armstrong Laboratory, Human Systems Center, Air Force Materiel Command, Wright-Patterson AFB OH 45433. |
Hasser, C. et al., Tactile Feedback for a Force-Reflecting Haptic Display, Thesis Submitted to the School of Engineering of the University of Daytona, Dayton OH, Dec. 1995. |
Hasser, C., Force-Reflecting Anthropomorphic Hand Masters, Crew Systems Directorate Biodynamics and Biocommunications Division, Wright-Patterson AFB OH 45433-7901, Jul. 1995, Interim Report for the Period Jun. 1991-Jul. 1995. |
Hinckley, K. et al., Haptic Issues for Virtual Manipulation, A Dissertation presented to the Faculty of the School of Engineering and Applied Science at the University of Virginia, in Partial Fulfillment of the Requirement for the Degree Doctor of Philosophy (Computer Science), Dec. 1996. |
Howe, R., A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation, Proceedings of the 1992 IEEE Conference in Robotics and Automation, Nice, France—May 1992. |
Iwata, H., Pen-Based Haptic Virtual Environment, Institute of Engineering Mechanics, University of Tsukuba, Japan, 1993. |
Jacobsen, S. et al., High Performance, Dextrous Telerobotic Manipulator with Force Reflection, Intervention/ROV '91, Conference & Exposition, May 21-23, 1991, Hollywood, FL. |
Johnson, A., Shape-Memory Alloy Tactical Feedback Actuator, Phase I-Final Report, Air Force SABIR Contract F33-88-C-0541, Armstrong Aerospace Medical Research Laboratory, Human Systems Division, Air Force Systems Command, Wright-Patterson Air Force Base, OH 45433. |
Jones, L. et al., A Perceptual Analysis of Stiffness, Experimental Brain Research, 1990, vol. 79, pp. 150-156. |
Kaczmarek, K. et al., Tactile Displays, Virtual Environment Technologies, pp. 349-414, 1995. |
Kelley, A. et al., MagicMouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input/Output Device, Department of Electrical Engineering, University of British Canada, Oct. 19, 1993. |
Lake, S.L., Cyberman from Logitech, web site at http://www.ibiblio.org/GameBytes/issue21/greviews/cyberman/html, as available via the Internet and printed May 29, 2002. |
MacLean, K., Designing with Haptic Feedback, Interval Research Corporation, 1801 Page Mill Road, Palo Alto, CA 94304, 2000. |
Mine, M., Isaac: A Virtual Environment Tool for the Interactive Construction of Virtual Worlds, Department of Computer Science, University of North Carolina Chapel Hill, 1995. |
Picinbono, G. et al., Extrapolation: A Solution for Force Feedback, Virtual Reality and Prototyping, Jun. 1999, Laval, France. |
Wloka, M., Interacting with Virtual Reality, Science and Technology Center for Computer Graphics and Scientific Visualization, Brown University Site, Department of Computer Science, 1995. |
eRENA, Pushing Mixed Reality Boundaries, Deliverable 7b.1, Final, Version 1.0. |
Real Time Graphics, The Newsletter of Virtual Environment Technologies and Markets, Aug. 1998, vol. 7, No. 2. |
1998 IEEE International Conference on Robotics and Automation, May 16-20, 1998, Lueven, Belgium. |
European Patent Office, Communication pursuant to Article 94(3) EPC, Application No. 14160149 dated Jan. 27, 2017. |
Chinese Patent Office, Notification of First Office Action, Application No. 201410097449 dated Jan. 26, 2017. |
European Patent Office, European Search Report and Written Opinion, European Application No. 14160149.2 dated Oct. 10, 2014. |
Number | Date | Country | |
---|---|---|---|
20140267911 A1 | Sep 2014 | US |