Haptics is a tactile and force feedback technology that takes advantage of a user's senses by haptic effects such as vibrations, motions, and other forces and stimulations. Devices, such as mobile devices, gaming devices, touchscreen devices, and personal computers, can be configured to generate haptic effects. Haptic feedback can provide kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, vibrotactile feedback, texture, heat, etc.) to a user. Haptic effects may be useful to alert the user to specific events or to provide realistic feedback to create greater sensory immersion within a simulated or virtual environment
While a haptic enabled device can render consistent haptic effects, such intended haptic effects may be perceived differently by different users. They also may be perceived differently by the same user, depending on factors such as how the user interacts with the haptic enabled device, and/or various physical properties of the users, the haptic enabled device, and/or the environment surrounding thereof. It may be desirable to enable the user to feel haptic effects as consistent and more similar to the intended sensation of such haptic effects.
The present disclosure generally relates to adaptive haptic effect rendering based on identification of a dynamic system. Various aspects are described in this disclosure, which include, but are not limited to, the following aspects.
One aspect is a method for generating a haptic effect. The method includes receiving, in real time, a dynamic system parameter signal from an input device of a haptic enabled apparatus, the dynamic system parameter signal representative of one or more dynamic system parameters of a dynamic system; determining a haptic parameter modification value based on the dynamic system parameter signal; modifying at least one of haptic data parameters based on the haptic parameter modification value; generating a modified haptic signal based on the haptic data parameters; and applying the modified haptic signal to a haptic actuator, thereby providing a haptic effect adapted to the dynamic system.
Another aspect is an apparatus for generating haptic effects. The apparatus includes an actuator, an actuator drive circuit configured to operate the actuator, an input device configured to monitor a dynamic system, and a processing device connected to the actuator drive circuit and the input device. The processing device operates to receive, in real time, a dynamic system parameter signal from the input device, the dynamic system parameter signal representative of one or more dynamic system parameters of the dynamic system; determine a haptic parameter modification value based on the dynamic system parameter signal; modify at least one of haptic data parameters based on the haptic parameter modification value; generate a modified haptic signal based on the haptic data parameters; and transmit the modified haptic signal to the actuator drive circuit, the modified haptic signal enabling the actuator drive circuit to control the actuator, thereby providing a haptic effect adapted to the dynamic system.
Yet another aspect is a computer-readable storage medium comprising software instructions that, when executed, cause a haptic enabled apparatus to, while the haptic enabled apparatus is in use, receive a dynamic system parameter signal from an input device of the haptic enabled apparatus, the dynamic system parameter signal representative of one or more dynamic system parameters of a dynamic system; determine a haptic parameter modification value based on the dynamic system parameter signal; modify at least one of haptic data parameters based on the haptic parameter modification value; generate a modified haptic signal based on the haptic data parameters; and operate a haptic actuator using the modified haptic signal, thereby providing a haptic effect adapted to the dynamic system.
Yet another aspect is a method for generating a haptic effect. The method includes: generating a dynamic system characterization model representative of a dynamic system, the dynamic system indicative of inputs received through an input device of a haptic enabled apparatus; receiving, in real time, an input device signal from the input device, the input device signal representative of dynamic system parameters of the dynamic system; updating the dynamic system characterization model based on the input device signal; modifying a haptic effect rendering model based on the updated dynamic system characterization model; generating a haptic signal based on the modified haptic effect rendering model; and controlling a haptic actuator using the haptic signal, thereby providing a haptic effect adapted to the dynamic system. In certain examples, the method may further include: storing the dynamic system parameters in the haptic enabled apparatus.
Yet another aspect is an apparatus for generating haptic effects. The apparatus includes an actuator, an actuator drive circuit configured to operate the actuator, an input device configured to monitor a dynamic system, and a processing device connected to the actuator drive circuit and the input device. The processing device is configured to generate a dynamic system characterization model representative of the dynamic system; receive, in real time, an input device signal from the input device, the input device signal representative of dynamic system parameters of the dynamic system; update the dynamic system characterization model based on the input device signal; modify a haptic effect rendering model based on the updated dynamic system characterization model; generate a haptic effect signal based on the modified haptic effect rendering model; and transmit the haptic effect signal to the actuator drive circuit, the haptic effect signal enabling the actuator drive circuit to control the actuator, thereby providing a haptic effect adapted to the dynamic system. In certain examples, the processing device may be further configured to store the dynamic system parameters in the haptic enabled apparatus.
Yet another aspect is a computer-readable storage medium comprising software instructions that, when executed, cause a haptic enabled apparatus to: store a dynamic system characterization model representative of a dynamic system; store a haptic effect rendering model; while the haptic enabled apparatus is in use, receive a input device signal from an input device associated with the haptic enabled apparatus, the input device signal representative of dynamic system parameters of the dynamic system; update the dynamic system characterization model based on the input device signal; store the updated dynamic system characterization model; modify the haptic effect rendering model based on the updated dynamic system characterization model; generate a haptic effect signal based on the modified haptic effect rendering model; and operate a haptic actuator using the haptic effect signal, thereby providing a haptic effect adapted to the dynamic system.
In certain examples, the dynamic system parameters include object parameters representative of physical characteristics of the haptic enabled apparatus. The object parameters may include position, velocity and acceleration effects associated with the haptic enabled apparatus.
In certain examples, the dynamic system parameters include user parameters associated with a user's behavioral and physiological properties with respect to the haptic enabled apparatus. The user parameters may include a user grip strength, a user grip pattern, and a user skin sensitivity with respect to the haptic enabled apparatus.
In certain examples, the dynamic system parameters include environmental parameters representative of physical characteristics of an environment surrounding the haptic enabled apparatus.
Various embodiments will be described in detail with reference to the drawings, wherein like reference numerals represent like parts and assemblies throughout the several views. Reference to various embodiments does not limit the scope of the claims attached hereto. Additionally, any examples set forth in this specification are not intended to be limiting and merely set forth some of the many possible embodiments for the appended claims.
Whenever appropriate, terms used in the singular also will include the plural and vice versa. The use of “a” herein means “one or more” unless stated otherwise or where the use of “one or more” is clearly inappropriate. The use of “or” means “and/or” unless stated otherwise. Terms such as “comprise,” “comprises,” “comprising,” “include,” “includes,” “including,” “such as,” “has,” and “having” are interchangeable and not intended to be limiting. For example, the term “including” shall mean “including, but not limited to.”
In general, the present disclosure relates to systems and methods for generating haptic effects adapted to a dynamic system. In certain examples, the dynamic system includes various properties associated with a user's dynamic interaction with a haptic enabled apparatus. The dynamic system may further include a dynamic change in an environment surrounding the haptic enabled apparatus and the user thereof.
A user's perception of haptic rendering for a particular haptic enabled apparatus can change as the dynamic system constantly varies. By way of example, if a user tightens the grip on a smartphone, the effective mass and other dynamic properties of the dynamic system including at least the user's hand and the smartphone can change the user's feel of the haptic effect generated from the smartphone. The systems and methods of the present disclosure operates to monitor a dynamic change in the dynamic system and automatically modify haptic rendering in real time so that the user can feel consistent haptic effects even though the factors that affect the haptic effect may change. The haptic signal controls the haptic effect adapted to change in the dynamic system. As such, the haptic rendering is dynamically updated in response to a status of the dynamic system.
The haptic enabled apparatus 102 includes an adaptive haptic effect rendering device 104. The adaptive haptic effect rendering device 104 operates to monitor a dynamic system 106 and generate a haptic effect which is dynamically adapted to a change in the dynamic system 106, thereby providing a consistent, effective haptic effect to the user's perception.
In at least some embodiments, the dynamic system 106 indicates one or more inputs received through an input device of the haptic enabled apparatus 102, such as an input device 162 and/or a dynamic system monitoring device 152 illustrated in
The dynamic system 106 changes as physical properties of the haptic enabled apparatus 102, the user's interaction with the apparatus 102, and/or the environment of the apparatus 102 vary. As a result, a resonant frequency of the apparatus 102 may also change because of, for example, the user or other surrounding elements that are in contact with, or arranged adjacent to, the apparatus 102. In some embodiments, the changed resonant frequency can determine a method of adjusting an operation of a haptic actuator (e.g., how to change a frequency of a haptic data) to provide haptic rendering adapted to the change in the dynamic system.
The dynamic system 106 may be represented with a plurality of dynamic system parameters 120, as illustrated in
The object parameters 122 include parameters representative of physical characteristics of the haptic enabled apparatus 102. Such physical characteristics of the haptic enabled apparatus 102 can influence the user's sensation of a haptic effect generated from the haptic enabled apparatus 102. In some embodiments, the haptic enabled apparatus 102 includes one or more devices (e.g., attachment devices) attached or coupled to the apparatus 102, such as a head-mounted display, a controller, or a screen. Therefore, the object parameters 122 also can indicate physical characteristics of the haptic enabled apparatus 102 and other devices associated with the apparatus 102. For example, any device, such as a controller or a screen, coupled to the haptic enabled apparatus 102 can determine the object parameters 122.
In at least some embodiments, the object parameters 122 include position, velocity, and acceleration of the haptic enabled apparatus 102. Further, the object parameters 122 include stiffness, damping, effective mass, acceleration, friction, or any other physical properties associated with the haptic enabled apparatus 102. In some embodiments, the position, velocity, and acceleration effects are associated with stiffness, damping, and inertia. In addition or alternatively, the object parameters 122 include a location, an arrangement, an orientation, and any other positional information associated with the haptic enabled apparatus 102. By way of example, the user may feel a haptic effect generated from the haptic enabled apparatus 102 differently depending on where the haptic enabled apparatus 102 is placed, such as when the apparatus 102 is laid on a table or when the apparatus 102 is placed in a plastic case. In addition or alternatively, the object parameters 122 include a shape or any other structural properties of the haptic enabled apparatus 102. In addition or alternatively, the object parameters 122 include product specifications of the haptic enabled apparatus 102 (including any auxiliary devices attachable to the apparatus 102, such as a head-mounted display or a controller as illustrated in
The user parameters 124 include parameters associated with the user's behavior with respect to the haptic enabled apparatus 102, and/or the user's physiological properties in contact with the haptic enabled apparatus 102. Similar to the object parameters 122, the user parameters 124 can influence the user's sensation of a haptic effect generated from the haptic enabled apparatus 102. For example, the user parameters 124 can indicate physical properties of the user's body in contact with the haptic enabled apparatus 102. In at least some embodiments, the user parameters 124 include a user's grip strength or force with respect to the apparatus 102, a user's grip pattern with respect to the apparatus 102 (e.g., holding with one hand or two hands), a user's skin sensitivity, a hand size, a finger size, a user's height, a user's posture (e.g., the user's sitting or standing while using the apparatus 102), a user's movement (e.g., the user's walking or running while using the apparatus 102), presence of a glove in hand, or any other user related properties which may influence the user's perception of haptic rendering.
In at least some embodiments, the user parameters 124 can further include the user's medical information, such as age, disability, illness, or any other information that may affect the user's feeling of haptic rendering.
The environmental parameters 126 include parameters associated with the environment 108 that surrounds the apparatus 102 and/or the user U using the apparatus 102. The environmental parameters 126 can influence the user's sensation of a haptic effect generated from the haptic enabled apparatus 102. Examples of environmental parameters 126 include weather information (e.g., temperature, humidity, precipitation, cloud cover, etc.), darkness (e.g., amount of light), loudness, geographic information (e.g., elevation and altitude), atmospheric pressure, or any other environmental factors that may affect the user's sensation of haptic rendering.
In some embodiments, the haptic enabled apparatus 102 can be a single device. In other embodiments, the haptic enabled apparatus 102 can collectively be a set of devices connected together.
In this embodiment, the haptic enabled apparatus 102 includes a bus 140, a processor 142, an input/output (I/O) controller 144, memory 146, a network interface controller (NIC) 148, a user interface 150, a dynamic system monitoring device 152, an actuator drive circuit 154, a haptic actuator 156, and a dynamic system characterization database 158.
In some embodiments, the elements, devices, and components of the apparatus 102, as illustrated in
The bus 140 includes conductors or transmission lines for providing a path to transfer data between the components in the apparatus 102 including the processor 142, the I/O controller 144, the memory 146, the NIC 148, the dynamic system monitoring device 152, and the actuator drive circuit 154. The bus 140 typically comprises a control bus, address bus, and data bus. However, the bus 140 can be any bus or combination of busses, suitable to transfer data between components in the apparatus 102.
The processor 142 can be any circuit configured to process information and can include any suitable analog or digital circuit. The processor 142 also can include a programmable circuit that executes instructions. Examples of programmable circuits include microprocessors, microcontrollers, application specific integrated circuits (ASIC), programmable gate arrays (PLA), field programmable gate arrays (FPGA), or any other processor or hardware suitable for executing instructions. In various embodiments, the processor 142 can be a single unit or a combination of two or more units. If the processor 142 includes two or more units, the units can be physically located in a single controller or in separate devices.
The processor 142 may be the same processor that operates the entire apparatus 102, or may be a separate processor. The processor 142 can decide what haptic effects are to be played and the order in which the effects are played based on high level parameters. In general, the high level parameters that define a particular haptic effect include magnitude, frequency and duration. Low level parameters such as streaming motor commands could also be used to determine a particular haptic effect.
The processor 142 receives signals or data from the input device 162 and outputs control signals to drive the actuator drive circuit 154. Data received by the processor 142 can be any type of parameters, instructions, flags, or other information that is processed by the processors, program modules, and other hardware disclosed herein.
The I/O controller 144 is circuitry that monitors operation of the apparatus 102 and peripheral or external devices such as the user interface 150. The I/O controller 144 also manages data flow between the apparatus 102 and the peripheral devices and frees the processor 142 from details associated with monitoring and controlling the peripheral devices. Examples of other peripheral or external devices with which the I/O controller 144 can interface includes external storage devices, monitors, input devices such as controllers, keyboards and pointing devices, external computing devices, antennas, other articles worn by a person, and any other remote devices.
The memory 146 can be any type of storage device or computer-readable medium such as random access memory (RAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), flash memory, magnetic memory, optical memory, or any other suitable memory technology. The memory 146 also can include a combination of volatile and nonvolatile memory. The memory 146 stores instructions executed by the processor 142. The memory 146 may also be located internal to the processor 142, or any combination of internal and external memory.
The network interface controller (NIC) 148 is in electrical communication with a network to provide communication (either wireless or wired) between the apparatus 102 and remote devices. Communication can be according to any wireless transmission techniques including standards such as Bluetooth, cellular standards (e.g., CDMA, GPRS, GSM, 2.5G, 3G, 3.5G, 4G), WiGig, IEEE 802.11a/b/g/n/ac, IEEE 802.16 (e.g., WiMax). The NIC 148 also can provide wired communication between the apparatus 102 and remote devices through wired connections using any suitable port and connector for transmitting data and according to any suitable standards such as RS 232, USB, FireWire, Ethernet, MIDI, eSATA, or thunderbolt.
The user interface 150 can include an input device 162 and an output device 164. The input device 162 includes any device or mechanism through which a user can input parameters, commands, and other information into the apparatus 102. In at least some embodiments, the input device 162 is configured to monitor or detect one or more events associated with the haptic enabled apparatus 102 or a user of the haptic enabled apparatus 102, or one or more events performed by the user, of which the user can be informed with a haptic feedback. The input device 162 is any device that inputs a signal into the processor 142.
Examples of the input device 162 include touchscreens, touch sensitive surfaces, cameras, mechanical inputs such as buttons and switches, and other types of input components, such as a mouse, touchpad, mini-joystick, scroll wheel, trackball, game pads or game controllers. Other examples of the input device 162 include a control device such as a key, button, switch or other type of user interfaces. Yet other examples of the input device 162 include a transducer that inputs a signal into the processor 142. Examples of transducers that can be used as the input device 162 include one or more antennas and sensors. For example, the input device 162 includes the dynamic system monitoring device 152 as described herein. In other examples, the dynamic system monitoring device 152 includes the input device 162. Yet other examples of the input device 162 include removable memory readers for portable memory, such as flash memory, magnetic memory, optical memory, or any other suitable memory technology.
The output device 164 includes any device or mechanism that presents information to a user in various formats, such as visual and audible formats. Examples of output device 164 include display screens, speakers, lights, and other types of output components. The output device 164 can also include removable memory readers. In one embodiment, the input device 162 and the output device 164 are integrally formed, such as a touch-sensitive display screen.
With reference still to
The dynamic system monitoring device 152 includes one or more sensors of various types, which may be incorporated in the apparatus 102 or connected to the apparatus 102. In some embodiments, the dynamic system monitoring device 152 can include the input device 162 of the apparatus 102. The dynamic system monitoring device 152 can also be referred to as the input device. Sensors can be any instruments or other devices that output signals in response to receiving stimuli. The sensors can be hardwired to the processor or can be connected to the processor wirelessly. The sensors can be used to detect or sense a variety of different conditions, events, environmental conditions, the operation or condition of the apparatus 102, the presence of other people or objects, or any other condition or thing capable of stimulating sensors.
Examples of sensors include acoustical or sound sensors such as microphones; vibration sensors; chemical and particle sensors such as breathalyzers, carbon monoxide and carbon dioxide sensors, and Geiger counters; electrical and magnetic sensors such as voltage detectors or hall-effect sensors; flow sensors; navigational sensors or instruments such as GPS receivers, altimeters, gyroscopes, magnetometers or accelerometers; position, proximity, and movement-related sensors such as piezoelectric materials, rangefinders, odometers, speedometers, shock detectors; imaging and other optical sensors such as charge-coupled devices (CCD), CMOS sensors, infrared sensors, and photodetectors; pressure sensors such as barometers, piezometers, and tactile sensors; force sensors such as piezoelectric sensors and strain gauges; temperature and heat sensors such as thermometers, calorimeters, thermistors, thermocouples, and pyrometers; proximity and presence sensors such as motion detectors, triangulation sensors, radars, photo cells, sonars, and hall-effect sensors; biochips; biometric sensors such as blood pressure sensors, pulse/ox sensors, blood glucose sensors, and heart monitors. Additionally, sensors can be formed with smart materials, such as piezo-electric polymers, which in some embodiments function as both a sensor and an actuator.
Various embodiments can include a single sensor or can include two or more sensors of the same or different types. Additionally, various embodiments can include different types of sensors.
The actuator drive circuit 154 is a circuit that receives a haptic signal (which is also referred to herein as a control signal) from the actuator drive module 178. The haptic signal embodies haptic data associated with haptic effects, and the haptic data defines parameters the actuator drive circuit 154 uses to generate an actuator drive signal. In exemplary embodiments, such parameters relate to, or are associated with, electrical characteristics. Examples of electrical characteristics that can be defined by the haptic data includes frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event. The actuator drive signal is applied to the actuator 156 to cause one or more haptic effects.
The actuator 156, which also is referred to herein as a haptic actuator or a haptic output device, operates to generate haptic effects. The actuator 156 is controlled by the processor 142 that executes the actuator drive module 178, which sends a haptic signal to the actuator drive circuit 154. The actuator drive circuit 154 then generates and applies an actuator drive signal to the actuator 156 to drive the actuator 156. When applied to the actuator 156, an actuator drive signal causes the actuator 156 to generate haptic effects by activating and braking the actuator 156.
The actuator 156 can be of various types. In the illustrated embodiments, the actuator is a resonant actuator, such as a Linear Resonant Actuator (LRA) in which a mass attached to a spring is driven back and forth. In other embodiments, the actuator is a solenoid resonant actuator (SRA).
Other types of electromagnetic actuators are also used, such as an Eccentric Rotating Mass (ERM) in which an eccentric mass is moved by a motor or a “smart material” such as piezoelectric, electro-active polymers or shape memory alloys. Actuators 156 also broadly include non-mechanical or non-vibratory devices such as those that use electrostatic friction (ESF), ultrasonic surface friction (USF), or those that induce acoustic radiation pressure with an ultrasonic haptic transducer, or those that use a haptic substrate and a flexible or deformable surface, or those that provide projected haptic output such as a puff of air using an air jet, and so on.
The apparatus 102 may include more than one actuator 156, and each actuator may include a separate actuator drive circuit 154, all coupled to the processor 142. In embodiments with more than one actuator, each actuator can have a different output capability in order to create a wide range of haptic effects on the device.
The dynamic system characterization database 158 operates to store various data from the dynamic system characterization module 174 and/or the haptic effect rendering module 176. For example, the dynamic system characterization database 158 stores data of characteristics or properties of the dynamic system 106, such as dynamic system characterization model 200 and/or dynamic system parameters 120 (as shown in
In some embodiments, the dynamic system characterization database 158 is configured as a secondary storage device (such as a hard disk drive, flash memory cards, digital video disks, compact disc read only memories, digital versatile disk read only memories, random access memories, or read only memories) for storing digital data. The secondary storage device is connected to the bus 140. The secondary storage devices and their associated computer readable media provide nonvolatile storage of computer readable instructions (including application programs and program modules), data structures, and other data for the apparatus 102. Although it is illustrated that the secondary storage device for the database 158 is included in the apparatus 102, it is understood that the secondary storage device is a separate device from the apparatus 102 in other embodiments. In yet other embodiments, the database 158 is included in the memory 146.
Referring still to
The user input acquisition module 172 are instructions that, when executed by the processor 142, cause the processor 142 to receive user inputs of one or more parameters associated with haptic effects or haptic effect modifiers. The user input acquisition module 172 can communicate with the input device 162 of the user interface 150 and enable a user to input such parameters through the input device 162. By way of example, the user input acquisition module 172 provides a graphical user interface on a display screen (i.e., the input device 162) that allows a user to enter or select one or more parameters for haptic effects.
The dynamic system characterization module 174 are instructions that, when executed by the processor 142, cause the processor 142 to receive signals from the dynamic system monitoring device 152 and obtain the dynamic system parameters 120 based on the received signals. The dynamic system characterization module 174 further operates to generate and update a dynamic system characterization model 200 (
The haptic effect rendering module 176 are instructions that, when executed by the processor 142, cause the processor 142 to render haptic effects on the haptic enabled apparatus 102. In at least some embodiments, the haptic effect rendering module 176 generates a haptic data or haptic effect rendering model 210 (
The actuator drive module 178 include instructions that, when executed by the processor 142, cause the processor 142 to generate control signals for the actuator drive circuit 154. The actuator drive module 178 can also determine feedback from the actuator 156 and adjust the control signals accordingly.
The communication module 180 facilitates communication between the apparatus 102 and remote devices. Examples of remote devices include computing devices, sensors, actuators, networking equipment such as routers and hotspots, vehicles, exercise equipment, and smart appliances. Examples of computing devices include servers, desktop computers, laptop computers, tablets, smartphones, home automation computers and controllers, and any other device that is programmable. The communication can take any form suitable for data communication including communication over wireless or wired signal or data paths. In various embodiments, the communication module may configure the apparatus 102 as a centralized controller of the system 100 or other remote devices, as a peer that communicates with other computing devices or other remote devices, or as a hybrid centralized controller and peer such that the controller can operate as a centralized controller in some circumstances and as a peer in other circumstances.
Alternative embodiments of the program modules are possible. For example, some alternative embodiments might have more or fewer program modules than the modules illustrated in
In some embodiments, the adaptive haptic effect rendering device 104 as illustrated in
In other embodiments, the adaptive haptic effect rendering device 104 can be configured separately from the haptic enabled apparatus 102. For example, the adaptive haptic effect rendering device 104 is configured as part of a server computing device that communicates with the haptic enabled apparatus 102 via a network.
At operation 302, the haptic enabled apparatus 102 obtains a dynamic system characterization model 200. In some embodiments, the haptic enabled apparatus 102 operates to generate a dynamic system characterization model 200. In other embodiments, the dynamic system characterization model 200 is generated using a separate computing device (e.g., a server computing device) and provided to the haptic enabled apparatus 102.
The dynamic system characterization model 200 is configured to represent the dynamic system 106. In at least some embodiments, the dynamic system characterization model 200 is built based on the dynamic system parameters 120 and can be configured in the form of a transformation matrix that correlates the dynamic system parameters 120 with the haptic data parameters 202 of the haptic data, as further illustrated in
In at least some embodiments, the characterization data of the dynamic system 106, which include the dynamic system parameters 120, can be stored in a database, such as the database 158 (
In at least some embodiments, the dynamic system characterization model 200 is configured to render a standard haptic effect representation by default. For example, the dynamic system characterization model 200 is configured such that standard haptic effects are rendered until any change to the dynamic system parameters 120 of the dynamic system 106 is detected and thereby the dynamic system characterization model 200 is updated.
At operation 304, the haptic enabled apparatus 102 operates to detect characteristics of the dynamic system 106. In some embodiments, the haptic enabled apparatus 102 monitors a change to the dynamic system parameters 120 in real time when the haptic enabled apparatus 102 is used or manipulated by a user U. In other embodiments, the haptic enabled apparatus 102 monitors the dynamic system parameters 120 periodically (or at predetermined intervals). In yet other embodiments, the haptic enabled apparatus 102 obtains the dynamic system parameters 120 when a change to the dynamic system parameters 120 is detected. In yet other embodiments, the haptic enabled apparatus 102 detects the dynamic system parameters 120 at random times.
In at least some embodiments, the haptic enabled apparatus 102 operates the dynamic system monitoring device 152 to dynamically monitor the dynamic system parameters 120 of the dynamic system 106. For example, the dynamic system monitoring device 152 operates to detect the dynamic system parameters 120 or a change thereto, and generates a sensor signal (also referred to herein as an input device signal) representative of the dynamic system parameters 120. The haptic enabled apparatus 102 operates to receive the sensor signal from the dynamic system monitoring device 152 and process the sensor signal to obtain the dynamic system parameters 120.
As such, the haptic enabled apparatus 102 actively uses one or more sensors to monitor changes in the dynamic system 106, and, as described below, implements an algorithm to constantly re-characterize the dynamic system 106. As described herein, a few examples of the sensors include accelerometers to parameterize mass, cameras to parameterize stiffness, and force sensors to measure grip strength and biometrics (e.g., to monitor muscle tension).
As described herein, the sensed proprieties (e.g., the dynamic system parameters) of the dynamic system 106 can include physical properties (e.g., mass, friction, and damping parameters), abstract system characteristic properties (e.g., amplitude and controller settings), and user characteristics (e.g., user's grip strength (such as weak grips or tight grips), user's likely sensitivity level to vibrations at different frequencies at different body sites, and user's skin conductance level (e.g., a change in conductive as a user touches a touch screen with a bare finger or a gloved finger).
Regarding the user characteristics, by way of example, if a user's Meissner corpuscles in the user's right hand have been identified or detected to be less sensitive to haptic feedback compared to the user's Pacinian corpuscles in the user's right hand, the dynamic system associated with the user (or the user's right hand) is created and/or modified such that lower frequency vibrotactile effects are amplified than higher frequency vibrotactile effects when the haptic effect is to be generated to the user's right hand. As described herein, such user characteristics can be identified or detected in various manners, such as using sensing devices of the haptic enabled apparatus or using user's medical record.
At operation 306, the haptic enabled apparatus 102 operates to update the dynamic system characterization model 200 based on the detected dynamic system parameters 120. In some embodiments, the dynamic system characterization model 200 can be updated based on the received sensor signal representative of the detected dynamic system parameters 120. The updated dynamic system characterization model 200 is used to transform a standard haptic data (i.e., a standard haptic rendering model), thereby rendering more appropriate haptic effects to the changed dynamic system. A standard haptic data can also be referred to herein as a base haptic data or a universal haptic data.
Where the characterization data of the dynamic system 106 can be stored in the database 158, the database 158 can be updated according to the detected dynamic system parameters 120.
At operation 308, the haptic enabled apparatus 102 operates to modify the haptic effect rendering model 210 (also referred to herein as the haptic data) based on the updated dynamic system characterization model 200. The haptic effect rendering model 210 can define the haptic data parameters 202. Some examples of haptic data parameters 202 relate to, or are associated with, characteristics of the haptic drive signals, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event. As described herein and illustrated in
An example of the haptic effect rendering model 210 is represented as Equation (1) below:
τ=M
acc
{umlaut over (θ)}+C
vel−sgn {dot over (θ)}−+Bvel−{dot over (θ)}−+Cvel+sgn {dot over (θ)}++Bvel+{dot over (θ)}++Apos sin(θ/Ppos+Spos) (1)
In some embodiments, Equation (1) is used for a rotary control, such as a torque control or a radian control. For example, where a vertical dial or knob is associated with a haptic effect as the dial or knob is spun with a finger (e.g., on a touch-sensitive display screen), the haptic effect may be rendered using Equation (1). As described herein, the output of Equation (1) (e.g., a torque value) is used to generate a haptic signal of various characteristics, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event. The haptic signal is then applied to the actuator to cause one or more haptic effects.
Another example of the haptic effect rendering model 210 is represented as Equation (2) below, which can be used for a linear force implementation:
F=M{umlaut over (x)}+B
vel−
{dot over (x)}+B
vel+
{dot over (x)}+C
vel−
{dot over (x)}+C
vel+
{dot over (x)}+kx (2)
Where, F=force (N),
Similarly, the output of Equation (2) (e.g., a force value) is used to generate a haptic signal of various characteristics, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event. The haptic signal is then applied to the actuator to cause one or more haptic effects.
At operation 310, the haptic enabled apparatus 102 operates to generate a haptic signal 212 based on the modified haptic effect rendering model 210. As described herein and further illustrated in
At operation 312, the haptic enabled apparatus 102 operates to control the haptic actuator 156 using the haptic signal 212. In some embodiments, the haptic signal 212 is provided to the actuator drive circuit 154, which then generates and applies an actuator drive signal to the haptic actuator 156, thereby driving the haptic actuator 156. The haptic actuator 156 is driven by the actuator drive signal to provide a haptic effect adapted to the monitored dynamic system parameters 120 of the dynamic system 106.
Referring to
Later, the user's grip changes to Situation B (a right diagram of
In addition, where two users operate the controllers separately, haptic effects can be rendered differently to the users to accommodate their different gripping forces, stiffness, and mass with respect to the controllers. By way of example, when a first user grips the controllers with more force than a second user, and the first user's hands are heavier and less stiff than the second user's, the haptic effects are adapted (e.g., magnitude increased) to accommodate the first user's hands and controller configuration which have been actively monitored and updated as described herein.
Referring to
As in
As in
As described herein, the haptic profile can represent one or more characteristics of the haptic drive signals, such as frequency, amplitude, phase, inversion, duration, waveform, attack time, rise time, fade time, and lag or lead time relative to an event. The haptic profile can be one or a combination of various characteristics of the haptic drive signals, thereby mapping different haptic responses for different system configurations. As described herein, the system configurations can change due to various factors, examples of which include the apparatus 102 being held by a user hand with variable parameters, such as grip strength, effective mass, and temperature.
In one example, the haptic profile being set up can be a haptic amplitude. As described herein, different amplitudes can compensate for different grip strengths. For example, a higher amplitude haptic effect can be rendered to compensate for the reduced vibration intensity that will result a user holding a device tightly. In some embodiments, the unit of amplitude can be a unitless scale from 0-1, as a voltage (i.e., driving a haptic actuator) or any other characteristics. In another example, alternatively or in addition, the haptic profile being set up can be a haptic frequency. In some embodiments, the haptic frequency can map to a resonant frequency of the apparatus 102 (which can change in the context of the dynamic system 106; in this sense, a resonant frequency may be also referred to as a resonant frequency of the dynamic system 106). Such mapping of the haptic frequency to the resonant frequency of the apparatus 102 can help conveying a stronger haptic effect.
In some embodiments, the dynamic system characterization model 200 incorporates a machine learning algorithm configured to learn the dynamic system 106 (e.g., a user behavior) with respect to the haptic enabled apparatus 102. For example, such a machine learning algorithm allows anticipating a user behavior interacting with the haptic enabled apparatus 102 and updating the dynamic system characterization model 200 accordingly, thereby providing improved user experience of haptic effects adapted to the anticipated change in the user behavior.
The systems and methods for generating adaptive haptic effects in accordance with the present disclosure can also be applied to kinesthetic and temperature haptic feedback conditions. By way of example, an adjustment in force feedback can be made depending on a monitored stiffness of a user's muscles. If tense muscles are monitored, more force feedback is provided, and if relaxed muscled are monitored, less force feedback is provided. Similarly, a temperature adjustment can be made depending on the temperature of a haptic enabled device or the environment thereof. By way of example, if a warm climate is detected, a higher rendered temperature is generated and provided, and if a cool climate is detected, a lower rendered temperature is generated and provided.
The systems and methods of the present disclosure can be used for physics-based models involving velocity and acceleration based effects such as friction and mass, respectively. The mathematical stability of physics-based models can be sensitive to the accuracy of the dynamic system identification. In some embodiments, a step for calibrating the dynamic system identification can be provided to improve the accuracy.
The various examples and teachings described above are provided by way of illustration only and should not be construed to limit the scope of the present disclosure. Those skilled in the art will readily recognize various modifications and changes that may be made without following the examples and applications illustrated and described herein, and without departing from the true spirit and scope of the present disclosure.