Orientation adjustable multi-channel haptic device

Information

  • Patent Grant
  • 10209776
  • Patent Number
    10,209,776
  • Date Filed
    Thursday, August 24, 2017
    7 years ago
  • Date Issued
    Tuesday, February 19, 2019
    5 years ago
Abstract
Embodiments generate haptic effects on a device that is grasped by a user on a first side having a corresponding first haptic output device and on a second side having a corresponding second haptic output device. Embodiments receive a first haptic effect channel and receive a second haptic effect channel. Embodiments determine that the first side is more tightly grasped by the user than the second side. Embodiments then, based on the determining, assign the first haptic effect channel to the first haptic output device and assign the second haptic effect channel to the second haptic output device.
Description
FIELD

One embodiment is directed generally to haptic effects, and in particular to haptic effects generated by a multi-channel device.


BACKGROUND INFORMATION

Portable/mobile electronic devices, such as mobile phones, smartphones, camera phones, cameras, personal digital assistants (“PDA”s), etc., typically include output mechanisms to alert the user of certain events that occur with respect to the devices. For example, a cell phone normally includes a speaker for audibly notifying the user of an incoming telephone call event. The audible signal may include specific ringtones, musical tunes, sound effects, etc. In addition, cell phones may include display screens that can be used to visually notify the users of incoming phone calls.


In some mobile devices, kinesthetic feedback (such as active and resistive force feedback) and/or tactile feedback (such as vibration, texture, and heat) is also provided to the user, more generally known collectively as “haptic feedback” or “haptic effects”. Haptic feedback can provide cues that enhance and simplify the user interface. Specifically, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment.


SUMMARY

Embodiments generate haptic effects on a device that is grasped by a user on a first side having a corresponding first haptic output device and on a second side having a corresponding second haptic output device. Embodiments receive a first haptic effect channel and receive a second haptic effect channel. Embodiments determine that the first side is more tightly grasped by the user than the second side. Embodiments then, based on the determining, assign the first haptic effect channel to the first haptic output device and assign the second haptic effect channel to the second haptic output device.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 is a block diagram of a haptically-enabled system in accordance with one embodiment of the present invention.



FIG. 2 is a flow diagram of the functionality of the orientated haptic effects module and the system of FIG. 1 when generating orientation adjustable haptic effect signals for actuators in accordance with one embodiment.



FIG. 3 is a block diagram of an embodiment of the present invention that uses sound to haptic conversion to generate haptic effect channels.





DETAILED DESCRIPTION

One embodiment is a haptically-enabled device/system that includes more than one haptic channel. For example, the device can include a left haptic channel that generates haptic effects predominately on the left side of the device, and a right haptic channel that generates haptic effects substantially independent of the left haptic channel and predominately on the right side of the device. The haptic device may be handheld/mobile and may change orientation (e.g., turned 180 degrees) during usage. Therefore, embodiments determine the current orientation and route the haptic channels accordingly so that they match up with the current orientation. In general, embodiments map haptic signals with respect to an actuator spatial arrangement of the device.



FIG. 1 is a block diagram of a haptically-enabled system 10 in accordance with one embodiment of the present invention. System 10 includes a touch sensitive surface 11 or other type of user interface mounted within a housing 15, and may include mechanical keys/buttons 13. Internal to system 10 is a haptic feedback system that generates vibrations on system 10. In one embodiment, the vibrations are generated on touch surface 11.


The haptic feedback system includes a processor or controller 12. Coupled to processor 12 is a memory 20 and a left actuator drive circuit 16, which is coupled to a left actuator 18. Actuator 18 may be any type of actuator that can generate and output a haptic effect including, for example, an electric motor, an electro-magnetic actuator, a voice coil, a linear resonant actuator, a piezoelectric actuator, a shape memory alloy, an electro-active polymer, a solenoid, an eccentric rotating mass motor (“ERM”) or a linear resonant actuator (“LRA”).


Processor 12 may be any type of general purpose processor, or could be a processor specifically designed to provide haptic effects, such as an application-specific integrated circuit (“ASIC”). Processor 12 may be the same processor that operates the entire system 10, or may be a separate processor. Processor 12 can decide what haptic effects are to be played and the order in which the effects are played based on high-level parameters. In general, the high-level parameters that define a particular haptic effect include magnitude, frequency, and duration. Low-level parameters such as streaming motor commands could also be used to determine a particular haptic effect.


Processor 12 outputs the control signals to left actuator drive circuit 16, which includes electronic components and circuitry used to supply left actuator 18 with the required electrical current and voltage (i.e., “motor signals”) to cause the desired haptic effects. In addition, system 10 include a right actuator drive circuit 26 and a right actuator 28, that operate substantially the same as the corresponding left side devices. Left actuator 18 can be positioned within system 10 to generate a vibratory haptic effect 30 predominantly on the left side of system 10, and right actuator 28 can be positioned within system 10 to generate a vibratory haptic effect 40 predominantly on the right side of system 10. System 10 further includes a sensor 25 that detects the orientation of system 10, such as an accelerometer, tilt sensor, three-dimensional detection sensor, etc. Signals from sensor 25 can be used by processor 12 to determine the location or overall spatial arrangement of all of the haptic output devices of system 10.


In addition to or in place of actuators 18, 28, system 10 may include other types of haptic output devices (not shown) that may be non-mechanical or non-vibratory devices such as devices that use electrostatic friction (“ESF”), ultrasonic surface friction (“USF”), devices that induce acoustic radiation pressure with an ultrasonic haptic transducer, devices that use a haptic substrate and a flexible or deformable surface or shape changing devices and that may be attached to a user's body, devices that provide projected haptic output such as a puff of air using an air jet, etc.


In other embodiments, actuators 18, 28 and sensor 25 may be in remote communication to processor 12. In these embodiments, processor 12 may receive signals from sensor 25, determine a mapping of haptic effects based on the signals, and transmit the haptic effects to the corresponding remote haptic output devices. For example, processor 12 and system 10 may be a central controller that controls and provides haptic effects to wearable haptic devices such as wrist bands, headbands, eyeglasses, rings, leg bands, arrays integrated into clothing, etc., or any other type of device that a user may wear on a body or can be held by a user and that is haptically enabled. The wearable devices include one or more haptic output devices that generate haptic effects on the wearable devices and are remote from system 10.


Memory 20 can be any type of storage device or computer-readable medium, such as random access memory (“RAM”) or read-only memory (“ROM”). Memory 20 stores instructions executed by processor 12. Among the instructions, memory 20 includes an orientated haptic effects module 22 which are instructions that, when executed by processor 12, generate orientation adjustable drive signals sent to drive circuits 16, 26 to generate haptic effects, as disclosed in more detail below. Memory 20 may also be located internal to processor 12, or be any combination of internal and external memory.


Touch surface 11 recognizes touches, and may also recognize the position and magnitude of touches on the surface. The data corresponding to the touches is sent to processor 12, or another processor within system 10, and processor 12 interprets the touches and in response generates haptic effect signals. Touch surface 11 may sense touches using any sensing technology, including capacitive sensing, resistive sensing, surface acoustic wave sensing, pressure sensing, optical sensing, etc. Touch surface 11 may sense multi-touch contacts and may be capable of distinguishing multiple touches that occur at the same time. Touch surface 11 may be a touchscreen that generates and displays images for the user to interact with, such as keys, dials, etc., or may be a touchpad with minimal or no images.


System 10 may be a handheld device, such a cellular telephone, personal digital assistant (“PDA”), smartphone, computer tablet, gaming console, wearable device, or may be any other type of device that includes a haptic effect system that includes one or more actuators. The user interface may be a touch sensitive surface, or can be any other type of user interface such as a mouse, touchpad, mini-joystick, scroll wheel, trackball, game pads or game controllers, etc. In embodiments with more than one actuator, each actuator may have a different rotational capability in order to create a wide range of haptic effects on the device. Not all elements illustrated in FIG. 1 will be included in each embodiment of system 10. In many embodiments, only a subset of the elements are needed.


In one embodiment, system 10 is a multi-channel haptic device, meaning processor 12 generates more than one haptic effect channel (i.e., a haptic effect signal that generates a haptic effect), and each channel is output/sent to a separate actuator or other haptic output device. In one embodiment, system 10 generates a haptic effect channel that corresponds to each channel of audio data, such as the left and right channels of stereo audio data. In one embodiment, the haptic effect channels/signals can be automatically generated from the audio channels, as disclosed in, for example, U.S. patent application Ser. Nos. 13/365,984 and 13/366,010, the disclosures of which are herein incorporated by reference.


On devices such as system 10 with multiple actuators, it is sometimes necessary to change the haptic signals sent to the actuators based on the posture or orientation of the device. For example, on system 10, actuators 18, 28 may be stereo piezo actuators, one on the left and one on the right. The tactile effects in games and videos executed on system 10 will typically be designed with the intention that some effects are felt on the left and others are felt on the right. However most devices such as system 10 allow the user to flip the device completely around, and the visual image will spin to adapt so that it is still right-side-up. Embodiments, therefore, flip the tactile effects so that the correct effects intended for the left side still play on the left side, and the ones intended for the right side play on the right.



FIG. 2 is a flow diagram of the functionality of orientated haptic effects module 22 and system 10 of FIG. 1 when generating orientation adjustable haptic effect signals for actuators 18, 28 in accordance with one embodiment. In one embodiment, the functionality of the flow diagram of FIG. 2 is implemented by software stored in memory or other computer readable or tangible medium, and executed by a processor. In other embodiments, the functionality may be performed by hardware (e.g., through the use of an application specific integrated circuit (“ASIC”), a programmable gate array (“PGA”), a field programmable gate array (“FPGA”), etc.), or any combination of hardware and software.


At 202, the device orientation is determined based on one or more sensors, such as sensor 25. In one embodiment, sensor 25 is an accelerometer.


At 204, the haptic effect data/channels are obtained. The haptic effect data can be obtained directly from an application that generates the data, or can be generated based on sound-to-haptic conversion techniques or other conversion techniques. The haptic effect data includes multiple haptic effect channels, where each channel is configured to be directed to a different haptic output device, such as a left or right actuator.


At 206, each haptic effect channel at 204 is assigned/mapped to an individual haptic output device.


At 208, each haptic effect channel is sent to the corresponding assigned haptic output device. The haptic output devices can be local or remote from system 10.


At 210, each haptic output device generates haptic effects in response to receiving the haptic effect channel.



FIG. 3 is a block diagram of an embodiment of the present invention that uses sound to haptic conversion to generate haptic effect channels. In the embodiment of FIG. 3, audio data is played at 300 and includes a left audio channel and a right audio channel. However, in the example of FIG. 3, because the orientation of the playback device (e.g., system 10 of FIG. 1) has changed, the actuator 304 that was originally on the left side is now on the right side, and vice versa. Therefore, it is determined that the left audio channel should be swapped with the right audio channel.


In Option 1, the audio channels can be swapped at 301, before being received by a sound to haptic conversion module 310, which converts each audio channel to a haptic effect channel, and before being received by a mixer 320. If Option 1 is used, a flag will be set in swap flag 330 to indicate that the channels have been swapped. Therefore, sound to haptic conversion module 310 can proceed to generate haptic effect channels without concern for the orientation of the device.


In Option 2, sound to haptic conversion module 310 receives un-swapped data, and determines from swap flag 330 that the channels need to be swapped. Module 310 can then include the swapping functionality as part of the sound to haptic conversion functionality before outputting haptic channels to the mapped actuators 340.


Although embodiments described above consider two directions when determining mapping (i.e., left and right), other embodiments can consider four directions (i.e., top, bottom, left, right), or any other number of directions, depending on the number and placement of the haptic output devices. Further, rather than left and right, the haptic output devices could be on the front and back of a device, or in some other arrangement. The mapping of haptic channels may also be based on hand position, grasp strength or grasp style (i.e., the way the user is holding the device) in addition to or in place of the orientation of the device. For example, if the user is tightly grasping the device on the left side, while barely grasping the device on the right side, one of the haptic channels may be mapped to the tightly grasped side, and the other haptic channel may be mapped to the lightly grasped side. Further, the volumes or “magnitudes” of the haptic channels can be adjusted according to the grasps. For example, the side being tightly grasped could be left at regular magnitude, while the side that is lightly grasped could be increased in magnitude.


In some embodiments disclosed above, the device is rotated 180 degrees so that, for example, the left side is swapped with the right side. However, in some instances, a device with two actuators, one on each side, may be rotated 90 degrees or some other amount less than 180 degrees. In one embodiment, one or more of the following mapping/assigning of channels may occur:


The left and right haptic channels are mixed into a single “center” channel to be played on both actuators.


No mixing—instead the left haptic effect is played on the actuator that was most recently on the left side, and the right haptic effect is played on what was the right actuator. This will provide the most consistent experience.


Some other attributes are used to determine which actuator to play on (e.g., one may be larger than the other).


In another embodiment with two actuators, the haptic signal is comprised of four-channels (e.g., left/right/front/back), and the device is rotated 90 degrees or some other amount less than 180 degrees. In this embodiment, the two channels that correspond to the current orientation (left/right OR front/back) are selected and those channels are rendered. The two off-axis channels are either dropped, or are mixed and treated as a center channel to be played on one or both of the other actuators.


In general, when embodiments, are assigning tracks of a multi-channel haptic effect to individual actuators, the first consideration is to use the effects that match the positions/axes of the actuators in the system, followed by an optional mapping of off-axis effects to one or more actuators.


Further, the haptic mapping may be modified according to where the user is touching a touch screen in a control widget. For example, a device may include a graphic touchscreen slider along the side of a multi-actuator device. The actuator closest to the user could receive a haptic command signal, while the other actuators receive no haptic signals.


The mapping may also apply to flexible devices and be dependent, for example, on whether the screen is rolled up or stretched out, and may apply to multi-cell touch screens.


Embodiments using mapping further include wearable haptic devices. For example, in one embodiment the haptic output device is a ring with multiple vibration elements. The mapping of the haptic channels can be adjusted according to the orientation of the ring. For example, which of the actuators that is currently at the top of the ring will change depending on the current orientation of the ring. When sending an “up” haptic signal (i.e., a haptic signal that imparts “up” information), the current actuator on the top of the ring will receive the up haptic effect. In another example, the user may be wearing a haptically-enabled watch on each arm. If the user swaps the watches, system 10 will determine which arm has which watch and map the haptic effects accordingly.


Further, the mapping in embodiments can occur “mid-stream” such as during the playing of a movie. For example, a user may be watching a movie that has stereo spatialized tactile effects on a tablet. The user pauses the movie and puts down the tablet. When the user returns and picks up the tablet, it is in the opposite orientation as before. The tablet rotates the display image to accommodate this. Further, embodiments also rotate the specialized haptic image by changing the mapping.


Although embodiments disclosed above include multiple actuators, in one embodiment the device can include a single actuator. In this embodiment, the haptic effect is changed based on how the user is holding the device rather than the orientation of the device. For example, if the actuator is on the left side, and the user is holding the device on the left side, the magnitude of the haptic effect may be relatively weak. However, if sensor 25 detects that the user is holding the device on the right side, the magnitude of the haptic effect may be relatively strong so that the user can feel that haptic effect at a distance from the placement of the actuator.


As disclosed, embodiments map haptic channels to actuators based on the orientation of the device, among other factors. Therefore, spatial haptic effects will always be generated by the appropriate haptic output device, regardless of how a mobile device, for example, is being held by a user.


Several embodiments are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the disclosed embodiments are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.

Claims
  • 1. A method of generating haptic effects on a device that is grasped by a user on a first side corresponding to a first haptic output device and on a second side corresponding to a second haptic output device, the method comprising: receiving a first haptic effect channel;receiving a second haptic effect channel;determining that the first side is more tightly grasped by the user than the second side; andbased on the determining, assigning the first haptic effect channel to the first haptic output device and assigning the second haptic effect channel to the second haptic output device.
  • 2. The method of claim 1, wherein the first side is a left side of the device, and the second side is a right side of the device.
  • 3. The method of claim 1, wherein the first side is a right side of the device, and the second side is a left side of the device.
  • 4. The method of claim 1, further comprising increasing a magnitude of the second haptic effect channel.
  • 5. The method of claim 1, further comprising: receiving a third haptic effect channel;receiving a fourth haptic effect channel;determining an orientation of the device; andbased on the orientation, assigning the third haptic effect channel to a third haptic output device and assigning the fourth haptic effect channel to a fourth haptic output device.
  • 6. The method of claim 5, wherein the third haptic output device is located on a top side of the device and the fourth haptic output device is located on a bottom side of the device.
  • 7. The method of claim 5, wherein the third haptic output device is located on a front side of the device and the fourth haptic output device is located on a back side of the device.
  • 8. The method of claim 1, further comprising decreasing a magnitude of the first haptic effect channel.
  • 9. A haptically enabled portable device comprising: a processor;a first side and a first haptic output device positioned on the first side;a second side and a second haptic output device positioned on the second side; anda grasp sensor;wherein the processor: receives a first haptic effect channel;receives a second haptic effect channel;determines from the grasp sensor that the first side is more tightly grasped by a user than the second side; andbased on the determines, assigns the first haptic effect channel to the first haptic output device and assigns the second haptic effect channel to the second haptic output device.
  • 10. The haptically enabled portable device of claim 9, wherein the first side is a left side of the device, and the second side is a right side of the device.
  • 11. The haptically enabled portable device of claim 9, wherein the first side is a right side of the device, and the second side is a left side of the device.
  • 12. The haptically enabled portable device of claim 9, the processor further increases a magnitude of the second haptic effect channel.
  • 13. The haptically enabled portable device of claim 9, the processor further: receives a third haptic effect channel;receives a fourth haptic effect channel;determines an orientation of the device; andbased on the orientation, assigns the third haptic effect channel to a third haptic output device and assigns the fourth haptic effect channel to a fourth haptic output device.
  • 14. The haptically enabled portable device of claim 13, wherein the third haptic output device is located on a top side of the device and the fourth haptic output device is located on a bottom side of the device.
  • 15. The haptically enabled portable device of claim 13, wherein the third haptic output device is located on a front side of the device and the fourth haptic output device is located on a back side of the device.
  • 16. The haptically enabled portable device of claim 9, the processor further decreases a magnitude of the first haptic effect channel.
  • 17. A non-transitory computer-readable medium having instructions stored thereon that, when executed by a processor, cause the processor to generate haptic effects on a device that is grasped by a user on a first side having a corresponding first haptic output device and on a second side having a corresponding second haptic output device, the generate haptic effects comprising: receiving a first haptic effect channel;receiving a second haptic effect channel;determining that the first side is more tightly grasped by the user than the second side; andbased on the determining, assigning the first haptic effect channel to the first haptic output device and assigning the second haptic effect channel to the second haptic output device.
  • 18. The non-transitory computer-readable medium of claim 17, further comprising increasing a magnitude of the second haptic effect channel.
  • 19. The non-transitory computer-readable medium of claim 17, the generate haptic effects further comprising: receiving a third haptic effect channel;receiving a fourth haptic effect channel;determining an orientation of the device; andbased on the orientation, assigning the third haptic effect channel to a third haptic output device and assigning the fourth haptic effect channel to a fourth haptic output device.
  • 20. The non-transitory computer-readable medium of claim 17, the generate haptic effects further decreasing a magnitude of the first haptic effect channel.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. patent application Ser. No. 14/959,077, filed on Dec. 4, 2015, which is a continuation of U.S. patent application Ser. No. 14/030,181, filed on Sep. 18, 2013 and issued as U.S. Pat. No. 9,207,764 on Dec. 8, 2015. The specification of each of these applications is hereby incorporated by reference.

US Referenced Citations (70)
Number Name Date Kind
6131097 Peurach et al. Oct 2000 A
7082570 von Wiegand et al. Jul 2006 B1
9207764 Birnbaum Dec 2015 B2
9367136 Latta et al. Jun 2016 B2
9370459 Mahoney Jun 2016 B2
9370704 Marty Jun 2016 B2
9392094 Hunt et al. Jul 2016 B2
9462262 Worley, III et al. Oct 2016 B1
9626805 Lampotang et al. Apr 2017 B2
9645646 Cowley et al. May 2017 B2
9652037 Rubin et al. May 2017 B2
9760166 Ammi et al. Sep 2017 B2
9778744 Birnbaum Oct 2017 B2
9811854 Lucido Nov 2017 B2
9851799 Keller et al. Dec 2017 B2
9933851 Goslin et al. Apr 2018 B2
9948885 Kurzweil Apr 2018 B2
20060066574 Kim et al. Mar 2006 A1
20060119572 Lanier Jun 2006 A1
20070226646 Nagiyama et al. Sep 2007 A1
20070242040 Ullrich et al. Oct 2007 A1
20080150911 Harrison Jun 2008 A1
20080174550 Laurila Jul 2008 A1
20090001855 Lipton et al. Jan 2009 A1
20090085882 Grant et al. Apr 2009 A1
20090280860 Dahlke Nov 2009 A1
20110128132 Ullrich et al. Jun 2011 A1
20110163946 Tartz et al. Jul 2011 A1
20120041436 Ullrich Feb 2012 A1
20120138658 Ullrich Jun 2012 A1
20120143182 Ullrich Jun 2012 A1
20140282051 Cruz-Hernandez Sep 2014 A1
20150015500 Lee Jan 2015 A1
20160070348 Cowley et al. Mar 2016 A1
20160084605 Monti Mar 2016 A1
20160086457 Baron et al. Mar 2016 A1
20160163227 Penake et al. Jun 2016 A1
20160166930 Bray et al. Jun 2016 A1
20160169635 Hannigan et al. Jun 2016 A1
20160170508 Moore et al. Jun 2016 A1
20160171860 Hannigan et al. Jun 2016 A1
20160171908 Moore et al. Jun 2016 A1
20160187969 Larsen et al. Jun 2016 A1
20160187974 Mallinson Jun 2016 A1
20160201888 Ackley et al. Jul 2016 A1
20160209658 Zalewski Jul 2016 A1
20160214015 Osman et al. Jul 2016 A1
20160214016 Stafford Jul 2016 A1
20160375170 Kursula et al. Dec 2016 A1
20170102771 Lei Apr 2017 A1
20170103574 Faaborg et al. Apr 2017 A1
20170131775 Clements May 2017 A1
20170148281 Do et al. May 2017 A1
20170154505 Kim Jun 2017 A1
20170168576 Keller et al. Jun 2017 A1
20170168773 Keller et al. Jun 2017 A1
20170178407 Gaidar et al. Jun 2017 A1
20170203221 Goslin et al. Jul 2017 A1
20170203225 Goslin Jul 2017 A1
20170206709 Goslin et al. Jul 2017 A1
20170214782 Brinda Jul 2017 A1
20170257270 Goslin et al. Sep 2017 A1
20170352185 Bonilla Acevedo et al. Dec 2017 A1
20180050267 Jones Feb 2018 A1
20180053351 Anderson Feb 2018 A1
20180077976 Keller et al. Mar 2018 A1
20180081436 Keller et al. Mar 2018 A1
20180093181 Goslin et al. Apr 2018 A1
20180107277 Keller et al. Apr 2018 A1
20180120936 Keller et al. May 2018 A1
Foreign Referenced Citations (14)
Number Date Country
1727858 Feb 2006 CN
1983125 Jun 2007 CN
102395940 Mar 2012 CN
102736732 Oct 2012 CN
103167381 Jun 2013 CN
103247296 Aug 2013 CN
2487557 Aug 2012 EP
2605490 Jun 2013 EP
2010086411 Apr 2010 JP
2011090575 May 2011 JP
2013044706 Mar 2013 JP
2007033244 Mar 2007 WO
2010119397 Oct 2010 WO
2011085242 Jul 2011 WO
Non-Patent Literature Citations (1)
Entry
Any information that are not included with this Information Disclosure Statement can be found in U.S. Appl. No. 14/959,077.
Related Publications (1)
Number Date Country
20170351334 A1 Dec 2017 US
Continuations (2)
Number Date Country
Parent 14959077 Dec 2015 US
Child 15685291 US
Parent 14030181 Sep 2013 US
Child 14959077 US