Modern electronic devices are increasingly being designed to engage users via multiple sensory modes. For example, personal communication devices may utilize a combination of visual, auditory, and haptic modes to interact with a user. With respect to the visual and auditory effects produced by modern electronic devices, the conventional art includes a wide variety of tools, application programming interfaces (APIs), and editing software for working with audio-visual content. In addition, there presently exists considerable expertise in producing audio-visual experiences providing educational, therapeutic, social, and entertainment focused interactions. However, the conventional art lacks a comparable richness in hardware technologies, software tools, and technical expertise for the development of haptic based interactions.
There are provided haptic effect generation systems and methods, substantially as shown in and/or described in connection with at least one of the figures, and as set forth more completely in the claims.
The following description contains specific information pertaining to implementations in the present disclosure. One skilled in the art will recognize that the present disclosure may be implemented in a manner different from that specifically discussed herein. The drawings in the present application and their accompanying detailed description are directed to merely exemplary implementations. Unless noted otherwise, like or corresponding elements among the figures may be indicated by like or corresponding reference numerals.
As noted above, modern electronic devices are increasingly being designed to engage users via multiple sensory modes, including visual, auditory, and haptic modes. As also noted above, the conventional art includes a wide variety of tools, application programming interfaces (APIs), and editing software for working with audio-visual content, as well as considerable expertise in producing audio-visual experiences. However, the conventional art lacks a comparable richness in hardware technologies, software tools, and technical expertise for the development of haptic based interactions.
The present application is directed to haptic effect generation systems and methods. The haptic effect generation systems and methods disclosed in the present application enable the creation, editing, storing, sharing, and broadcasting of haptic data files corresponding respectively to a broad range of haptic effects. Such a haptic effect generation system can be implemented through use of a computing platform coupled to a haptic transformer and including a hardware processor for executing a haptic software code. Moreover, the haptic transformer can advantageously be implemented using readily available audio based hardware components. As a result, the haptic effect generation systems and methods disclosed in the present application are advantageously easy to use, simple to adopt, and can be implemented to produce a wide variety of haptic user interactions.
It is noted that, in some implementations, haptic effect generation system 100 may include one or more input sources 140 and/or haptic actuators 150. However, in other implementations, haptic effect generation system 100 may receive input signal 142 from one or more input sources 140 external to haptic effect generation system 100. Moreover, in some implementations, haptic effect generation system 100 may send haptic actuator signal 134 to haptic actuators 150 external to haptic effect generation system 100. It is further noted that although computing platform 102 is shown as a personal computer (PC) in
According to the exemplary implementation shown in
Second audio data 124 corresponds to a desired haptic effect, and may be utilized in a process to produce the desired haptic effect, or may be stored in haptic data file 126, by hardware processor 104, for later use. It is noted that, when stored by hardware processor 104 in haptic data file 126, second audio data 124 may be subsequently edited, shared, such as through being copied, and/or may be broadcast, such as by being transmitted to another computing platform (other computing platform not shown in
When used to produce the desired haptic effect, second audio data 124 is converted to second audio signal 112 by DAC 110. Haptic transformer 130 receives second audio signal 112 from DAC 110 and transforms second audio signal 112 to haptic actuator signal 134, which is produced as an output to haptic actuators 150. Haptic actuators 150 may then instantiate the desired haptic effect based on haptic actuator signal 134.
Haptic transformer 230 including filtering and modulation circuit 236, and frequency converter, driver, and amplifier circuit 238, corresponds in general to haptic transformer 130, in
Filtering and modulation circuit 236 and frequency converter, driver, and amplifier circuit 238 may each include an analog circuit. For example, each of filtering and modulation circuit 236 and frequency converter, driver, and amplifier circuit 238 may be implemented using readily available audio circuit components, such as audio mixers, filters, drivers, and amplifiers, for example. Haptic transformer 130/230 uses filtering and modulation circuit 236 to transform input signal 142/242, which has a non-zero frequency, to first audio signal 132/232 having a frequency in the audio band, i.e., up to approximately twenty kilohertz (20 kHz). Haptic transformer 130/230 uses frequency converter, driver, and amplifier circuit 238 to transform second audio signal 112/212, which may be an up to 20 kHz signal, to lower frequency haptic actuator signal 134/234. In some implementations, for example, haptic actuator signal 134/234 may have a frequency of less than or approximately equal to 300 Hz.
Alternating signal sources 344 may include one or more of a microphone, accelerometer, and pulse-sensor, for example. Examples of non-alternating signal sources 346 include stretch sensors, potentiometers, switches and dials, and force or pressure sensors. In implementations in which one or more non-alternating signal sources 346 are utilized, haptic transformer 130/230 or input sources 140/340 may include circuitry for passing an oscillating signal through each non-alternating signal source to produce one or more input signals corresponding to input signal 142/242. By contrast, in implementations in which one or more alternating signal sources 344 are utilized, the alternating signal source or sources 344 may produce input signal 142/242 directly.
First and second haptic actuators 451 and 452 may take the form of vibratory elements, and may be implemented using one or more of speakers, subwoofers, buzzers, bone conductors, and piezo elements, for example. Use of at least two haptic actuators, such as first and second haptic actuators 451 and 452, advantageously enables generation of haptic stereo effects.
It is noted that although
In
In
In
The features shown in
Referring to
However, and as also noted above, in some implementations, one or more input sources 140/340 may be non-alternating signal sources 346, such as resistance sensors, for example, incapable of producing input signal 142/242/342 having a signal frequency other than zero. In those implementations, haptic transformer 130/230 or input sources 140/340 may include circuitry for generating an oscillating signal for passing through each of one or more non-alternating signal sources 346 to produce input signal 142/242/342 having a non-zero frequency.
Flowchart 500 continues with transforming input signal 142/242/342 to first audio signal 132/232 corresponding to input signal 142/242/342 (action 520). As shown in
Flowchart 500 continues with converting first audio signal 132/232 to first audio data 122 (action 530). Conversion of first audio signal 132/232 to first audio data 122 may be performed by ADC 108 of computing platform 102, under the control of hardware processor 104, for example.
Flowchart 500 continues with receiving first audio data 122 from ADC 108 (action 540). First audio data 122 may be received from ADC 108 by hardware processor 104 executing haptic software code 120.
Flowchart 500 continues with generating second audio data 124 corresponding to a desired haptic effect, using first audio data 122 (action 550). Generation of second audio data 124 corresponding to a desired haptic effect, using first audio data 122, may be performed by hardware processor 104 executing haptic software code 120.
Haptic software code 120 includes audio processing software for performing audio mixing and audio production. Haptic software code 120, when executed by hardware processor 104, may generate second audio data 124 corresponding to the haptic effects illustrated in
Flowchart 500 continues with converting second audio data 124 to second audio signal 112/212 (action 560). Conversion of second audio data 124 to second audio signal 112/212 may be performed by DAC 110 of computing platform 102, under the control of hardware processor 104, for example.
Flowchart 500 continues with transforming second audio signal 112/212 to haptic actuator signal 134/234 for producing the desired haptic effect (action 570). As shown in
Flowchart 500 can conclude with producing haptic actuator signal 134/234 as an output (action 580). As shown in
The audio processing capabilities provided by haptic software code 120, combined with the functionality provided by haptic transformer 130/230, advantageously enable the use of high-speed, high-bandwidth audio channels for the generation of desired haptic effects using haptic actuator signal 134/234. According to the implementations disclosed in the present application, input signal 142/242/342 can be received and processed so as to generate haptic actuator signal 134/234 corresponding to a desired haptic effect in real time with respect to input signal 142/242/342 received from input sources 140/340.
The haptic effect generation systems and methods disclosed in the present application can be implemented in any of a wide variety of use cases in which coherent, real time, haptic feedback complements a user experience. Examples of such use cases include video games, movies, sporting events, and theme park attractions such as rides and virtual reality interactive experiences. In addition, the haptic effect generation systems and methods disclosed herein can be utilized to enable couches, beds, tables, walls, and other architecture and furniture to react to user actions.
Moreover, in some implementations, the present haptic effect generation systems and methods may be employed to provide therapeutic and/or assistive services. For example, haptic feedback can be used to complement relaxation and meditation, monitor and guide breathing, and provide therapeutic massage. In yet another use case, the present haptic effect generation systems and methods may be incorporated into smart vests, smart belts, or headgear configured to provide directional and awareness cues to motor cycle riders, athletes, and construction workers, for example.
Thus, the present application discloses haptic effect generation systems and methods enabling the creation, editing, storing, sharing, and broadcasting of haptic data files corresponding respectively to a broad range of haptic effects. The haptic effect generation systems disclosed in the present application can be implemented through use of a computing platform coupled to a haptic transformer and including a hardware processor for executing a haptic software code. In addition, the haptic transformer can advantageously be implemented using readily available audio based hardware components. As a result, the haptic effect generation systems and methods disclosed in the present application are advantageously easy to use, simple to adopt, and can be implemented to produce a wide variety of haptic user interactions.
From the above description it is manifest that various techniques can be used for implementing the concepts described in the present application without departing from the scope of those concepts. Moreover, while the concepts have been described with specific reference to certain implementations, a person of ordinary skill in the art would recognize that changes can be made in form and detail without departing from the scope of those concepts. As such, the described implementations are to be considered in all respects as illustrative and not restrictive. It should also be understood that the present application is not limited to the particular implementations described herein, but many rearrangements, modifications, and substitutions are possible without departing from the scope of the present disclosure.
Number | Name | Date | Kind |
---|---|---|---|
7623114 | Rank | Nov 2009 | B2 |
7982711 | Anastas | Jul 2011 | B2 |
8849587 | Lightle | Sep 2014 | B1 |
9086755 | Cho | Jul 2015 | B2 |
9319150 | Peeler | Apr 2016 | B2 |
9619029 | Lacroix | Apr 2017 | B2 |
9635440 | Lacroix | Apr 2017 | B2 |
9786287 | Lacroix | Oct 2017 | B2 |
20030061932 | Tanaka | Apr 2003 | A1 |
20060017691 | Cruz-Hernandez | Jan 2006 | A1 |
20090021473 | Grant | Jan 2009 | A1 |
20090128306 | Luden | May 2009 | A1 |
20100231539 | Cruz-Hernandez | Sep 2010 | A1 |
20110202155 | Ullrich | Aug 2011 | A1 |
20110215913 | Ullrich | Sep 2011 | A1 |
20130106691 | Rank | May 2013 | A1 |
20140143682 | Druck | May 2014 | A1 |
20140205260 | Lacroix | Jul 2014 | A1 |
20140292501 | Lim | Oct 2014 | A1 |
20150070149 | Cruz-Hernandez | Mar 2015 | A1 |
20150077324 | Birnbaum | Mar 2015 | A1 |
20150355712 | Rihn | Dec 2015 | A1 |
20150362991 | Koga | Dec 2015 | A1 |
20160007095 | Lacroix | Jan 2016 | A1 |
20160162027 | Cruz-Hernandez | Jun 2016 | A1 |
20160162028 | Lacroix | Jun 2016 | A1 |
20160342212 | Weddle | Nov 2016 | A1 |
20160366450 | Hamam | Dec 2016 | A1 |
20170092084 | Rihn | Mar 2017 | A1 |
20170277330 | Bae | Sep 2017 | A1 |
20170364143 | Danieau | Dec 2017 | A1 |
Entry |
---|
Electronic Projects, Digital DC Motor Speed Control With LED Display Circuit and Electronics Project, Jan. 10, 2015. |
Bach-y-Rita, et al. “Sensory Substitution and the Human-Machine Interface.” Trends in Cognitive Sciences, vol. 7, No. 12. Dec. 2003. pp. 541-546. |
Konishi, Y., et al. “Synesthesia suit: the full body immersive experience,” SIGGRAPH Emerging Technologies, 2016. pp. 1. |
O. Bau, et al. “TeslaTouch: Electrovibration for Touch Surfaces” User Interface Software and Technology, 2010 pp. 283-292. |
Sodhi, et al. “AIREAL: Interactive Tactile Experiences in Free Air.” ACM Transactions on Graphics (TOG), v.32 n.4, Jul. 2013. pp. 1-10. |
Number | Date | Country | |
---|---|---|---|
20180165925 A1 | Jun 2018 | US |