Touch sensitive devices can use sensors to determine that a touch has occurred on a surface of the device. Present day touch sensitive devices are mainly limited to non-conductive surfaces due to the way they must operate.
In an embodiment, a touch sensitive device including a front panel having a touch surface and a back surface opposite the touch surface. The touch sensitive device further includes one or more vibration transducers mounted to the back surface, and a controller electronically connected to the vibration transducer. The controller receives, from the vibration transducer, a vibration signal, extracts feature information corresponding to predetermined features from the vibration signal, determines, based on the feature information, that a touch occurred within a predefined area of the touch surface, and outputs a signal indicating that the touch occurred within the predefined area of the touch surface.
In an embodiment, a method for detecting touch by a controller includes receiving, from one or more vibration transducers of a touch sensitive device, a vibration signal; extracting feature information from the vibration signal, the feature information corresponding to predetermined features; determining, based on the feature information, that a touch has occurred within a predefined area on a touch surface of the touch sensitive device; and outputting a signal indicating that the touch occurred within the predefined area.
The foregoing and other features of the present disclosure will become more fully apparent from the following description and appended claims, taken in conjunction with the accompanying drawings. Understanding that these drawings depict embodiments in accordance with the disclosure and are, therefore, not to be considered limiting of its scope, the disclosure will be described with additional specificity and detail through use of the accompanying drawings.
In the following detailed description, reference is made to the accompanying drawings, which form a part hereof. In the drawings, similar symbols identify similar components. The illustrative embodiments described in the detailed description, drawings, and claims are not meant to be limiting. Other embodiments may be used, and other changes may be made, without departing from the spirit or scope of the subject matter presented here. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, and designed in a wide variety of different configurations, all of which are explicitly contemplated and make part of this disclosure.
The present disclosure describes systems and methods for detecting touch by a touch sensitive device. Different touch sensitive devices can use different sensors for detecting touch. For example, some touch sensitive devices use capacitors to detect touch.
The touch detection described with respect to
The systems and methods described herein can be used for detecting touch using vibration sensors, and can provide advantages over other types of touch detection. For example, the systems and methods described herein allow for accurate touch detection even with gloved or dirty fingers and can be used for touch detection on devices having surfaces comprised of a conductive material. It should be understood, however, that the systems and methods described are also suitable for touch detection on surfaces comprised of non-conductive materials.
In embodiments of devices and techniques using vibrational sensors for user input, a user interface is incorporated onto a substrate. In one or more embodiments, the substrate includes stainless steel, glass or other rigid or non-rigid materials, and in some embodiments, a substrate including such materials may be used in appliances and other devices. Other materials may additionally or alternatively be used in the substrate. A substrate may include multiple layers of a same or similar material, or multiple layers with one or more of the layers being a material different than the other layers.
Button representations can be provided on a front facing surface of a substrate facing the user, and one or more vibrational sensors can be mounted on a rear surface opposing the front facing surface of the substrate. Pressing on or touching a button representation causes vibrations in the substrate. These vibrations are sensed and measured by the vibrational sensors to identify an intended user input.
Button representations may be provided, for example, by painting, printing, inscribing, lighting or etching the front facing surface of the substrate, or by painting, printing, inscribing, lighting or etching a material which is then attached (e.g., by gluing or laminating) to the front facing surface of the substrate, or a combination thereof. Such a material may be, for example, a film; and the film may be, but is not necessarily, a transparent or translucent film.
Vibrational sensors can be mounted on button areas on the rear surface of the substrate. In some embodiments, the button areas can be defined directly behind corresponding button representations, and one button area can correspond to one button representation. In one or more embodiment, one or more vibrational sensors can be mounted per button area. In some embodiments in which the substrate is multi-layered, one or more of the vibrational sensors may be mounted to a surface of an intermediate layer of the substrate. For convenience, mounting to the rear surface of the substrate is described with respect to the embodiments of the present disclosure; however, it is to be understood that mounting to an intermediate layer of the substrate instead is within the scope of the present disclosure.
In some embodiments, the button representations are omitted, and the vibrational sensors are arranged to detect pressing upon the substrate within a predefined area of the substrate. For convenience, the embodiments described herein are described as having button representations; however, it is to be understood that the button representations may be omitted. Thus, the substrate may not have visible indicators of a button representation for the user interface on the front facing surface of the substrate, though the user interface is available.
Vibrations caused by a user touching a button representation are sensed and measured by the vibrational sensors adjacent to the button area corresponding to the button representation touched by the user, and by other vibrational sensors mounted on other button areas. Electrical signals generated by the vibrational sensors can be processed to identify a valid user input.
While
A user can press or tap, such as with finger (or fingers) or some other object, the front surface of the substrate 230 over the button representations 232, 234 to enter an input. The user's pressing on the substrate 230 will cause vibrations in the substrate. In one or more embodiments, these vibrations can be sensed by a vibrational sensor. The vibrations may be in any frequency range detectable by the vibrational sensor, such as, for example, infrasonic, acoustic, or ultrasonic.
The apparatus 200 shown in the example of
In one or more embodiments, the first vibrational transducer 202 and the second vibrational transducer 208 may output digital outputs instead of analog voltage levels. For example, in one or more embodiments, the first vibrational transducer 202 and the second vibrational transducer 208 may output pulse density modulated (PDM) data or pulse width modulated (PWM) data. In some such embodiments, the digital outputs of the first vibrational transducer 202 and the second vibrational transducer 208 may be provided directly to the decoder 214.
The decoder 214 receives signals originating from the first and second vibrational transducers 202 and 208, and, based on the received signals, determines which ones of the actuation lines 222 to actuate. The actuation lines 222 can represent and control one or more functions. For example, if the apparatus 200 were deployed in a refrigerator, one of the actuation lines 222 may activate a motor, another of the actuation lines 222 may turn on a light, another one of the actuation lines 222 may turn off a light, or increase/decrease temperature. Other example functions are additionally or alternatively possible based on the device in which the apparatus is deployed.
It will be appreciated that the decoder 214 may be any type of processing device such as a microprocessor, controller or the like. For example, the device may execute computer programmed instructions stored in memory to determine which button was touched by the user and assert the appropriate one of the actuation lines 222. In addition, the decoder 214 may be a device that is constructed of discrete or integrated analog components. Combinations of hardware and/or software elements may also be used to implement the decoder 214. In one or more embodiments, the decoder 214 may also include a demodulator to demodulate PDM or PWM data signals received from vibrational transducers that output digital data. In one or more embodiments, the decoder 214 may include additional modules such as one or more sample-and-hold modules, one or more ADCs, or one or more DACs. In one or more embodiments, the decoder 214 may include a timing module that records a time when an input from a vibrational transducer is received. In one or more embodiments, the decoder 214 may sample an analog input, generate a corresponding digital representation, and store the digital representation along with a corresponding time-stamp.
The first amplifier 204, the second amplifier 210, the first comparator 206, the second comparator 212, and the decoder 214 may each be implemented in analog circuitry, in digital circuitry, or in a combination of analog and digital circuitry.
Although shown as discrete devices in
In some embodiments, further analysis may be performed on vibrations that are sensed. For example, in one or more embodiments, vibration patterns from known anomalies in the devices being controlled may be stored (at the decoder or some other processing device) and the sensed vibrations compared to these patterns to detect defects or other types of anomalous situations. For example, if the apparatus 200 is deployed in a refrigerator, the apparatus may sense vibrations and compare these vibrations to vibrational patterns stored in memory, where the stored patterns are from defective compressors. If the sensed patterns match the stored patterns, then a defect is potentially detected. A user can be alerted, for example, by displaying a message on a screen of the refrigerator. It is to be understood that analyses for detecting other types of defects and anomalies are also possible.
In one or more embodiments, the decoder 214 processes the received signals based on parameters such as timing, amplitude, and frequency. For example, in one or more embodiments, the decoder 214 compares a relative timing of the receipt of the various signals at the decoder 214.
As mentioned above, the actuation lines activated by the decoder 214 may perform various functions. For example, they may activate devices (or portions of devices), deactivate devices (or portions of devices), or serve to control operation of another device or electrical or electronic equipment.
While
In one or more embodiments, the decoder may measure the relative amplitudes or the relative frequencies of the received signals instead of the relative timing of when the signals go to a logic high (or a logic low, or make a transition), and determine the decoded inputs and the corresponding actions from the relative amplitudes or relative frequencies.
The process 400 includes receiving vibrations (stage 402) and converting received vibrations into corresponding electrical signals (stage 404). Examples of these process stages have been discussed above in relation to
The process 400 also includes receiving electrical signals that exceed a threshold value (stage 406). One example of this process stage has been discussed above in relation to
According to certain aspects, process 400, implemented using only the hardware shown in
The process 400 also includes activating the appropriate actuation line (stage 410). One example of this process stage has been discussed above in relation to
Example embodiments of touch sensitive devices incorporating MEMS devices as vibrational sensors will now be described in more detail. As in the previous examples, these embodiments operate by detecting any object contacting and causing vibrations through the front panel of the touch sensitive device. The front panel can be any hard surface material (metal, plastic, or glass). Other, non-rigid surface materials are possible. Contact is detected by using a set (e.g. an array) of two or more small vibration detecting transducers. In one embodiment, these vibration detectors are small accelerometers made from MEMS devices. The MEMS devices provide a small low cost acceleration sensor. These MEMS devices are mounted behind the front panel, thus isolating them from the environment. The present embodiments can be used with gloved hands and are resistant to contaminants that might be encountered in routine use of the device being controlled (dust, dirt, oil, grease, acids, cleansers). By using an array of vibration sensors and detection circuitry, a touch control panel with several buttons can be implemented. By assigning part of the vibration sensor array as background listeners, and the use of appropriate signal processing algorithms the system can accurately locate contact in the presence of background vibrations (i.e. noise). Since the front panel of the touch sensitive device is used as the Human Machine Interface (HMI), the material(s) used for the front panel can be selected to meet the environmental, aesthetic and use requirements of the device.
The front panel 502 has a front surface, i.e. touch surface 504 and a back surface 506. At least a portion of the touch surface 504 is exposed such that a user has physical access to the touch surface 504. The front panel 502 can include, for example, metal, ceramic, leather, plastic, glass, acrylic, Plexiglas, composite materials such as carbon fiber or fiberglass, or a combination thereof. In some embodiments, the touch surface 504 includes a covering, such as a plastic or film covering. The touch surface 504 can optionally include button representations to help inform or guide a device user's touch; however, such button representations may be omitted.
The MEMS devices 508 can be any MEMS device that detects vibration. For example, MEMS devices 508 can be MEMS accelerometers. In another example, MEMS devices can be MEMS microphones. In these and other examples, the MEMS microphones can comprise unplugged MEMS microphones, plugged MEMS microphones or MEMS microphones with no ports. Example embodiments of MEMS microphones that can be used to implement MEMS devices 508 are described in more detail in co-pending application No. [K-210PR2], the contents of which are incorporated by reference herein in their entirety.
In one or more embodiments, the MEMS device 508 can be mounted on the front panel 502 (e.g., on the back surface 506) using the adhesive 510. In one or more embodiments, the MEMS device 508 is a MEMS mic mounted such that a sound inlet or port of the MEMS mic is sealed against the back surface 506 of the front panel 502. In other embodiments, the MEMS device 508 is a MEMS mic with the sound inlet or port plugged, and the plugged MEMS mic is mounted against the back surface 506. An adhesive 510 can be applied around a perimeter of the port of the MEMS mic to adhere the MEMS mic to the front panel 502. In one or more embodiments, a two sided adhesive 510 sealant can be used to adhere the MEMS mic to the front panel 502. In some other embodiments, layers of insulating material, such as rubber or plastic, can be applied around the port of the MEMS mic, and adhered to the front panel 502. These and other embodiments are described in more detail in the co-pending application.
The substrate 512 can electrically connect the MEMS devices 508 to a controller 600 (
In one or more embodiments, the filler 514 provides structural support to the substrate 512, the MEMS devices 508, the front panel 502, and/or the controller 600. In some embodiments, the filler 514 can distribute a pressure applied to the filler 514 across the MEMS devices 508 such that the MEMS devices 508 are in contact with the front panel 502. In some embodiments, this can improve an effectiveness of the MEMS devices 508 in detecting vibration. The filler 514 can be any suitable material for providing structural support and/or pressure in the manner described above. For example, the filler 514 can be a foam, a sponge material, a rubber, other material, or a combination thereof. In other embodiments, the touch sensitive device 500 does not include filler 514, and structural support for components can be provided in another appropriate manner, such as, for example, another supporting structure such as a clamp, or by attachment, directly or indirectly, to the back surface 506 of the front panel 502, or to the side panel 518.
In some embodiments, the touch sensitive device 500 includes a frame or body. In an example embodiment, the touch sensitive device 500 includes a body that includes the back panel 516 and the side panels 518. The back panel 516 and the side panels 518, together with the front panel 502, can frame the touch sensitive device 500. The back panel 516 and the side panels 518 can include rigid materials such that other components of the touch sensitive device 500 are shielded from impacts. Non-limiting examples of rigid materials include metal, ceramic, plastic, glass, acrylic, Plexiglas, carbon fiber and fiberglass. The back panel 516 and the side panels 518 can provide structural support for ones of, or all of, the other components of the touch sensitive device 500. In some embodiments, including the embodiment depicted in
In the example embodiment, the front panel 502 is a steel plate and is approximately 0.6 millimeters (mm) thick. The rubber layer 503 is approximately 1/64″ (inches) thick and is disposed between the front panel 502 and the adhesive 510. The rubber layer 503 is used to cushion a MEMS device 508, and provides a surface well-suited to adhesion by the adhesive 510. The rubber layer can also help to dampen vibrations between microphones. In some other embodiments, the touch sensitive device 500 includes a layer of foam or sponge dampening material. The electrical connector 505 can be any electrical connector, such as a flexible electrical connector, and serves to connect the substrate 512 to an external controller 600 (not shown in
It should be noted that the present embodiments are not limited to vibration sensors mounted on a back surface opposite a touch surface as shown in
In one or more embodiments, the controller 600 includes at least one processor 602 and at least one memory 604. The memory 604 can include one or more digital memory devices, such as RAM, ROM, EPROM, EEPROM, MROM, or Flash memory devices. The processor 602 can be configured to execute instructions stored in the memory 604 to perform one or more operations described herein. The memory 604 can store one or more applications, services, routines, servers, daemons, or other executable logics for detecting touch on the touch surface. The applications, services, routines, servers, daemons, or other executable logics stored in the memory 604 can include any of an event detector 606, an event data store 612, a feature extractor 616, a touch identifier 620, a long term data store 614, and a transmission protocol logic 618.
In one or more embodiments, the event detector 606 can include one or more applications, services, routines, servers, daemons, or other executable logics for determining that a potential touch event has occurred. The event detector 606 can monitor and store signals received from one or more vibration transducers, and can determine when the signals indicate that a potential touch event has occurred. The event detector 606 may include or be coupled to a buffer data store 608 and a noise floor calculator 610.
In one or more embodiments, the event detector 606 can store a vibration signal received from at least one MEMS device 508 frame by frame. For example, the event detector 606 can continuously or repeatedly store the incoming vibration signal in buffer data store 608, and can continuously or repeatedly delete oldest signal data from buffer data store 608 after some time has passed, such as after a predetermined amount of time. In this manner, the event detector 606 can maintain the buffer data store 608 such that only a most recent portion of the vibration signal is stored. For example, the event detector 606 can store only a most recent half second (or another time period) of the vibration signal. This can reduce data storage needs and can allow for efficient use of computer resources.
In one or more embodiments, the event detector 606 can monitor the portion of the vibration signal stored in the buffer data store 608 for an indication that a potential touch event has occurred. For example, the event detector 606 can determine, based on the stored portion of the vibration signal, that the vibration signal or an average or accumulation thereof has crossed a noise floor threshold, or that the vibration signal or average or accumulation thereof is above the noise floor threshold. When the event detector 606 determines that the vibration signal or an average or accumulation thereof is above the noise floor threshold, the event detector 606 can determine that a potential touch event has occurred and can store at least part of the portion of the signal stored in buffer data store 608 in the event data store 612 as a potential event signal, or can associate the at least part of the portion of the signal with a potential event and can store an indicia of that association in the event data store 612. The event detector 606 can set a time at which the vibration signal crossed a noise floor threshold as an event start time. In some embodiments, the event detector 606 can store a portion of a vibration signal as a potential event signal in the event data store 612, the portion of the vibration signal corresponding to a time frame that includes a first amount of time prior to the event start time and a second amount of time after the event start time. For example, when the event detector 606 determines that the vibration signal or an average or accumulation thereof is above the noise floor threshold, or has crossed the noise floor threshold, the event detector 606 can continue to store the vibration signal frame by frame for a predetermined amount of time, such as for an additional half second, and can then store the portion of the vibration signal stored in the buffer data store 608 as a potential event signal in the event data store 612.
In some embodiments, the noise floor threshold is a predetermined threshold. In other embodiments, the noise floor calculator 610 calculates the noise floor threshold based on an adaptive algorithm, such that the noise floor threshold is adaptive to a potentially changing noise floor. For example, the noise floor calculator 610 can calculate a first noise floor at a first time based on a portion of a vibration signal stored in the buffer data store 608 at the first time, and at a second time can calculate a second noise floor based on a portion of a vibration signal stored in the buffer data store 608 at the second time, or based on an accumulation value (e.g., an accumulated average value of the vibration signal). Example techniques for adaptively calculating the noise floor threshold according to these and other embodiments are described in more detail in J. F. Lynch Jr, J. G. Josenhans, R. E. Crochiere, “Speech/Silence Segmentation for Real-Time Coding via Rule Based Adaptive Endpoint Detection.”
In one or more embodiments, when the event detector 606 determines that a potential touch event has occurred and stores the portion of the signal stored in the buffer data store 608 in the event data store 612, the event detector 606 can also store a portion of a second vibration signal that corresponds to second MEMS device 508 in the event data store 612. In some embodiments, the portion of the first vibration signal and the portion of the second vibration signal correspond to a same time frame. The event detector 606 can store vibration signals as potential event signals for any number of signals that correspond to the MEMS devices 508, in any appropriate manner, including in the manner described above. It should be noted that the number of signals stored can depend on a number of factors, such as a storage capacity of buffer data store 608.
The feature extractor 616 can include one or more applications, services, routines, servers, daemons, or other executable logics for extracting features or identifying values corresponding to features from signals or from portions of signals stored in a data store, such as the event data store 612, or any other appropriate data store, such as the buffer data store 608. The features can be predetermined features. For example, the features can include: (i) a maximum signal amplitude, (ii) a minimum signal amplitude, (iii) a time at which a signal achieves a maximum amplitude, (iv) a time at which a signal achieves a minimum amplitude, (v) a time at which a signal amplitude crosses a predetermined amplitude threshold, (vi) an energy contribution to the signal by frequencies equal to or below a first predetermined frequency threshold, and (vii) an energy contribution to the signal by frequencies equal to or above a second predetermined frequency threshold, where the first and second predetermined frequency thresholds can be any appropriate frequency threshold. Without limitation or loss of generality, in some embodiments, the first and/or second predetermined frequency threshold is in a range of 50-150 Hertz (“Hz”). In some embodiments, the first and/or second predetermined frequency threshold is in a range of 90-110 Hz. In some embodiments, the first and/or second predetermined frequency threshold is 100 Hz.
In one or more embodiments, the feature extractor 616 can extract features from two or more signals. For example, the feature extractor 616 can extract features from two signals stored in the event data store 612 that respectively correspond to different respective vibration transducers, and/or that correspond to a same time frame. In some embodiments, a touch sensitive device (e.g., the touch sensitive device 500) can include two or more vibration transducers, the event data store 612 can store a set of two or more signals that respectively correspond to the two or more vibration transducers, and the feature extractor 616 can extract a same set of features from the two or more signals. For example, the feature extractor 616 can extract a minimum amplitude for each of two or more signals stored in the event data store 612.
In some embodiments, the touch identifier 620 can include one or more applications, services, routines, servers, daemons, or other executable logics for determining that a touch event has occurred, and/or for determining at which area of a predefined set of areas of the touch surface the touch event occurred. The touch identifier 620 can determine that a touch event has occurred at an area of the touch surface based on, for example, one or more event signals stored in the event data store 612, and/or based on features extracted by the feature extractor 616. In some embodiments, the touch identifier 620 includes a classifier that can classify extracted features of vibration signals as corresponding to a touch event at an area of the touch surface. The classifier can be, for example, a model that takes features or feature values as inputs, and outputs a determination that a touch event has occurred, or has not occurred, at an area of the touch surface. For example, the feature extractor 616 can extract a minimum amplitude for each of a set of signals stored in the event data store 612, the signals respectively corresponding to different vibration transducers and corresponding to a same time frame. The classifier can output a determination as to whether and where a touch has occurred based on the minimum amplitudes.
A classifier or model of the touch identifier 620 can be generated by a machine learning algorithm trained on annotated training data. For example, the model can be a linear combination of a number of features, and weights for those features can be determined by a machine learning algorithm. Examples of features and classifiers that make use of those features are described in reference to
Training can be done either with, or without, being installed in the end device (e.g, oven or other appliance). This can involve collecting “labeled” data by the touch sensitive device and feeding it through the algorithm to train it. Note that it is also possible to have a short training session during production of the end device, essentially to calibrate the touch sensitive device to the end device.
The touch identifier 620 can be used to determine whether a touch event occurred at one area of a predetermined set of areas of the touch surface. For example, at least a portion (not necessarily contiguous) of the touch surface can be divided into two or more designated areas, and the touch identifier 620 can determine which area a touch event corresponds to. In some embodiments, the touch surface includes a single designated area. In some embodiments, the areas can correspond to locations at which one or more vibration transducers are disposed. In some embodiments, the areas can be designated based on button representations on a touch surface (e.g., the touch surface 504).
In one or more embodiments, the touch identifier 620 can be used to determine a touch score for one or more of the areas. The touch score can be, for example, equal to a linear combination of the features. The touch identifier 620 can determine that the area corresponding to the highest touch score is an area at which the touch event occurred. In some embodiments, the touch identifier 620 can determine that a touch event has occurred at multiple areas. For example, the touch identifier 620 can determine that a touch event has occurred at any area corresponding to a touch score above a predetermined threshold. In some embodiments, the touch score can be generated by the classifier or model of the touch identifier 620.
In one or more embodiments, the controller 600 can include or can access, directly or indirectly, the long term data store 614. The controller 600 can receive vibration signal data from at least one of the vibration transducers and can store the vibration signal data in the long term data store 614. In some embodiments, the controller 600 can store vibration signals in the long term data store 614 corresponding to a longer period of time than the vibration signals stored in the buffer data store 608. In some embodiments, the controller 600 can store vibration signals in the long term data store 614 corresponding to data that is deleted by the event detector 606 from the buffer data store 608. In some embodiments, the controller 600 can store vibration signals in parallel to both the long term data store 614 and the buffer data store 608. In some embodiments, the data stored in the long term data store 614 can be used to train or evaluate a machine learning classifier, such as, for example, a machine learning classifier of the touch identifier 620, or a machine learning classifier trained to classify data, including features of vibration signals, as corresponding to touch events. The training can occur locally, remotely, or as some combination of the two.
In some embodiments, the transmission protocol logic 618 can include one or more applications, services, routines, servers, daemons, or other executable logics for transmitting or uploading data stored in the long term data store 614 to a remote data store, such as, for example, a cloud data store. In some embodiments, the controller 600 further includes a transmitter, or can access a transmitter of the touch sensitive device, and the transmission protocol logic 618 can cause the transmitter to transmit data from the long term data store 614 to a remote data store. In some embodiments, the transmission protocol logic 618 can cause the transmitter to transmit the data from the long term data store 614 on a fixed schedule, such as, for example, every hour, every day, every week, every month, or on any other appropriate fixed schedule. In some embodiments, the transmission protocol logic 618 can cause the transmitter to transmit the data from the long term data store 614 responsive to the long term data store 614 storing an amount of data above a threshold. In some embodiments, the threshold is based on an amount of available space or memory available in the long term data store 614. In some embodiments, the controller 600 can delete data stored in the long term data store 614 responsive to the data being transmitted to a remote data store.
In one or more embodiments, at blocks 706 and 708, a change detection algorithm can detect that the signal has exhibited a change indicative of a potential touch event. For example, an event detector (e.g., the event detector 606) can determine that signal data stored in the buffer data store corresponds to a potential touch event, based on, for example, the signal crossing a noise floor threshold calculated by a noise floor calculator (e.g., the noise floor calculator 610). Responsive to this determination, the event detector can store at least a portion of the signal data stored in the buffer data store or in the event data store (e.g., the event data store 612).
In one or more embodiments, at block 710, a feature extractor (e.g., the feature extractor 616) can extract features from the signal data stored in the event data store. In other embodiments, the feature extractor can extract features from the signal data stored in the buffer data store. The extracted feature data can correspond to one or more predetermined features.
In one or more embodiments, at block 712, a touch identifier (e.g., the touch identifier 620) can classify the extracted feature data as corresponding to a touch event, or as not corresponding to a touch event. The touch identifier can so classify the extracted feature data using a classifier or model, such as a machine learning classifier, as described above in reference to
In one or more embodiments, a noise floor calculator (e.g., the noise floor calculator 610) can determine a noise floor threshold, such as that a noise floor threshold is 0.5 mV as illustrated in
In one or more embodiments, a feature extractor (e.g., the feature extractor 616) can analyze the signal data stored in the event data store to extract features, such as any of the predetermined features described above. In some embodiments, the feature extractor can extract predetermined features from multiple signals, and each extracted feature value for each signal can be used by a touch identifier (e.g., the touch identifier 620) as an independent parameter value for determining whether and where a touch event occurred. As set forth above, the extracted features can include: (i) a maximum signal amplitude, (ii) a minimum signal amplitude, (iii) a time at which a signal achieves a maximum amplitude, (iv) a time at which a signal achieves a minimum amplitude, (v) a time at which a signal amplitude crosses a predetermined event threshold, (vi) an energy contribution to the signal by frequencies equal to or below a first predetermined frequency threshold, and (vii) an energy contribution to the signal by frequencies equal to or above a second predetermined frequency threshold, where the first and second predetermined frequency thresholds can be any appropriate frequency threshold.
In the example embodiment shown in
The button MEMS microphones 508a correspond to MEMS microphones disposed behind the front panel 502 at locations that correspond to button areas 1-9. In other embodiments, the button MEMS microphones 508a are MEMS microphones that are closest to respective button areas. The additional MEMS microphones 508b are MEMS microphones that are disposed adjacent to or near the button MEMS microphones 508a. The additional MEMS microphones 508b are similar to the button MEMS microphones 508a, except for their placement. Signals from the button MEMS microphones 508a and from the additional MEMS microphones 508b can be received and used by a controller (e.g., the controller 600) to determine whether and where a touch event has occurred. In the example of
In other embodiments, the MEMS devices 508 can be disposed or spaced in any appropriate manner, and need not be disposed in an evenly spaced configuration. For example, the disposition of sensors behind the button areas on the front panel is designed to maximize the classification success of the algorithm. While the previously described algorithm can function with any disposition of sensors, it is advantageous in some embodiments to place sensors directly underneath and surrounding the desired touch sensitive area. In this configuration, the “button” sensor (e.g. MEMS microphones 508a) directly underneath the touch sensitive area will record a substantially larger signal relative to the adjacent “keep out” sensors (e.g. MEMS microphones 508b), whereas pressing outside the contact area will result in either larger or comparable in magnitude signals at the adjacent “keep out” sensors, enabling reliable classification.
In general, the number of and spacing of “keep out” sensors is a function of the layout of the touch locations themselves as well as the “resolution” of the touch on the surface. In the case of a dense grid of touch locations, the “keep out” sensors may only be necessary around the perimeter of the array. In the case of sparsely distributed touch locations, each touch location may require 2-3“keep out” sensors to prevent touches outside of the contact area from producing a false classification. The “resolution” characterizes how the measured features of the received signals change as a function of the touch location. A setup with low resolution will require additional sensors to provide sufficient information to the classification algorithm.
It should be noted that a frequency threshold of 100 Hz has been found advantageous in many embodiments. In other embodiments, a frequency threshold in the range of 50-150 Hz provides sufficient results, and in other embodiments, a frequency threshold in a range of 0-1000 Hz can be used. Moreover, in still further embodiments, a frequency range is divided up into frequency bins, with a frequency threshold for each.
In the performance matrix 1000, results from each of tests A, B, C, D are shown in a matrix of two rows and three columns of numbers: row 1, column 1 corresponds to a number of correct button classifications (correct identification by a touch identifier that a touch event, such as a finger tap, has occurred, and that the touch event has occurred at a particular area); row 1, column 2 corresponds to a number of incorrect button classifications (correct identification by the touch identifier that a touch event has occurred, but incorrect identification of the area at which the touch event occurred); row 1, column 3 corresponds to a number of missed button classifications (touch events occurred but were not identified as touch events by a touch identifier); row 2, column 1 corresponds to a number of non-events classified as a button tap (false positives where the touch identifier determined that a touch event had occurred, when in fact it had not); row 2, column 2 is always zero, and row 2, column 3 corresponds to a number of non-events correctly classified as non-events. Non-events can include, for example, touch events outside of the button areas or in between button areas, or other types of vibrational input to the touch sensitive device 500 that are not touch events in the button area, such as knocks outside the button areas and shaking of the device. As can be seen from the results, the tests were very successful. For example, in test A, when only feature 2 was used, all 862 touch events were correctly classified as touch events at a correct location, and 1234 out of 1238 non-events were correctly classified as non-events. In test D, when all seven features were used, all 862 touch events were correctly classified as touch events at correct locations, and all 1238 non-events were correctly classified as non-events.
Note that features can be determined for all of the button MEMS microphones 508a and the additional MEMS microphones 508b. Thus, for a number ‘x’ of features and a combined number ‘y’ of sensors (the button MEMS microphones 508a plus the additional MEMS microphones 508b), a number ‘z’ of values used for touch detection can be z=xy.
As can be seen from the performance matrix 1000, the combinations of features tested were each successful in identifying actual touch events and rejecting non-events. Notably, test A was performed using a classifier that used a single feature, feature (ii), minimum signal amplitude (min peak value), illustrating that the systems and techniques of the present disclosure, using vibration transducers, provide for accurate and consistent touch detection.
In the example architecture of
Another example architecture is shown in
The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures can be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “associated” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “associated with” each other such that the desired functionality is achieved, irrespective of architectures or intermediate components. Likewise, any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable,” to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components and/or wirelessly interactable and/or wirelessly interacting components and/or logically interacting and/or logically interactable components.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations may be expressly set forth herein for sake of clarity.
It will be understood by those within the art that, in general, terms used herein, and especially in the appended claims (e.g., bodies of the appended claims) are generally intended as “open” terms (e.g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.).
It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to inventions containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations).
Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). It will be further understood by those within the art that virtually any disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms. For example, the phrase “A or B” will be understood to include the possibilities of “A” or “B” or “A and B.” Further, unless otherwise noted, the use of the words “approximate,” “about,” “around,” “substantially,” etc., mean plus or minus ten percent.
The foregoing description of illustrative embodiments has been presented for purposes of illustration and of description. It is not intended to be exhaustive or limiting with respect to the precise form disclosed, and modifications and variations are possible in light of the above teachings or may be acquired from practice of the disclosed embodiments. It is intended that the scope of the invention be defined by the claims appended hereto and their equivalents.
The present application claims priority to U.S. Provisional Appln. No. 62/296,919 filed Feb. 18, 2016, the contents of which are incorporated herein in their entirety.
Number | Date | Country | |
---|---|---|---|
62296919 | Feb 2016 | US |