The present application is directed to force detection and, more specifically, to force detection using a piezoelectric sensor.
Touch displays have become increasingly popular in electronic devices. Smart phones, cell phones, tablet computers, notebook computers, and computer monitors, and so forth, are increasingly equipped with displays that are configured to sense touch as a user input. The touch may be sensed in accordance with one of several different touch sensing techniques including, but not limited to, capacitive touch sensing.
Touch sensitive devices generally provide position identification of where the user touches the device. The touching may include movement, gestures, and other effects related to position detection. For example, touch sensitive devices can provide information to a computing system regarding user interaction with a graphical user interface (GUI) of a display, such as pointing to elements, reorienting or repositioning elements, editing or typing, and other GUI features. In another example, touch sensitive devices can provide information to a computing system for a user to interact with an application program, such as relating to input or manipulation of animation, photographs, pictures, slide presentations, sound, text, other audiovisual elements, and so forth.
While the touch sensitive devices provide an input mechanism that provides an appearance that the user is interacting directly with element displayed in the GUI, the input is generally limited to the x-, y-positioning of the touch. In some cases, the input sensitivity has been increased to allow for multi-touch inputs, but this is still limited to positional constraints of the surface upon which the touch is sensed. Some applications and programs may benefit from additional input modes beyond that provided strictly by the touch sensing.
The present application includes techniques directed to additional input modes for touch devices. In particular, embodiments may be directed to sensing force on a touch device using piezoelectric sensors. The force sensing may be in addition to the touch sensing to enable an additional user input mode for the touch device.
One embodiment, for example, may take the form of an apparatus including a touch device having a deformable device stack and a piezoelectric element positioned relative to the deformable device stack such that the piezoelectric element deforms with the deformable stack. Deformation of the piezoelectric element generates a signal having a magnitude discernable as representative of an amount of force applied to the touch device.
Another embodiment may take the form of a touch device having a dielectric cover glass (CG). The touch device further includes a piezoelectric structure adjacent the cover glass. The piezoelectric structure includes piezoelectric material, a first set of electrodes on a first surface of the piezoelectric material, and a second set of electrodes on a second surface of the piezoelectric material and located between the piezoelectric material and the cover glass. The piezoelectric material is a dielectric material and the second set of electrodes is configured to sense both electrical charge generated by the piezoelectric material and capacitance when a conductive material is brought into proximity with the cover glass.
As an alternative to the above, a single piezoelectric structure may be used and placed on one side of the CG. A set of electrodes may be sandwiched by the CG and the piezoelectric material. In such an embodiment, touch locations may be determined by a capacitive-sensing structure associated with the CG, while force may be estimated based on the lateral stretching of the electrodes operating in a d33 mode.
While multiple embodiments are disclosed, still other embodiments of the present invention will become apparent to those skilled in the art from the following Detailed Description. As will be realized, the embodiments are capable of modifications in various aspects, all without departing from the spirit and scope of the embodiments. Accordingly, the drawings and detailed description are to be regarded as illustrative in nature and not restrictive.
As alluded to above, when interfacing with a GUI, or with an application program, it may be advantageous for the user to be able to indicate an amount of force applied when manipulating, moving, pointing to, touching, or otherwise interacting with, a touch device. For example, it might be advantageous for the user to be able to manipulate a screen element or other object in a first way with a relatively lighter touch, or in a second way with a relatively more forceful or sharper touch. In one such case, it might be advantageous if the user could move a screen element or other object with a relatively lighter touch, while the user could alternatively invoke or select that same screen element or other object with a relatively more forceful or sharper touch. Hence, having the ability to sense force might provide the touch device with greater capabilities, with additional input information for the touch device.
In some embodiments, the force sensing device may be incorporated into a variety of electronic or computing devices, such as, but not limited to, computers, smart phones, tablet computers, track pads, and so on. The force sensing device may be used to detect one or more user force inputs on an input surface and then a processor (or processing element) may correlate the sensed inputs into a force measurement and provide those inputs to the computing device. In some embodiments, the force sensing device may be used to determine force inputs to a track pad, a display screen, or other input surface.
The force sensing device may include an input surface, a force sensing module, a substrate or support layer, and optionally a sensing layer that may detect another input characteristic than the force sensing layer. The input surface provides an engagement surface for a user, such as the external surface of a track pad or the cover glass for a display. In other words, the input surface may receive one or more user inputs directly or indirectly.
The force sensing module may include an ultrasonic module or sensor that may emit and detect ultrasonic pulses. In one example, the ultrasonic module may include a plurality of sensing elements arranged in rows or columns, where each of the sensing elements may selectively emit an ultrasonic pulse or other signal. The pulse may be transmitted through the components of the force sensing device, such as through the sensing layer and the input surface. When the pulse reaches the input surface, it may be reflected by a portion of the user (e.g., finger) or other object, which may reflect the pulse. The reflection of the pulse may vary based on distance that the particular sensing element receiving the pulse is from the input. Additionally, the degree of attenuation of the pulse may also be associated with a force magnitude associated with the input. For example, generally, as the input force on the input surface increases, the contacting object exerting the force may absorb a larger percentage of the pulse, such that the reflected pulse may be diminished correspondingly.
In embodiments where it is present, the sensing layer may be configured to sense characteristics different from the force sensing module. For example, the sensing layer may include capacitive sensors or other sensing elements. In a specific implantation, a multi-touch sensing layer may be incorporated into the force sensing device and may be used to enhance data regarding user inputs. As an example, touch inputs detected by the sense layer may be used to further refine the force input location, confirm the force input location, and/or correlate the force input to an input location. In the last example, the force sensitive device may not use the capacitive sensing of the force sensing device to estimate a location, which may reduce the processing required for the force sensing device. Additionally, in some embodiments, a touch sensitive device may be used to determine force inputs for a number of different touches. For example, the touch positions and force inputs may be used to estimate the input force at each touch location.
In some specific embodiments described herein piezoelectric sensors may be used to determine a force applied to the touch device. In particular, a d31 sensing mode of piezoelectric sensors may be utilized as a measure of the force applied to the touch device. The d31 sensing mode is related to the stretching of the piezoelectric, as will be discussed in greater detail below with reference to example embodiments. In some embodiments, a d33 sensing mode of piezoelectric sensors may be utilized in addition to or in lieu of the d31 mode. The d33 sensing mode is related to the compression of the piezoelectric sensor and, as such, may operate as a secondary piezoelectric effect adding to the total charge generated as the piezoelectric sensor stretches during a force sensing event.
The piezoelectric sensors may generally be configured to sense deformation of a touch display stack. As such, the piezoelectric sensors may be located within the display stack or attached to stack (e.g., laminated to the bottom of the display stack). For displays that include a backlight, such as a liquid crystal display (LCD), the piezoelectric sensor may be located between a rear polarizer and the backlight. Alternately, the piezoelectric sensor may be located on the back of the cover glass, whether or not the system includes a backlight.
FORCE SENSITIVE DEVICE AND SYSTEM
Turning now to the figures, illustrative electronic devices that may incorporate the force sensing device will be discussed in more detail.
In some embodiments, the force sensing device may be incorporated into a tablet computer.
In yet other embodiments, the force sensing device may be incorporated into a mobile computing device, such as a smart phone.
Additionally, the device 10 may include one or more buttons 15 and/or other input devices. In some embodiments, the button 15 may take the form of a home button. Further, in some embodiments, the button 15 may be integrated as part of a cover glass of the device and the piezoelectric based force measurements may be utilized to determine actuation of the button.
The force sensing device will now be discussed in more detail.
The sensing layer 22 may be configured to sense one or more parameters correlated to a user input. In some embodiments, the sensing layer 22 may be configured to sense characteristics or parameters that may be different from the characteristics sensed by the force sensing module 24. For example, the sensing layer 22 may include one or more capacitive sensors that may be configured to detect input touches, e.g., multi-touch input surface including intersecting rows and columns. The sensing layer 22 may be omitted where additional data regarding the user inputs may not be desired. Additionally, the sensing layer 22 may provide additional data that may be used to enhance data sensed by the force sensing module 24 or may be different from the force sensing module. In some embodiments, there may be an air gap between the sensing layer 22 and the force sensing module 24. In other words, the force sensing module 24 and sensing layer may be spatially separated from each other defining a gap or spacing distance.
The substrate 28 may be substantially any support surface, such as a portion of an printed circuit board, the enclosure 16 or frame, or the like. Additionally, the substrate 28 may be configured to surround or at least partially surround one more sides of the sensing device 18.
In some embodiments, a display (e.g., a liquid crystal display) may be positioned beneath the input surface 20 or may form a portion of the input surface 20. Alternatively, the display may be positioned between other layers of the force sensing device. In these embodiments, visual output provided by the display may be visible through the input surface 20.
As generally discussed above, the force sensing device may be incorporated into one or more touch sensitive device. It should be appreciated that although
Touch I/O device 1006 may include a touch sensitive panel which is wholly or partially transparent, semitransparent, non-transparent, opaque or any combination thereof. Touch I/O device 1006 may be embodied as a touch screen, touch pad, a touch screen functioning as a touch pad (e.g., a touch screen replacing the touchpad of a laptop), a touch screen or touchpad combined or incorporated with any other input device (e.g., a touch screen or touchpad disposed on a keyboard) or any multi-dimensional object having a touch sensitive surface for receiving touch input.
In one example, touch I/O device 1006 embodied as a touch screen may include a transparent and/or semitransparent touch sensitive panel partially or wholly positioned over at least a portion of a display. According to this embodiment, touch I/O device 1006 functions to display graphical data transmitted from computing system 1008 (and/or another source) and also functions to receive user input. In other embodiments, touch I/O device 1006 may be embodied as an integrated touch screen where touch sensitive components/devices are integral with display components/devices. In still other embodiments a touch screen may be used as a supplemental or additional display screen for displaying supplemental or the same graphical data as a primary display and to receive touch input.
Touch I/O device 1006 may be configured to detect the location of one or more touches or near touches on device 1006 based on capacitive, resistive, optical, acoustic, inductive, mechanical, chemical measurements, or any phenomena that can be measured with respect to the occurrences of the one or more touches or near touches in proximity to deice 1006. Software, hardware, firmware or any combination thereof may be used to process the measurements of the detected touches to identify and track one or more gestures. A gesture may correspond to stationary or non-stationary, single or multiple, touches or near touches on touch I/O device 1006. A gesture may be performed by moving one or more fingers or other objects in a particular manner on touch I/O device 1006 such as tapping, pressing, rocking, scrubbing, twisting, changing orientation, pressing with varying pressure and the like at essentially the same time, contiguously, or consecutively. A gesture may be characterized by, but is not limited to a pinching, sliding, swiping, rotating, flexing, dragging, or tapping motion between or with any other finger or fingers. A single gesture may be performed with one or more hands, by one or more users, or any combination thereof.
Computing system 1008 may drive a display with graphical data to display a graphical user interface (GUI). The GUI may be configured to receive touch input via touch I/O device 1006. Embodied as a touch screen, touch I/O device 1006 may display the GUI. Alternatively, the GUI may be displayed on a display separate from touch I/O device 1006. The GUI may include graphical elements displayed at particular locations within the interface. Graphical elements may include but are not limited to a variety of displayed virtual input devices including virtual scroll wheels, a virtual keyboard, virtual knobs, virtual buttons, any virtual UI, and the like.
A user may perform gestures at one or more particular locations on touch I/O device 1006 which may be associated with the graphical elements of the GUI. In other embodiments, the user may perform gestures at one or more locations that are independent of the locations of graphical elements of the GUI. Gestures performed on touch I/O device 1006 may directly or indirectly manipulate, control, modify, move, actuate, initiate or generally affect graphical elements such as cursors, icons, media files, lists, text, all or portions of images, or the like within the GUI. For instance, in the case of a touch screen, a user may directly interact with a graphical element by performing a gesture over the graphical element on the touch screen. Alternatively, a touch pad generally provides indirect interaction.
Gestures may also affect non-displayed GUI elements (e.g., causing user interfaces to appear) or may affect other actions within computing system 1008 (e.g., affect a state or mode of a GUI, application, or operating system). Gestures may or may not be performed on touch I/O device 1006 in conjunction with a displayed cursor. For instance, in the case in which gestures are performed on a touchpad, a cursor (or pointer) may be displayed on a display screen or touch screen and the cursor may be controlled via touch input on the touchpad to interact with graphical objects on the display screen. In other embodiments in which gestures are performed directly on a touch screen, a user may interact directly with objects on the touch screen, with or without a cursor or pointer being displayed on the touch screen.
Feedback may be provided to the user via communication channel 1010 in response to or based on the touch or near touches on touch I/O device 1006. Feedback may be transmitted optically, mechanically, electrically, olfactory, acoustically, or the like or any combination thereof and in a variable or non-variable manner.
Attention is now directed towards embodiments of a system architecture that may be embodied within any portable or non-portable device including but not limited to a communication device (e.g. mobile phone, smart phone), a multi-media device (e.g., MP3 player, TV, radio), a portable or handheld computer (e.g., tablet, netbook, laptop), a desktop computer, an All-In-One desktop, a peripheral device, or any other system or device adaptable to the inclusion of system architecture 2000, including combinations of two or more of these types of devices.
It should be apparent that the architecture shown in
RF circuitry 2008 is used to send and receive information over a wireless link or network to one or more other devices and includes well-known circuitry for performing this function. RF circuitry 2008 and audio circuitry 2010 are coupled to processing system 2004 via peripherals interface 2016. Interface 2016 includes various known components for establishing and maintaining communication between peripherals and processing system 2004. Audio circuitry 2010 is coupled to audio speaker 2050 and microphone 2052 and includes known circuitry for processing voice signals received from interface 2016 to enable a user to communicate in real-time with other users. In some embodiments, audio circuitry 2010 includes a headphone jack (not shown).
Peripherals interface 2016 couples the input and output peripherals of the system to processor 2018 and computer-readable medium 2001. One or more processors 2018 communicate with one or more computer-readable mediums 2001 via controller 2020. Computer-readable medium 2001 can be any device or medium that can store code and/or data for use by one or more processors 2018. Medium 2001 can include a memory hierarchy, including but not limited to cache, main memory and secondary memory. The memory hierarchy can be implemented using any combination of RAM (e.g., SRAM, DRAM, DDRAM), ROM, FLASH, magnetic and/or optical storage devices, such as disk drives, magnetic tape, CDs (compact disks) and DVDs (digital video discs). Medium 2001 may also include a transmission medium for carrying information-bearing signals indicative of computer instructions or data (with or without a carrier wave upon which the signals are modulated). For example, the transmission medium may include a communications network, including but not limited to the Internet (also referred to as the World Wide Web), intranet(s), Local Area Networks (LANs), Wide Local Area Networks (WLANs), Storage Area Networks (SANs), Metropolitan Area Networks (MAN) and the like.
One or more processors 2018 run various software components stored in medium 2001 to perform various functions for system 2000. In some embodiments, the software components include operating system 2022, communication module (or set of instructions) 2024, touch processing module (or set of instructions) 2026, graphics module (or set of instructions) 2028, one or more applications (or set of instructions) 2030, and force module (or set of instructions) 2038. Each of these modules and above noted applications correspond to a set of instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various embodiments. In some embodiments, medium 2001 may store a subset of the modules and data structures identified above. Furthermore, medium 2001 may store additional modules and data structures not described above.
Operating system 2022 includes various procedures, sets of instructions, software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 2024 facilitates communication with other devices over one or more external ports 2036 or via RF circuitry 2008 and includes various software components for handling data received from RF circuitry 2008 and/or external port 2036.
Graphics module 2028 includes various known software components for rendering, animating and displaying graphical objects on a display surface. In embodiments in which touch I/O device 2012 is a touch sensitive display (e.g., touch screen), graphics module 2028 includes components for rendering, displaying, and animating objects on the touch sensitive display.
One or more applications 2030 can include any applications installed on system 2000, including without limitation, a browser, address book, contact list, email, instant messaging, word processing, keyboard emulation, widgets, JAVA-enabled applications, encryption, digital rights management, voice recognition, voice replication, location determination capability (such as that provided by the global positioning system (GPS)), a music player, etc.
Touch processing module 2026 includes various software components for performing various tasks associated with touch I/O device 2012 including but not limited to receiving and processing touch input received from I/O device 2012 via touch I/O device controller 2032.
System 2000 may further include force module 2038 for performing the method/functions as described herein in connection with
The force module 2038 may generally relate to interpretation of force measurements and/or their effect on the current operating context of the system 2000. Generally, the force module 2038 and the touch processing module 2026 may be configured to operate in cooperation to determine the effect of a force measurement. For example, the touch processing module 2026 may be utilized to help discern a location of touch on a surface. This location information may be used in determining an effect of a force measurement. Specifically, if a threshold amount of force is sensed over the button 15 (
I/O subsystem 2006 is coupled to touch I/O device 2012, the piezoelectric sensor 2042 and one or more other I/O devices 2014 for controlling or performing various functions. Touch I/O device 2012 communicates with processing system 2004 via touch I/O device controller 2032, which includes various components for processing user touch input (e.g., scanning hardware). The piezoelectric sensor 2042 is communicates with piezoelectric controllers 2043 as part of the force determination for force measurements. In particular, for example, signals generated by the piezoelectric sensor 2042 are controlled or otherwise received by the piezoelectric controller 2043 as part of the I/O subsystem 2006. One or more other input controllers 2034 receives/sends electrical signals from/to other I/O devices 2014. Other I/O devices 2014 may include physical buttons, dials, slider switches, sticks, keyboards, touch pads, additional display screens, or any combination thereof.
If embodied as a touch screen, touch I/O device 2012 displays visual output to the user in a GUI. The visual output may include text, graphics, video, and any combination thereof. Some or all of the visual output may correspond to user-interface objects. Touch I/O device 2012 forms a touch-sensitive surface that accepts touch input from the user. Touch I/O device 2012 and touch screen controller 2032 (along with any associated modules and/or sets of instructions in medium 2001) detects and tracks touches or near touches (and any movement or release of the touch) on touch I/O device 2012 and converts the detected touch input into interaction with graphical objects, such as one or more user-interface objects. In the case in which device 2012 is embodied as a touch screen, the user can directly interact with graphical objects that are displayed on the touch screen. Alternatively, in the case in which device 2012 is embodied as a touch device other than a touch screen (e.g., a touch pad) the user may indirectly interact with graphical objects that are displayed on a separate display screen embodied as I/O device 2014.
Touch I/O device 2012 may be analogous to the multi-touch sensitive surface described in the following U.S. Pat. Nos. 6,323,846; 6,570,557; and/or 6,677,932; and/or U.S. Patent Publication 2002/0015024A1, each of which is hereby incorporated by reference.
Embodiments in which touch I/O device 2012 is a touch screen, the touch screen may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, OLED, or OEL (organic electro luminescence), although other display technologies may be used in other embodiments.
Feedback may be provided by touch I/O device 2012 based on the user's touch input as well as a state or states of what is being displayed and/or of the computing system. Feedback may be transmitted optically (e.g., light signal or displayed image), mechanically (e.g., haptic feedback, touch feedback, force feedback, or the like), electrically (e.g., electrical stimulation), olfactory, acoustically (e.g., beep or the like), or the like or any combination thereof and in a variable or non-variable manner.
The I/O subsystem 2006 may include and/or be coupled to one or more sensors configured to be utilized in the force determination. In particular, the I/O subsystem 2006 may include an LED 3002 and a sensor 3004, and/or an additional sensor 4000. Each of the LED 3002, sensor 3004 and additional sensors 4000 may be coupled to the touch I/O device controller 2032, or another I/O controller (not shown). The LED 3002, sensor 3004 and additional sensor 4000 may be utilized, for example, as part of a proximity sense routine to determine if a user or object is close the system. If the user or object is not near the system 2000, any force measurement and/or sensed touch may be false and, therefore, discarded.
System 2000 also includes power system 2044 for powering the various hardware components and may include a power management system, one or more power sources, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator and any other components typically associated with the generation, management and distribution of power in portable devices.
In some embodiments, peripherals interface 2016, one or more processors 2018, and memory controller 2020 may be implemented on a single chip, such as processing system 2004. In some other embodiments, they may be implemented on separate chips.
Turning to
In other embodiments, the piezoelectric elements may be located at different positions within a display stack. The positioning may depend upon the type of display into which the piezoelectric element is placed. Additionally, or alternatively, the location of the piezoelectric element within the stack may depend upon the optical characteristics of the piezoelectric element. For example, if the piezoelectric element may have a negative impact upon the image of the display, then it may be preferable to position the piezoelectric behind the rear polarizer and display.
The electrical structure that communicates the electrical charge from the piezoelectric elements may take any suitable form. Several different embodiments, are presented herein as examples.
The piezoelectric elements 10000 may have a length so that they may be coupled to the display stack and flex when force is applied to the stack. In other embodiments, the piezoelectric elements 10000 and the electrodes 10010, 10012 may take another geometrical form. For example, the piezoelectric elements 10000 may be cubical, rectangular or any other shape. As such, the piezoelectric elements 10000 may be configured as discrete pixels that may sense force at a particular location.
In addition to the electrode strips discussed herein, an array or grid of smaller electrodes may be employed. Each electrode may function to sense force (or strain) separately from other electrodes in the array and may be driven and/or sensed separately from the other array electrodes.
In each of the foregoing elements, the force measurement is derived from measuring local deformation. The local deformation is a function of force translated through the cover glass and/or the display stack. That is, the display stack curves and the bottom of the stack strains (e.g., stretches). Thus, the force measurement typically is not a measurement of the deflection or displacement of the stack or cover glass. The piezoelectric element of the sensors may be laminated to the bottom of the stack. In backlit displays, such as liquid crystal displays, the piezoelectric elements may be located behind the rear polarizer, but in front of the backlight. Because they are located in front of the backlight, they are generally transparent. The transistors and electrodes associated with the piezoelectric elements may be formed through a process that provides for transparency, such as an indium tin oxide (ITO) deposition process. Additionally, as discussed above, a pixel or grid geometry of electrodes may be created so that force location may be determined. That is, a single film with many electrodes may be used. In such embodiments, a multi-force determination may be made, or an estimate of multi-force inputs may be approximated. That is, discrete, simultaneous force inputs at different locations may be discerned.
Alternatively, the piezoelectric based sensors may be placed on the back of the display cover glass while providing the same advantages discussed above.
It should be appreciated that embodiments discussed herein may be boundary-independent. That is, neither the piezoelectric film nor the element to which it is applied (such as a display) need be bounded with a rubber or other elastic boundary, such as a gasket, in order to operate. Instead, force determination may be performed even without such a boundary.
As multiple modes of charge generation may be active (e.g., d33 mode and d31 mode), a calibration may be performed to account for both or all active modes. Hence, the force determination may take into account all of the generated charge as a result of force, even if one mode is dominant. As one example of calibration, various known forces may be applied to specific locations on the force-sensing surface of the device. This may occur, for example, in a factory prior to shipment or sale of the device. Force may be applied across the device's surface, including at locations near or on a boundary of the force-sensing area. The output of the piezoelectric sensors may be read out and generated as a strain map. This strain map may be used to calculate calibration constants for the different locations at which force was applied; since the applied forces are known, they may be correlated to the output strain map and the various constants required to scale the force to the output may be determined. Calibration constants may vary across the force-sensing surface or may be relatively constant. Generally, the calibration constants may relate sensor signal to force or strain to force, and may be stored in a memory of the device. These constants may later be retrieved and used to estimate a force applied to a particular location. Certain embodiments may employ plate deformation-based algorithms to correlate a deflection map of the force-sensing surface to one or more force inputs.
The piezoelectric elements generate their own signal and do not have to be powered. However, the signal generated may generally be low energy. This low energy signal may be difficult to sense as there may be noise and/or leakage in the circuit. The electrical model may take one of several suitable forms. For example, in one embodiment, a high pass filter may be applied. However, the high pass filter effect may make low frequency elements in the signal difficult to read. In another embodiment, the signal may be amplified and/or processed to help obtain suitable readings. For example, a low leakage op amp may be implemented to limit drift. Additionally, a shield may be provided in some embodiments to protect against capacitance from a user's finger. For example, a thin-film transistor layer in the display stack may be grounded to serve as a shield. Further, temperature changes may affect the piezoelectric element and steps may be taken to mitigate any temperature change impact. For example, temperature may be canceled out by using multiple sensors (e.g., one for thermal effect and the other for both thermal effects and force). For example, a dual mode rejection method may be implemented. Alternatively, films that are not thermally sensitive may be used or a film that has a non-uniform directional response may be used. The directional properties may be used to cancel out thermal effects. In one example, the films may be stacked or positioned side by side so that their respective thermal effects cancel each other out.
As may be appreciated, in the presently disclosed embodiments the piezoelectric element and electrodes are located within the display stack and are not exposed. Additionally, the force sensing may not merely be binary. Rather, force pixels may be sensed so that a multi-force parameter may be read.
In some embodiments, a force sensing structure may further be utilized as a touch input structure. As piezoelectric elements are generally dielectric in nature (e.g., they are not conductive), they may be utilized as a dielectric member for a capacitor. That is, when a user's finger (or other capacitively-sensed element, such as a capacitive stylus) is in proximity to the cover glass surface, the position of the finger/element may be sensed by the touch sensor. The touch sensor may use the piezoelectric element as one plane or part of a mutual-capacitance array to detect such touches or near-touch events. Likewise, force applied to the cover glass may be translated into strain on the piezoelectric element, thereby generating a charge that may be read and utilized to approximate, estimate or otherwise measure the force. This generated charge may further modulate the capacitance of the touch sensor, and may be accounted for when detecting a touch event. Additionally, the cover glass of a device may serve as a dielectric for another capacitive circuit. By multiplexing the operation of a piezoelectric element, it may function as both a force sensor and a touch sensor. In particular, the force and touch sensing structure may generally take the form of a piezoelectric structure with electrode(s) located on either side of the piezoelectric element. The sheets may be used to sense charge generated by deflection of the piezoelectric elements. Additionally, a top electrode(s) may be configured to capacitively couple with a conductive element interacting with a display screen. For example, the electrode(s) may be configured to capacitively couple with a user's finger. The electrodes may be configured as self-capacitive members or mutual capacitive members (e.g., a bottom layer may be non-patterned).
The piezoelectric based force sensor is an impulse sensor due to the nature of the piezoelectric element. That is, when the piezoelectric element is deformed to generate a charge, the charge generated is an impulse signal. A circuit may be provided to integrate the impulse signal and determine how much force is applied and how the force changes. Generally, the size of the generated impulse signal is linearly related to the amount of force applied. The touch sensor, alternatively, does not generate an impulse signal.
The piezoelectric force and touch sensor 10030 may generally be created by any suitable process. In one embodiment, ITO electrodes may be formed on an underside of a device cover glass. A piezoelectric element may be positioned over the ITO electrodes and a second set of ITO electrodes may be positioned or deposited on the piezoelectric element. The ITO electrodes may take any suitable form. In one embodiment, the ITO electrodes may be sheets. In another embodiment, the ITO may be strips or pixels to enable a location determination of both the force and touch sensing. In some embodiments, the piezoelectric element and electrodes may take the form of a piezoelectric package that may be installed beneath a cover glass. Example piezoelectric films that may be utilized in certain embodiments discussed herein include poly-L-lactic acid piezoelectric (PLLA) elements, some of which are manufactured and distributed by Murata Manufacturing Co., Ltd. One example of such a film is a d14 mode piezoelectric material cut in a 45 degree orientation in order to permit operation in a d31 mode. As another example, a polyvinylidene fluoride (PVDF) material may be used in certain applications. In embodiments employing a PVDF material as a strain sensor, thermal effects on the PVDF may be accounted for, or the PVDF may be thermally isolated.
The piezoelectric film may generally have a high transparency. Hence, the piezoelectric film generally does not appreciably impact the optics of the system in which it is incorporated (at least as seen by the human eye), which means it may be installed directly under the cover glass. This may prove advantageous, as it may be attached only to the cover glass rather than to other components or layers in a display stack. Therefore, the deformation of the piezoelectric element is dependent only upon the deformation of the cover glass. Also, the film, or other transparent piezoelectric element that may be installed directly under the cover glass may be used with any display technology. That is, it is not display device dependent. Additionally, because the force and touch sensor may be installed directly under the cover glass, there is no change to existing display architecture. The piezoelectric based sensor may be directly inserted into existing architecture.
Timing Diagram
In some embodiments various components of the computing device and/or touch screen device may be driven or activated separately from each other and/or on separate frequencies. Separate drive times and/or frequencies for certain components, such as the display, touch sensor or sensors (if any), and/or force sensors may help to reduce cross-talk and noise in various components.
With respect to
With respect to
With respect to
The foregoing describes some example techniques using piezoelectric elements to sense force in a touch sensitive stack. The sensing of force gives an additional input mode for touch sensitive input devices. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the embodiments.
This application is a 35 U.S.C. § 371 application of PCT/US2013/032607, filed Mar. 15, 2013, and entitled “Force Detection in Touch Devices Using Piezoelectric Sensors,” and further claims the benefit under 35 U.S.C. § 119(e) to U.S. provisional application No. 61/738,381, filed Dec. 17, 2012, and entitled “Force Detection In Touch Devices Using Piezoelectric Sensors,” both of which are incorporated by reference as if fully disclosed herein. Force Detection In Touch Devices Using Piezoelectric Sensors
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2013/032607 | 3/15/2013 | WO | 00 |
Publishing Document | Publishing Date | Country | Kind |
---|---|---|---|
WO2014/098946 | 6/26/2014 | WO | A |
Number | Name | Date | Kind |
---|---|---|---|
3876912 | Sanders | Apr 1975 | A |
4345477 | Johnson | Aug 1982 | A |
4423640 | Jetter | Jan 1984 | A |
4516112 | Chen | May 1985 | A |
4634917 | Dvorsky | Jan 1987 | A |
4695963 | Sagisawa | Sep 1987 | A |
4951510 | Holm-Kennedy et al. | Aug 1990 | A |
5481905 | Pratt | Jan 1996 | A |
5577021 | Nakatani et al. | Nov 1996 | A |
5616846 | Kwasnik | Apr 1997 | A |
5673041 | Chatigny et al. | Sep 1997 | A |
5708460 | Young | Jan 1998 | A |
5790215 | Sugahara | Aug 1998 | A |
5915285 | Sommer | Jun 1999 | A |
6288829 | Kimura | Sep 2001 | B1 |
6369865 | Hinata | Apr 2002 | B2 |
6637276 | Adderton et al. | Oct 2003 | B2 |
6812161 | Heremans | Nov 2004 | B2 |
7176897 | Roberts | Feb 2007 | B2 |
7190350 | Roberts | Mar 2007 | B2 |
7196694 | Roberts | Mar 2007 | B2 |
7211885 | Nordal et al. | May 2007 | B2 |
7320253 | Hanazawa et al. | Jan 2008 | B2 |
7392716 | Wilner | Jul 2008 | B2 |
7441467 | Bloom | Oct 2008 | B2 |
7511702 | Hotelling | Mar 2009 | B2 |
7724248 | Saito | May 2010 | B2 |
7755616 | Jung et al. | Jul 2010 | B2 |
7800592 | Kerr et al. | Sep 2010 | B2 |
8020456 | Liu et al. | Sep 2011 | B2 |
8050876 | Feen et al. | Nov 2011 | B2 |
8132468 | Radivojevic | Mar 2012 | B2 |
8243225 | Kai et al. | Aug 2012 | B2 |
8266971 | Jones | Sep 2012 | B1 |
8289286 | Stallings et al. | Oct 2012 | B2 |
8305358 | Klighhult et al. | Nov 2012 | B2 |
8421483 | Klinghult | Apr 2013 | B2 |
8434369 | Hou et al. | May 2013 | B2 |
8456430 | Oliver et al. | Jun 2013 | B2 |
8648816 | Homma et al. | Feb 2014 | B2 |
8669952 | Hashimura et al. | Mar 2014 | B2 |
8669962 | Kuan | Mar 2014 | B2 |
8692646 | Lee et al. | Apr 2014 | B2 |
8711128 | Small et al. | Apr 2014 | B2 |
8780074 | Castillo | Jul 2014 | B2 |
8780543 | Molne et al. | Jul 2014 | B2 |
8870087 | Coogan et al. | Oct 2014 | B2 |
8988384 | Krah | Mar 2015 | B2 |
9024910 | Stephanou et al. | May 2015 | B2 |
9030427 | Yasumatsu | May 2015 | B2 |
9063599 | Yanagi et al. | Jun 2015 | B2 |
9081460 | Jeong et al. | Jul 2015 | B2 |
9099971 | Lynn et al. | Aug 2015 | B2 |
9110532 | Ando et al. | Aug 2015 | B2 |
9116569 | Stacy et al. | Aug 2015 | B2 |
9116570 | Lee et al. | Aug 2015 | B2 |
9182849 | Huang et al. | Nov 2015 | B2 |
9182859 | Coulson | Nov 2015 | B2 |
9223162 | DeForest et al. | Dec 2015 | B2 |
9246486 | Yang et al. | Jan 2016 | B2 |
9262002 | Momeyer | Feb 2016 | B2 |
9262003 | Kitchens | Feb 2016 | B2 |
9292115 | Kauhanen | Mar 2016 | B2 |
9304348 | Jang | Apr 2016 | B2 |
9329729 | Kim et al. | May 2016 | B2 |
9417725 | Watazu et al. | Aug 2016 | B1 |
9454268 | Badaye et al. | Sep 2016 | B2 |
9466783 | Olien et al. | Oct 2016 | B2 |
9501167 | Day | Nov 2016 | B2 |
9507456 | Watazu et al. | Nov 2016 | B2 |
9542028 | Filiz et al. | Jan 2017 | B2 |
9612170 | Vosgueritchian et al. | Apr 2017 | B2 |
9658722 | Schwartz | May 2017 | B2 |
9665200 | Filiz et al. | May 2017 | B2 |
9690413 | Filiz | Jun 2017 | B2 |
9690414 | Kano et al. | Jun 2017 | B2 |
9729730 | Levesque et al. | Aug 2017 | B2 |
20020149571 | Roberts | Oct 2002 | A1 |
20030234769 | Cross | Dec 2003 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20080165159 | Soss et al. | Jul 2008 | A1 |
20080218488 | Yang et al. | Sep 2008 | A1 |
20090002199 | Lainonen | Jan 2009 | A1 |
20090046072 | Emig | Feb 2009 | A1 |
20090189866 | Haffenden et al. | Jul 2009 | A1 |
20090267902 | Nambu | Oct 2009 | A1 |
20090309616 | Klinghult | Dec 2009 | A1 |
20090316380 | Armstrong | Dec 2009 | A1 |
20100053116 | Daverman et al. | Mar 2010 | A1 |
20100103115 | Hainzl | Apr 2010 | A1 |
20100117809 | Dai | May 2010 | A1 |
20100128002 | Stacy | May 2010 | A1 |
20110045285 | Saiki et al. | Feb 2011 | A1 |
20110128250 | Murphy | Jun 2011 | A1 |
20110175844 | Berggren | Jul 2011 | A1 |
20110248839 | Kwok et al. | Oct 2011 | A1 |
20110261021 | Modarres | Oct 2011 | A1 |
20110278078 | Schediwy | Nov 2011 | A1 |
20110285660 | Prabhu | Nov 2011 | A1 |
20120038577 | Brown et al. | Feb 2012 | A1 |
20120105333 | Maschmeyer | May 2012 | A1 |
20120105367 | Son et al. | May 2012 | A1 |
20120111119 | Small | May 2012 | A1 |
20120127136 | Schneider | May 2012 | A1 |
20120154299 | Hsu et al. | Jun 2012 | A1 |
20120139864 | Sleeman et al. | Jul 2012 | A1 |
20120188198 | Jeong et al. | Jul 2012 | A1 |
20120194483 | Deluca | Aug 2012 | A1 |
20120268416 | Pirogov | Oct 2012 | A1 |
20120293491 | Wang et al. | Nov 2012 | A1 |
20130050126 | Kimura | Feb 2013 | A1 |
20130074988 | Chou | Mar 2013 | A1 |
20130082970 | Frey | Apr 2013 | A1 |
20130141365 | Lynn et al. | Jun 2013 | A1 |
20130147739 | Aberg et al. | Jun 2013 | A1 |
20130154933 | Sheik-Nainar | Jun 2013 | A1 |
20130155059 | Wang et al. | Jun 2013 | A1 |
20130215056 | Johansson et al. | Aug 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130328803 | Fukushima et al. | Dec 2013 | A1 |
20130333922 | Kai et al. | Dec 2013 | A1 |
20140062934 | Coulson | Mar 2014 | A1 |
20140118635 | Yang | May 2014 | A1 |
20140174190 | Kulkarni et al. | Jun 2014 | A1 |
20140191973 | Zellers et al. | Jul 2014 | A1 |
20140347315 | Mo et al. | Nov 2014 | A1 |
20150002452 | Klinghult | Jan 2015 | A1 |
20150101849 | Bockmeyer et al. | Apr 2015 | A1 |
20150116260 | Hoen et al. | Apr 2015 | A1 |
20150169100 | Tsuyuki et al. | Jun 2015 | A1 |
20150268725 | Levesque et al. | Sep 2015 | A1 |
20150301684 | Shimamura | Oct 2015 | A1 |
20160033389 | Serpe | Feb 2016 | A1 |
20160034073 | Andoh | Feb 2016 | A1 |
20160035290 | Kim et al. | Feb 2016 | A1 |
20160041672 | Hoen et al. | Feb 2016 | A1 |
20160048266 | Smith et al. | Feb 2016 | A1 |
20160062517 | Meyer et al. | Mar 2016 | A1 |
20160117035 | Watazu et al. | Apr 2016 | A1 |
20160132151 | Watazu et al. | May 2016 | A1 |
20160147353 | Filiz et al. | May 2016 | A1 |
20160306481 | Filiz et al. | Oct 2016 | A1 |
20160357297 | Picciotto et al. | Dec 2016 | A1 |
20170031495 | Smith | Feb 2017 | A1 |
20170075465 | Pedder et al. | Mar 2017 | A1 |
20170090638 | Vosgueritchian et al. | Mar 2017 | A1 |
20170090655 | Zhang et al. | Mar 2017 | A1 |
20170191884 | Vosgueritchian et al. | Jul 2017 | A1 |
20170261387 | Vosgueritchian et al. | Sep 2017 | A1 |
20170269757 | Filiz et al. | Sep 2017 | A1 |
20170285799 | Iuchi et al. | Oct 2017 | A1 |
20170285864 | Pedder et al. | Oct 2017 | A1 |
20180067612 | Smith | Mar 2018 | A1 |
Number | Date | Country |
---|---|---|
1527933 | Sep 2004 | CN |
1796955 | Jul 2006 | CN |
1860432 | Nov 2006 | CN |
101017419 | Aug 2007 | CN |
101071354 | Nov 2007 | CN |
101201277 | Jun 2008 | CN |
101950224 | Jan 2011 | CN |
102012772 | Apr 2011 | CN |
102047088 | May 2011 | CN |
102165400 | Aug 2011 | CN |
102175362 | Sep 2011 | CN |
102460351 | May 2012 | CN |
102591519 | Jul 2012 | CN |
102822779 | Dec 2012 | CN |
103026327 | Apr 2013 | CN |
103069365 | Apr 2013 | CN |
103197821 | Jul 2013 | CN |
103336562 | Oct 2013 | CN |
103582807 | Feb 2014 | CN |
204461655 | Jul 2015 | CN |
104866134 | Aug 2015 | CN |
204576454 | Aug 2015 | CN |
105444662 | Mar 2016 | CN |
0332365 | Sep 1989 | EP |
0467562 | Jan 1992 | EP |
1840714 | Oct 2007 | EP |
2120136 | Nov 2009 | EP |
2381340 | Oct 2011 | EP |
2629075 | Aug 2013 | EP |
2907563 | Apr 2008 | FR |
201039458 | Feb 2010 | JP |
2010039458 | Feb 2010 | JP |
2010197066 | Sep 2010 | JP |
WO 96038833 | Dec 1996 | WO |
WO 02035461 | May 2002 | WO |
WO 07074800 | Jul 2007 | WO |
WO 11156447 | Dec 2011 | WO |
WO 12168892 | Dec 2012 | WO |
WO 13177322 | Nov 2013 | WO |
WO 15106183 | Jul 2015 | WO |
WO 15158952 | Oct 2015 | WO |
WO 16029354 | Mar 2016 | WO |
Entry |
---|
Bau, et al., “TeslaTouch: Electrovibration for Touch Surfaces,” UIST'10, Oct. 3-6, 2010, New York, New York USA, 10 pages. |
Feist, “Samsung snags patent for new pressure sensitive touchscreens,” posted on AndroidAuthority.com at URL: http://www.androidauthority.com/samsung-patent-pressure-sensitive-touchscreens-354860, Mar. 7, 2014, 1 page. |
Rausch, “Printed piezoresistive strain sensors for monitoring of light-weight structures,” SENSOR+TEST Conferences 2011—SENSOR Proceedings, pp. 216-220. |
Schweizer, “Electrical characterization and investigation of the piezoresistive effect of PEDOT:PSS thin films,” A Thesis Presented to the Academic Faculty in Partial Fulfillment of the Requirements for the Degree Master of Science of Electrical and Computer Engineering, Georgia Institute of Technology, Apr. 2005, 89 pages. |
Takamatsu, et al., “Transparent conductive-polymer strain sensors for touch input sheets of flexible displays,” Journal of Micromechanics and Microengineering, vol. 20, 2010, 6 pages. |
Tsai, et al., “Fabrication of Graphene-based Micro Strain Gauge,” NPL Management Ltd.—Internal, Oct. 15-16, 2012, 1 page. |
Number | Date | Country | |
---|---|---|---|
20150331517 A1 | Nov 2015 | US |
Number | Date | Country | |
---|---|---|---|
61738381 | Dec 2012 | US |