Touch-based user interface with haptic feedback

Information

  • Patent Grant
  • 10013058
  • Patent Number
    10,013,058
  • Date Filed
    Tuesday, September 21, 2010
    13 years ago
  • Date Issued
    Tuesday, July 3, 2018
    5 years ago
Abstract
One embodiment of a touch-based user interface may include a haptic feedback layer with one or more actuators configured to supply a haptic feedback. The one or more actuators may be embedded in a nonconductive material. The touch-based user interface may further include a printed circuit board layer underlying the haptic feedback layer. The printed circuit board layer may include one or more conductive traces configured to supply a voltage to the one or more actuators.
Description
BACKGROUND

I. Technical Field


Embodiments described herein relate generally to touch-based user interfaces, such as a track pad or a touch screen, and more particularly, to touch-based user interfaces capable of providing localized haptic feedback to a user.


II. Background Discussion


Existing touch-based user interfaces typically have a touch panel and a visual display component. The touch panel may include a touch sensitive surface that, in response to detecting a touch event, generates a signal that can be processed and utilized by other components of an electronic device. The touch sensitive surface may be separate from the display component, such as in the case of a trackpad, or may be integrated into or positioned in front of the viewable area of the display screen, such as in the case of a display touchscreen.


In either case, the display component may display textual and/or graphical display elements representing selectable virtual buttons or icons, and the touch sensitive surface may allow a user to navigate the content displayed on the display screen. Typically, a user may move one or more objects, such as a finger or a stylus, across the touch sensitive surface in a pattern that the device translates into an input command. As an example, some electronic devices allow the user to select a virtual button by tapping a portion of the touch sensitive surface corresponding to the virtual button. Other electronic devices include a touch sensitive surface that can detect more than one simultaneous touch events in different locations on the touchscreen.


Existing touch-based user interfaces do not provide haptic feedback to a user. Haptic feedback may be any type of tactile feedback that takes advantage of a user's sense of touch, for example, by applying forces, vibrations, and/or motions to the user. The user can typically only feel the rigid surface of the touch screen, making it difficult to find icons, hyperlinks, textboxes, or other user-selectable input elements that are being displayed. A touch-based user interface may help a user navigate content displayed on the display screen by incorporating haptic feedback. For example, localized haptic feedback can enable a user to feel what is being displayed by providing feedback when a user locates a virtual button, selects the virtual button and/or confirms the selection of the virtual button.


SUMMARY

Embodiments described herein relate to touch-based user interface devices that can both receive an input from a user and provide haptic feedback based on the input from the user. In one embodiment, a touch-based user interface device may include a haptic feedback layer that includes one or more piezoelectric actuators that are embedded in a nonconductive material. The haptic feedback layer may be the outermost layer of the touch-based user interface device so that the mechanical stimulation provided by the actuators can be felt by a user. However, in other embodiments, the haptic feedback layer may be covered by a protective coating or cover layer. In some embodiments, a printed circuit board layer may be positioned underneath the haptic feedback layer. The printed circuit board layer may include one or more metallic traces that are configured to supply a voltage to each of the piezoelectric actuators embedded in the haptic feedback layer. Some embodiments may also include input sensors, such as a displacement sensor and/or force sensor for recognizing and distinguishing between various touch-based input gestures from a user.


One embodiment may take the form of a touch-based user interface that includes a haptic feedback layer including one or more actuators configured to supply a haptic feedback. The one or more actuators may be embedded in a nonconductive material. The touch-based user interface may further include a printed circuit board layer underlying the haptic feedback layer. The printed circuit board layer may include one or more conductive traces configured to supply a voltage to the one or more actuators.


Another embodiment may take the form of a method for manufacturing a haptic feedback layer. The method may include arranging one or more piezoelectric actuators so that the one or more piezoelectric actuators are spaced apart from one another, and filing any spaces between the piezoelectric actuators with a nonconductive material.


Another embodiment may take the form of a method for manufacturing a haptic feedback layer. The method may include arranging one or more piezoelectric actuator strands so that the one or more piezoelectric actuator strands are spaced apart from one another, filling any spaces between the piezoelectric actuator strands with a nonconductive material to form a blank, and cutting the blank to form a haptic feedback layer.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1A illustrates one embodiment of an electronic device that incorporates an embodiment of a touch-based user interface.



FIG. 1B illustrates another embodiment of an electronic device that incorporates an embodiment of a touch-based user interface.



FIG. 1C illustrates another embodiment of an electronic device that incorporates an embodiment of a touch-based user interface.



FIG. 2 illustrates a perspective view of a single piezoelectric actuator, as used in accordance with some embodiments.



FIG. 3A illustrates a top down view of one embodiment of a touch-based user interface.



FIG. 3B illustrates a side cross-sectional view of the touch-based user interface shown in FIG. 3A, as taken along line 3B-3B.



FIG. 4A illustrates a top down view of one embodiment of a displacement sensor overlaying one or more piezoelectric actuators.



FIG. 4B illustrates a close up and partially cut-away view of the embodiment shown in FIG. 4A.



FIG. 4C illustrates an exploded view of the embodiment shown in FIG. 4A.



FIG. 5A illustrates a top down view of another embodiment of a touch-based user interface.



FIG. 5B illustrates a perspective view of the embodiment of the touch-based user interface shown in FIG. 5A.



FIG. 6A illustrates a perspective view of a sample embodiment shown in FIGS. 3A and 3B, shown during manufacturing of the embodiment before an adhesive is applied around the edges of the piezoelectric actuators.



FIG. 6B illustrates a perspective view of the sample embodiment of FIGS. 3A and 3B, shown during manufacturing of the embodiment before nonconductive material is added to the mold.



FIG. 6C illustrates a perspective view of the sample embodiment of FIGS. 3A and 3B, shown during manufacturing of the embodiment after the spaces between the piezoelectric actuators have been filled with nonconductive material.



FIG. 7A illustrates a perspective view of the sample embodiments shown in FIGS. 3A and 3B and FIGS. 5A and 5B, shown during manufacturing of these embodiments from a composite blank before the composite blank is cut.



FIG. 7B illustrates a perspective view of a sample embodiment shown in FIGS. 3A and 3B, shown during manufacturing of the embodiment after the composite blank is cut.



FIG. 7C illustrates a perspective view of a sample embodiment shown in FIGS. 5A and 5B, shown during manufacturing of the embodiment after the composite blank is cut.



FIG. 8 is a flowchart setting forth a method for manufacturing a haptic feedback layer.



FIG. 9 is a flowchart setting forth a method for manufacturing a haptic feedback layer.





DETAILED DESCRIPTION

Embodiments described herein relate to touch-based user interface devices that can both receive an input from a user and provide haptic feedback based on the input from the user. In one embodiment, a touch-based user interface device may include a haptic feedback layer that includes one or more piezoelectric actuators that are embedded in a nonconductive material. The haptic feedback layer may be the outermost layer of the touch-based user interface device so that the mechanical stimulation provided by the actuators can be felt by a user. However, in other embodiments, the haptic feedback layer may be covered by a protective coating or cover layer. In some embodiments, a printed circuit board layer may be positioned underneath the haptic feedback layer. The printed circuit board layer may include one or more metallic traces that are configured to supply a voltage to each of the piezoelectric actuators embedded in the haptic feedback layer. Some embodiments may also include input sensors, such as displacement and/or force sensors for recognizing and distinguishing between various touch-based input gestures from a user.


The term “vertical” as used herein is defined as a plane perpendicular to the plane or surface of the haptic feedback layer, regardless of its orientation. The term “horizontal” refers to a direction perpendicular to the vertical direction just defined. Terms such as “above,” “below,” “bottom,” “beneath,” “top,” “side,” “higher,” “lower,” “upper,” “over,” and “under” (e.g., as in “underlying,” “underneath,” and so on) are defined with respect to the plane perpendicular to the plane or surface of the haptic feedback layer, regardless of its orientation. The term “outermost” refers to the surface positioned closest to a user engaging the surface. The term “outer,” as in “outer surface,” refers to any surface of an object, which can include the outermost surface.



FIGS. 1A-1C illustrate some examples of electronic devices that incorporate various embodiments of touch-based user interfaces. In one embodiment, shown in FIG. 1A, a laptop 111 may incorporate a trackpad 104 that serves as a user input-output (I/O) device. The trackpad 104 may be separate from the display screen 103 of the laptop 100.


As will be further described below, the trackpad 104 may include one or more input sensors that allow a user to interact with the laptop 111, as well as a surface capable of providing dynamic localized haptic feedback. In one embodiment, the trackpad 104 may be configured to sense various touch-based input gestures, such as swiping, taping, scrolling, and so on, applied across the surface of the trackpad 104. The touch-based input gestures may be applied by an object, such as a finger, a stylus, and so on. The input sensors may obtain information regarding the sensed gestures and transmit the information to a processing device provided in the laptop 111, which may translate the received information to a particular input command. As an example, the input sensors may derive distance and/or direction information regarding a sensed gesture, and the processing device may move a graphical pointer on the screen based on the received distance and/or direction information. As another example, the input sensors may be configured to sense a particular motion or pattern of motions and associate the sensed motion with a particular command. For example, a tap may be associated with a mouse click, while sliding the object along the trackpad in a particular manner may be associated with scrolling. The processing device may be any known processing device, including, but not limited to, a central processing unit (CPU), a microprocessor, a digital signal processor (DSP), a microcontroller, a graphics processing unit (GPU), and so on.


As discussed above, the trackpad 104 may be configured to provide haptic feedback based on the input gestures from the user. The haptic feedback may be used to enhance the user's interaction with the laptop 111 by providing mechanical stimulation to the user when the user is engaging the trackpad 104. For example, the haptic feedback may confirm the user's selection of a particular virtual icon or button, or may be provided when the user's cursor passes a selectable icon or button. Other embodiments may include other ways of providing haptic feedback to the user. The haptic feedback may be provided by one or more actuators configured to apply forces, vibration, and/or other motions to the object engaging the trackpad 104. As will be further discussed below, in one embodiment, the actuators may be distributed throughout the surface of the trackpad 104 so that a user may receive the feedback from different portions of the trackpad 104. In other embodiments, the actuators may only provided in certain sections of the surface of the trackpad 104, so that the user may only receive feedback when engaging those sections. As will be discussed below, the actuators may be piezoelectric actuators.



FIG. 1B illustrates another embodiment, in which the touch-based user interface may be incorporated into the housing of a mouse 108. One example of an existing mouse 108 incorporating such a touch-based user interface is Apple Inc.'s Magic Mouse™. The mouse 108 may include one or more sensors for detecting various touch-based input gestures, such as swiping, taping, single and two-finger scrolling, and so on, across the top surface 107 of the mouse 108 for allowing a user to interact with a desktop computer 105. In one embodiment, the top surface 107 of the mouse 108 may include a number of actuators that may provide haptic feedback to the user based on the user's interactions with the desktop computer 105. Like the trackpad 104 of the embodiment shown in FIG. 1A, the mouse 108 may be separate from the display screen 102 of the desktop computer 105.


In yet another embodiment, illustrated in FIG. 1C, the touch-based user interface may take the form of a touchscreen input component 106. The touchscreen input component 106 may be provided on an electronic device 101 that can function as, for example, a media device, a communications device, a digital camera, a video camera, a storage device, or any other electronic device. Some examples of electronic devices 101 incorporating touch-based user interfaces include Apple Inc.'s iPhone™ and iPad™. The electronic device 101 may include one or more sensors for detecting various touch-based input gestures, such as swiping, taping, scrolling, and so on, across a surface 109 overlaying the display screen of the electronic device 101 for allowing a user to interact with the device. In some embodiments, the surface 109 may include a number of actuators that may provide haptic feedback in response to the input gestures from the user.



FIG. 2 shows a perspective view of a single piezoelectric actuator 100, as used in accordance with some embodiments. As discussed above, the piezoelectric actuator 100 may provide some type of mechanical stimulation, such as a pulse, vibration, or other feedback, upon actuation. The surface area of actuator 100 can be, for example, 10 square millimeters, 10 square micrometers, 10 square nanometers, or any other size that is physically possible. Additionally, while the illustrated piezoelectric actuator 100 has a rectangular configuration, other embodiments may be other shapes. For example, the piezoelectric actuator may be circular, ovular, triangular, elongated strands, and so on and so forth.


The piezoelectric actuator 100 may include electrodes 102 and 104 and piezoelectric material 106, any or all of which can be transparent, opaque, or a combination thereof. The piezoelectric material 106 can include, for example, a ceramic, polyvinylidene fluoride, one or more natural crystals (such as, e.g., Berlinite, cane sugar, quartz, Rochelle salt, topaz, and/or any tourmaline group mineral(s)), man-made crystals (such as, e.g., Gallium orthophosphate or langasite), bone, polymers, and/or any other material that is able to mechanically deform in response to an applied voltage.


The piezoelectric material 106 may be connected to two electrodes 102 and 104. One of the electrodes 102 may be connected to a positive terminal of a voltage source and the other of the electrodes 104 may be connected to a negative terminal of a voltage source. When a sufficient voltage is applied across the electrodes 102 and/or 104, the piezoelectric material 106 can expand or contract in height (H). In other embodiments, the piezoelectric actuator 100 can be made to expand in other directions, such as in width, as opposed to height. The amount of voltage required to deform the piezoelectric material 106 may vary, and may depend on the type of piezoelectric material 106 used to manufacture the piezoelectric actuator 100. When no voltage is supplied by the voltage source, or when the voltage across the electrodes 102, 104 is less than the threshold amount of voltage required to deform the piezoelectric material 106, the piezoelectric material 106 may return to its original dimensions (i.e., the dimensions of the material in its undeformed state).


The magnitude of expansion or contraction of the piezoelectric material 106 may be determined by the level or amount of voltage across the electrodes 102, 104, with a larger amount of voltage corresponding to a higher magnitude of expansion or contraction. Additionally, the polarity of the voltage across the piezoelectric material 106 may determine whether the piezoelectric material 106 contracts or expands. For example, the piezoelectric material 106 may expand in response to a positive voltage and contract in response to a negative voltage. Alternatively, the piezoelectric material may contract in response to a positive voltage and expand in response to a negative voltage.


In one embodiment, the piezoelectric actuator 100 can be made to vibrate by applying a control signal to one or both of the electrodes 102 and 104 of the piezoelectric actuator 100. The control signal may be a wave having a predetermined amplitude and/or frequency. When the control signal is applied to one or both of the electrodes 102, 104, the piezoelectric actuator 100 may vibrate at the frequency of the control signal. The frequency of the control signal may be adjusted according to various embodiments to alter the rate of expansion and contraction of the piezoelectric actuators 100 if a more or less rapid vibration is desired. The amplitude of the control signal may be correlated to the magnitude of expansion or contraction of the piezoelectric material 106, and may be adjusted to alter the intensity of the vibration.



FIG. 3A illustrates a top down view of one embodiment of a touch-based user interface 300. A cross-sectional view of the touch-based user interface 300 shown in FIG. 3A is illustrated in FIG. 3B. As shown in FIGS. 3A and 3B, the touch-based user interface 300 may include an optional cover layer 305, a haptic feedback layer 301, a printed circuit board (“PCB”) layer 307, and one or more force sensors 315. In one embodiment, the optional cover layer 305 may be positioned above the haptic feedback layer 301, the PCB layer 307 may be positioned below the haptic feedback layer 301, and the force sensors 315 may be positioned below the PCB layer 307. Other embodiments may include other arrangements of the haptic feedback layer 301, the PCB layer 307, force sensors 315 and cover layer 305. For example, in one embodiment, the haptic feedback layer 301 may be positioned underneath the PCB layer 307. Another embodiment may not include a cover layer 305. Instead, the haptic feedback layer 301 may form the outermost surface of the touch-based user interface 300. In a further embodiment, the force sensors 315 may be positioned above the haptic feedback layer 301.


In one embodiment, the haptic feedback layer 301 may include one or more piezoelectric actuators 100 embedded in a nonconductive material 311. Each of the piezoelectric actuators 100 in the haptic feedback layer 301 may be the same as or similar to the piezoelectric actuator 100 shown and described in FIG. 2. In one embodiment, each piezoelectric actuator 100 may be individually controlled. In other embodiments, two or more piezoelectric actuators 100 can be grouped together and controlled as a single entity. For example, two or more piezoelectric actuators can be grouped together to represent a single virtual button. In further embodiments, any number of piezoelectric actuators 100 can be grouped to form a single entity.


One skilled in the art will appreciate that, despite the actuators shown in FIGS. 3A and 3B having the same physical dimensions, the piezoelectric actuators can be any size, or combination of sizes. For example, the piezoelectric actuators can be larger around the edges of the touch-based user interface 300 and proportionately smaller towards the middle of the touch-based user interface 300. One skilled in the art would also appreciate that the space between piezoelectric actuators and/or the piezoelectric actuators' piezoelectric material can also be adjusted accordingly.


As shown in FIGS. 3A and 3B, the piezoelectric actuators 100 may be embedded into the haptic feedback layer 301 in any configuration. For example, as shown in FIG. 3A, the piezoelectric actuators 100 may be arranged in a grid configuration to form a plurality of rows and columns. The number of rows and columns of piezoelectric actuators 100 on the touch-based user interface 300 may vary according to different embodiments. For example, one embodiment may include more rows than columns, while another embodiment may include equal numbers of rows and columns, and so on and so forth.


The piezoelectric actuators 100 may be embedded in a nonconductive material 311 that may serve to insulate the actuators 100 and separate the actuators 100 from one another. The nonconductive material 311 may be an inorganic or rigid material that has a sufficiently high modulus of rigidity to resist deformation when the embedded piezoelectric actuators 100 deform in response to a supplied voltage. In this embodiment, the nonconductive material 311 may maintain the same dimensions as the attached actuators 100 increase and decrease in height relative to the nonconductive material 311. Some examples of inorganic materials that may be used include glass, ceramic, plastic, and so on and so forth. In other embodiments, the nonconductive material 311 may be an organic or compliant material that has a sufficiently high modulus of elasticity to deform with the attached embedded piezoelectric actuators 100. In this embodiment, the nonconductive material 311 may increase and decrease in height as the attached embedded actuators 100 increase and decrease in height. Some examples of organic materials that may be used include elastomers, silicon, thermoplastics, and so on and so forth.


In one embodiment, the piezoelectric actuators 100 may be bonded to the nonconductive material 311 by an adhesive 308. For example, the adhesive 308 may be applied around at least a portion of the perimeter of the piezoelectric actuators 100 to bond the actuators to the nonconductive material 311. In some embodiments, the adhesive 308 may have a high modulus of elasticity so as to allow the piezoelectric actuators 100 to move relative to the nonconductive material 311 while resisting debonding of the actuators 100 and the nonconductive material, as well as cracking or wear of the adhesive itself. Some examples of suitable adhesives include, but are not limited to, a thermoplastic adhesive, a hot melt adhesive, a solvent-based adhesive, and so on and so forth.


The properties of the adhesive 308 may vary according to the properties of the nonconductive material 311 used to form the haptic feedback layer 301. For example, an adhesive having a higher modulus of elasticity may be more suitable for embodiments utilizing a rigid nonconductive material 311 that resists deformation as the embedded piezoelectric actuators 100 are deformed. In contrast, an adhesive having a lower modulus of elasticity may be more suitable for embodiments utilizing a compliant or elastic nonconductive material 311 that is deformed with the embedded piezoelectric actuators 100.


As discussed above, a PCB layer 307 may be positioned underneath the haptic feedback layer 301. The PCB layer 307 may include a nonconductive matrix 309 configured to support the electrodes 102, 104 corresponding to each of the piezoelectric actuators 100. As shown in FIG. 3B, in one embodiment, each pair of electrodes 102, 104 may be positioned directly beneath a corresponding piezoelectric actuator so that each of the electrodes 102, 104 is aligned with a corresponding actuator 100 along at least one vertical axis. However, in other embodiments, the electrodes may not be vertically aligned with a corresponding actuator 100. For example, in one embodiment, one or both of the electrodes 102, 104 may be positioned to one side of a corresponding actuator 100.


In one embodiment, the electrodes 102, 104 may take the form of conductive metallic traces that are embedded within the nonconductive matrix 309. As shown in FIG. 3B, the top ends of the metallic traces may contact the piezoelectric actuators 100, and the metallic traces may extend from a top surface 312 of the PCB layer 307 through a bottom surface 314 of the PCB layer. The metallic traces may be formed from any suitable electrically conductive material, including, but not limited to, copper, aluminum, silver, gold, iron, and so on and so forth. In other embodiments, the electrodes 102, 104 may be insulated wires, rather than uninsulated traces.


The nonconductive matrix 309 may be formed from any non-conductive material, including an low-temperature co-fired ceramic, an elastomer-based polymer, glass, Teflon, and so on and so forth. In one embodiment, the nonconductive matrix 309 may be formed from a rigid or semi-rigid material that may provide structural support to the haptic feedback layer 301. For example, the nonconductive matrix 309 may prevent the haptic feedback layer 301 from cracking when depressed. The nonconductive matrix 309 may completely surround each of the electrodes 102, 104 so as to insulate the individual electrodes and prevent contact between adjacent electrodes. However, in other embodiments, such as where insulated wires are used rather than uninsulated traces, the nonconductive matrix 309 may only partially surround each of the electrodes 102, 104.


In some embodiments, the haptic feedback layer 301 may be fully or partially covered by an optional cover layer 305. The optional cover layer 305 may serve to insulate and protect the haptic feedback layer 301 from wear. The cover layer 305 may be sufficiently thin so as to allow a user to feel the forces supplied by the actuators 100. In one embodiment, the optional cover layer 305 may be formed from a transparent nonconductive material, such as glass, a clear cosmetic glaze, plastic, and so on. However, in other embodiments, the cover layer 305 may be formed from a fully or partially opaque material, such as a ceramic or an opaque paint. In another embodiment, the cover layer 305 may be a clear material that is sprayed or otherwise coated by an opaque paint. For example, the cover layer 305 may be a glass layer that is coated in paint.


As alluded to above, the touch-based user interface 300 may also include one or more force sensors 315. In one embodiment, the force sensors 315 may be located beneath the PCB layer 307. However, in other embodiments, the force sensors 315 may be positioned above the haptic feedback layer 301 or embedded into the PCB layer 307 or the haptic feedback layer 301. The force sensors 315 may be capable of sensing the amount of force or pressure being exerted on the sensors. When a force is applied to the touch-based user interface 300, the force may be transmitted through the outer layers of the interface to a force sensor underneath. Some examples of force sensors 315 that may be used in conjunction with the touch-based user interface may include, but are not limited to, force sensitive resistors, force sensitive capacitors, load cells, pressure plates, piezoelectric transducers, strain gauges, and so on and so forth.


In one embodiment, the force sensors 315 may be positioned underneath or incorporated into the outermost surface of the touch-based user interface 300. In this embodiment, the outermost surface of the touch-based user interface 300 may allow for a slight amount of flex so that any forces on the surface can be distributed to a respective force sensor. Accordingly, when a force is applied to the touch-based user interface 300, for example, due to squeezing or pushing on the outermost surface, the force may be transmitted through the outermost surface to a force sensor 315 located underneath the outermost surface. That is, the outermost surface may flex minimally, but still enough to be sensed by the force sensor 315 embedded in the outermost surface or sandwiched between the outermost surface and another intermediate layer of the touch-based user interface 300.


The force sensors 315 may produce signals indicative of the sensed forces. In one embodiment, the sensors 315 may be configured to generate input signals when forces are applied to the touch-based user interface 300. The processing device of the electronic device may then process the input signals to distinguish between various touch-based input gestures and initiate commands according to the different input gestures. Accordingly, the force sensors 315 may allow for distinguishing between various input gestures that may be associated with different commands. In one embodiment, the force sensors may be used to differentiate between a click and a scroll command. As an example, the processing device may associate a higher amount of force, such as from a tapping motion, with a click command and a lower amount of force, such as from a gliding motion, with a scroll command (or vice versa). Accordingly, if the force measured by the force sensors 315 is over a threshold level of force, the input gesture may be interpreted as a click command. On the other hand, if the force measured by the force sensors 315 is less than the threshold level of force, the input gesture may be interpreted as a scroll command.


The touch-based user interface 300 may also include a displacement sensor that may derive spatial data relating to the position of the object on the interface, as well as proximity data relating to the distance of the object from the interface. In one embodiment, illustrated in FIGS. 4A-4C, the displacement sensor may be a capacitance sensor 320 that can detect the location of a finger (or other object) using mutual capacitance sensing. In one embodiment, the capacitance sensor 320 may be incorporated into the PCB layer 307 underlying the haptic feedback layer 301. However, in another embodiment, the capacitance sensor 320 may be sandwiched between the haptic feedback layer 301 and the PCB layer 307. In other embodiments, the capacitance sensor 320 may be incorporated into any layer of the touch-based user interface 300 described above, or may be an additional layer that is positioned above or below the other layers of the interface 300 or sandwiched between two layers of the interface 300.


In one embodiment, the capacitance sensor 320 may include electrically conductive electrodes 335 that are deposited in varying patterns onto two flexible substrate sheets 331, 333. The substrate sheets 331, 333 may be formed from a flexible, yet rigid nonconductive material, such as plastic, polyester, rubber, glass, and so on and so forth. In one embodiment, the electrodes 335 may be deposited on the inner surface of one sheet 331 to form a row pattern, and on the corresponding inner surface of the other sheet 333 to form a column pattern. The spacing between the rows 338 and columns 339 may vary according to different embodiments, with a smaller spacing size corresponding to a more sensitive capacitive sensor 320. When the two substrate sheets are positioned with one on top of the other with the electrodes facing one another, a grid pattern may be formed. A finger, or other object, placed near the intersection 336 of two electrodes modifies the capacitance between them. This change in capacitance can be measured, and the position of the finger may be determined based on these changes at various points along the capacitance sensor.


In one embodiment, the piezoelectric actuators 100 may be embedded in the haptic feedback layer 301 so that the actuators 100 are aligned with the grid pattern formed by the electrodes 335 of the capacitance sensor 320. For example, the piezoelectric actuators 100 may be positioned above the spaces 322 defined between the rows 338 and columns 339 of the grid so that the spaces 322 and the actuators 100 are aligned along at least one vertical axis. As a change in capacitance is detected at a particular intersection 336 or group of intersections, a voltage may be supplied to the actuator 100 or group of actuators positioned proximate the intersections 336. The piezoelectric actuators 100 may or may not be positioned above every space of the grid. For example, a single piezoelectric actuator 100 may be provided for every other space of the grid or every third space of the grid. In another embodiment, multiple piezoelectric actuators 100 may be provided for some spaces.


As discussed above, the haptic feedback from the piezeoelectric actuators 100 may allow for enhanced navigation of the content displayed on a display coupled to the touch-based user interface. In one embodiment, the piezoelectric actuators 100 may replace the mechanical “click” of a mouse, trackpad, or other user interface of an electronic device. For example, the touch-based user interface may confirm a “click” by supplying a voltage to the piezoelectric actuators 100 so that the user feels a vibration or other motion. In one embodiment, the electronic device may interpret a tapping motion on the surface of the touch-based user interface as corresponding to a click command. In contrast, when the user glides a finger or other object along the surface of the touch-based user interface, the piezoelectric actuators 100 may remain unactuated. Accordingly, a user may be able to ascertain whether the electronic device has interpreted an input gesture as a click or a scroll.


In another embodiment, the piezoelectric actuators 100 may allow the user “feel” the selectable buttons or icons displayed by the electronic device. This embodiment may be particularly useful in a touch-based user interface that is not overlaid on a display screen, such as a trackpad or a mouse, in which the user cannot position a finger or other object directly over the displayed buttons and icons to select them. In one implementation, a voltage may be supplied to the piezoelectric actuators 100 when a cursor is positioned within selection range of a virtual button or icon. Accordingly, the user may feel a vibration or other motion indicating that the user may select the button with a selection input gesture.



FIGS. 5A and 5B illustrate a top down view and a perspective view of another embodiment of a touch-based user interface 200. In this embodiment, the piezoelectric actuators 201 may take the form of one or more strands 203 that extend laterally across the haptic feedback layer 205. In one embodiment, the strands 203 may be parallel to one another, and may extend in a horizontal direction across the haptic feedback layer 205. In this embodiment, the electrically conductive traces connected to the strands 203 may be positioned on the sides of the haptic feedback layer 203, as opposed to underneath the haptic feedback layer 203 as in the embodiment shown in FIGS. 3A and 3B. The strands 203 may be embedded in the nonconductive material 207 such that the strands 203 are exposed and form part of the outer surface of the haptic feedback layer 205. Alternatively, the strands 203 may be covered by the nonconductive material 207.


In other embodiments, the traces may be positioned underneath the strands 203. In further embodiments, the strands 203 may not be parallel to one another, but may extend at angles with respect to one another. Additionally, the strands 203 may extend vertically or diagonally across the haptic feedback layer 205, rather than horizontally.



FIGS. 6A-6C illustrate one embodiment of a method for manufacturing a haptic feedback layer. In a first step, illustrated in FIG. 6A, one or more piezoelectric actuators 100 may be arranged in a mold 160. The mold 160 may define the shape of the formed haptic feedback layer. The actuators 100 may be arranged in any configuration. For example, the actuators 100 may be evenly spaced apart in the mold 160, or concentrated in one portion of the mold. The actuators 100 may have any shape. For example, the actuators may have a circular shape, a square shape, a triangular shape, or any other shape. The actuators may all be substantially identical, or some actuators may have a different configuration than other actuators.


In a second step, illustrated in FIG. 6B, an adhesive 308 may be applied around all or a portion of the perimeter of the piezoelectric actuators 100. As discussed above, the adhesive 308 may bind the piezoelectric actuators 100 to the nonconductive material. In one embodiment, the adhesive 308 may be a hot melt adhesive that is applied around the perimeter of the actuators. Other embodiments may use other types of adhesive, as discussed above. In an alternate embodiment, the adhesive 308 may be applied after the nonconductive material is added to the mold. For example, the actuators 100 may be spaced apart from the nonconductive material, and the adhesive may be added after nonconductive portion of the blank is formed. Additionally, some embodiments may not include an adhesive layer between the nonconductive material and the piezoelectric actuators 100. Accordingly, the adhesive application step described above is optional.


In a third step, illustrated in FIG. 6C, a nonconductive material 311 may added to the mold to fill the spaces between the piezoelectric actuators 100. In one embodiment, the nonconductive material 311 may be heated to a liquid form and then poured into the mold 160 to fill the spaces between the actuators 100. In another embodiment, the nonconductive material 311 may be added to the mold 160 in solid form, and the actuators and the nonconductive material may be heated to melt the nonconductive material 311 so that it fills the spaces between the actuators 100. After the nonconductive material 311 is added to the mold 160, the composite layer may be heated, baked, or otherwise processed to form the final haptic feedback layer 301.


In one embodiment, both the nonconductive material and the actuators may each define a portion of the outer surface of the haptic feedback layer. However, in other embodiments, the nonconductive material 311 may cover all or part of the actuators 100 to form one or more of the side, top and/or bottom surfaces of the haptic feedback layer. Accordingly, in one embodiment, the nonconductive material may define the outer surfaces of the haptic feedback layer, or the actuators may define a portion of one outer surface of the haptic feedback layer, while the other surfaces are defined by the nonconductive material.



FIGS. 7A-7C illustrate another embodiment of a method for manufacturing a haptic feedback layer. In particular, FIG. 7A illustrates a perspective view of a composite blank 700 that may be used to form a haptic layer of a touch-based user interface. As shown in FIG. 7A, the composite blank may include one or more piezoelectric strands 201 that are conjoined with a nonconductive material 311. The strands 201 may have any cross-sectional configuration. For example, the strands may have a circular cross-section, a rectangular cross-section, a square cross-section, a triangular cross-section, and so on and so forth. In one embodiment, the piezoelectric strands may be parallel to one another such that the strands form one or more rows and one or more columns within the composite blank. However, in other embodiments, the piezoelectric strands may extend at angles from one another. As discussed above, the side surfaces of the piezoelectric strands may be joined to the nonconductive material by an adhesive material 308.


The composite blank 700 may be formed in a manner similar to that described with respect to the method for forming a haptic layer illustrated in FIGS. 6A-6C. That is, the composite blank 700 may be formed by arranging the piezoelectric strands 201 in an array configuration, and then filling the spaces between the strands with a nonconductive material 311. The array of strands may first be arranged in a mold defining the shape of the blank.


As discussed above, the spaces between the strands of the array may then be filled with the nonconductive material 311. In one embodiment, the nonconductive material 311 may be heated to a liquid state, and then poured over the array of piezoelectric strands 201. In other embodiments, the nonconductive material, in solid form, may be placed around the piezoelectric strands, and the strands and the nonconductive material may be heated so that the nonconductive material is melted and fills the gaps between the strands. In one embodiment, adhesive 308 may be applied to the side edges of the strands before the nonconductive material is added to the mold.


The formed composite blank 700 may then be cut to form different configurations of touch-based user interface devices. In one embodiment, shown in FIG. 7B the blank may be cut along a plane perpendicular to the direction of extension of the strands to form a haptic feedback layer 301 similar to that shown in FIGS. 3A and 3B, with the shape of the piezoelectric actuators 100 varying according to the cross-sectional profile of the strands 201. The blank may be cut using, for example, a computer-numerical controlled laser cutting tool, or alternatively, a mechanical cutting tool such as a blade. In another embodiment, shown in FIG. 7C, the blank 700 may be cut along a plane parallel to the direction of extension of the strands 201 form a haptic feedback layer 205 similar to that shown in FIGS. 5A and 5B. For example, the blank may be cut along the nonconductive areas between the strands so that the piezoelectric strands of the resulting touch-based user interface are covered by the nonconductive material. Alternatively, the blank may be cut to expose the strands 201 so that the strands form at least part of the outer surface of the resulting haptic feedback layer 205.



FIG. 8 is a flowchart illustrating one embodiment of a method 800 for manufacturing a haptic feedback layer. For example, the illustrated method 800 may be used to form an embodiment similar to that shown in FIGS. 3A and 3B. The method 800 may begin by arranging one or more piezoelectric actuators in a spaced-apart configuration, as indicated in block 801. As discussed above, the mold may define the shape of the formed haptic feedback layer. The actuators may be arranged in any configuration. For example, the actuators may be evenly spaced apart in the mold, or concentrated in one portion of the mold.


An adhesive may then be applied around at least a portion of the perimeter of the piezoelectric actuators, as indicated in block 803. As discussed above, the adhesive may bind the piezoelectric actuators to the nonconductive material. In some embodiments, the adhesive may have a high modulus of elasticity so as to allow the piezoelectric actuators to move relative to the nonconductive material while resisting debonding of the actuators and the nonconductive material, as well as cracking or wear of the adhesive itself.


The spaces between the actuators may be filled with a nonconductive material, as indicated in block 805. As discussed above, in one embodiment, the nonconductive material may be heated into liquid form and poured into the mold to fill the spaces between the actuators. In other embodiments, the nonconductive material may be inserted into the mold in solid form, and the actuators and the nonconductive material may be heated so that the nonconductive material fills the spaces between the actuators.



FIG. 9 is a flowchart illustrating another embodiment of a method 900 for manufacturing a haptic feedback layer. For example, the illustrated method 900 may be used to form embodiments similar to that shown in FIGS. 3A and 3B and FIGS. 5A and 5B. The method 900 may begin by arranging one or more piezoelectric strands in a spaced-apart configuration, as indicated in block 901. As discussed above, the spacing between the strands and the configuration of the strands may vary according to different embodiments. The strands may first be arranged in a mold defining the shape of the blank. The spaces between the strands may then be filled with a nonconductive material to form a composite blank, as indicated in block 903. As discussed above, the nonconductive material may be heated into liquid form and poured into the mold to fill the spaces between the strands. In other embodiments, the nonconductive material may be inserted into the mold in solid form, and the strands and the nonconductive material may be heated so that the nonconductive material fills the spaces between the strands.


The composite blank may then be cut, as indicated in block 905. As discussed above, in one embodiment, the composite blank may be cut along a plane perpendicular to the direction of extension of the strands. In another embodiment, the composite blank may be cut along a plane parallel to the direction of extension of the strands so that the formed haptic feedback layer includes one or more strands extending across it. The strands may be exposed, so that the strands form a portion of the outermost surface of the haptic feedback layer, or may be covered by the nonconductive material.


The order of execution or performance of the methods illustrated and described herein is not essential, unless otherwise specified. That is, elements of the methods may be performed in any order, unless otherwise specified, and that the methods may include more or less elements than those disclosed herein. For example, it is contemplated that executing or performing a particular element before, contemporaneously with, or after another element are all possible sequences of execution.

Claims
  • 1. A touch-based user interface, comprising: a haptic feedback layer comprising a layer of nonconductive material and one or more actuators embedded within the layer of nonconductive material, the one more actuators configured to supply a haptic feedback;one or more force sensors configured to differentiate among a plurality of input commands based, at least in part, on an amount of force sensed by the one or more force sensors; anda printed circuit board layer, disposed on a first side of the haptic feedback layer, and positioned between the haptic feedback layer and the one or more force sensors, the printed circuit board layer including one or more conductive traces configured to supply a voltage to the one or more actuators, whereina second side of the haptic feedback layer opposite to the first side is positioned toward an outermost surface of the touch-based user interface.
  • 2. The touch-based user interface of claim 1, wherein at least one of the one or more actuators is a piezoelectric actuator.
  • 3. The touch-based user interface of claim 1, wherein the printed circuit board layer further comprises a capacitive sensor.
  • 4. The touch-based user interface of claim 3, wherein the capacitive sensor includes a first layer of electrodes and a second layer of electrodes, the first layer of electrodes overlaying the second layer of electrodes to form a grid defining one or more spaces between the first layer of electrodes and the second layer of electrodes.
  • 5. The touch-based user interface of claim 4, wherein at least one of the one or more actuators is aligned along at least one vertical axis with at least one of the one or more spaces of the grid.
  • 6. The touch-based user interface of claim 1, wherein the one or more force sensors are positioned underneath the printed circuit board layer.
  • 7. The touch-based user interface of claim 1, wherein the one or more actuators forms at least a portion of an outer surface of the haptic feedback layer.
  • 8. The touch-based user interface of claim 1, wherein the one or more actuators is joined to the nonconductive material by an adhesive.
  • 9. The touch-based user interface of claim 1, wherein at least one of the one or more actuators is configured to move independently from another actuator of the one or more actuators.
  • 10. The touch-based user interface of claim 1, wherein the one or more actuators form rows of actuators that extend laterally across the haptic feedback layer.
  • 11. The touch-based user interface of claim 1, further comprising a cover layer overlaying the haptic feedback layer.
  • 12. The touch-based user interface of claim 1, wherein the nonconductive material has a modulus of rigidity to prevent the nonconductive material from moving relative to the one or more actuators when a voltage is supplied to the one or more actuators.
  • 13. A method for manufacturing a haptic feedback layer, comprising: arranging one or more piezoelectric actuator strands so that each of the one or more piezoelectric actuator strands are spaced apart from one another;melting a nonconductive material to form a liquid;filling spaces between the piezoelectric actuator strands with the melted nonconductive material to form a blank so that the piezoelectric actuator strands are embedded within the nonconductive material;cutting the blank to form a haptic feedback layer;orienting one side of the haptic feedback layer toward a first side of a printed circuit board layer; andcoupling one or more force sensors to a second side of the printed circuit board layer that is opposite to the first side, wherein the one or more force sensors are configured to differentiate among a plurality of input commands based, at least in part, on an amount of force detected by the one or more force sensors.
  • 14. The method of claim 13, wherein the blank is cut along a plane perpendicular to a direction of extension of the piezoelectric actuator strands.
  • 15. The method of claim 13, wherein the blank is cut along a plane parallel to a direction of extension of the piezoelectric actuators strands.
  • 16. The method of claim 15, wherein the blank is cut such that only the nonconductive material between the piezoelectric actuator strands is cut.
  • 17. A touch-based input device, comprising: one or more actuators embedded within a layer of nonconductive material and configured to supply a haptic feedback; andone or more force sensors embedded within a printed circuit board layer and configured to differentiate among a plurality of input commands based, at least in part, on an amount of force sensed by the one or more force sensors;wherein the printed circuit board layer is attached to the layer of nonconductive material and comprises one or more conductive traces that supply a voltage to the one or more actuators, and the layer of nonconductive material is positioned toward an outermost surface of the touch-based user interface.
  • 18. The touch-based input device of claim 17, wherein the outermost surface of the touch-based input device is configured to flex in response to a received force.
  • 19. The touch-based input device of claim 17, wherein a first input command of the plurality of input commands is associated with a first gesture and a second input command of the plurality of input commands is associated with a second gesture.
  • 20. The touch-based user interface of claim 4, wherein the capacitive sensor measures a change in capacitance at a location at which a first electrode of the first layer of electrodes crosses above a second electrode of the second layer of electrodes.
US Referenced Citations (404)
Number Name Date Kind
3001049 Didier Sep 1961 A
3390287 Sonderegger Jun 1968 A
3419739 Clements Dec 1968 A
4236132 Zissimopoulos Nov 1980 A
4412148 Klicker et al. Oct 1983 A
4414984 Zarudiansky Nov 1983 A
4695813 Nobutoki et al. Sep 1987 A
4975616 Park Dec 1990 A
5010772 Bourland Apr 1991 A
5245734 Issartel Sep 1993 A
5283408 Chen Feb 1994 A
5293161 MacDonald et al. Mar 1994 A
5317221 Kubo et al. May 1994 A
5365140 Ohya et al. Nov 1994 A
5434549 Hirabayashi et al. Jul 1995 A
5436622 Gutman et al. Jul 1995 A
5510584 Norris Apr 1996 A
5510783 Findlater et al. Apr 1996 A
5513100 Parker et al. Apr 1996 A
5587875 Sellers Dec 1996 A
5590020 Sellers Dec 1996 A
5602715 Lempicki et al. Feb 1997 A
5619005 Shibukawa et al. Apr 1997 A
5621610 Moore et al. Apr 1997 A
5625532 Sellers Apr 1997 A
5629578 Winzer et al. May 1997 A
5635928 Takagi et al. Jun 1997 A
5718418 Gugsch Feb 1998 A
5739759 Nakazawa et al. Apr 1998 A
5742242 Sellers Apr 1998 A
5783765 Muramatsu Jul 1998 A
5793605 Sellers Aug 1998 A
5812116 Malhi Sep 1998 A
5813142 Demon Sep 1998 A
5818149 Safari et al. Oct 1998 A
5896076 Van Namen Apr 1999 A
5907199 Miller May 1999 A
5951908 Cui et al. Sep 1999 A
5959613 Rosenberg et al. Sep 1999 A
5973441 Lo et al. Oct 1999 A
5982304 Selker et al. Nov 1999 A
5982612 Roylance Nov 1999 A
5995026 Sellers Nov 1999 A
5999084 Armstrong Dec 1999 A
6069433 Lazarus et al. May 2000 A
6078308 Rosenberg et al. Jun 2000 A
6127756 Iwaki Oct 2000 A
6135886 Armstrong Oct 2000 A
6218966 Goodwin Apr 2001 B1
6220550 McKillip, Jr. Apr 2001 B1
6222525 Armstrong Apr 2001 B1
6252336 Hall Jun 2001 B1
6342880 Rosenberg et al. Jan 2002 B2
6351205 Armstrong Feb 2002 B1
6373465 Jolly et al. Apr 2002 B2
6408187 Merriam Jun 2002 B1
6411276 Braun et al. Jun 2002 B1
6429849 An Aug 2002 B1
6438393 Surronen Aug 2002 B1
6444928 Okamoto et al. Sep 2002 B2
6455973 Ineson Sep 2002 B1
6465921 Horng Oct 2002 B1
6552404 Hynes Apr 2003 B1
6552471 Chandran et al. Apr 2003 B1
6557072 Osborn Apr 2003 B2
6642857 Schediwy Nov 2003 B1
6693626 Rosenberg Feb 2004 B1
6717573 Shahoian et al. Apr 2004 B1
6809462 Pelrine et al. Oct 2004 B2
6809727 Piot et al. Oct 2004 B2
6864877 Braun et al. Mar 2005 B2
6906697 Rosenberg Jun 2005 B2
6906700 Armstrong Jun 2005 B1
6906703 Vablais et al. Jun 2005 B2
6952203 Banerjee et al. Oct 2005 B2
6954657 Bork et al. Oct 2005 B2
6963762 Kaaresoja et al. Nov 2005 B2
6995752 Lu Feb 2006 B2
7005811 Wakuda et al. Feb 2006 B2
7016707 Fujisawa et al. Mar 2006 B2
7022927 Hsu Apr 2006 B2
7023112 Miyamoto et al. Apr 2006 B2
7081701 Yoon et al. Jul 2006 B2
7121147 Okada Oct 2006 B2
7123948 Nielsen Oct 2006 B2
7130664 Williams Oct 2006 B1
7136045 Rosenberg et al. Nov 2006 B2
7161580 Bailey et al. Jan 2007 B2
7162928 Shank et al. Jan 2007 B2
7170498 Huang Jan 2007 B2
7176906 Williams et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7194645 Bieswanger et al. Mar 2007 B2
7217891 Fischer et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7219561 Okada May 2007 B2
7253350 Noro et al. Aug 2007 B2
7333604 Zernovizky et al. Feb 2008 B2
7334350 Ellis Feb 2008 B2
7348968 Dawson Mar 2008 B2
7388741 Konuma et al. Jun 2008 B2
7392066 Hapamas Jun 2008 B2
7423631 Shahoian et al. Sep 2008 B2
7446752 Goldenberg et al. Nov 2008 B2
7469595 Kessler et al. Dec 2008 B2
7495358 Kobayashi et al. Feb 2009 B2
7508382 Denoue et al. Mar 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7562468 Ellis Jul 2009 B2
7569086 Kikuchi et al. Aug 2009 B2
7575368 Guillaume Aug 2009 B2
7586220 Roberts Sep 2009 B2
7619498 Miura Nov 2009 B2
7639232 Grant et al. Dec 2009 B2
7641618 Noda et al. Jan 2010 B2
7675253 Dorel Mar 2010 B2
7675414 Ray Mar 2010 B2
7679611 Schena Mar 2010 B2
7707742 Ellis May 2010 B2
7710399 Bruneau et al. May 2010 B2
7732951 Mukaide Jun 2010 B2
7742036 Grant et al. Jun 2010 B2
7788032 Moloney Aug 2010 B2
7793429 Ellis Sep 2010 B2
7793430 Ellis Sep 2010 B2
7798982 Zets et al. Sep 2010 B2
7868489 Amemiya et al. Jan 2011 B2
7888892 McReynolds et al. Feb 2011 B2
7893922 Klinghult et al. Feb 2011 B2
7919945 Houston et al. Apr 2011 B2
7929382 Yamazaki Apr 2011 B2
7946483 Miller et al. May 2011 B2
7952261 Lipton et al. May 2011 B2
7952566 Poupyrev et al. May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7961909 Mandella et al. Jun 2011 B2
8031172 Kruse et al. Oct 2011 B2
8044940 Narusawa Oct 2011 B2
8069881 Cunha Dec 2011 B1
8077145 Rosenberg et al. Dec 2011 B2
8081156 Ruettiger Dec 2011 B2
8082640 Takeda Dec 2011 B2
8098234 Lacroix et al. Jan 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125453 Shahoian et al. Feb 2012 B2
8141276 Ellis Mar 2012 B2
8156809 Tierling et al. Apr 2012 B2
8174372 da Costa May 2012 B2
8179202 Cruz-Hernandez et al. May 2012 B2
8188623 Park May 2012 B2
8205356 Ellis Jun 2012 B2
8210942 Shimabukuro et al. Jul 2012 B2
8232494 Purcocks Jul 2012 B2
8248277 Peterson et al. Aug 2012 B2
8248278 Schlosser et al. Aug 2012 B2
8253686 Kyung et al. Aug 2012 B2
8255004 Huang et al. Aug 2012 B2
8261468 Ellis Sep 2012 B2
8270114 Argumedo et al. Sep 2012 B2
8288899 Park et al. Oct 2012 B2
8291614 Ellis Oct 2012 B2
8294600 Peterson et al. Oct 2012 B2
8315746 Cox et al. Nov 2012 B2
8378798 Bells et al. Feb 2013 B2
8384679 Paleczny et al. Feb 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8398570 Mortimer et al. Mar 2013 B2
8411058 Wong et al. Apr 2013 B2
8446264 Tanase May 2013 B2
8451255 Weber et al. May 2013 B2
8461951 Gassmann et al. Jun 2013 B2
8466889 Tong et al. Jun 2013 B2
8471690 Hennig et al. Jun 2013 B2
8515398 Song et al. Aug 2013 B2
8542134 Peterson et al. Sep 2013 B2
8545322 George et al. Oct 2013 B2
8547341 Takashima et al. Oct 2013 B2
8570291 Motomura Oct 2013 B2
8575794 Lee et al. Nov 2013 B2
8596755 Hibi Dec 2013 B2
8598893 Camus Dec 2013 B2
8599047 Schlosser et al. Dec 2013 B2
8599152 Wurtenberger et al. Dec 2013 B1
8600354 Esaki Dec 2013 B2
8621348 Ramsay et al. Dec 2013 B2
8629843 Steeves et al. Jan 2014 B2
8674941 Casparian et al. Mar 2014 B2
8680723 Subramanian Mar 2014 B2
8681092 Harada et al. Mar 2014 B2
8682396 Yang et al. Mar 2014 B2
8686952 Pope et al. Apr 2014 B2
8710966 Hill Apr 2014 B2
8723813 Park et al. May 2014 B2
8735755 Peterson et al. May 2014 B2
8760273 Casparian et al. Jun 2014 B2
8780060 Maschmeyer et al. Jul 2014 B2
8787006 Golko et al. Jul 2014 B2
8797152 Henderson et al. Aug 2014 B2
8798534 Rodriguez et al. Aug 2014 B2
8845071 Yamamoto et al. Sep 2014 B2
8857248 Shih et al. Oct 2014 B2
8861776 Lastrucci Oct 2014 B2
8866600 Yang et al. Oct 2014 B2
8890668 Pance et al. Nov 2014 B2
8928621 Ciesla et al. Jan 2015 B2
8948821 Newham et al. Feb 2015 B2
8970534 Adachi et al. Mar 2015 B2
8976141 Myers et al. Mar 2015 B2
9008730 Kim et al. Apr 2015 B2
9019088 Zawacki et al. Apr 2015 B2
9035887 Prud'Hommeaux et al. May 2015 B1
9072576 Nishiura Jul 2015 B2
9083821 Hughes Jul 2015 B2
9092129 Abdo et al. Jul 2015 B2
9098991 Park et al. Aug 2015 B2
9122325 Peshkin et al. Sep 2015 B2
9131039 Behles Sep 2015 B2
9134834 Reshef Sep 2015 B2
9158379 Cruz-Hernandez et al. Oct 2015 B2
9189932 Kerdemelidis et al. Nov 2015 B2
9201458 Hunt et al. Dec 2015 B2
9235267 Pope et al. Jan 2016 B2
9274601 Faubert et al. Mar 2016 B2
9274602 Garg et al. Mar 2016 B2
9274603 Modarres et al. Mar 2016 B2
9275815 Hoffmann Mar 2016 B2
9300181 Maeda et al. Mar 2016 B2
9317116 Ullrich et al. Apr 2016 B2
9318942 Sugita et al. Apr 2016 B2
9325230 Yamada et al. Apr 2016 B2
9357052 Ullrich May 2016 B2
9360944 Pinault Jun 2016 B2
9390599 Weinberg Jul 2016 B2
9396434 Rothkopf Jul 2016 B2
9405369 Modarres et al. Aug 2016 B2
9449476 Lynn Sep 2016 B2
9454239 Elias et al. Sep 2016 B2
9467033 Jun et al. Oct 2016 B2
9477342 Daverman et al. Oct 2016 B2
9480947 Jiang et al. Nov 2016 B2
9501912 Havskjold et al. Nov 2016 B1
9594450 Lynn et al. Jul 2017 B2
9779592 Hoen Oct 2017 B1
9934661 Hill Apr 2018 B2
20030210259 Liu Nov 2003 A1
20040021663 Suzuki et al. Feb 2004 A1
20040127198 Roskind et al. Jul 2004 A1
20050057528 Kleen Mar 2005 A1
20050107129 Kaewell et al. May 2005 A1
20050110778 Ben Ayed May 2005 A1
20050118922 Endo Jun 2005 A1
20050217142 Ellis Oct 2005 A1
20050237306 Klein et al. Oct 2005 A1
20050248549 Dietz et al. Nov 2005 A1
20050258715 Schlabach Nov 2005 A1
20060014569 DelGiorno Jan 2006 A1
20060119586 Grant Jun 2006 A1
20060154674 Landschaft et al. Jul 2006 A1
20060209037 Wang et al. Sep 2006 A1
20060239746 Grant Oct 2006 A1
20060252463 Liao Nov 2006 A1
20070099574 Wang May 2007 A1
20070152974 Kim et al. Jul 2007 A1
20070178942 Sadler et al. Aug 2007 A1
20070188450 Hernandez et al. Aug 2007 A1
20080084384 Gregorio et al. Apr 2008 A1
20080158149 Levin Jul 2008 A1
20080165148 Williamson Jul 2008 A1
20080181501 Faraboschi Jul 2008 A1
20080181706 Jackson Jul 2008 A1
20080192014 Kent et al. Aug 2008 A1
20080204428 Pierce Aug 2008 A1
20080252594 Gregorio et al. Oct 2008 A1
20080255794 Levine Oct 2008 A1
20080291620 DiFonzo et al. Nov 2008 A1
20090002328 Ullrich et al. Jan 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090120105 Ramsay et al. May 2009 A1
20090128503 Grant et al. May 2009 A1
20090135142 Fu et al. May 2009 A1
20090160813 Takashima et al. Jun 2009 A1
20090167542 Culbert et al. Jul 2009 A1
20090167702 Nurmi Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090218148 Hugeback et al. Sep 2009 A1
20090225046 Kim Sep 2009 A1
20090236210 Clark et al. Sep 2009 A1
20090267892 Faubert Oct 2009 A1
20090267920 Faubert et al. Oct 2009 A1
20090305744 Ullrich Dec 2009 A1
20090313542 Cruz-Hernandez et al. Dec 2009 A1
20090322496 da Costa Dec 2009 A1
20100020036 Hui et al. Jan 2010 A1
20100048256 Huppi et al. Feb 2010 A1
20100053087 Dai Mar 2010 A1
20100079264 Hoellwarth Apr 2010 A1
20100089735 Takeda et al. Apr 2010 A1
20100141606 Bae et al. Jun 2010 A1
20100152620 Ramsay et al. Jun 2010 A1
20100164894 Kim et al. Jul 2010 A1
20100188422 Shingai et al. Jul 2010 A1
20100194547 Terrell et al. Aug 2010 A1
20100225340 Smith Sep 2010 A1
20100231508 Cruz-Hernandez et al. Sep 2010 A1
20100231550 Cruz-Hernandez et al. Sep 2010 A1
20100265197 Purdy Oct 2010 A1
20100309141 Cruz-Hernandez et al. Dec 2010 A1
20100328229 Weber et al. Dec 2010 A1
20110012717 Pance et al. Jan 2011 A1
20110043454 Modarres Feb 2011 A1
20110053577 Lee et al. Mar 2011 A1
20110075835 Hill Mar 2011 A1
20110077055 Hill Mar 2011 A1
20110107958 Pance et al. May 2011 A1
20110121765 Anderson et al. May 2011 A1
20110128239 Polyakov et al. Jun 2011 A1
20110141052 Bernstein et al. Jun 2011 A1
20110148608 Grant et al. Jun 2011 A1
20110163985 Bae et al. Jul 2011 A1
20110175692 Niiyama Jul 2011 A1
20110193824 Modarres et al. Aug 2011 A1
20110203912 Niu Aug 2011 A1
20110248948 Griffin et al. Oct 2011 A1
20110260988 Colgate et al. Oct 2011 A1
20110263200 Thornton et al. Oct 2011 A1
20110291950 Tong Dec 2011 A1
20110304559 Pasquero Dec 2011 A1
20120075198 Sulem et al. Mar 2012 A1
20120092263 Peterson et al. Apr 2012 A1
20120126959 Zarrabi et al. May 2012 A1
20120127088 Pance et al. May 2012 A1
20120133494 Cruz-Hernandez et al. May 2012 A1
20120139844 Ramstein et al. Jun 2012 A1
20120206248 Biggs Aug 2012 A1
20120223824 Rothkopf Sep 2012 A1
20120256848 Madabusi Srinivasan Oct 2012 A1
20120268412 Cruz-Hernandez et al. Oct 2012 A1
20120274578 Snow et al. Nov 2012 A1
20120280927 Ludwig Nov 2012 A1
20120286943 Rothkopf et al. Nov 2012 A1
20120319827 Pance et al. Dec 2012 A1
20120327006 Israr et al. Dec 2012 A1
20130027345 Binzel Jan 2013 A1
20130063356 Martisauskas Mar 2013 A1
20130106699 Babatunde May 2013 A1
20130120290 Yumiki et al. May 2013 A1
20130124076 Bruni et al. May 2013 A1
20130181913 Cole et al. Jul 2013 A1
20130191741 Dickinson et al. Jul 2013 A1
20130207793 Weaber et al. Aug 2013 A1
20130217491 Hilbert et al. Aug 2013 A1
20130222280 Sheynblat et al. Aug 2013 A1
20130228023 Drasnin et al. Sep 2013 A1
20130261811 Yagi et al. Oct 2013 A1
20130300549 Hill Nov 2013 A1
20130300590 Dietz et al. Nov 2013 A1
20140035397 Endo et al. Feb 2014 A1
20140082490 Jung et al. Mar 2014 A1
20140091857 Bernstein Apr 2014 A1
20140143785 Mistry et al. May 2014 A1
20140197936 Biggs et al. Jul 2014 A1
20140232534 Birnbaum et al. Aug 2014 A1
20140267076 Birnbaum et al. Sep 2014 A1
20140267952 Sirois Sep 2014 A1
20150005039 Liu et al. Jan 2015 A1
20150040005 Faaborg Feb 2015 A1
20150061848 Hill Mar 2015 A1
20150090572 Lee et al. Apr 2015 A1
20150109215 Puskarich Apr 2015 A1
20150169059 Behles et al. Jun 2015 A1
20150192414 Das et al. Jul 2015 A1
20150194165 Faaborg et al. Jul 2015 A1
20150220199 Wang et al. Aug 2015 A1
20150227204 Gipson et al. Aug 2015 A1
20150296480 Kinsey et al. Oct 2015 A1
20150324049 Kies et al. Nov 2015 A1
20150349619 Degner et al. Dec 2015 A1
20160049265 Bernstein Feb 2016 A1
20160063826 Morrell et al. Mar 2016 A1
20160071384 Hill Mar 2016 A1
20160162025 Shah Jun 2016 A1
20160163165 Morrell et al. Jun 2016 A1
20160172953 Degner et al. Jun 2016 A1
20160195929 Martinez et al. Jul 2016 A1
20160196935 Bernstein Jul 2016 A1
20160206921 Szabados et al. Jul 2016 A1
20160211736 Moussette et al. Jul 2016 A1
20160216764 Morrell et al. Jul 2016 A1
20160216766 Puskarich Jul 2016 A1
20160231815 Moussette et al. Aug 2016 A1
20160233012 Lubinski et al. Aug 2016 A1
20160241119 Keeler Aug 2016 A1
20160259480 Augenbergs et al. Sep 2016 A1
20160306423 Uttermann et al. Oct 2016 A1
20160371942 Smith, IV et al. Dec 2016 A1
20170038905 Bijamov et al. Feb 2017 A1
20170070131 Degner et al. Mar 2017 A1
20170257844 Miller et al. Sep 2017 A1
20170285747 Chen Oct 2017 A1
20170311282 Miller et al. Oct 2017 A1
20170357325 Yang et al. Dec 2017 A1
20170364158 Wen et al. Dec 2017 A1
20180075715 Morrell et al. Mar 2018 A1
20180081441 Pedder et al. Mar 2018 A1
Foreign Referenced Citations (68)
Number Date Country
2015100710 Jul 2015 AU
2016100399 May 2016 AU
2355434 Feb 2002 CA
1817321 Aug 2006 CN
101409164 Apr 2009 CN
101763192 Jun 2010 CN
101903848 Dec 2010 CN
102025257 Apr 2011 CN
102246122 Nov 2011 CN
102315747 Jan 2012 CN
102591512 Jul 2012 CN
102713805 Oct 2012 CN
102844972 Dec 2012 CN
103181090 Jun 2013 CN
103416043 Nov 2013 CN
19517630 Nov 1996 DE
10330024 Jan 2005 DE
102009038103 Feb 2011 DE
10201115762 Apr 2013 DE
0483955 May 1992 EP
1047258 Oct 2000 EP
1686776 Aug 2006 EP
2060967 May 2009 EP
2073099 Jun 2009 EP
2194444 Jun 2010 EP
2264562 Dec 2010 EP
2315186 Apr 2011 EP
2374430 Oct 2011 EP
2395414 Dec 2011 EP
2461228 Jun 2012 EP
2631746 Aug 2013 EP
2434555 Oct 2013 EP
H05301342 Nov 1993 JP
2002199689 Jul 2002 JP
2002102799 Sep 2002 JP
200362525 Mar 2003 JP
2004236202 Aug 2004 JP
20050033909 Apr 2005 KR
1020100046602 May 2010 KR
1020110101516 Sep 2011 KR
20130024420 Mar 2013 KR
200518000 Nov 2007 TW
200951944 Dec 2009 TW
201145336 Dec 2011 TW
201218039 May 2012 TW
201425180 Jul 2014 TW
WO199716932 May 1997 WO
WO 01059588 Aug 2001 WO
WO2002073587 Sep 2002 WO
WO 03038800 May 2003 WO
WO2006057770 Jun 2006 WO
WO2007114631 Oct 2007 WO
WO2008075082 Jun 2008 WO
WO2009038862 Mar 2009 WO
WO2009068986 Jun 2009 WO
WO2009097866 Aug 2009 WO
WO2009122331 Oct 2009 WO
WO2009150287 Dec 2009 WO
WO 10085575 Jul 2010 WO
WO2010087925 Aug 2010 WO
WO 11007263 Jan 2011 WO
WO 12052635 Apr 2012 WO
WO 12129247 Sep 2012 WO
WO 13069148 May 2013 WO
WO 13169302 Nov 2013 WO
WO 14018086 Jan 2014 WO
WO 13169299 Nov 2014 WO
WO 15023670 Feb 2015 WO
Non-Patent Literature Citations (30)
Entry
International Search Report and Written Opinion, PCT/US2011/048808, 23 pages, dated Apr. 2, 2012.
Hasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC—vol. 49, pp. 73-80, 1993.
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Standford University, 6 pages, at least as early as Sep. 30, 2009.
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004.
Office Action dated May 15, 2014, TW 100132478, 8 pages.
Kim et al., “Tactile Rendering of 3D Features on Touch Surfaces,” UIST '13, Oct. 8-11, 2013, St. Andrews, United Kingdom, 8 pages.
European Search Report dated Jul. 7, 2015, EP 11752699.6, 7 pages.
U.S. Appl. No. 14/910,108, filed Feb. 4, 2016, Martinez et al.
U.S. Appl. No. 15/045,761, filed Feb. 17, 2016, Morrell et al.
U.S. Appl. No. 15/046,194, filed Feb. 17, 2016, Degner et al.
U.S. Appl. No. 15/047,447, filed Feb. 18, 2016, Augenbergs et al.
U.S. Appl. No. 15/102,826, filed Jun. 8, 2016 Smith et al.
U.S. Appl. No. 15/251,459, filed Aug. 30, 2016, Miller et al.
U.S. Appl. No. 15/260,047, filed Sep. 8, 2016, Degner.
U.S. Appl. No. 15/306,034, filed Oct. 21, 2016, Bijamov et al.
Astronomer's Toolbox, “The Electromagnetic Spectrum,” http://imagine.gsfc.nasa.gov/science/toolbox/emspectrum1.html, updated Mar. 2013, 4 pages.
U.S. Appl. No. 15/364,822, filed Nov. 30, 2016, Chen.
U.S. Appl. No. 15/583,938, filed May 1, 2017, Hill.
Nakamura, “A Torso Haptic Display Based on Shape Memory Alloy Actuators,” Massachusetts Institute of Technology, 2003, pp. 1-123.
U.S. Appl. No. 15/621,966, filed Jun. 13, 2017, Pedder et al.
U.S. Appl. No. 15/621,930, filed Jun. 13, 2017, Wen et al.
U.S. Appl. No. 15/622,017, filed Jun. 13, 2017, Yang et al.
U.S. Appl. No. 15/641,192, filed Jul. 3, 2017, Miller et al.
U.S. Appl. No. 15/800,630, filed Nov. 1, 2017, Morrell et al.
U.S. Appl. No. 13/630,867, filed Sep. 28, 2012, Bernstein.
U.S. Appl. No. 14/059,693, filed Oct. 22, 2013, Puskarich.
U.S. Appl. No. 14/165,475, filed Jan. 27, 2014, Hayskjold et al.
U.S. Appl. No. 14/493,190, filed Sep. 22, 2014, Hoen.
U.S. Appl. No. 14/512,927, filed Oct. 13, 2014, Hill.
U.S. Appl. No. 14/928,465, filed Oct. 30, 2015, Bernstein.
Related Publications (1)
Number Date Country
20120068957 A1 Mar 2012 US