1. Technical Field
This invention relates generally to user interface peripherals, and more particularly to a user interface configured to deliver a haptic response to a user input element.
2. Background Art
Compact portable electronic devices are becoming increasingly popular. As more and more users carry these electronic devices, manufacturers are designing smaller devices with increased functionality. By way of example, not too long ago a mobile telephone was a relatively large device; its only function was that of making telephone calls. Today, however, mobile telephones fit easily in a shirt pocket and often include numerous “non-phone” features such as cameras, video recorders, games, web browsers, and music players.
Just as the feature set included with compact portable electronic devices has become more sophisticated, so too has the hardware itself. Most portable electronic devices of the past included only manually operated buttons. Today, however, manufacturers are building devices with “touch sensitive” screens and user interfaces that include no physical buttons or keys. Instead of pressing a button, the user touches “virtual buttons” presented on the display to interact with the device.
Despite the convenience and flexibility of these devices, many users today still prefer the familiarity of a more classic user interface. Some find the small touch screen user interfaces cumbersome to operate and prefer, for example, a full size QWERTY keyboard. While some electronic devices allow a conventional keyboard to be coupled as a user interface, prior art keyboard technology results in large form-factor designs. Users generally do not want to carry large keyboards along with their compact electronic device. As a result, such keyboards are relegated to limited usage. It would be advantageous to have an improved user input device.
Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of embodiments of the present invention.
Various embodiments describe and illustrate a compact user interface, suitable for use with an electronic device, which provides a “legacy” feel. Embodiments include an electromechanical user interface design that delivers the tactile feedback of a conventional keypad or keyboard with a form factor suitable for use with modern, compact, electronic devices. In short, embodiments described below provide a conventional user interface experience with an interface peripheral that is very thin, simple, and compact.
In one or more embodiments, a user interface element configured as a key is disposed above an engagement layer that spans two or more keys and that can selectively engage a single key. The user interface elements can be supported on a common carrier, which may be a thin, flexible sheet.
The engagement layer can define a plurality of apertures, with each aperture corresponding to a boss extending distally away from the user interface element. If the user interface has a single boss, for example, the engagement layer may have a single aperture corresponding to the user interface element. Where the user interface element has multiple bosses, multiple apertures of the engagement layer can correspond to the user interface element. As will be shown and described below, the boss and aperture can have similar or different shapes. In one embodiment, the boss has a round cross section while the aperture is a different shape, e.g., a rectangle.
A membrane switch can be disposed beneath the user interface element opposite the engagement layer. Separators or spacers can separate layers of the membrane switch beneath the engagement layer. The separators or spacers, which may be single devices, or multiple stacked devices, can be configured to allow a user to rest his or her fingers on the user interface elements without those user interface elements traveling along the z-axis (up and down a distance sufficient to close a switch). When a user actuates the user interface element by pressing upon it to deliver a sufficient magnitude of user input force, the membrane switch closes. A control module detects the switch closing. As the user presses the user interface element, the boss can pass through its corresponding aperture to contact a substrate. The boss can then expand to grasp or “engage” the engagement layer. Prior to or during engagement, the control module can fire a motion generation component coupled to the engagement layer to deliver a haptic response through the engagement layer to the pressed user interface element. Note that even though the engagement layer spans multiple user interface elements, haptic response is only delivered to those user interface elements that are actuated by the user. Accordingly, a “localized” haptic response is delivered only to actuated user interface elements and not those unactuated elements spanned by the engagement layer. In this fashion, the user interface peripheral can be made thirty to sixty percent thinner than conventional keyboards. While the interface peripherals described below can deliver a tactile response to only a single key, multi-key tactile feedback can be delivered as well. For example, when the user presses multiple keys, e.g., CTRL+ALT+DEL, the haptic feedback can be delivered to the three actuated keys simultaneously.
User interface peripherals illustrated and described below can work in reverse as well. As will be shown, when the interface is integrated into a folio configuration, for example, a user may actuate the rear side of the user interface element to receive the haptic feedback as well. Said differently, if the interface peripheral is configured as a keypad in a folio, the user can close the folio and press the back layer to actuate one of the user interface elements. Other features can be included as well. For instance, in one or more embodiments the user interface elements also include light pipes that conduct light to provide a backlit user input interface experience. Thus, single user interface elements can be illuminated when they are pressed by a user.
In one or more embodiments, the interface is configured as a keypad that can use mechanical pressure, force sensing devices, resistive touch, and multi-touch technology to deliver haptic responses to the user. The keypad can be made of a thin pliable material, such as a rubber, silicone, or polymer materials. The component interaction surfaces can take a variety of shapes, including semi-spherical, triangular, rectangular, and so forth. When keys are pressed, the component interaction surface forms a variable area contact point. When used with a force sensor, such as a force sensitive resistor, the variable area can be used to determine force. In one or more embodiments, the tactile response delivered to the key can be partially dependent upon the detected force. Although the user interfaces shown are described as separate peripheral devices, the user interfaces could be easily modified to be integrated into the main electronic device. Other form factors are also available, such as accessories for the main electronic device
The explanatory haptic user interface system 100 of
A bus 104 conveys electronic signals between the electronic device 102 and the interface peripheral 101 in this illustration. It will be clear to those of ordinary skill in the art having the benefit of this disclosure that the interface peripheral 101 and electronic device can be configured to exchange electronic signals in any of a variety of ways, including via wire, bus, wireless communications such as Bluetooth, Bluetooth Low Energy (BTLE), Wi-Fi, or other wireless communications, optical communication (including infrared), and so forth.
The folio configuration shown in
A plurality of user input elements, e.g., user input elements 107, 108, 109, 110, 111, 112 are disposed along a major face of the interface peripheral 101. Each user input element 107, 108, 109, 110, 111, 112 is moveable along a first axis to close a switch. In this illustrative embodiment, the interface peripheral 101 is configured as a QWERTY keypad, with each user input element 107, 108, 109, 110, 111, 112 being configured as a key. Other configurations, including a musical keyboard, gaming keyboard, or learning keyboard, will be described below with reference to
A user 116 actuates one or more of the user input elements 107, 108, 109, 110, 111, 112 by moving a user input element 112 along the first axis. Sufficient movement of the user input element 112 along the first axis closes a switch disposed beneath the user input element 112. Disposed between the user input element 112 and the switch is a mechanical layer that spans a plurality of the user input elements 107, 108, 109, 110, 111, 112 along the second and third axes. Examples of mechanical layers will be described in more detail with reference to
When the user 116 actuates the user input element 112 and the switch closes, in one embodiment a boss extending from the user input element 112 is configured to expand in response to the application of force to engage the mechanical layer. The control module actuates a haptic device coupled to the mechanical layer to deliver a haptic response 117 to the user input element 112. In one embodiment, the haptic response 117 is delivered to the user input element 112 when engaged with the mechanical layer. This embodiment will be described in more detail below with reference to
Each user input element 207, 208, 209, 210 includes a user interaction surface 221 with which a user may press or otherwise actuate the user input element 207, 208, 209, 210. Each user input element 207, 208, 209, 210 in this explanatory embodiment also includes a boss 246 extending distally away from the user interaction surface 221. While each user input element 207, 208, 209, 210 of
Each boss 246 terminates in a component interaction surface 247. The explanatory component interaction surfaces 247 of
Disposed beneath the user input elements 207, 208, 209, 210 is an engagement layer 222. The engagement layer 222 can be configured as a thin metal layer or thin plastic layer, and forms a mechanical layer that spans two or more of the user input elements 207, 208, 209, 210. As will be explained in more detail below, the engagement layer 222 can comprise a lightguide. In the explanatory embodiment of
The engagement layer 222 defines a plurality of apertures 223, 224, 225, 226 that correspond to the user input elements 207, 208, 209, 210. In one embodiment, the engagement layer 222 is a conduit for light projected by light sources of the interface peripheral 201, and accordingly can function as a light guide to backlight or otherwise illuminate the interface peripheral 201. Since only one boss 246 extends from each user input element 207, 208, 209, 210, the apertures 223, 224, 225, 226 shown in
The shape of the boss 246 and shape of the apertures 223, 224, 225, 226 can correspond to each other or can be different. For example, in one embodiment, a perimeter 227 of an aperture 223 can be the same shape as the cross section of the boss 246. The perimeter 227 can be circular when the cross section of the boss 246 is circular. In another embodiment, the perimeter 227 of an aperture 223 can be similar to, but different from, the cross section of the boss 246. For instance, if the cross section of the boss 246 is circular, the perimeter 227 of the aperture 223 can be oval. In yet another embodiment, the perimeter 227 of the aperture 223 can be different than the cross section of the boss 246. In
In one or more embodiments, a width 230 of each aperture 223, 224, 225, 226 is greater than a diameter 231 of the boss 232 to which it corresponds. This configuration allows the boss 232 to initially pass through the corresponding aperture 233 when the user moves the corresponding user input element 234 along the z-axis 115 (in the negative direction).
One or more motion generation components 228, 229 can be coupled to the engagement layer 222. In one embodiment, the motion generation components 228, 229 are piezoelectric devices. Other devices can also be used, including vibrator motors, rotator motors, an artificial muscle, electrostatic plates, or combinations thereof. While piezoelectric transducers are but one type of motion generation component suitable for use with embodiments of the present invention, they are well suited to embodiments of the present invention in that they provide a relatively fast response and a relatively short resonant frequency. Prior art haptic feedback systems have attempted to mount such devices directly to the device housing or the user interface surface. Such configurations are problematic, however, in that piezoelectric materials can tend to be weak or brittle when subjected to impact forces. Consequently, when such a prior art configuration is dropped, these “directly coupled” configurations can tend to break or malfunction. Embodiments of the present invention avoid such maladies in that the piezoelectric devices are coupled to the engagement layer 222, which is suspended within the interface peripheral 201. The piezoelectric devices are able to vibrate independent of an outer housing. This configuration is better able to withstand common drop testing experiments.
As will be described below with reference to
“Engagement” as used with the engagement layer refers to mechanically grasping, clenching, holding, catching, seizing, grabbing, deforming, or latching to the user input element 207, 208, 209, 210. For example, a boss 232 can be configured to contact a lower layer 235 and a rigid substrate 245 when a user moves the corresponding user input element 234 along the z-axis 115 (in the negative direction). Where the boss 232 is manufactured from a pliant material, this contact can cause the diameter 231 of the boss 232 to expand along the x-axis 113 and y-axis 114 after being depressed against the rigid substrate 245. As the diameter 231 expands, the pliant material of the boss 232 “engages” the engagement layer by grasping the sides of the corresponding aperture 233. Said differently, the boss 232 in this embodiment is configured to expand upon actuation to grip a perimeter of its corresponding aperture 233. This is one example of engagement. Others will be described, for example, with reference to
In one embodiment, when the engagement layer 222 is engaged with an actuated user input element, the engagement layer 222 delivers a haptic response to an actuated user input element when the motion generation component 228, 229 actuates. This occurs as follows: actuation of the motion generation component 228, 229 causes movement of the engagement layer 222 along the x-axis 113, the y-axis 114, or combinations thereof. When engaged with an actuated user input element 234 that has been moved along the z-axis 115 such that its boss 232 has engaged with a corresponding aperture 233, a haptic response will be delivered to the engaged user input element 234.
A lower layer 235 is disposed on a side opposite the engagement layer 222 from the user input elements 207, 208, 209, 210. The lower layer 235 may be combined with a substrate 245 that serves as a base of the interface peripheral 201. The substrate 245 can be rigid. For example, the substrate 245 can be manufactured from FR4 printed wiring board material that can also function as a structural element. The lower layer 235 can be configured as a flexible material or as part of the substrate 245.
Disposed between the lower layer 235 and the engagement layer 222 is an array 236 of switches. In this explanatory embodiment, the switches of the array 236 are each membrane switches. Membrane switches, which are known in the art, are electrical switches capable of being turned on or off. The membrane switches of
When a boss, e.g., boss 232, passes through a corresponding aperture 233 of the engagement layer 222 along the z-axis 115, it contacts one of the first conductors 237, 238, 239 and deforms the flexible layer 240. As the boss 232 continues to move along the z-axis 115, the first conductor 239 engaged by the boss 232 contacts one of the second conductors 241, 242, 243, 244, 245. When this occurs, one switch of the array 236 closes and user input is detected.
When this user input is detected, a control module can actuate or fire one or more of the motion generation components 228, 229. In one embodiment, a delay between closing of the switch and firing of the motion generation component can be inserted. For example, in an embodiment where engagement of the boss 232 with a corresponding aperture 233 occurs when the boss 232 expands along the x-axis 113, y-axis 114, or both, the delay may be inserted to ensure enough time passes for engagement to occur.
As shown in
Each of the first conductive layer 340 and the second conductive layer 335 has a resistance such that current passing through one or both of the first conductive layer 340 and the second conductive layer 335 can be varied by the amount of contact between the first conductive layer 340 and the second conductive layer 335. When a user applies force to user input element 307, the compressible spacers 331, 332, 333, 334, 336, 337, 338, 339 compress. When enough compression occurs, the first conductive layer 340 and second conductive layer 335 come into contact, thereby closing a switch and allowing a current to flow in accordance with a resistance established by the contact surface area between the first conductive layer 340 and the second conductive layer 335. In addition to triggering a motion generation component upon the closing of the switch, the amount of current flowing can be detected to determine a magnitude of force being applied to the user input element 307.
Differences between
In
In one illustrative embodiment using stacked spacers 431, 432, 433, 434, the stacked spacers 431, 432, 433, 434 can define different aperture shapes. For example, stacked spacers 431, 433 can define a square aperture, while stacked spacers 433, 434 define a round aperture, or vice versa. In other embodiments, each of the stacked spacers 431, 432, 433, 434 can define apertures with a common shape. For example, the perimeter of the defined apertures 450 can be the same shape as the boss 446. Where single spacers 531, 532 are used, the perimeter of the apertures 550 can be rectangular in shape, while the boss 546 has a round cross section. Testing has shown a configuration usig stacked spacers 431, 432, 433, 434 with stacked spacers 431, 433 defining a square aperture and stacked spacers 433, 434 defining a round aperture to allow a user to rest fingers on the user input elements without closing the membrane switch.
At step 660, the interface peripheral 400 is in a non-actuated state. The user input element 407 rests on the key carrier 403. At step 661, a force 666 is applied to the user interaction surface 421 of the user input element 407. The force 666 translates the user input element 407 along the z-axis 115 (in the negative direction). This translation moves the boss 446 through the engagement layer 422. The translation also closes the membrane switch by pressing flexible layer 440 against lower layer 435, thereby causing the contacts on each to electrically connect.
At step 662, the continued pressure upon the user input element 407 along the z-axis 115 when opposed by the substrate 430 causes the boss 446 to expand, thereby engaging the engagement layer 422 by expanding and gripping the perimeter of the aperture of the engagement layer 422. This is known as “compression engagement.”
At step 663, a control module, triggered by the membrane switch closing at step 661, fires a haptic element coupled to the engagement layer 422. This causes the engagement layer 422 to move along an axis substantially orthogonal with the z-axis 115 to deliver the haptic response 617 to the user input element 407. For instance, where a first haptic device coupled to the engagement layer 422 is oriented to impart a force upon the engagement layer 422 along the x-axis 113 and a second haptic device coupled to the engagement layer 422 oriented to impart another force along the y-axis 114, firing the haptic elements can cause the engagement layer 422 to move along the x-axis 113, the y-axis 114, or combinations thereof.
As shown in
It should be noted that steps 662 and 663 could occur in either order. In one embodiment, the haptic element will be fired before the boss 446 engages the engagement layer 422. Said differently, step 663 will occur before step 662. In another embodiment, the boss 446 will engage the engagement layer 422 prior to the haptic device firing. In other words, step 662 will occur before step 663. One way to ensure the latter embodiment occurs is to insert a delay between the closing of the switch occurring at step 661 and the firing of the haptic element that occurs at step 663.
At step 760, the interface peripheral 400 is in a non-actuated state. The user input element 407 rests on the key carrier 403. At step 761, a force 752 is applied to the user interaction surface 421 of the user input element 407. The force 752 translates the user input element 407 along the z-axis 115 (in the negative direction). This translation moves the boss 446 through the engagement layer 422. The translation also closes the membrane switch by pressing flexible layer 440 against lower layer 435, thereby causing the contacts on each to electrically connect.
At step 762, a control module fires a haptic element coupled to the engagement layer 422. This causes the engagement layer 422 to move along an axis substantially orthogonal with the z-axis 115. For instance, where a first haptic device coupled to the engagement layer 422 is oriented to impart a force upon the engagement layer 422 along the x-axis 113 and a second haptic device coupled to the engagement layer 422 oriented to impart another force along the y-axis 114, firing the haptic elements can cause the engagement layer 422 to move along the x-axis 113, the y-axis 114, or combinations thereof.
At step 763, the continued translation of the engagement layer 422 along the x-axis 113, y-axis 114, or a combination thereof, causes the engagement layer 422 to engage the user input element 407. This engagement grips at least a portion of the boss 446 against the engagement layer 422 and delivers the haptic response 717 to the user input element 407. This is known as “translation engagement.”
As with the engagement layer (222) of
One or more motion generation components 828, 829, 928, 929 can be coupled to the engagement layers 822, 922. In
Since multiple sheets 881, 882, 883, 991, 992, 993, 994 are used in
A magnified view of one embodiment of a force sensing resistive switch 1010 is shown as an electrode node 1016. This electrode node 1016 can be repeated on the contact layer 1035 to form the array of force sensing resistive switches.
The electrode node 1016 has two conductors 1017, 1018. The conductors 1017, 1018 may be configured as exposed copper or aluminum traces on a printed circuit board or flexible substrate 1030. The two conductors 1017, 1018 are not electrically connected with each other. In one embodiment, the two conductors 1017, 1018 terminate in an interlaced finger configuration where a plurality of fingers from the first conductor 1017 alternate in an interlaced relationship with a plurality of fingers from the second conductor 1018.
The electrode node 1016 can be configured in a variety of ways. For example, in one embodiment the electrode node 1016 can be simply left exposed along a surface of the substrate 1030. In another embodiment the electrode node 1016 can be sealed to prevent dirt and debris from compromising the operative reliability of the electrodes. In another embodiment, a conductive covering can be placed atop the electrode node 1016 to permit the electrode node 1016 to be exposed, yet protected from dirt and debris.
In the explanatory embodiment of
To function with the electrode node 1016, the boss 1046, its component interaction surface, or both, will be constructed from a conductive material. For example, the boss 1046 can be manufactured from a resilient, pliable material such as an elastomer that is further capable of conducting current. Such conductive elastomers are known in the art. The benefits of conductive elastomers as they relate to embodiments of the present invention are four-fold: First, they are compressible. This allows for varying surface contact areas to be created across the electrode node 1016. Second, conductive elastomers may be designed with resistances that are within acceptably accurate ranges. Third, the conductive elastomers may be doped with various electrically conductive materials to set an associated resistance, or to vary the resistances of each boss 1046. Fourth, conductive elastomers are easily shaped.
Compression of the boss 1046 against the electrode node 1016 forms a resistive path between the first conductor 1017 and the second conductor 1018. Compression of the boss 1046 with different amounts of force results in establishment of different resistances across the electrode node 1016. The boss 1046 effectively gets “squished” against the electrode node 1016 in a degree corresponding to the applied force. This results in more or fewer of the interlaced fingers of the electrode node 1016 coming into contact with the conductive portion of the boss 1046. Where the control module of the interface peripheral 1000 is capable of detecting current flowing through—or voltage across—the electrode node 1016, the control module can detect an electrical equivalent, i.e., voltage or current, corresponding to how “hard” the boss 1046 of the user input element 1007 is pressing against the electrode node 1016. When a user manipulates the user input element 1007, the compressible, conductive material of the boss 1046 can expand and contract against the electrode node 1016, thereby changing the impedance across the electrode node 1016. The control module can detect the resulting change in current or voltage, and then to interpret this as user input.
At contact view 1020, the boss 1046 is just barely touching the electrode node 1016. This initial engagement establishes a high impedance, Rhi, which corresponds to a minimal force being applied to the user input element 1007. At contact view 1021, a greater amount of contact is occurring between the boss 1046 and the electrode node 1016. This establishes a resistance, R1, which is less than Rhi and corresponds to a slightly larger force being applied to user input element 1007 than at contact view 1020.
At contact view 1025, a still greater amount of contact is occurring between the boss 1046 and the electrode node 1016. This establishes a second resistance, R2, with a value that is less than resistance R1, and that corresponds to a greater amount of force being applied to the user input element. At contact view 1026, a still larger amount of contact is occurring between the boss 1046 and the electrode node 1016. Presuming that this is maximum compression, a lowest resistance, Rlo, is created, which corresponds to maximum force being applied to the user input element 1007.
When force is detected, knowledge of the magnitude of force can be used in the delivery of haptic responses. For example, in one embodiment a predetermined resistance, e.g., R2, must be achieved prior to firing the motion generation devices or haptic components. Thus, light force touches, e.g., when a user's fingertips are resting on keys but not intentionally pressing down, and initial touches, e.g., at the beginning of a keystroke, will not activate a haptic component.
The amount of force can be used in other ways as well. Recall from
In
The embodiments of
Rather than using a bus (104) to communicate with the electronic device 1302, the haptic user interface 1301 of
A plurality of keys 1307, 1308, 1309, 1310, 1311, 1312 is disposed along the haptic user interface 1301. Each key 1307, 1308, 1309, 1310, 1311, 1312 is moveable along an axis to close a switch 1334. The switch 1334 can be a membrane switch as shown in
A user applies a force 1362 to one or more of the keys 1307, 1308, 1309, 1310, 1311, 1312 by moving a key, e.g., key 1312, along the first axis. Movement of the key 1312 along the first axis closes the switch 1334. Disposed between the key 1312 and the switch 1334 is a mechanical layer 1322 that spans multiple keys 1307, 1308, 1309, 1310, 1311, 1312 along axes orthogonal to the first axis. One or more haptic devices, which are operable with and coupled to the mechanical layer 1322, are configured to impart a force upon the mechanical layer 1322 upon being fired by a control module to deliver a haptic response 1317 to the key 1312.
The embodiment of
Bosses can be made in other ways as well. User interaction element 1627 includes a hollow boss 1666 as one example. As noted above, the boss material can be conductive when the boss is to be used with a force sensing resistive switch. However, the boss material need not be conductive when a membrane switch or resistive touch panel is used.
Component interaction surface 1747 is configured as a convex contour. Such a contour is useful when using a force sensing resistive switch or resistive touch panel. This is one example of a non-linear contoured termination configuration. Component interaction surface 1757 is semi-spherical. Component interaction surface 1767 is frustoconical. Component interaction surface 1777 is frustoconical with a convex contour terminating the cone's frustum. This is another example of a non-linear contoured termination configuration. Component interaction surface 1787 is rectangular. These component interaction surfaces are illustrative only. Other shapes may be possible, as will be obvious to those of ordinary skill in the art having the benefit of this disclosure.
To this point, the interface peripherals described above have been primarily QWERTY keypads suitable for use with electronic devices, such as those having only touch sensitive surfaces and not having physical keys. However, as noted above, embodiments of the invention are not so limited.
A plurality of switches 2101 is operable with the control module 2105 to detect user input. The switches 2101 are operable with corresponding user input elements to detect user actuation of one or more user actuation elements by closing when a user input element is translated along an axis. The switches 2101 can be membrane switches, a resistive touch panel, or resistive force sensing switches 2102. Where membrane switches are employed, the control module can detect actuation of a user input element by detecting one or more of the membrane switches closing.
Where a resistive touch panel or resistive force sensing switches 2102 are employed, a plurality of electrode nodes can be coupled to, and is operable with, the control module 2105. In one embodiment, the control module 2105 can be configured to sense either current or voltage through each electrode node. The amount of current or voltage will depend upon the surface area of each compressible (optionally conductive, depending on implementation) boss when actuated by a user, as the surface area defines a corresponding resistance across each electrode node. The control module 2105 detects this current or voltage across each electrode node and correlates it as an applied force.
When a switch actuates, the control module 2105 can fire a motion generation component 2103. Where additional motion generation components 2107 are included, the control module 2105 can fire them in combination, or separately. In one or more embodiments, an audio output 2104 can be configured to deliver an audible “click” or other suitable sound in conjunction with the haptic feedback.
At step 2201, user input resulting from translation of a user input element is received. In one embodiment, the user input is received by detecting a switch closing at step 2202 when a user input element is translated along the z-axis (115) in response to the application of force on a user interaction surface of the user interaction element. In another embodiment, the user input is received by detecting a user press along a pliable (folio layer) substrate disposed opposite a mechanical sheet or engagement layer from a plurality of keys.
At optional step 2203, a magnitude of the applied force can optionally be determined by using a force sensing element or resistive touch layer. At step 2204, an optional delay of a predetermined time can be inserted.
Steps 2205 and 2206 can occur in either order. At step 2205, a motion generation component coupled to the mechanical sheet or engagement layer is actuated. In one embodiment, the mechanical sheet or engagement layer actuated is one of a plurality of sheets. In such an embodiment, step 2205 can also include determining which of the plurality of sheets corresponds to the user input element actuated at step 2201 and actuating only the motion generation component corresponding to a single actuated key or multiple actuated keys.
At step 2206, the user input element to which the force of step 2201 was applied engages with the mechanical sheet or engagement layer. As described above, the engagement can be translational engagement or compression engagement. Compression engagement can include grasping, with only a single key, the mechanical sheet or engagement layer.
After step 2205 has occurred, the mechanical sheet or engagement layer moves at step 2207. When both steps 2205, 2206 have occurred, regardless of order, the mechanical sheet or engagement layer delivers a haptic response to an engaged user input element at step 2208. In one embodiment, this haptic response is delivered to a single key by moving the mechanical sheet when engaged with the single key. In another embodiment, the haptic response can be delivered to a combination of keys actuated by the user.
It should be observed that the embodiments described above reside primarily in combinations of method steps and apparatus components related to haptic feedback delivery by moving a mechanical sheet or engagement layer that spans a plurality of keys, is capable of engaging any of the plurality of keys, but engages only those actuated by a user. Any process descriptions or blocks in flow charts should be understood as representing modules, segments, or portions of code that include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included, and it will be clear that functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved. Accordingly, the apparatus components and method steps have been represented where appropriate by conventional symbols in the drawings, showing only those specific details that are pertinent to understanding the embodiments of the present invention so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein.
It will be appreciated that embodiments of the invention described herein may be comprised of one or more conventional processors and unique stored program instructions that control the one or more processors to implement, in conjunction with certain non-processor circuits, some, most, or all of the functions of the control module described herein. As such, these functions may be interpreted as steps of a method to perform haptic feedback delivery. Alternatively, some or all functions could be implemented by a state machine that has no stored program instructions, or in one or more application specific integrated circuits (ASICs), in which each function or some combinations of certain of the functions are implemented as custom logic. Of course, a combination of the two approaches could be used. Thus, methods and means for these functions have been described herein. Further, it is expected that one of ordinary skill, notwithstanding possibly significant effort and many design choices motivated by, for example, available time, current technology, and economic considerations, when guided by the concepts and principles disclosed herein will be readily capable of generating such software instructions and programs and ICs with minimal experimentation.
As used in the description herein and throughout the claims, the following terms take the meanings explicitly associated herein, unless the context clearly dictates otherwise: the meaning of “a,” “an,” and “the” includes plural reference, the meaning of “in” includes “in” and “on.” Relational terms such as first and second, top and bottom, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, reference designators shown herein in parenthesis indicate components shown in a figure other than the one in discussion. For example, talking about a device (10) while discussing figure A would refer to an element, 10, shown in a figure other than figure A.
In the foregoing specification, specific embodiments of the present invention have been described. However, one of ordinary skill in the art appreciates that various modifications and changes can be made without departing from the scope of the present invention as set forth in the claims below. Thus, while preferred embodiments of the invention have been illustrated and described, it is clear that the invention is not so limited. Numerous modifications, changes, variations, substitutions, and equivalents will occur to those skilled in the art without departing from the spirit and scope of the present invention as defined by the following claims. Accordingly, the specification and figures are to be regarded in an illustrative rather than a restrictive sense, and all such modifications are intended to be included within the scope of present invention. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential features or elements of any or all the claims.