This application relates to U.S. Application No. 2008/0010593, titled “USER INTERFACE INPUT DEVICE”, filed Jun. 30, 2006, now U.S. Pat. No. 7,916,002 B2, which is hereby incorporated by reference in its entirety and U.S. Patent Application 2009/0319893, titled “METHOD AND APPARATUS FOR ASSIGNING A TACTILE CUE”, concurrently filed, now abandoned, which is hereby incorporated by reference in its entirety.
The present application relates generally to electronic device user interfaces.
User interfaces, such as touchscreens have become commonplace since the emergence of the electronic touch interface. Touchscreens have become familiar in retail settings, on point of sale systems, on smart phones, on Automated Teller Machines (ATMs) and on Personal Digital Assistant (PDAs). The popularity of smart phones, PDAs, and many types of information appliances is growing the demand for, and the acceptance of, touchscreens. Although the demand and acceptance of touchscreens is growing, touchscreens are still limited.
Various aspects of the invention are set out in the claims.
In accordance with an example embodiment of the present invention, an electronic device is configured to provide a tactile cue associated with a feature. The electronic device is also configured to identify one or more user actuations. After identifying one or more user actuations, the electronic device is configured to execute the feature.
For a more complete understanding of example embodiments of the present invention reference is now made to the following descriptions taken in connection with the accompanying drawings in which:
An example embodiment of the present invention and its potential advantages are best understood by referring to
Traditional screens, such as a touchscreen, provide a user with soft keys and other soft input devices on a user interface. But soft keys and soft input devices are of limited use. In particular, the soft keys and soft input devices do not provide users with tactile cues of use without visual inspection, e.g., eyes-free use. Using a touchscreen without visual inspection is desirable for features, such as music playback, volume control, Global Positioning System (GPS) navigation and/or the like. Example embodiments of the invention use tactile cues to facilitate execution of a feature on a touchscreen, display cover, or electronic device.
In this example embodiment, the electronic device 100 comprises a touchscreen 120 having at least one selectable tactile cue, for example selectable tactile cues 105, 110. A user may associate a selectable tactile cue 105 with a particular feature, e.g., playing music. To employ the example embodiment, the user places a finger on the selectable tactile cue 105. As a result, the user receives a tactile sensation from the selectable tactile cue 105. The tactile sensation may indicate an association between the feature and the selectable tactile cue. For example, the user places a finger on selectable tactile cue 105 and associates the selectable tactile cue 105, e.g., a playback button, as a music playback feature. If the user would like to execute the feature, the user presses the selectable tactile cue 105 (or provides some other interface indication) and the electronic device 100 executes the feature associated with the selected tactile cue.
It should be understood that the above is merely an example of a selectable tactile cue 105 and any number of features may be employed using, for example, selectable tactile cue 110 or the like. In an example embodiment, the tactile cue may be arranged in a pattern of a predetermined number of raised lines. In an alternative embodiment, the tactile cue may use a shape, other identifiable symbol and/or the like. Thus, the tactile cue distinguishes from another by the pattern of raised lines, the shape, identifiable symbol, and/or the like. In an alternative embodiment, the tactile cues may be an indicator of a starting location or point on a screen to facilitate execution of a feature using a finger sweep, roll, gesture, and/or the like. In an embodiment, a sweep may move or carry a finger on the touchscreen 120. In an embodiment, a roll may move by turning on an axis on touchscreen 120. In an embodiment, a gesture may make a sign or motion, such as an “x.” It should be understood that the above is merely an example and sweep, roll, and gesture may comprise many different forms and variations as known in the art.
It should also be understood that while an electronic device 100 is shown in the drawings and will be used in describing example embodiments of the invention, the invention has application to the entire gamut of consumer electronics including, but not limited to, a mobile telephone, a personal digital assistant, a portable computer device, GPS, a mobile computer, a camera, a browsing device, an electronic book reader, a combination thereof, and/or the like. Further still, example embodiments of the invention may also be applicable to a touchscreen, a screen, a screen edge, a display cover, a touch pad, or a combination thereof.
It should be understood that it is also possible to select a tactile cue using a sweep, roll, gesture, and/or the like as described below. In the case of a sweep, roll, gesture, and/or the like the touch sensitive area 150 is further configured to detect a sweep, roll, or gesture or detect multiple sweep motions in sequence. It should also be understood that the touch screen area 150 may be a portion of touchscreen 120, the entire touchscreen 120, or a combination thereof. In an embodiment, a piezo actuator as described below may also be employed.
It should be further understood that the tactile cue 185 may be positioned on a touchscreen, on a screen, on a screen edge, on a display cover, adjacent to a screen, or a combination thereof. It should be further understood that the tactile cue 185 may be concave, convex, embossed icon, a replaceable sticker, flat with a textured material, three dimensional and/or the like. In an embodiment, the tactile cue 185 may be opaque, transparent, and/or the like.
Moreover, in an example embodiment, the touch sensitive area 150 may use one of many touch sensor technologies. For example, the touch sensitive area 150 may use a capacitive touch sensor, e.g., an analog capacitive sensor or a projected capacitive sensor, a resistive touch sensor, an optical touch sensor, an acoustic touch sensor, a force sensor, a vibration touch sensor, or any other suitable touch sensor. Use of other touch sensor technologies is also possible. Several of these technologies are described briefly below.
A capacitive touch sensor comprises at least one conductive layer. The conductive layer is usually energized by an oscillator circuit. When a user touches the tactile cue 185, a signal is generated as a result of a capacitive coupling between the user and the conductive layer. The signal is converted to the location of the touch by a sensing circuit.
A resistive touch sensor typically comprises two transparent conductive layers separated by spacer dots. When a touch forces the two conductive layers to come into contact, the resulting voltage is sensed and the location of the touch is computed. It should be understood that the touch sensitive area 150 may also use these sensors for detecting sweeps, rolls, or gestures.
In an embodiment, the electronic device may include a replaceable cover. The replaceable cover may be stenciled, embossed, or silk screened as desired with any number of tactile cues or embossed logos. For example, the tactile cues may resemble normal mechanical keys with key graphics. The tactile cues may also be concave instead of convex. Further, the tactile cues may use different materials, e.g. rubber or leather patches on a plastic or a metal cover. In an example embodiment, the tactile cues may also be dynamic (e.g., tactile cues appear and disappear) using an actuator, such as a mechanical actuator.
Referring by now to the example process 200, the electronic device may identify one or more user actuations at 210. For example, the electronic device identifies that a user actuates a feature, such as increasing volume. Restated, the electronic device identifies the user has selected the tactile cue or provided some other indication of selection (e.g., sweep, roll, gesture, and/or the like) at 210. In an embodiment, the electronic device may also detect multiple user actuations, such as sweep motions in sequence. At 215, the electronic device executes the feature. For example, the electronic device, after identifying the sweep (e.g., a user actuation), executes the volume control feature.
In an embodiment, the electronic device may also modify a parameter of a feature at 220. For example, the electronic device modifies a parameter, such as a volume level, of the volume control feature. Thus, the electronic device, using example process 200 may execute a feature and modify a parameter of a feature. It should be understood that the electronic device can identify a user interaction, such as a sweep, roll, gesture, button press, and/or the like.
Moreover, the piezo actuator may also use a similar piezo element to provide tactile feedback, such as vibration, to a user of the electronic device 300. Thus, providing a user with a confirmation of a successful feature activation.
It should be understood that both the piezo sensors and the piezo actuator may be fabricated from a single piezo-electric element so as to be both coplanar and electronically isolated from one another. The difference in operation between the piezo sensors and the piezo actuator is achieved through a coupling of the piezo sensors and the piezo actuator to a voltage source and a differential voltage measurement device respectively as known in the art. Other configurations are also possible.
It should be further understood that any number of sweeping variations may be employed by a user. The user's sweeping finger 320 from a first side 370 towards a second side 375 is merely for illustrative purposes. That is, a user's sweeping finger 320 may also move from the bottom of the screen 315 upwards towards the top of the screen 315, diagonally across the screen 315, and/or a combination thereof. Other variations are also possible.
It should be further understood that the user may adjust the volume or other electronic device 300 features by sweeping in a known direction and the upward/downward sweeping is merely for illustrative purposes. For example, the same sweeping motion for volume control may also be used to allow the user to adjust the screen 315 by zooming in or out. Many other feature configurations are also possible. It should be further understood that the user is not limited to moving in a sweeping motion. But rather, the user may also make a gesture, such as the letter “X” to indicate closing a program or window. Other variations are also possible.
One benefit of selecting multiple features is that some features are typically executed in sequence, such as playback of a song and adjusting the volume. It should be understood that two sweeping motions or actuations may be performed in any number of directions or variations. Further, the sweeping motions or actuations may be performed by a single finger or multiple fingers.
It is useful to note that one benefit of using the checkbox 338 on the edge of the screen 315 is so that the checkbox 338 is not confused with sweeping, rolling, gestures, and/or the like for activating other features. It should be understood that the checkbox 338 is just one example and other features, such as a Graphical User Interface menu or drop down menus, may also be employed.
It should be understood that the selectable menu 410 may be located adjacent to the screen 405, e.g., approximately 2 millimeters, from the edge of the screen 405. Thus, electronic device 400 may distinguish the selection of the selectable menu 410 from a user sweeping from the tactile cue 435 on the cover based at least in part by the starting point (e.g., 2 millimeters from the edge). It should be further understood that the tactile cue 435 may be concave, convex, embossed icon, a replaceable sticker, flat with a textured material, three dimensional and/or the like. In an embodiment, the tactile cue 435 may be opaque, transparent, and/or the like.
It should further be understood that since the sweep begins, e.g., approximately 2 millimeters, from the edge of the screen 405, example embodiments of the invention may distinguish this sweep from a sweep intended to playback as shown in
Since example embodiments may be employed eyes-free, accidental activation of a graphical user interface is possible. In an example embodiment of the invention, a menu lock mode is used. In the menu lock mode, an electronic device is configured to allow sweeps starting from the edge of the touchscreen, for example, during screen 405 sweeps. That is, this example embodiment locks features not associated with the tactile cue on a screen edge when the sweep begins. It should be further understood that features in the vicinity of the tactile cue are lock. In an embodiment, all features not associated with the tactile cue are locked.
In an embodiment, a touch sensitive area, such as touch sensitive area 150 of
In use, a display cover of the electronic device's 500, such as replaceable cover 505, may be removed and replaced by a user. In particular, the replaceable cover 505 of the electronic device 500 may be removed from the base 515. A new cover may then be installed. By replacing the replaceable cover 505, custom configurations of tactile cues 510 may be performed. That is, a user may have one replaceable cover 505 for work (e.g., work related tactile cues 510) and another replaceable cover 505 for home (e.g., entertainment tactile cues 510). It should be understood that the replaceable cover 505 or new cover may be fastened together by any technique known in the art to securely enclose the internal workings of an electronic device 500. It should be further understood that the replaceable cover 505 may be made of any suitable material known in the art.
In an embodiment, the electronic device 500 may not include a screen 520, but rather comprise a replaceable cover 505 configured to conform to the dimensions of the base 515. The replaceable cover 505 may be manufactured from injection molding and/or vacuum molded plastic, or other like suitable material having sufficient rigidity. The replaceable cover 505 may be a single unit, thus making it easy to remove, replace, and reuse as the user desires. The replaceable cover 505 may also include stencil or silk screening to identify the numbers and tactile cues 510 or function keys in any language, and thus reduce the cost of having to produce phone or pager units with different languages. The replaceable cover 505 may be stenciled, embossed, or silk screened as desired with any tactile cues 510 or logo. For example, the tactile cues 510 may resemble normal mechanical keys with key graphics. The tactile cues 510 may be concave, convex or flat. Further, the tactile cues 510 may use different materials, e.g. rubber or leather patches on a plastic or a metal cover. In an embodiment, the tactile cues 510 can be flat and coupled to the replaceable cover 505 without indication. Therefore, the tactile cues 510 are distinguished from the replaceable cover 505 by the material or texture of the tactile cues 510. In an example embodiment, the tactile cues 510 may also be dynamic (e.g., tactile cues 510 appear and disappear) using an actuator, such as a mechanical actuator. All figures are illustrative.
In this example embodiment, a mechanical actuator, such as for example a piezoelectric motor generally designated 630 is appropriately mounted to the circuit board 612 and comprises a shaft 632 extending axially lengthwise of the piezoelectric motor 630. A sheet spring steel band generally designated 634 has one end 636 attached to the circuit board 612 and its opposite end 638 suitably attached to the shaft 632 of the piezoelectric motor 630, for example by inserting the end 638 into a complementary shaped and sized slot 640 in the shaft 632. The sheet spring steel band 634 is substantially “C” shaped and is located over the dome switch 614. The sheet spring steel band 634 is in contact with a downward extending foot 642 of the elastomer portion 620 defining the key 624. When the key 624 is pressed or otherwise pushed downward in a direction toward the surface 616 of the circuit board 612, as indicated by the direction arrow 644, the bottom 646 of the foot 642 contacts the sheet spring steel band 634 pushing it into contact with the dome switch 614 to operate the switch 614. When the downward pressure is removed from the key 624, the sheet spring steel band 634 returns to its “C” shaped configuration pushing the foot 642 upward to make the key 624 available.
In a situation in which the key 624 is not available, for example when the electronic device does not have a given function associated with the key 624 available for the particular mode selected, the surface area topology 626 of the key 624 is flush with the surface 622 of the user interface 618 indicating the key is unavailable. The elastomer portion defining the key 624 is permitted to return to its unstretched state when the shaft 632 of the piezoelectric motor 630 rotates in a clockwise direction such that the end 638 of the sheet spring steel band 634 rotates with the shaft 632, thereby shortening the length of the sheet spring steel band 634, causing the band 634 to flatten and approach the surface 616 of the circuit board 612, removing the upward bias on the elastomer foot 642. As shown in the
It should be understood that a piezoelectric motor which may be utilized example embodiments of the invention is available, for example, from New Scale Technologies, Inc. under the trademark name Squiggle Motor to provide the desired actuation and appearance and disappearance of the keys as described above. The operation of such piezoelectric motors is well understood by those skilled in the art.
Without in any way limiting the scope, interpretation, or application of the claims appearing below, it is possible that a technical effect of one or more of the example embodiments disclosed herein may be a facilitating use of a touchscreen. Another possible technical effect of one or more of the example embodiments disclosed herein may be ease of execution of one or more features.
Embodiments of the present invention may be implemented in software, hardware, application logic or a combination of software, hardware and application logic. The software, application logic and/or hardware may reside on a mobile phone, personal digital assistant or other electronic device. If desired, part of the software, application logic and/or hardware may reside on an electronic device, part of the software, application logic and/or hardware may reside in memory. The application logic, software or an instruction set is preferably maintained on any one of various conventional computer-readable media. In the context of this document, a “computer-readable medium” may be any media or means that may contain, store, communicate, propagate or transport the instructions for use by or in connection with an instruction execution system, apparatus, or device.
If desired, the different functions discussed herein may be performed in any order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions may be optional or may be combined.
Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise any combination of features from the described embodiments and/or the dependent claims with the features of the independent claims, and not solely the combinations explicitly set out in the claims.
It is also noted herein that while the above describes exemplifying embodiments of the invention, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which may be made without departing from the scope of the present invention as defined in the appended claims. For example features associated with electronic devices may also employ example embodiments of the invention using tactile cues.
Number | Name | Date | Kind |
---|---|---|---|
4202615 | Nemoto | May 1980 | A |
4314750 | Orban | Feb 1982 | A |
4327985 | Urushihara et al. | May 1982 | A |
5496174 | Garner | Mar 1996 | A |
5748185 | Stephan et al. | May 1998 | A |
5926119 | Lindeman et al. | Jul 1999 | A |
6218966 | Goodwin et al. | Apr 2001 | B1 |
6433801 | Moon et al. | Aug 2002 | B1 |
6535201 | Cooper et al. | Mar 2003 | B1 |
6561600 | Seeley et al. | May 2003 | B1 |
6636202 | Ishmael et al. | Oct 2003 | B2 |
6667697 | Botich | Dec 2003 | B2 |
6667738 | Murphy | Dec 2003 | B2 |
6788294 | Takala et al. | Sep 2004 | B2 |
6967642 | SanGiovanni | Nov 2005 | B2 |
7009599 | Pihlaja | Mar 2006 | B2 |
7941786 | Scott et al. | May 2011 | B2 |
20010040558 | Takala et al. | Nov 2001 | A1 |
20020003469 | Gupta | Jan 2002 | A1 |
20020158836 | Ishmael, Jr. et al. | Oct 2002 | A1 |
20030022701 | Gupta | Jan 2003 | A1 |
20030153349 | Sun | Aug 2003 | A1 |
20040056877 | Nakajima | Mar 2004 | A1 |
20040121760 | Westman et al. | Jun 2004 | A1 |
20040169598 | Arling et al. | Sep 2004 | A1 |
20050099403 | Kraus et al. | May 2005 | A1 |
20050122313 | Ashby | Jun 2005 | A1 |
20050184959 | Kompe et al. | Aug 2005 | A1 |
20060017711 | Pihlaja | Jan 2006 | A1 |
20060046031 | Janevski | Mar 2006 | A1 |
20060098397 | Chou | May 2006 | A1 |
20060181515 | Fletcher et al. | Aug 2006 | A1 |
20060202803 | Yoon et al. | Sep 2006 | A1 |
20060256090 | Huppi | Nov 2006 | A1 |
20070035523 | Cohen | Feb 2007 | A1 |
20070132735 | Gil-Gomez | Jun 2007 | A1 |
20070152974 | Kim et al. | Jul 2007 | A1 |
20070157089 | Van Os et al. | Jul 2007 | A1 |
20070270179 | Lee et al. | Nov 2007 | A1 |
20070277124 | Shin et al. | Nov 2007 | A1 |
20080010593 | Uusitalo et al. | Jan 2008 | A1 |
20080040692 | Sunday et al. | Feb 2008 | A1 |
20080042978 | Perez-Noguera | Feb 2008 | A1 |
20080055273 | Forstall | Mar 2008 | A1 |
20080084400 | Rosenberg | Apr 2008 | A1 |
20080204418 | Cybart et al. | Aug 2008 | A1 |
20080234849 | Han | Sep 2008 | A1 |
20080244447 | Sagar | Oct 2008 | A1 |
20090002328 | Ullrich et al. | Jan 2009 | A1 |
20090019396 | McCarthy | Jan 2009 | A1 |
20090251420 | Do et al. | Oct 2009 | A1 |
20090319893 | Pihlaja | Dec 2009 | A1 |
20110047459 | Van Der Westhuizen | Feb 2011 | A1 |
Number | Date | Country |
---|---|---|
43 19 795 | Jan 1994 | DE |
1 098 241 | Mar 2000 | EP |
1280319 | Jan 2003 | EP |
2 306 078 | Apr 1997 | GB |
2 445 445 | Jul 2008 | GB |
2004-117950 | Apr 2004 | JP |
2002-024903 | Jan 2005 | JP |
2005-223616 | Aug 2005 | JP |
2006-003746 | Jan 2006 | JP |
2003-0048697 | Jun 2003 | KR |
2006-0027655 | Mar 2007 | KR |
WO-9800775 | Jan 1998 | WO |
WO-2004042685 | May 2004 | WO |
WO-2004068521 | Aug 2004 | WO |
WO-2006009813 | Jan 2006 | WO |
2007036596 | Apr 2007 | WO |
WO-2008004049 | Jan 2008 | WO |
WO-2009156810 | Dec 2009 | WO |
Entry |
---|
HTC Smart Mobility, Touch Phone User Manual, www.htc.com., 135 pgs. |
HTC Touch, Navigating the Touch Cube, http://www.htc.com/uploadedFiles/Common/Training—Guides/HTC—Touch/e-Learning/htctouch.htm, 1pg. |
Number | Date | Country | |
---|---|---|---|
20090315836 A1 | Dec 2009 | US |