The described embodiments relate generally to an input device. More particularly, the present embodiments relate to an adaptable user-selectable input area and techniques for providing feedback to guide a user input to the adaptable user-selectable input area.
Many electronic devices include touch-based devices that receive user inputs. For example, a touchscreen display is typically used as a touch-based input component. The touchscreen display is capable of displaying various text and graphics to a user, which the user can select by touching the touchscreen. More specifically, a touchscreen can be configured to display virtual buttons, icons, textboxes, hyperlinks, and other types of user-selectable input elements. The user may select an input element by tapping the portion of the touchscreen where the input element is displayed.
However, without looking at the display, it can be difficult for a user to find the virtual buttons, icons, textboxes, or other user-selectable input elements that are being displayed. The smooth hard surface of the touchscreen does not provide any indication of the shape, size, or location of the virtual buttons, textboxes, icons, and other user-selectable input elements. But in some instances, it may be inconvenient, or even dangerous, for the user to look at the touchscreen display. For example, a user cannot look at the touchscreen display while driving a motor vehicle or operating other machinery. Alternatively, a user may not want to display information on the touchscreen display for security reasons. Additionally, it can be difficult for visually impaired users to interact with an electronic device using a touchscreen display.
Embodiments described herein generally relate to guiding a user input (e.g., a touch or force input) to a user-selectable input element by providing one or more types of feedback to the user. The feedback can be tactile feedback, auditory feedback, olfactory feedback, visual feedback, and combinations thereof. The user-selectable input element may be situated at any suitable location in an electronic device. For example, the user-selectable input element can be displayed on a touchscreen display, positioned below a portion of an enclosure of the electronic device such as a side or back of the enclosure, and/or associated with an input or an input/output device coupled to, or included within, an electronic device (e.g., a trackpad, a physical button). Additionally, in some situations, the active area or boundary of a user-selectable input element can be adjusted and a user input that is detected in the adjusted active area may be associated with the user-selectable input element. As used herein, the terms “active area” and “adjusted active area” refer to a region in which a user input that is intended for a user-selectable input element is recognized and associated with the user-selectable input element.
In one aspect, an electronic device includes a touch-sensing layer positioned below an input surface of the electronic device, one or more feedback devices, and a processing device coupled to the touch-sensing layer and to the one or more feedback devices. The processing device is configured to adjust an active area of a user-selectable input area in response to the touch-sensing layer detecting a user input on the input surface that is outside of the user-selectable input area. The adjusted active area extends beyond the boundary of the user-selectable input area. The processing device is further configured to cause at least one feedback device to provide feedback to alert a user to the adjustment of the active area. Additionally or alternatively, the processing device is configured to cause at least one feedback device to provide feedback to guide the user input to the adjusted active area. The processing device is also configured to recognize the user input and associate the user input with the user-selectable input area when the user input is detected in at least a portion of the adjusted active area.
In another aspect, an electronic device includes multiple sensors that are each configured to detect touch events on a surface of the electronic device. A first subset of the sensors forms a user-selectable input element and a different second subset of the sensors forms an adjusted active area of the user-selectable input element. A processing device is coupled to the multiple sensors and configured to recognize the touch event and to associate the touch event with the input element when the touch event is detected by at least one sensor in the first or second subset of sensors.
In yet another aspect, a method of operating an electronic device includes detecting a user input on an input surface of the electronic device and determining if the user input is within an active area of an input element. In some embodiments, the active area comprises an area within a boundary of the input element. The active area is adjusted when the user input is not within the active area of the input element. A first feedback is provided to a user to alert a user to the adjustment of the active area. A second feedback is provided to guide the user input to the adjusted active area. The user input is recognized and associated with the input element when the user input is detected in the adjusted active area.
In another aspect, an electronic device includes a touch-sensing layer positioned below an input surface of the electronic device and configured to detect user inputs on the input surface. A processing device is coupled to the touch-sensing layer and to a feedback device. The processing device is configured to cause the feedback device to provide feedback to indicate a function associated with the user-selectable input area and associate a user input with the user-selectable input area when the user input is detected in the user-selectable input area.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
and
The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.
Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented therebetween, are provided in the accompanying figures merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.
Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following descriptions are not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
The following disclosure relates to an electronic device that includes one or more user-selectable input areas or input elements. The user-selectable input area(s) may be situated at any suitable location in an electronic device. For example, a user-selectable input area can be displayed on a touchscreen (and/or force-sensitive) display, positioned below a portion of an enclosure of an electronic device, and/or included in an input or input/output device coupled to, or within, an electronic device (e.g., a trackpad, a physical input button). A touch event or user input, such as a touch or force input, may be guided to a user-selectable input area by providing one or more types of feedback to the user. This guiding or direction may be initiated by the touch or force input. The user input can be directed to a user-selectable input area in response to a user input occurring outside of a boundary of the user-selectable input area.
In an illustrative embodiment, an input/output device, such as headphones, may be coupled to a digital media player. One or more user-selectable input areas can be included in the headphones. For example, a user-selectable input area may be included in each ear pad (e.g., below the housing) of the headphones. The user-selectable input areas can receive user inputs that are used to control a function or application of the digital media player. For example, a user-selectable input area may receive a user input (e.g., a force input) to increase or decrease the volume of audio output or playback. In some embodiments, a user can press an input area to mute the audio output or playback.
Additionally or alternatively, feedback can be provided to alert a user to one or more locations of different input elements, to guide a user input to a user-selectable input element, and/or to alert a user to a distance to the user-selectable input element. For example, tactile feedback (e.g., haptic output) having a first set of characteristics (e.g., magnitude, duration, frequency) can be produced to alert a user to the locations of the next and previous input elements, and haptic output having a second set of characteristics can be provided to indicate to a user the location of a pause input element. In certain embodiments, haptic output having a third set of characteristics can be generated to guide a user input to a play input element or to indicate a proximity to the play input element.
In some embodiments, feedback can be provided in a section of the headphones other than where a user is touching or where the input elements are located. For example, haptic output can be produced around a perimeter of the ear pads to indicate what and/or where the user is touching. Alternatively, the haptic output can be provided in the headband to indicate the user is not touching an input element or to guide the user to an ear pad that includes the play input element.
In some embodiments, feedback may be provided proactively when a user is expected to provide a user input to the user-selectable input area. For example, an embodiment may expect, predict, anticipate, or otherwise contemplate that a user can be expected to apply a force (e.g., a press) to the user-selectable input area, for example, based on an application program running on the electronic device. A processing device can cause one or more feedback devices to provide feedback to alert the user and/or to guide the user input to the user-selectable input area.
Additionally or alternatively, in some situations, the boundary or active area of a user-selectable input area can be adjusted from a default state or size to an “adjusted active area” of a different size. In this fashion, a user input that is detected in the adjusted active area is recognized and associated with the user-selectable input area. As used herein, the terms “active area” and “adjusted active area” refer to a region in which a user input that is intended for a user-selectable input area is recognized and associated with the input area. In some embodiments, the active area corresponds to the area of the input area or element. In other embodiments, the active area may have a different size than the input element. For example, an active area may be larger than an input element so that user inputs that only partially contact the input element are recognized and associated with the input element. Alternatively, in some situations, an active area can be smaller than an input element to ensure a user input is intended and received by the input element.
In one representative embodiment, tactile feedback (e.g., haptic output) may be provided in a feedback area within the active area to guide a user input to the input area. Additionally or alternatively, two or more feedback areas can be created within the boundary of an input area and/or within an active area. When a user input is to be directed to the input area, haptic output can be provided in one or more of the feedback areas. For example, haptic output may only be produced in a first feedback area until the user input is at a first distance from the center of the user-selectable input area. Thereafter, the haptic output may be produced only in a second feedback area until the user input is at a closer second distance from the center of the input area. Finally, haptic output may be produced only in a third feedback area until the user input is within the input area.
Additionally or alternatively, in some embodiments, the haptic output can home in on a center or a boundary of the input area. By “home in,” it is meant that the haptic output can appear to move toward the center or boundary from a different region; in some embodiments, the haptic output (or some characteristic of the haptic output) may vary with distance from the portion of the input area. Similarly, the haptic output can move inward from the boundary of a feedback area to the boundary of an adjacent feedback area to indicate to a user the direction a user input needs to move to reach the input area.
Additionally or alternatively, in some embodiments, the haptic output can be created throughout an entire feedback area. In other embodiments, the haptic output may be provided in one or more sections of a feedback area to guide a user input to a user-selectable input area.
In some embodiments, one or more feedback areas can be positioned outside of a user-selectable input area. In such embodiments, haptic output can be produced only when the touch input is moving in one direction. For example, haptic output can be provided only when the touch input is moving closer to the input area and no haptic output may be produced when the touch input is moving away from the input area.
Additionally, in some embodiments, one or more characteristics of the haptic output can differ between feedback areas. The characteristics of the haptic output include, but are not limited to, a frequency, a magnitude, and a duration. For example, the frequency of the haptic output can increase as the user input moves closer to a boundary of the user-selectable input area. Additionally or alternatively, the duration of the haptic output can decrease as the user input moves closer to (or away from) the boundary of the input area.
In certain embodiments, haptic feedback having one or more different characteristics can be produced to alert a user to a location of a user-selectable input element and/or to indicate a function associated with the input element (e.g., volume control, mute). Additionally or alternatively, the haptic output can indicate a location and a distance from a user-selectable input element. In other words, the haptic output can indicate or provide different data or information regarding a user-selectable input element.
Further, feedback other than tactile feedback (e.g., haptic output) can be used to direct a user input to a user-selectable input area. For example, audio feedback and/or visual feedback can be provided in addition to, or as an alternative to, the tactile feedback.
In some embodiments, the active area of a user-selectable input area can be adjusted to receive user inputs. The adjusted active area may be larger than the input area and surround the input area such that the entire input area is within the adjusted active area. Alternatively, an adjusted active area can include only a portion of an input area. For example, the active area can be expanded to include only half of an input area. A touch event can be recognized and associated with the input area when the touch event (or a portion of the touch event) is detected in the adjusted active area.
The active area can be adjusted based on several factors. A particular user may periodically or consistently submit user inputs (e.g., touch and/or force inputs) outside of the boundary of the input area. Once a sufficient number of such inputs is received, or when a sufficient number is received within a given time period, the active area may be expanded or moved to include the region in which the user submits his or her inputs. Additionally or alternatively, a processing device in the electronic device can receive output signals from one or more sensors that provide data on the orientation of the electronic device and/or the position of the input area on a display. The active area may be adjusted based on the orientation of the electronic device (e.g., landscape or portrait orientation) and/or the location of the input area in the electronic device. As yet another option, a user may create a user profile that specifies which active areas are to be adjusted. For example, the user can select different application programs in which an active area is to be adjusted.
As used herein, the terms “connected” and “coupled” are generally intended to be construed broadly to cover direct connections and indirect connections. In the context of the present invention, the terms “connected” and “coupled” are intended to cover circuits, components, and/or devices that are connected such that an electrical parameter passes from one to another. Example electrical parameters include, but are not limited to, voltages, currents, magnetic fields, control signals, and/or communication signals. Thus, the terms “coupled” and “connected” include circuits, components, and/or devices that are coupled directly together or through one or more intermediate circuits, components, and/or devices.
Additionally, in the context of the present invention, the terms “connected” and “coupled” are intended to cover mechanical or structural elements, components, and/or devices that are directly connected together or through one or more intervening elements, components, and/or devices.
These and other embodiments are discussed below with reference to
The electronic device 100 includes an enclosure 102 at least partially surrounding a display 104 and one or more input/output (I/O) devices 105. The enclosure 102 can form an outer surface or partial outer surface for the internal components of the electronic device 100. The enclosure 102 can be formed of one or more components operably connected together, such as a front piece and a back piece. Alternatively, the enclosure 102 can be formed of a single piece operably connected to the display 104.
The display 104 can provide a visual output to the user. The display 104 can be implemented with any suitable technology, including, but not limited to, a liquid crystal display (LCD) element, a light emitting diode (LED) element, an organic light-emitting display (OLED) element, an organic electroluminescence (OEL) element, and the like.
In some embodiments, the I/O device 105 can take the form of a home button or input element, which may be a mechanical button, a soft button (e.g., a button that does not physically move but still accepts inputs), an icon or image on a display, and so on. Further, in some embodiments, the I/O device 105 can be integrated as part of a cover layer 106 and/or the enclosure 102 of the electronic device 100. Although not shown in
The cover layer 106 may be positioned over the front surface (or a portion of the front surface) of the electronic device 100. At least a portion of the cover layer 106 can function as an input surface that receives user inputs (e.g., touch and/or force inputs). The cover layer 106 can be formed with any suitable material, such as glass, plastic, sapphire, or combinations thereof. In one embodiment, the cover layer 106 covers the display 104 and the I/O device 105. User inputs can be received by the portion of the cover layer 106 that covers the display 104 and by the portion of the cover layer 106 that covers the I/O device 105.
In another embodiment, the cover layer 106 covers the display 104 but not the I/O device 105. User inputs can be received by the portion of the cover layer 106 that covers the display 104. In some embodiments, the I/O device 105 may be disposed in an opening or aperture formed in the cover layer 106. In such embodiments, the aperture can extend through the enclosure 102 with one or more components of the I/O device 105 positioned in the enclosure.
In some embodiments, the display 104 can function as an input device that allows the user to interact with the electronic device 100. For example, the display 104 can be a multi-touch touchscreen LED display. A user-selectable input element or area 108 can be associated with the display 104. The user-selectable input area 108 may be a virtual input area that is displayed on the display 104 (e.g., an icon, a textbox, a button) and/or the user-selectable input area 108 may be a designated touch-sensing section of the display 104. In other words, the user-selectable input element 108 can be fixed in a location or can be positioned anywhere on the display 104.
In some embodiments, the user-selectable input area 108 has a boundary 110 that designates the dimensions of the user-selectable input area 108. The area within the boundary 110, or a portion of the area within the boundary 110, can be an active area that receives user inputs (e.g., touch and force inputs). A user can use a body part (e.g., a finger) or an object, such as a stylus, to submit user inputs to the user-selectable input area 108.
Although the user-selectable input area 108 and the boundary 110 are depicted as circular, in other embodiments the user-selectable input area 108 and/or the boundary 110 may have any given shape and/or dimensions. Additionally,
Additionally or alternatively, a user-selectable input element or area 112 can be associated with the enclosure 102 (or one or more sections of the enclosure 102). Like the user-selectable input area 108, the user-selectable input area 112 has a boundary 114 that designates the dimensions of the user-selectable input area 112. The area within the boundary 114, or a portion of the area within the boundary 114, can be an active area that receives user inputs (e.g., touch and force inputs). Although the user-selectable input area 112 and the boundary 114 are depicted as circular, in other embodiments the user-selectable input area 112 and/or the boundary 114 may have any given shape and/or dimensions. Additionally, the user-selectable input area 112 may be one of multiple user-selectable input areas.
In some embodiments, one or more user-selectable input areas can be associated with, or incorporated into, other components of an electronic device. Example components include, but are not limited to, the I/O device 105, a trackpad, a physical input button, and a section or portion of the enclosure 102. For example, one or more user-selectable input areas can be provided to a side or back surface of the enclosure 102.
Embodiments described herein relate generally to providing feedback to a user to guide a user input to a user-selectable input area (e.g., user-selectable input area 108). The feedback can be any suitable type of feedback, including tactile feedback, auditory feedback, olfactory feedback, visual feedback, and combinations thereof. The feedback may be provided in response to a touch event occurring outside of a boundary of the user-selectable input area. Additionally or alternatively, feedback can be produced proactively when a user is expected to provide a user input to the user-selectable input area. For example, a user can be expected to apply a force (e.g., a press) to the user-selectable input area based on an application program running on the electronic device. A processing device can cause one or more feedback devices to provide feedback to the user to alert the user to the expected user input, to alert the user to a location of the user-selectable input area, to alert a user to a distance to the user-selectable input area, and/or to guide the user input to the user-selectable input area.
Additionally, in some embodiments, the active area associated with a user-selectable input area (e.g., user-selectable input area 108) can be adjusted to recognize user inputs at locations outside a boundary of the user-selectable input area and associate the user inputs with the user-selectable input area (e.g., user-selectable input area 108 and boundary 110).
In one example embodiment, tactile feedback having a longer duration may be produced when a user input is farther from the user-selectable input area 108 and tactile feedback having a shorter duration may be produced when a user input is closer to the user-selectable input area 108. In this manner, the duration of the tactile feedback is associated with the distance from the user-selectable input area 108.
In another example embodiment, tactile feedback having a lower frequency may be produced when a user input is moving toward the user-selectable input area 108 and tactile feedback having a higher frequency may be produced when a user input is moving away from the user-selectable input area 108. In such embodiments, the frequency of the tactile feedback is associated with guiding a user input toward the user-selectable input area 108.
The tactile feedback may be the result of haptic output produced by one or more haptic devices. The haptic output can be a force, movement, and/or a vibration that may be detected by a user as tactile feedback. The haptic output can produce planar movement (movement in the plane of the cover layer 106) and/or vertical movement (movement normal to the surface of the cover layer 106). In one embodiment, the haptic output creates vertical movement in the cover layer 106.
In some embodiments, the haptic output may be applied continuously or at select times to the entire feedback area 116, to a portion of the feedback area 116, or at the boundary 122 of the feedback area 116. The haptic output can continue until an object touching the cover layer 106 (e.g., a finger, a stylus) contacts or crosses the boundary 110. Alternatively, the haptic output can be produced until a user input (e.g., a press) is received within the user-selectable input area 108. When the haptic output is provided at select times, the frequency of the haptic output can be fixed or variable. For example, the frequency of the haptic output can increase as the touch input moves closer to the user-selectable input area 108.
Additionally, in some embodiments, one or more characteristics of the haptic output can vary over time and/or as the user input moves closer to the user-selectable input area 108. The characteristics of the haptic output include, but are not limited to, a frequency, a magnitude (e.g., amplitude), and a duration. For example, the frequency of the haptic output can increase as the user input moves closer to (or away from) the boundary 110 of the user-selectable input area 108. Additionally or alternatively, the duration of the haptic output can decrease as the user input moves closer to (or away from) the boundary 110 of the user-selectable input area 108.
In some embodiments, two or more feedback areas or zones can be created within the boundary 110 of the user-selectable input area 108. As shown in
When a user input is to be directed to the user-selectable input area 108, haptic output can be provided in one or more of the feedback areas 124, 126, 128. For example, haptic output may only be produced in the feedback area 124 until the user input is at a first distance from the center of the user-selectable input area 108. Thereafter, haptic output may be produced only in the feedback area 126 until the user input is at a closer second distance from the center of the user-selectable input area 108. Finally, haptic output may be produced only in the feedback area 128 until the user input is within the user-selectable input area 108.
In some embodiments, one or more characteristics of the haptic output can differ between the feedback areas 124, 126, 128. The characteristics of the haptic output include, but are not limited to, a frequency, a magnitude (e.g., amplitude), and a duration. For example, the frequency of the haptic output can increase as the user input moves closer to the boundary 110 of the user-selectable input area 108. Additionally or alternatively, the duration of the haptic output can decrease as the user input moves closer to (or away from) the boundary 110 of the user-selectable input area 108.
Additionally or alternatively, the haptic output can move inward from the boundary of the feedback area 124 to the boundary of the feedback area 126 to the boundary of the feedback area 128 to indicate to a user the direction a user input needs to move to reach the center of the user-selectable input area 108. In other words, the haptic output can home in on the center of the user-selectable input area 108.
Further, in some embodiments, the haptic output can be created throughout an entire feedback area 124, 126, 128. In other embodiments, the haptic output may be provided in one or more sections of a feedback area 124, 126, 128. The one or more sections can each have any given shape and dimension.
As yet another option, various combinations of haptic output can be produced in the feedback areas 124, 126, 128. For example, haptic output can be created throughout the entire feedback area 124, in a first section of the feedback area 126, and in a different second section of the feedback area 128.
In some embodiments, haptic output can be produced only when the user input is moving in one direction. For example, haptic output can be provided only when the user input is moving closer to the user-selectable input area 108 (indicated by arrows 134). In such embodiments, haptic output may not be produced when the touch event is moving away from the user-selectable input area 108 (indicated by arrows 136).
In some embodiments, haptic output having a first set of characteristics (e.g., frequency, magnitude, duration) can be provided when the touch event is moving in one direction (e.g., closer to the user-selectable input area 108) while haptic output having a different second set of characteristics may be produced when the user input is moving in a second direction (e.g., away from the user-selectable input area 108). For example, haptic output having a first magnitude can be generated when a user input is moving closer to the user-selectable input area 108 while haptic output having a different second magnitude can be generated when a touch event is moving away from the user-selectable input area 108. In this manner, characteristics of the haptic output can be associated with guiding a user input toward the user-selectable input area 108.
Additionally or alternatively, one or more characteristics of the haptic output may vary as a user input crosses a boundary into a feedback area 130, 132. For example, haptic output having a first duration can be generated when a user input crosses into the feedback area 130, and haptic output having a different second duration can be generated when a user input moves into the feedback area 132. In such embodiments, the duration of the haptic output may be associated with guiding the user input toward the user-selectable input area 108. For example, the duration of the haptic feedback can become shorter as the user input moves closer to the user-selectable input area 108.
In some embodiments, the active area of the user-selectable input area 108 can be adjusted to receive user inputs. As described earlier, the active area is the region in which a user input intended for a user-selectable input area or user-selectable input element is recognized and associated with the user-selectable input area. For example, in
An adjusted active area can have any given shape and/or dimensions. A touch event 140 can be recognized and associated with the user-selectable input area 108 when the touch event 140 (or a portion of the touch event 140) is detected in the adjusted active area 138.
The active area can be adjusted based on several factors. A particular user may submit one or more user inputs (e.g., touch and/or force inputs) outside of the boundary 110 of the user-selectable input area 108. Once a sufficient number of such inputs is received, or when a sufficient number is received within a given time period, the active area may be expanded or moved to include the region in which the user submits his or her inputs. Additionally or alternatively, a processing device can receive output signals from one or more sensors in the electronic device that provide data on the orientation of the electronic device and/or the location of the user-selectable input area 108. The active area may be adjusted based on the orientation of the electronic device (e.g., landscape or portrait orientation) and/or the location of the user-selectable input area. As yet another option, a user may create a user profile that specifies which active areas are to be adjusted. For example, the user can select different application programs in which an active area is to be adjusted.
In some embodiments, adjustment of an active area can be accomplished via software, hardware, or a combination thereof. For example, in some embodiments, multiple sensors may be configured to detect user inputs. The sensors can be any suitable type of sensor or sensors. In a non-limiting example, a capacitive touch-sensing device or layer can include multiple capacitive sensors. Each capacitive sensor may be formed by two electrodes that are aligned in at least one direction (e.g., vertically) but separated by an air gap or by a deformable or compliant material. A subset of sensors in the multiple sensors can form an input element or area. When a touch event is detected by one or more sensors outside of the subset of sensors, a processing device can adjust the active area (e.g., increase the active area) by associating a second subset of sensors in the multiple sensors with the first subset of sensors. Collectively, the first and second subsets of sensors form the adjusted active area. The touch event can be recognized and associated with the input element when at least one sensor in the second subset of sensors or in the first subset of sensors detects the touch event.
As discussed earlier, the haptic output can be provided to alert a user to a user-selectable input element, to guide a user input to the user-selectable input area, to alert a user to an adjusted active area of a user-selectable input element, and/or to guide a touch event to an adjusted active area of the input element. The user-selectable input element can be displayed on a touchscreen display, positioned below a portion of an enclosure of the electronic device, and/or associated with an I/O device coupled to, or within, the electronic device (e.g., a trackpad, a physical button).
As described earlier, a user-selectable input element can be displayed on a touchscreen display, positioned below a portion of an enclosure of an electronic device, and/or included in an input device coupled to, or included within, an electronic device (e.g., a trackpad, a physical button). One embodiment of a display that is configured to detect touch and/or force inputs and provide feedback will now be described.
Any suitable touch-sensing device or layer 202 can be used. For example, in one embodiment, the touch-sensing device 202 is configured as a capacitive touch-sensing device. Other embodiments can use a different type of touch-sensing technology, including, but not limited to, resistive, ultrasonic, infrared, and surface acoustic wave touch-sensing devices or layers.
In some embodiments, the touch-sensing device 202 includes multiple sensors that are each configured to detect user inputs. For example, a capacitive touch-sensing device can include an array of capacitive sensors. A processing device can be configured to dynamically associate a subset of sensors in the array with an input area or element. The processing device may also be configured to adjust the active area of the input element by associating another subset of sensors with the first subset of sensors to produce an adjusted active area.
A display layer 206 is positioned below the touch-sensing layer 202. The display layer 206 includes the display 104, and may include additional layers such as one or more polarizers, conductive layers, adhesive layers, and a backlight unit.
The display 200 can also include a support structure 208. In the illustrated embodiment, the support structure 208 is a U-shaped support structure that includes a support plate 210 and sides 212 that extend from the support plate 210 to the cover layer 204. The support plate 210 is depicted as a substantially horizontal support plate, although this is not required.
The support structure 208 can be made of any suitable material or materials. In one non-limiting example, the support structure 208 is made from a conductive material (e.g., a metal or metal alloy). Other embodiments can form the support structure 208, the support plate 210, and/or the sides 212 with a different material or combination of materials (e.g., plastic or a ceramic). In the illustrated embodiment, the support plate 210 extends along a length and a width of the display layer 206, although this is not required. The support structure 208 and/or the support plate 210 can have any given shape and/or dimensions in other embodiments.
The sides 212 of the support structure 208 can be connected to the cover layer 204 such that the support structure 208 is suspended from the cover layer 204. In other embodiments, the support structure 208 may be connected to a component other than the cover layer 204. For example, the support structure 208 and/or the support plate 210 can be attached to an enclosure of the display 200 (e.g., enclosure 102 in
An array 214 of haptic devices or actuators 216 may be affixed, through a circuit layer 218, to a surface of the support structure 208 (e.g., to the support plate 210). Although the array 214 is depicted with three haptic actuators 216, other embodiments are not limited to this number. The array 214 can include one or more haptic actuators 216.
In the illustrated embodiment, each haptic actuator 216 is attached and electrically connected to the circuit layer 218. Any suitable circuit layer 218 can be used. For example, in one embodiment, the circuit layer 218 may be a flexible printed circuit layer or a circuit board. The circuit layer 218 includes signal lines that are electrically connected to the haptic actuators 216. The signal lines can be used to transmit electrical signals to each haptic actuator 216. As will be described in more detail later, the signal lines are used send actuation signals to one or more haptic actuators 216 to selectively actuate one or more haptic actuators 216 to produce a deflection or deflections (e.g., haptic output) in the cover layer 204.
Any suitable type of haptic actuator can be used. For example, in one embodiment, each haptic actuator 216 is a ceramic piezoelectric actuator, such as a lead zirconate titanate actuator. Other embodiments can use a different type of piezoelectric actuator. Example piezoelectric actuators include, but are not limited to, a piezoelectric polymer material such as a polyvinylidene fluoride, a piezoelectric semiconductor material, and a lead-free piezoelectric material such as a potassium-based material (e.g., potassium-sodium niobate).
In the illustrated embodiment, the haptic actuators 216 are actuated with an electrical signal. When activated, each haptic actuator 216 converts the electrical signal into mechanical movement, vibrations, and/or force(s). The mechanical movement, vibrations, and/or force(s) generated by the actuated haptic actuator(s) 216 can be used to produce localized haptic output. When the haptic output is applied to a surface, a user can detect or feel the haptic output and perceive the haptic output as haptic feedback.
Each haptic actuator 216 can be selectively activated in the embodiment shown in
The support structure 208 is constructed and attached to the cover layer 204 to define a gap 224 between the top surface of the support plate 210 and a bottom surface of the display layer 206. In some embodiments, a first force-sensing component 220 and a second force-sensing component 222 may be positioned within the gap 224. For example, the first force-sensing component 220 can be affixed to the bottom surface of the display layer 206 and the second force-sensing component 222 can be attached to the top surface of the support plate 210. Together, the first and second force-sensing components 220, 222 form a force-sensing device. The force-sensing device can be used to detect an amount of force that is applied to the cover layer 204. As will be described later, the force input can be used in the methods shown in
In some implementations, the first force-sensing component 220 represents a first array of electrodes and the second force-sensing component 222 represents a second array of electrodes. The first and second arrays of electrodes can each include one or more electrodes. Each electrode in the first array of electrodes is aligned in at least one direction (e.g., vertically) with a respective electrode in the second array of electrodes to form an array of capacitive sensors. The capacitive sensors are used to detect a force applied to the cover layer 204 through measured capacitances or changes in capacitances. For example, as the cover layer 204 deflects in response to an applied force, a distance between the electrodes in at least one capacitive sensor changes, which varies the capacitance of that capacitive sensor. Drive and sense circuitry can be coupled to each capacitive sensor and configured to sense or measure the capacitance of each capacitive sensor. A processing device may be coupled to the drive and sense circuitry and configured to receive signals representing the measured capacitance of each capacitive sensor. The processing device can be configured to correlate the measured capacitances into an amount of force.
In other embodiments, the first and second force-sensing components 220, 222 can employ a different type of sensor to detect force or the deflection of the first force-sensing component 220 relative to the second force-sensing component 222. In some representative examples, the first and second force-sensing components 220, 222 can each represent an array of optical displacement sensors, magnetic displacement sensors, or inductive displacement sensors.
In some embodiments, one or both force-sensing components 220, 222 can be used to detect one or more touches on the cover layer 204. In such embodiments, the force-sensing component(s) 220, 222 have a dual function in that they are used to detect both touch and force inputs. In such embodiments, the touch-sensing device 202 may be omitted.
In some embodiments, a battery 226 is positioned below the support structure 208. The battery 226 provides power to the various components of the display 200. The battery 226 can be positioned such that a gap 228 is defined between the array 214 of haptic actuators 216 and a top surface of the battery 226. A third force-sensing component 230 can be disposed on a top surface of the battery 226. The third force-sensing component 230 may be used to detect a second amount of force. In some embodiments, the amount of force applied to the cover layer 204 may be sufficient to cause the touch-sensing device 202 and the display layer 206 to deflect such that the first force-sensing component 220 traverses into the gap 224 and contacts the second force-sensing component 222. When the cover layer 204 is deflected to a point where the first force-sensing component 220 contacts the second force-sensing component 222, the amount of force detected by the force-sensing device reaches a maximum level (e.g., a first amount of force). The force-sensing device cannot detect force amounts that exceed that maximum level. In such embodiments, the third force-sensing component 230 can detect the amount of force that exceeds the maximum level of the force-sensing device (e.g., a second amount of force) by associating an amount of deflection between the support plate 210 and the third force-sensing component 230. For example, in some embodiments, the third force-sensing component 230 represents one or more electrodes that can be used to measure a change in capacitance between the support plate 210 and the third force-sensing component 230.
It should be noted that
The electronic device 300 includes an enclosure 302 at least partially surrounding a display 304. The enclosure 302 and the display 304 can be configured similarly to the enclosure 102 and the display 104 shown in
One or more user-selectable input areas 306, 308, 310 are disposed below the enclosure 302. In the illustrated embodiment, the user-selectable input areas 306, 308, 310 are positioned on a side of the enclosure 302, although this is not required. One or more user-selectable input areas 306, 308, 310 can be disposed below a back surface of the enclosure 302, a top surface of the enclosure 302 (e.g., user-selectable input area 112 in
The user-selectable input areas 306, 308, 310 can receive user inputs that are used to control a function or application of the electronic device 300. For example, the user-selectable input area 306 may receive a user input (e.g., a force input) to increase the volume of audio playback while the user-selectable input area 308 receives a user input (e.g., a force input) to decrease the volume of the audio output or playback.
Additionally or alternatively, the user-selectable input area 310 can receive a user input (e.g., a force input) to provide input to a function or operation of the electronic device 300. For example, a user may apply a user input (e.g., a touch input) to answer a telephone call received by the electronic device 300. In some embodiments, a user can apply a user input (e.g., a force input) to mute an audio device (e.g., a speaker) in the electronic device 300, to open an application program of the electronic device 300 (e.g., a communication program such as email or text), and/or to open a received communication (e.g., email or text).
Additionally or alternatively, feedback can be provided to alert a user to one or more locations of different inputs, to guide a user input to a user-selectable input area, and/or to alert a user to a distance to the user-selectable input area. The feedback can include tactile feedback, auditory feedback, olfactory feedback, visual feedback, and combinations thereof. For example, tactile feedback (e.g., haptic output) having one or more different characteristics (e.g., frequency, duration, magnitude) can be produced to alert a user to one or more locations of different inputs, to guide a user input to a user-selectable input area, and/or to alert a user to a distance to the user-selectable input area. With respect to the electronic device 300, haptic output having a first set of characteristics can be provided to indicate the location of a volume control input element while haptic output having a different second set of characteristics can be provided to indicate the location of a mute input element or an input element to accept a telephone call.
In some embodiments, tactile feedback (e.g., haptic output) having one or more different characteristics (e.g., frequency, duration, magnitude) can be produced to alert a user to a distance of a user-selectable input element. For example, haptic output having a first set of characteristics can be provided to indicate a user input is positioned a first distance from a volume control input element while haptic output having a different second set of characteristics can be provided to indicate the user input is located a different second distance (e.g., closer) from the volume control input element.
As yet another option, feedback having one or more different characteristics (e.g., frequency, duration, volume, intensity) can be produced to alert a user to a location of a user-selectable input element and to a type of input element (e.g., volume control, mute). The feedback can indicate a location to and a distance from a user-selectable input element. In other words, the feedback can indicate or provide different data or information regarding a user-selectable input element.
In some embodiments, at least a portion of the cover layer 406 and/or the trackpad 402 can depress or deflect when a user presses or applies a force to the cover layer 406. For example, a user may depress or deflect a portion of the trackpad 402 to perform a “click” or a “double click” that selects an icon displayed on the display 414. Additionally or alternatively, in some embodiments, a user can apply a force to the trackpad 402 to submit a force input for an application or function.
In some situations, feedback is provided to a user in response to the user interaction with the trackpad 402. For example, haptic output can be applied to the cover layer 406 based on a touch and/or force input. The haptic output can indicate to the user that the force or touch input is recognized by the electronic device 400.
As described earlier, the haptic output may be a force, movement, and/or a vibration that may be detected by a user as haptic feedback. The haptic output can produce planar movement (movement in the plane of the cover layer 406) and/or vertical movement (movement normal to the surface of the cover layer 406).
One or more user-selectable input elements or input areas 408, 410 can be associated with the trackpad 402. Similar to the embodiments shown in
The feedback can be any suitable type of feedback, including tactile feedback, auditory feedback, olfactory feedback, visual feedback, and combinations thereof. Additionally, in some embodiments, the active area associated with the user-selectable input area 408 and/or the user-selectable input area 410 can be adjusted to recognize user inputs that are submitted outside of the boundary of a respective user-selectable input area 408, 410 but are intended for the input area 408, 410. The techniques described in conjunction with
Additionally or alternatively, feedback may be provided proactively when a user is expected to provide a user input to one or both user-selectable input areas 408, 410. For example, a user can be expected to apply a force (e.g., a press) to the user-selectable input area 408 based on an application program running on the electronic device 400. A processing device (e.g., processing device 1002 in
The touch-sensing device 418 is configured to detect user inputs or touch events on the input surface or cover layer 416. The touch-sensing device 418 can employ any suitable sensing technology, including, but not limited to, capacitive touch sensing, resistive touch sensing, and ultrasonic touch sensing. In
As described earlier, a subset of the capacitive sensors can form an input element or input area. In some embodiments, when a user input is detected by one or more capacitive sensors outside of the subset of capacitive sensors, a processing device can adjust the active area (e.g., increase the active area) by associating a second subset of capacitive sensors with the first subset of capacitive sensors. Collectively, the first and second subsets of capacitive sensors form the adjusted active area. The user input can be recognized and associated with the input element when at least one sensor in the second subset of sensors or in the first subset of sensors detects the user input.
The I/O or circuit board 420 can include one or more force sensors 438, 440. As will be described later, force inputs detected by the force sensors 438, 440 can be used in the methods shown in
In some embodiments, a first haptic device includes a first electromagnetic actuator 442 attached to the circuit board 420 and a first conductive plate 444 affixed to the support plate or structure 422. The position of the first electromagnetic actuator 442 on the I/O or circuit board 420 corresponds to the position of the first conductive plate 444 on the support structure 422. When the I/O or circuit board 420 and the support structure 422 are attached to one another, the first electromagnetic actuator 442 is located adjacent to the first conductive plate 444.
A second haptic device includes a second electromagnetic actuator 446 attached to the circuit board 420 and a second conductive plate 448 affixed to the support plate or structure 422. Like the first electromagnetic actuator 442 and the first conductive plate 444, the position of the second electromagnetic actuator 446 on the I/O board 420 corresponds to the position of the second conductive plate 448 on the support structure 422. When the circuit board 420 and the support structure 422 are attached to one another, the second electromagnetic actuator 446 is located adjacent to the second conductive plate 448.
The second electromagnetic actuator 446 and the second conductive plate 448 have an orientation that is different from the orientation of the first electromagnetic actuator 442 and the first conductive plate 444. In the illustrated embodiment, the second electromagnetic actuator 446 and the second conductive plate 448 are oriented orthogonally to the first electromagnetic actuator 442 and the first conductive plate 444. This permits haptic output to be provided along different axes.
When haptic output is to be produced to alert a user and/or to guide a user to an input area, an alternating current passes through the first and/or the second electromagnetic actuators 442, 446, which produces time-varying magnetic fields. The time-varying magnetic fields attract and repel the corresponding conductive plates 444, 448 to generate the haptic output. The haptic output can transfer to the cover layer 416 and move or translate the cover layer 416 in one or more directions to provide feedback to a user. In one embodiment, the tactile feedback can guide the user input to a respective user-selectable input element (e.g., user-selectable input element 410). Additionally or alternatively, the tactile feedback may be provided for different purposes. For example, tactile feedback can be provided to alert the user to an adjusted active area for a user-selectable input element (e.g., user-selectable input element 408) and/or to a location of the user-selectable input element (e.g., user-selectable input element 408).
Although the present invention has been described in conjunction with a tablet computing device, a laptop computer, and a smart phone, other embodiments are not limited to these devices. As described earlier, an electronic device can be a wearable computing device, a digital music player, a kiosk, a standalone touch screen display, headphones, a smart stylus, and other types of input, input/output, accessory, and/or electronic devices that have one or more user-selectable input areas.
A determination is made at block 502 as to whether the user input is located within the active area of a user-selectable input area. If so, the method passes to block 504 where the user input is recognized. Recognition of the user input may include generating an input signal that is processed and utilized by other components in the electronic device. For example, a processing device can receive the input signal and associate the input signal with an application program (e.g., selection of an application program or an input to an application program).
When the user input is not situated within the active area of the user-selectable input area, the process continues at block 506 where feedback (e.g., haptic output) is provided to guide the user input to the user-selectable input area. In some embodiments, directing the user input to the user-selectable input area includes tracking the location of the object providing the user input (e.g., a finger, a stylus, or the like) with respect to the user-selectable input area. As discussed earlier, one or more characteristics of the feedback can vary based on the location of the object with respect to the user-selectable input area. Additionally or alternatively, the type or types of feedback may vary based on the location of the object with respect to the user-selectable input area.
In some situations, a combination of feedbacks can be used to guide a user input to a user-selectable input area. For example, in the embodiment shown in
Initially, as shown in block 600, a user of an electronic device may be identified. A user can be identified through a variety of techniques. For example, in one embodiment, a user is identified when he or she enters a password to unlock or otherwise access an electronic device, an application program running on the electronic device, and/or a website. In some instances, a user can submit biometric data (e.g., a fingerprint) to unlock or access the electronic device, the application program, and/or the website. Thus, a user can self-identify in some embodiments.
Additionally or alternatively, one or more characteristics of the user's interactions with an electronic device may be used to identify the user. For example, the application programs and/or websites a user accesses can identify the user. In some embodiments, the typing speed, communications (e.g., emails, texts, telephone calls), calendar entries, and/or locations (e.g., via a location services application) may be used to identify a user.
After the user is identified in block 600, a determination is made at block 602 as to whether the active area for a user-selectable input element is to be adjusted based on the identity of the user. For example, a user profile may be associated with the user that specifies which active areas are to be adjusted. In some embodiments, the user can select different application programs in which an active area is to be adjusted and store these selections in a user profile.
The method waits at block 600 if the active area will not be adjusted. When the active area of a user-selectable input element will be adjusted, the process continues at block 604 where the active area of the user-selectable input element is adjusted. In some embodiments, feedback (e.g., haptic feedback) is provided to the user to alert the user to the adjustment of the active area (block 606), although this is not required. Block 606 may be omitted in other embodiments.
A user input (e.g., a touch or force input) is then detected and a determination made as to whether the user input is in the adjusted active area (blocks 608 and 610). If the user input is in the active area, the process passes to block 612 where the user input is recognized and associated with the user-selectable input element. In some embodiments, the user input is recognized and associated with the input element when the user input is partially within the adjusted active area. As described earlier, recognition of the user input may include generating an input signal that is processed and utilized by other components in the electronic device.
If the user input is not in the active area, the method continues at block 614 where feedback is provided to guide the user input to the adjusted active area. The method then returns to block 610 and repeats until the user input is detected in the adjusted active area.
In some embodiments, feedback can be provided to a user to indicate a touch event (e.g., touch and/or force input) is recognized.
Initially, as shown in block 700, a touch event is detected. A determination is then made at block 702 as to whether the force component of the touch event equals or exceeds a force threshold. If the force component does not equal or exceed the force threshold, the process passes to block 704 where the touch event is rejected.
If the force component of the touch event equals or exceeds the force threshold, the method continues at block 706 where feedback is provided to the user. The feedback can be tactile, auditory, visual, and combinations thereof. The touch event is then recognized, as shown in block 708. As described earlier, recognition of the user input may include generating an input signal that is processed and utilized by other components in the electronic device.
Other events or factors may be considered when recognizing a touch event. For example, an amount of time a touch event that does not include a force component (e.g., a press) is in contact with (or hovers over) a user-selectable input element can be used to recognize the touch event and associate the touch event with the user-selectable input element.
If the amount of time the touch event is in contact with (or hovers over) a user-selectable input element does not equal or exceed the time threshold, the method passes to block 808 where a determination is made as to whether the contact (or the hovering over) is maintained. In other words, a determination is made as to whether the touch event remains at the same location or at the same user-selectable input element. If so, the process returns to block 802. If the touch event is not maintained, the touch event is rejected at block 810.
In some embodiments, a user may want to present data on a display only after the user has taken an action. For example, some electronic devices may display data in response to an inadvertent touch event, but the user may want to keep the data confidential and present the information only in response to a particular user input.
If the force input equals or exceeds the force threshold, the method continues at block 906 where feedback is provided to the user to indicate the force input is recognized. As discussed earlier, the feedback can be tactile feedback, auditory feedback, visual feedback, olfactory feedback, or combinations thereof. Next, as shown in block 908, data including one or more user-selectable input elements is displayed on the display screen.
A determination is then made at block 910 as to whether a touch input to a user-selectable input element is detected. If not, the method waits at block 910. When a touch input is detected, the process continues at block 912 where feedback is provided to the user to indicate the touch input is recognized. The touch input is then recognized at block 914. As described earlier, recognition of the touch input may include generating an input signal that is processed and utilized by other components in the electronic device.
The one or more processing devices 1002 can control some or all of the operations of the electronic device 1000. The processing device(s) 1002 can communicate, either directly or indirectly, with substantially all of the components of the electronic device 1000. For example, one or more system buses 1016 or other communication mechanisms can provide communication between the processing device(s) 1002, the memory 1004, the I/O device(s) 1006, the power source 1008, the one or more sensors 1010, the network communication interface 1012, and/or the display 1014. At least one processing device 1002 can be configured to determine if an active area of a user-selectable input element is to be adjusted and if so, adjust the active area. Additionally or alternatively, the at least one processing device 1002 may be configured to cause one or more feedback devices (e.g., auditory and/or haptic feedback devices) to provide feedback to guide a user input to the input element and/or the adjusted active area.
The processing device(s) 1002 can be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the one or more processing devices 1002 can be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of multiple such devices. As described herein, the term “processing device” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
The memory 1004 can store electronic data that can be used by the electronic device 1000. For example, the memory 1004 can store electrical data or content such as audio files, document files, timing and control signals, operational settings and data, user profiles, and image data. The memory 1004 can be configured as any type of memory. By way of example only, memory 1004 can be implemented as random access memory, read-only memory, Flash memory, removable memory, or other types of storage elements, in any combination.
The one or more I/O devices 1006 can transmit and/or receive data to and from a user or another electronic device. Example I/O device(s) 1006 include, but are not limited to, a trackpad, one or more buttons, a microphone, a haptic device, a speaker, and/or one or more feedback devices 1018. Any suitable feedback device(s) can be used. For example, in some embodiments, the feedback device(s) 1018 may include one or more tactile feedback devices 1020 (e.g., haptic device) and/or one or more auditory feedback devices 1022 (e.g., audio device).
Any suitable auditory feedback device 1022 may be used. One example of an auditory feedback device (e.g., an audio device) is a speaker. In some embodiments, an auditory feedback device 1022 can include any device that produces sound when operating (or as a result of operating). For example, an actuator that is configured to move one or more components can produce sound as a result of moving the component(s). In a non-limiting embodiment, an electromagnetic linear actuator may produce different sounds as a result of moving a magnetic mass in the electromagnetic linear actuator. Thus, in some embodiments, an auditory feedback device 1022 may be configured as a tactile feedback device 1020.
Additionally, any suitable tactile feedback 1020 device can be used. For example, in one embodiment, at least one tactile feedback device 1020 may be configured as one or more haptic devices. The haptic device(s) can be configured as the haptic actuators discussed earlier in conjunction with
As one example, the I/O device 105 shown in
The power source 1008 can be implemented with any device capable of providing energy to the electronic device 1000. For example, the power source 1008 can be one or more batteries or rechargeable batteries, or a connection cable that connects the electronic device to another power source such as a wall outlet.
The electronic device 1000 may also include one or more sensors 1010 positioned substantially anywhere on or in the electronic device 1000. The sensor or sensors 1010 may be configured to sense substantially any type of characteristic, such as but not limited to, images, pressure, light, heat, force, touch, temperature, humidity, movement, relative motion, biometric data, and so on. For example, the sensor(s) 1010 may be an image sensor, a temperature sensor, a light or optical sensor, an accelerometer, an environmental sensor, a gyroscope, a magnet, a health monitoring sensor, and so on.
As one example, the electronic device 1000 may include a force-sensing device (e.g., first and second force-sensing components 220, 222 in
The network communication interface 1012 can facilitate transmission of data to or from other electronic devices. For example, a network communication interface can transmit electronic signals via a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, infrared, RFID, Ethernet, and NFC.
The display 1014 can provide a visual output to the user. The display 1014 can be implemented with any suitable technology, including, but not limited to, a multi-touch sensing touchscreen that uses an LCD element, LED element, OLED element, OEL element, or another type of display element. In some embodiments, the display 1014 can function as an input device that allows the user to interact with the electronic device 1000. For example, the display can include a touch-sensing device that permits the display 1014 to function as a touch or multi-touch display.
In some embodiments, the display 1014 may include one or more feedback devices 1018. Any suitable feedback device(s) 1018 can be used. For example, in some embodiments, the feedback device(s) 1018 can be configured as one or more tactile feedback devices 1020 and/or one or more auditory feedback devices 1022.
It should be noted that
As described earlier, any suitable type of feedback or feedback combinations can be provided to a user. The feedback can be tactile feedback, auditory feedback, olfactory feedback, visual feedback, and combinations thereof. Additionally, one or more characteristics of the feedback can vary. For example, the volume and/or frequency of auditory feedback provided to a user can change, or the size, color, and/or intensity of visual feedback may vary.
The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.
This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application No. 62/384,688, filed on Sep. 7, 2016, and entitled “Adaptable User-Selectable Input Area in an Electronic Device,” which is incorporated by reference as if fully disclosed herein.
Number | Name | Date | Kind |
---|---|---|---|
5196745 | Trumper et al. | Mar 1993 | A |
5293161 | MacDonald et al. | Mar 1994 | A |
5424756 | Ho et al. | Jun 1995 | A |
5434549 | Hirabayashi et al. | Jul 1995 | A |
5436622 | Gutman et al. | Jul 1995 | A |
5668423 | You et al. | Sep 1997 | A |
5842967 | Kroll | Jan 1998 | A |
5739759 | Nakazawa et al. | Apr 1998 | A |
6084319 | Kamata et al. | Jul 2000 | A |
6342880 | Rosenberg et al. | Jan 2002 | B2 |
6373465 | Jolly et al. | Apr 2002 | B2 |
6388789 | Bernstein | May 2002 | B1 |
6438393 | Surronen | Aug 2002 | B1 |
6445093 | Binnard | Sep 2002 | B1 |
6493612 | Bisset et al. | Dec 2002 | B1 |
6693622 | Shahoian et al. | Feb 2004 | B1 |
6777895 | Shimoda et al. | Aug 2004 | B2 |
6822635 | Shahoian | Nov 2004 | B2 |
6864877 | Braun et al. | Mar 2005 | B2 |
6952203 | Banerjee et al. | Oct 2005 | B2 |
6988414 | Ruhrig et al. | Jan 2006 | B2 |
7068168 | Girshovich et al. | Jun 2006 | B2 |
7080271 | Kardach et al. | Jul 2006 | B2 |
7126254 | Nanataki et al. | Oct 2006 | B2 |
7130664 | Williams | Oct 2006 | B1 |
7196688 | Shena et al. | Mar 2007 | B2 |
7202851 | Cunningham et al. | Apr 2007 | B2 |
7234379 | Claesson et al. | Jun 2007 | B2 |
7253350 | Noro et al. | Aug 2007 | B2 |
7276907 | Kitagawa et al. | Oct 2007 | B2 |
7323959 | Naka et al. | Jan 2008 | B2 |
7339572 | Schena | Mar 2008 | B2 |
7355305 | Nakamura et al. | Apr 2008 | B2 |
7360446 | Dai et al. | Apr 2008 | B2 |
7370289 | Ebert et al. | May 2008 | B1 |
7392066 | Hapamas | Jun 2008 | B2 |
7423631 | Shahoian et al. | Sep 2008 | B2 |
7508382 | Denoue et al. | Mar 2009 | B2 |
7570254 | Suzuki et al. | Aug 2009 | B2 |
7656388 | Schena et al. | Feb 2010 | B2 |
7667371 | Sadler et al. | Feb 2010 | B2 |
7667691 | Boss et al. | Feb 2010 | B2 |
7675414 | Ray | Mar 2010 | B2 |
7710397 | Krah et al. | May 2010 | B2 |
7710399 | Bruneau et al. | May 2010 | B2 |
7741938 | Kramlich | Jun 2010 | B2 |
7755605 | Daniel et al. | Jul 2010 | B2 |
7798982 | Zets et al. | Sep 2010 | B2 |
7825903 | Anastas et al. | Nov 2010 | B2 |
7855657 | Doemens et al. | Dec 2010 | B2 |
7890863 | Grant et al. | Feb 2011 | B2 |
7893922 | Klinghult et al. | Feb 2011 | B2 |
7904210 | Pfau et al. | Mar 2011 | B2 |
7911328 | Luden et al. | Mar 2011 | B2 |
7919945 | Houston et al. | Apr 2011 | B2 |
7952261 | Lipton et al. | May 2011 | B2 |
7952566 | Poupyrev et al. | May 2011 | B2 |
7956770 | Klinghult et al. | Jun 2011 | B2 |
7976230 | Ryynanen et al. | Jul 2011 | B2 |
8002089 | Jasso et al. | Aug 2011 | B2 |
8020266 | Ulm et al. | Sep 2011 | B2 |
8040224 | Hwang | Oct 2011 | B2 |
8053688 | Conzola et al. | Nov 2011 | B2 |
8063892 | Shahoian | Nov 2011 | B2 |
8081156 | Ruettiger | Dec 2011 | B2 |
8125453 | Shahoian et al. | Feb 2012 | B2 |
8154537 | Olien et al. | Apr 2012 | B2 |
8174495 | Takashima et al. | May 2012 | B2 |
8174512 | Ramstein et al. | May 2012 | B2 |
8169402 | Shahoian et al. | Jun 2012 | B2 |
8217892 | Meadors | Jul 2012 | B2 |
8217910 | Stallings et al. | Jul 2012 | B2 |
8232494 | Purcocks | Jul 2012 | B2 |
8248386 | Harrison | Aug 2012 | B2 |
8253686 | Kyung | Aug 2012 | B2 |
8262480 | Cohen et al. | Sep 2012 | B2 |
8265292 | Leichter | Sep 2012 | B2 |
8265308 | Gitzinger et al. | Sep 2012 | B2 |
8344834 | Niiyama | Jan 2013 | B2 |
8345025 | Seibert et al. | Jan 2013 | B2 |
8351104 | Zaifrani et al. | Jan 2013 | B2 |
8378797 | Pance et al. | Feb 2013 | B2 |
8378965 | Gregorio et al. | Feb 2013 | B2 |
8384316 | Houston et al. | Feb 2013 | B2 |
8390218 | Houston et al. | Mar 2013 | B2 |
8390594 | Modarres et al. | Mar 2013 | B2 |
8400027 | Dong et al. | Mar 2013 | B2 |
8405618 | Colgate et al. | Mar 2013 | B2 |
8421609 | Kim et al. | Apr 2013 | B2 |
8469806 | Grant et al. | Jun 2013 | B2 |
8471690 | Hennig et al. | Jun 2013 | B2 |
8493177 | Flaherty et al. | Jul 2013 | B2 |
8493189 | Suzuki | Jul 2013 | B2 |
8576171 | Grant | Nov 2013 | B2 |
8598750 | Park | Dec 2013 | B2 |
8598972 | Cho et al. | Dec 2013 | B2 |
8604670 | Mahameed et al. | Dec 2013 | B2 |
8605141 | Dialameh et al. | Dec 2013 | B2 |
8614431 | Huppi et al. | Dec 2013 | B2 |
8619031 | Hayward | Dec 2013 | B2 |
8624448 | Kaiser et al. | Jan 2014 | B2 |
8633916 | Bernstein et al. | Jan 2014 | B2 |
8639485 | Connacher et al. | Jan 2014 | B2 |
8648829 | Shahoian et al. | Feb 2014 | B2 |
8654524 | Pance et al. | Feb 2014 | B2 |
8681130 | Adhikari | Mar 2014 | B2 |
8717151 | Forutanpour et al. | May 2014 | B2 |
8730182 | Modarres et al. | May 2014 | B2 |
8749495 | Grant et al. | Jun 2014 | B2 |
8754759 | Fadell et al. | Jun 2014 | B2 |
8760037 | Eshed et al. | Jun 2014 | B2 |
8773247 | Ullrich | Jul 2014 | B2 |
8780074 | Castillo et al. | Jul 2014 | B2 |
8797153 | Vanhelle et al. | Aug 2014 | B2 |
8803670 | Steckel et al. | Aug 2014 | B2 |
8834390 | Couvillon | Sep 2014 | B2 |
8836502 | Culbert et al. | Sep 2014 | B2 |
8836643 | Romera Joliff et al. | Sep 2014 | B2 |
8867757 | Ooi | Oct 2014 | B1 |
8872448 | Boldyrev et al. | Oct 2014 | B2 |
8878401 | Lee | Nov 2014 | B2 |
8907661 | Maier et al. | Dec 2014 | B2 |
8976139 | Koga et al. | Mar 2015 | B2 |
8981682 | Delson et al. | Mar 2015 | B2 |
8987951 | Park | Mar 2015 | B2 |
9008730 | Kim et al. | Apr 2015 | B2 |
9024738 | Van Schyndel et al. | May 2015 | B2 |
9052785 | Horie | Jun 2015 | B2 |
9054605 | Jung et al. | Jun 2015 | B2 |
9058077 | Lazaridis et al. | Jun 2015 | B2 |
9086727 | Tidemand et al. | Jul 2015 | B2 |
9092056 | Myers et al. | Jul 2015 | B2 |
9104285 | Colgate et al. | Aug 2015 | B2 |
9122330 | Bau et al. | Sep 2015 | B2 |
9134796 | Lemmons et al. | Sep 2015 | B2 |
9172669 | Swink et al. | Oct 2015 | B2 |
9218727 | Rothkopf et al. | Dec 2015 | B2 |
9245704 | Maharjan et al. | Jan 2016 | B2 |
9256287 | Shinozaki et al. | Feb 2016 | B2 |
9274601 | Faubert et al. | Mar 2016 | B2 |
9280205 | Rosenberg et al. | Mar 2016 | B2 |
9286907 | Yang et al. | Mar 2016 | B2 |
9304587 | Wright et al. | Apr 2016 | B2 |
9319150 | Peeler et al. | Apr 2016 | B2 |
9361018 | Pasquero et al. | Jun 2016 | B2 |
9396629 | Weber et al. | Jul 2016 | B1 |
9430042 | Levin | Aug 2016 | B2 |
9436280 | Tartz et al. | Sep 2016 | B2 |
9442570 | Slonneger | Sep 2016 | B2 |
9448713 | Cruz-Hernandez et al. | Sep 2016 | B2 |
9449476 | Lynn et al. | Sep 2016 | B2 |
9466783 | Olien et al. | Oct 2016 | B2 |
9489049 | Li | Nov 2016 | B2 |
9496777 | Jung | Nov 2016 | B2 |
9501149 | Burnbaum et al. | Nov 2016 | B2 |
9557857 | Schediwy | Jan 2017 | B2 |
9829981 | Ji | Nov 2017 | B1 |
9875625 | Khoshkava et al. | Jan 2018 | B2 |
9904393 | Frey et al. | Feb 2018 | B2 |
20030117132 | Klinghult | Jun 2003 | A1 |
20050036603 | Hughes | Feb 2005 | A1 |
20050230594 | Sato et al. | Oct 2005 | A1 |
20060017691 | Cruz-Hernandez et al. | Jan 2006 | A1 |
20060209037 | Wang et al. | Sep 2006 | A1 |
20060223547 | Chin et al. | Oct 2006 | A1 |
20060252463 | Liao | Nov 2006 | A1 |
20070106457 | Rosenberg | May 2007 | A1 |
20070152974 | Kim et al. | Jul 2007 | A1 |
20080062145 | Shahoian | Mar 2008 | A1 |
20080084384 | Gregorio et al. | Apr 2008 | A1 |
20080111791 | Nikittin | May 2008 | A1 |
20090085879 | Dai et al. | Apr 2009 | A1 |
20090115734 | Fredriksson et al. | May 2009 | A1 |
20090166098 | Sunder | Jul 2009 | A1 |
20090167702 | Nurmi | Jul 2009 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20090174672 | Schmidt | Jul 2009 | A1 |
20090207129 | Ullrich et al. | Aug 2009 | A1 |
20090225046 | Kim et al. | Sep 2009 | A1 |
20090231271 | Heubel et al. | Sep 2009 | A1 |
20090243404 | Kim et al. | Oct 2009 | A1 |
20090267892 | Faubert | Oct 2009 | A1 |
20090313542 | Cruz-Hernandez et al. | Dec 2009 | A1 |
20100116629 | Borissov et al. | May 2010 | A1 |
20100225600 | Dai et al. | Sep 2010 | A1 |
20100231508 | Cruz-Hernandez et al. | Sep 2010 | A1 |
20100313425 | Hawes | Dec 2010 | A1 |
20100328229 | Weber et al. | Dec 2010 | A1 |
20110115754 | Cruz-Hernandez | May 2011 | A1 |
20110128239 | Polyakov et al. | Jun 2011 | A1 |
20110132114 | Siotis | Jun 2011 | A1 |
20110205038 | Drouin et al. | Aug 2011 | A1 |
20110210834 | Pasquero et al. | Sep 2011 | A1 |
20110261021 | Modarres et al. | Oct 2011 | A1 |
20120038471 | Kim et al. | Feb 2012 | A1 |
20120056825 | Ramsay et al. | Mar 2012 | A1 |
20120062491 | Coni et al. | Mar 2012 | A1 |
20120113008 | Makinen et al. | May 2012 | A1 |
20120127071 | Jitkoff et al. | May 2012 | A1 |
20120127088 | Pance et al. | May 2012 | A1 |
20120223824 | Rothkopf | Sep 2012 | A1 |
20120235942 | Shahoian | Sep 2012 | A1 |
20120319827 | Pance et al. | Dec 2012 | A1 |
20120327006 | Israr et al. | Dec 2012 | A1 |
20130016042 | Makinen et al. | Jan 2013 | A1 |
20130044049 | Biggs et al. | Feb 2013 | A1 |
20130207793 | Weaber et al. | Aug 2013 | A1 |
20130253818 | Sanders et al. | Sep 2013 | A1 |
20130278401 | Flaherty et al. | Oct 2013 | A1 |
20140062948 | Lee et al. | Mar 2014 | A1 |
20140125470 | Rosenberg | May 2014 | A1 |
20140168175 | Mercea et al. | Jun 2014 | A1 |
20140218853 | Pance et al. | Aug 2014 | A1 |
20140274398 | Grant | Sep 2014 | A1 |
20140327630 | Burr | Nov 2014 | A1 |
20150097800 | Grant et al. | Apr 2015 | A1 |
20150116205 | Westerman | Apr 2015 | A1 |
20150126070 | Candelore | May 2015 | A1 |
20150130730 | Harley et al. | May 2015 | A1 |
20150135121 | Peh et al. | May 2015 | A1 |
20150277562 | Bard et al. | May 2015 | A1 |
20150205357 | Virtanen et al. | Jul 2015 | A1 |
20150234493 | Parivar et al. | Aug 2015 | A1 |
20150293592 | Cheong et al. | Oct 2015 | A1 |
20150317026 | Choi | Nov 2015 | A1 |
20150338919 | Weber et al. | Nov 2015 | A1 |
20150349619 | Degner et al. | Dec 2015 | A1 |
20160011664 | Silvanto et al. | Jan 2016 | A1 |
20160098107 | Morrell et al. | Apr 2016 | A1 |
20160171767 | Anderson et al. | Jun 2016 | A1 |
20160209979 | Endo | Jul 2016 | A1 |
20160293829 | Maharjan et al. | Oct 2016 | A1 |
20160327911 | Eim et al. | Nov 2016 | A1 |
20160328930 | Weber et al. | Nov 2016 | A1 |
20160379776 | Oakley | Dec 2016 | A1 |
20170003744 | Bard et al. | Jan 2017 | A1 |
20170024010 | Weinraub | Jan 2017 | A1 |
20170249024 | Jackson et al. | Aug 2017 | A1 |
20170285843 | Roberts-Hoffman et al. | Oct 2017 | A1 |
20170337025 | Finnan et al. | Nov 2017 | A1 |
20180014096 | Miyoshi | Jan 2018 | A1 |
20180029078 | Park et al. | Feb 2018 | A1 |
20180181204 | Weinraub | Jun 2018 | A1 |
20180194229 | Wachinger | Jul 2018 | A1 |
Number | Date | Country |
---|---|---|
101036105 | Sep 2007 | CN |
101409164 | Apr 2009 | CN |
101663104 | Mar 2010 | CN |
101872257 | Oct 2010 | CN |
214030 | Mar 1983 | DE |
1686776 | Aug 2006 | EP |
2743798 | Jun 2014 | EP |
2004129120 | Apr 2004 | JP |
2004236202 | Aug 2004 | JP |
2010537279 | Dec 2010 | JP |
2010540320 | Dec 2010 | JP |
20050033909 | Apr 2005 | KR |
2010035805 | Oct 2010 | TW |
WO2002073587 | Sep 2002 | WO |
WO2006091494 | Aug 2006 | WO |
WO2007049253 | May 2007 | WO |
WO2007114631 | Oct 2007 | WO |
WO2009038862 | Mar 2009 | WO |
WO2010129892 | Nov 2010 | WO |
WO2013169303 | Nov 2013 | WO |
WO2014066516 | May 2014 | WO |
WO2016091944 | Jun 2016 | WO |
Entry |
---|
Nasser et al., “Preliminary Evaluation of a Shape-Memory Alloy Tactile Feedback Display,” Advances in Robotics, Mechantronics, and Haptic Interfaces, ASME, DSC-vol. 49, pp. 73-80, 1993. |
Hill et al., “Real-time Estimation of Human Impedance for Haptic Interfaces,” Stanford Telerobotics Laboratory, Department of Mechanical Engineering, Standford University, 6 pages, at least as early as Sep. 30, 2009. |
Lee et al, “Haptic Pen: Tactile Feedback Stylus for Touch Screens,” Mitsubishi Electric Research Laboratories, http://wwwlmerl.com, 6 pages, Oct. 2004. |
Stein et al., “A process chain for integrating piezoelectric transducers into aluminum die castings to generate smart lightweight structures,” Results in Physics 7, pp. 2534-2539, 2017. |
Author Unknown, “3D Printed Mini Haptic Actuator,” Autodesk, Inc., 16 pages, 2016. |
Number | Date | Country | |
---|---|---|---|
62384688 | Sep 2016 | US |