Method for adjusting the user interface of a device

Information

  • Patent Grant
  • 9229571
  • Patent Number
    9,229,571
  • Date Filed
    Tuesday, October 15, 2013
    11 years ago
  • Date Issued
    Tuesday, January 5, 2016
    9 years ago
Abstract
A method adjusting a user interface experience for a device that includes providing a user interface to retrieve a user input, providing a tactile interface layer that defines a surface and includes a volume of fluid and a displacement device 10 that manipulates the volume of fluid to deform a particular region of the surface into a tactilely distinguishable formation retrieving a user preference between a first type, location, and/or timing and a second embodiment, location, and/or timing through the user interface, and manipulating the volume of fluid to deform a particular region of the surface into a tactilely distinguishable formation of one of the first and second type, location, and/or timing is disclosed.
Description
TECHNICAL FIELD

This invention relates generally to touch sensitive user interfaces, and more specifically to a new and useful mountable systems and methods for selectively raising portions of touch sensitive displays.


BACKGROUND

The user interface system of U.S. application Ser. Nos. 11/969,848 and 12/319,334 is preferably used as the user interface for an electronic device, more specifically, in an electronic device that benefits from an adaptive user interface. The user interface system functions to provide a tactile guide and/or feedback to the user. Because of the variety of devices and uses that the user interface system may be used for, for example, an automotive console, a tablet computer, a smartphone, a personal navigation device, a personal media player, a watch, a remote control, a trackpad, or a keyboard, the user interface system must accommodate to each application to provide the user with the kind of tactile guide and/or feedback that facilitates the user in the operation of the device 10. In addition, each user may have a different preference for the kind of tactile guide and/or feedback that is most useful to them in facilitating the operation of the device. For example, while some users may prefer a larger surface area of tactile guidance, others may prefer a larger degree of deformation of the surface area of tactile guidance. Because of the large range of usage scenarios, determining an average user interface system setting that may accommodate to a relatively large range of user preferences for each usage scenario requires a substantial amount of time and research. In addition, because of the large range of user preferences, configuring one set of settings for each use scenario may not provide a user with their preferred tactile guidance and/or feedback. This invention allows the user to adjust the characteristics of the user interface system in order to allow the user interface system to efficiently accommodate to both the usage scenario and the user in a large range of devices and usage scenarios.





BRIEF DESCRIPTION OF THE FIGURES


FIGS. 1 and 2 are a first and second variation of the method of the preferred embodiments, respectively.



FIG. 3 is a top view of the user interface system of a preferred embodiment.



FIGS. 4
a and 4b are cross-sectional views of the tactile interface layer of a first and second variation, respectively.



FIGS. 5
a, 5b, and 5c are cross-sectional views illustrating the operation of a particular region of the surface of the tactile interface layer in accordance to the preferred embodiments.



FIGS. 6
a and 6b is a representation of a set of variations to the user interface system.



FIGS. 7-9 are examples of input interfaces provided to the user on the device.





DESCRIPTION OF THE PREFERRED EMBODIMENTS

The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.


As shown in FIGS. 1 and 2, the method S100 of the preferred embodiments for adjusting a user interface for a device preferably includes providing a user interface to retrieve a user input Step S110, providing a tactile interface layer that defines a surface and includes a volume of fluid and a displacement device that manipulates the volume of fluid to deform a particular region of the surface into a tactilely distinguishable formation Step S120, retrieving a user preference between a first choice of type, location, and/or timing and a second choice of kind, location, and/or timing through the user interface Step S130, and manipulating the volume of fluid to deform a particular region of the surface into a tactilely distinguishable formation of the chosen type, location, and/or timing Step S140. The tactile interface layer may also include a sensor that detects a user input at the tactilely distinguishable formation. In this variation, the step of retrieving a user preference S130 may also include retrieving a user preference between a first sensitivity and a second sensitivity for the sensor through the user interface and the step of manipulating the volume of fluid to deform a particular region of the surface Step S140 may include manipulating the volume of fluid to deform a particular region of the surface into one of a first embodiment of tactilely distinguishable formation for the first sensitivity for the sensor and a second embodiment of tactilely distinguishable formation for the second sensitivity of the sensor based on the user preference. The step of providing a user interface to retrieve a user input S110 may include providing a user interface to retrieve a user input on the device, providing a user interface to retrieve a user input on the tactile interface layer, providing a user interface to retrieve a user input that is on both the device 10 and the tactile interface layer, providing a user interface on a remote control for the device 10 (for example, a wireless remote control), or providing a user interface in any other suitable arrangement.


1. Providing a Tactile Interface Layer


As shown in FIGS. 3 and 4, the tactile interface layer 100 provided in Step S120 of the preferred embodiment includes: a layer no defining a surface 115, a substrate 120 supporting the layer no and at least partially defining a fluid vessel 127, and a displacement device 130 coupled to the fluid vessel 127 that influences the volume of fluid 112 within the fluid vessel 127 to expand and retract at least a portion of the fluid vessel 127, thereby deforming a particular region 113 of the surface 115. The surface 115 is preferably continuous, such that when swiping a finger across the surface 115 a user would not feel any substantial seams or any other type of interruption in the surface 115. Alternatively, the surface 115 may include features that facilitate the user in distinguishing one region from another. The surface 115 is also preferably planar. The surface 115 is preferably arranged in a flat plane, but may alternatively be arranged in a curved plane or on a first plane and then wrapped around to a second plane substantially perpendicular to the first plane, or any other suitable arrangement. The surface 115 may alternatively include lumps, bumps, depressions, textures, or may be a surface of any other suitable type or geometry. The fluid vessel 127 preferably includes a cavity 125 and the displacement device 130 preferably influences the volume of fluid 112 within the cavity 125 to expand and retract the cavity 125. The fluid vessel 127 may alternatively be a channel 138 or a combination of a channel 138 and a cavity 125, as shown in FIG. 4b. As shown in the variation shown in FIG. 4b, the substrate 120 preferably defines a fluid outlet 116 that allows fluid to flow between the channel 138 and the cavity 125 to deform and un-deform a particular region of the surface 113. The fluid outlet may be formed into the substrate, for example, the fluid outlet 116 may be a series of bores that are machined into the substrate in between the channel 138 and the cavity 125 as shown in FIG. 4b or an open orifice between the cavity 125 and the channel 138 as shown in FIG. 4a, but may alternatively be a property of the material, for example, the substrate 120 may include a porous material that includes a series of interconnected cavities that allow fluid to flow through the substrate 120. The substrate 120 may define any suitable number of fluid outlets 116 that are of any suitable size and shape. The tactile interface layer may also include a fluid outlet layer (not shown) that defines the fluid outlets 116 that is separate from substrate 120 and arranged in between the substrate 120 and layer 110. However, any other suitable arrangement of the fluid outlets 116 may be used. As shown in FIG. 4b, the portion of the substrate 120 (or the fluid outlet layer) that includes the fluid outlets 116 may also function to provide a support for the layer no to substantially prevent the layer 110 from substantially depressing into the channel 138 when force is applied over the particular region 113. However, the substrate 120 may be arranged in any other suitable manner and may provide support for the layer no in any other suitable way.


The layer 110 is preferably attached to the substrate 120 (or fluid outlet layer) at an attachment point 117 that at least partially defines the size and/or shape of the particular region 113. In other words, the attachment point 117 functions to define a border between a deformable particular region of the surface 113 and the rest of the surface 115 and the size of the particular region 113 is substantially independent of the size of the cavity 124 and/or the channel 138. The attachment point 117 may be a series of continuous points that define an edge, but may alternatively be a series of non-continuous points. The attachment point 117 may be formed using, for example, adhesive, chemical bonding, surface activation, welding, or any other suitable attachment material and/or method. The method and material used to form the attachment point 117 is preferably of a similar optical property as the layer 110 and the substrate 120, but may alternatively be of any other optical property. Other portions of the layer no and substrate 120 not corresponding to a particular region of the surface 113 may also be adhered using similar or identical materials and methods to the attachment point 117. Alternatively, the layer 110 and substrate 120 may be left unattached in other portions not corresponding to a particular region of the surface 113. However, the layer 110 and the substrate 120 may be arranged in any other suitable manner.


The fluid vessel 127 may also include a second cavity 125b. When the second cavity 125b is expanded, a second particular region 113 on the surface 115 is preferably deformed. The displacement device 130 preferably influences the volume of fluid 112 within the second cavity 125b independently of the cavity 125, but may alternatively influence the volumes of fluid 112 within both cavity and second cavity 125 and 125b substantially concurrently. Alternatively, the user interface enhancement system 100 may include a second displacement device 130 that functions to influence the volume of fluid 112 within the second cavity 125b to expand and retract the second cavity 125b, thereby deforming a second particular region 113 of the surface. The second cavity 125b is preferably similar or identical to the cavity 125, but may alternatively be any other suitable kind of cavity. The following examples may be described as expanding a fluid vessel 127 that includes a cavity 125 and a channel 138, but the fluid vessel 127 may be any other suitable combination of combination of cavity 125 and/or channel 138. The tactile interface layer 100 may also include a display 150 coupled to the substrate 120 and adapted to output images to the user. As described above, the tactile interface layer 100 may also include a sensor 140 that functions to detect inputs from the user. The sensor 140 may be a capacitive sensor, a pressure sensor, a touch sensitive display, or any other suitable sensor type that detects the presence of a user input. The sensor 140 may be located within the fluid vessel 127, substantially adjacent to the fluid vessel 127 (as shown in FIGS. 4a and 4b), remote from the fluid vessel 127, remote from a cavity 125 but fluidly coupled to the fluid vessel 127, or in any other suitable location.


The tactile interface layer 100 of the preferred embodiments has been specifically designed to be used as the user interface for an electronic device 10, more preferably in an electronic device 10 that benefits from an adaptive user interface. The electronic device 10 may or may not include a display and/or a touch sensor, for example, an automotive console, a desktop computer, a laptop computer, a tablet computer, a television, a radio, a desk phone, a mobile phone, a PDA, a personal navigation device, a personal media player, a camera, a watch, a remote control, a mouse, a trackpad, or a keyboard. The tactile interface layer 100 may, however, be used as the user interface for any suitable device 10 that interfaces with a user in a tactile and/or visual manner. The tactile interface layer 100 is preferably integrated with the device, for example, in the variation wherein the tactile interface layer 100 includes a sensor 140, the tactile interface layer 100 is preferably assembled into the device 10 and presented to the user as one unit. Alternatively, the tactile interface layer 100 may function as an accessory to a device 10, the user may be presented the tactile interface layer 100 and the device 10 as two separate units wherein, when coupled to each other, the tactile interface layer 100 functions to provide tactile guidance to the user and/or to receive user inputs. However, any other suitable arrangement of the tactile interface layer 100 may be used.


As shown in FIG. 5, the surface 115 of the tactile interface layer 100 preferably remains flat until tactile guidance is to be provided to the user at the location of the particular region 113. The displacement device 130 then preferably expands the cavity 125 to expand the particular region 113 outward, forming a deformation that may be felt by a user, and providing tactile guidance for the user. The expanded particular region 113 preferably also provides tactile feedback to the user when he or she applies force onto the particular region 113 to provide input. Alternatively, the displacement device 130 may retract the cavity 125 to deform the particular region 113 inward. However, any other suitable deformation of the particular region 113 may be used.


As shown in FIG. 5, the cavity 125 of the fluid vessel 127 of the preferred embodiment functions to hold a volume of fluid 112 and to have at least two volumetric settings: a retracted volume setting (shown in FIG. 5a) and an extended volume setting (shown in FIG. 5b). The fluid 112 is preferably a substantially incompressible fluid, but may alternatively be a compressible fluid. The fluid 112 is preferably a liquid (such as water, oil, glycerin, or ethylene glycol), but may alternatively be a gas (such as air, nitrogen, or argon) or any other substance (such as a gel or aerogel) that expands the cavity 125 and deforms the surface 115. In the extended volume setting, the cavity 125 deforms the particular region 113 of the surface 115 above the plane of the other regions of the surface 115. When used with a mobile phone device, the deformation of the particular region 113 preferably has a diameter of 2-10 mm and the cavity 125 may be of a substantially equal diameter as the deformation of the particular region 113 or may be of a smaller or larger diameter. When used with this or other applications, however, the cavity 125 may have any suitable dimension.


The displacement device 130 of the preferred embodiment functions to influence the volume of the fluid 112 with the fluid vessel 127 to expand and retract at least a portion of the fluid vessel 127, thereby deforming a particular region 113 (and/or a second particular region 113) of the surface 115. When used with a mobile phone device, the displacement device 130 preferably increases the volume of the fluid 112 within the fluid vessel 127 by approximately 0.003-0.1 ml to expand the cavity 125 to outwardly deform a particular region 113. When used with this or other applications, however, the volume of the fluid may be increased (or possibly decreased) by any suitable amount. The displacement device 130 preferably modifies the volume of the fluid 112 by (1) modifying the volume of the existing fluid 112 in the fluid vessel 127, or (2) adding and removing fluid 112 to and from the fluid vessel 127. The displacement device 130 may, however, influence the volume of the fluid 112 by any suitable device or method. Modifying the volume of the existing fluid 112 in the fluid vessel 127 most likely has an advantage of lesser complexity, while adding and removing fluid 112 to and from the fluid vessel 127 most likely has an advantage of maintaining the deformation of the surface 115 without the need for additional energy (if valves or other lockable mechanisms are used). Although the cause of the deformation of a particular region 113 of the surface 115 has been described as a modification of the volume of the fluid in the fluid vessel 127, it is possible to describe the cause of the deformation as an increase or decrease in the pressure below the surface 115 relative to the pressure above the surface 115. When used with a mobile phone device, an increase of approximately 0.1-10.0 psi between the pressure below the layer 110 relative to the pressure above the layer 110, is preferably enough to outwardly deform a particular region 113 of the surface 115. When used with this or other applications, however, the modification of the pressure may be increased (or possibly decreased) by any suitable amount.


The shape of the deformation of the particular region 113 is preferably one that is felt by a user through their finger and preferably acts as (i) a button that can be pressed by the user, (2) a slider that can be pressed by the user in one location along the slider or that can be swept in a sliding motion along the slider (such as the “click wheel” of the second generation Apple iPod), and/or (3) a pointing stick that can be pressed by the user from multiple directions and/or locations along the surface whereby the user is provided with tactile feedback that distinguishes a first directional touch from a second directional touch and/or a touch in a first location from a touch in a second location (such as the pointing stick trademarked by IBM as the TRACKPOINT and by Synaptics as the TOUCHSTYK (which are both informally known as the “nipple”)). The deformation may, however, act as any other suitable device or method that provides suitable tactile guidance and feedback. In the variation including a display 150, the shape of the deformation of the particular region 113 also preferably functions to minimize the optical distortion of the image underneath the deformed particular region 113.


2. Retrieving a User Preference and Manipulating the Volume of Fluid


The user preference retrieved in Step S130 is preferably one of the following embodiments: a first embodiment for the operation of the tactile interface layer 100, a second embodiment for interaction between the device and the tactile interface layer, and a third embodiment for operation of the device. The step of retrieving a user preference S130 of the first embodiment preferably includes retrieving a user preference for the operation of the tactile interface layer Step S132 and the step of manipulating the volume of fluid to deform a particular region of the surface of the first embodiment S140 preferably includes manipulating the volume of fluid to deform a particular region of the surface based on the user preference for the operation of the tactile layer Step S142. The step of retrieving a user preference S130 of the second embodiment preferably includes retrieving a user preference for the interaction between the device 10 and the tactile interface layer Step S134 and the step of manipulating the volume of fluid to deform a particular region of the surface S140 of the second embodiment preferably includes manipulating the volume of fluid to deform a particular region on the surface based on the user preference for the interaction between the device 10 and the tactile interface layer Step S144. The step of retrieving a user preference S130 of the third embodiment preferably includes retrieving a user preference for the operation of the device Step S133. A user preference for the operation of the device may be a user preference for vibrating and/or producing a sound when a particular region 113 is deformed or when a particular application of the device is actuated. Alternatively, a user preference for the operation of the device may include a user preference for the loudness of the sound produced and/or the magnitude of the vibration produced. However, the user preference for the operation of the device may be any other suitable kind of preference for an application of the device.


2.1 User Preference of a First Embodiment


A user preference of the first embodiment may be one of several variations: (i) a preference for the geometry of the deformation (e.g., the size of the deformed particular region 113), (2) a preference for the tactile feel of the deformation (e.g., the firmness of the deformation), (3) a preference for the performance of the deformation (e.g., the deformation rate of the particular region 113 and/or the time that the particular region 113 is deformed), (4) a preference for the sensitivity of the sensor 140 (for example, sensitivity at the deformed particular region 113, sensitivity at the un-deformed particular region 113, or sensitivity for any other suitable state or location along the surface 115) or (5) a preference for the location of the particular region 113 relative to the tactile interface layer 100. In the variation of the fluid vessel 127 that includes a second cavity 125b that corresponds to a second particular region 113, a sixth variation may include a preference for which of the particular region 113 and/or second particular region 113 to deform. In the variation of the tactile interface layer that includes a display 150, a seventh variation may include a preference for a tactilely distinguishable formation independent of the operation of the display 150. However, any other suitable user preference for the operation of the tactile interface layer may be retrieved through the user interface in Step S132.


The volume of fluid may be manipulated in one of several variations to deform a particular region of the surface based on the user preference for the operation of the tactile layer S142.


A first variation of manipulating the volume of fluid to deform a particular region of the surface based on the user preference for the operation of the tactile interface layer S142 preferably includes adjusting the operation of the displacement device 130 and is preferably applied to the first, second, and/or third variation of a user preference of the first embodiment. In particular, adjusting the operation of the displacement device 130 is preferably used to adjust the geometry, tactile feel, and performance of the deformation of the particular region 113. As mentioned above, the cause of the deformation of the particular region 113 may be thought of as an increase in the pressure below the surface 115 relative to the pressure above the surface 115. The displacement device 130 functions to provide this increase in pressure by modifying the volume of fluid 112 within the cavity 125. For example, the level of increase in the volume of fluid 112 within the cavity 125 directly influences the level of increase of the pressure below the surface 115, and by changing the level of increase in pressure below the surface 115 relative to the pressure above the surface 115, characteristics such as the firmness and the height of the deformation of the particular region 113 may be adjusted. The rate of increase of the pressure below the surface 115 relative to the pressure above the surface 115 may also affect the rate at which the deformation of the particular region 113 occurs. Similarly, the length of time that the displacement device 130 provides the increased pressure is directly related to the length of time that a particular region is deformed. By providing adjustments through varying the operation parameters of the displacement device 130 in this first variation, the number of available adjustment settings is directly related to the number of available variations in the operation parameters of the displacement device 130. For example, in adjusting the firmness of the deformation of the particular region 113, the tactile interface layer 100 may provide a minimum firmness and a maximum firmness with a substantially large number of firmness level settings in between the minimum and maximum firmness, each correlating with a volume increase within the cavity 125 that a displacement device 130 of the first variation may induce or a volume of fluid 112 that a displacement device 130 may provide. This may provide the user with the ability to apply an adjustment setting that is substantially close to their personal preference. The number of available settings may be less than the number of available variations in the operation parameters of the displacement device 130 to decrease complexity. However, any other suitable number of adjustment settings may be provided to the user.


In another example of the first variation, adjusting the operation of the displacement device 130 may be applied to the fifth variation of the user preference of the first embodiment where the user provides a preference for the location of the particular region 113 relative to the tactile interface layer 100 and/or the sixth variation where there is a second cavity 125b and the user provides a preference for which of the particular region 113 and/or second particular region 113 to deform. In particular, the displacement device 130 may function to selectively expand the cavity 125 and/or the second cavity 125b corresponding to a particular region 113 that is indicated in the user preference. The user may select one particular region from a first and a second particular region that they desire to be expanded to provide tactile guidance in a certain user scenario. Alternatively, there may be a plurality of cavities 125 and second cavities 125b that are arranged into a first group and a second group. In an example of a user selection for a particular usage scenario, the first group may include a first spacing in between each particular region 113 of the first group and the second group may include a second spacing in between each particular region 113 of the second group, as shown in FIGS. 6a and 6b. A user may prefer the second spacing (for example, a larger spacing) and select to expand the second group during use. The displacement device 130 then functions to expand the second group for the particular usage scenario. Any other variations to the operation parameters of the displacement device 130 may be used to adjust the characteristics of the first embodiment of the tactile interface layer 100.


A second variation of manipulating the volume of fluid to deform a particular region of the surface based on the user preference for the operation of the tactile interface layer S142 preferably includes adjusting the deformation of the particular region 113 to set a user preference of the fourth variation for the sensitivity of the sensor 140. For example, the sensor 140 may be a capacitive sensor that detects the presence of the finger of the user at a distance away from the surface 115. To decrease the sensitivity of the sensor 140, the height of the deformation of the particular region 113 may be increased such that, when the finger of the user is resting on the top of the deformed particular region 113, a user input is not registered. In other words, the equivalent sensitivity of the sensor may be decreased while the actual sensitivity of the sensor remains the same. Alternatively, the sensitivity of the sensor 140 may be adjusted by adjusting the operation of the sensor 140. In one example, the thresholds for the sensor 140 to register a user input may be adjusted. In the variation wherein the sensor 140 is a touch sensitive display, a touch at any location along the display may register as a user input regardless of the presence of a particular region 113, preventing the user from resting their finger on a deformed particular region 113 as a user would normally be able to do on a static tactile interface such as those found on a remote control with mechanical buttons or a Blackberry mobile phone. In this variation, the user may input a user preference for a lower sensitivity for the sensor 140 wherein a user input is registered only if the finger is at a certain distance away from the touch sensitive display, preferably one wherein the distance is less than the distance between the most distant point of the deformation of the particular region 113 from the surface 115, allowing the user to rest their finger on the deformation and the sensor 140 only registering a user input when the deformation is inwardly deformed by force applied by the user. In the variation wherein the sensor 140 is a capacitive or a pressure sensor, the sensitivity of the sensor 140 may be adjusted such that a user input is registered with a certain degree of change in capacitive or pressure reading. However, any other suitable adjustment to the sensitivity of the sensor 140 may be provided to the user.


In another example of adjusting the operation of the sensor 140, readings from the sensor 140 may be ignored and/or the sensor 140 may be disabled. In the variation wherein the sensor 140 is a touch sensitive display, certain portions of the touch sensitive display may be disabled and/or readings from certain portions of the touch sensitive display may be ignored. For example, for certain usage scenarios, the particular region 113 that is deformed may be on a first portion of the touch sensitive display. The user may input a user preference to disable the remaining portions of the touch sensitive display to prevent undesired touch inputs, but may alternatively allow the remaining portions of the touch sensitive display to continue to receive touch inputs, allowing the user to select options that are displayed in a location wherein the particular region 113 is not deformed. However, any other suitable combination of ignored readings, disabled sensing, and/or enabled sensing may be used.


A third variation of manipulating the volume of fluid to deform a particular region of the surface based on the user preference for the operation of the tactile interface layer S142 preferably includes manipulating the volume of fluid to deform a particular region of the surface independently of the state of the display 140 and is preferably applied to the seventh variation of a user preference of the first embodiment to set a user preference for a tactilely distinguishable formation independent of the operation of the display 150. For example, the user preference may include disabling the display 150 while enabling the sensor 140. Subsequently, the volume of fluid is manipulated to expand a particular region of the surface. Because the tactile interface layer 100 provides tactile guidance, the visual guidance provided by the display 150 is not necessary in certain scenarios to guide the user in the user of the device 10. Disabling the display 150 allows the device 10 to conserve energy, potentially extending the use time per charge of the device 10 if the device 10 is a portable device such as a camera or a cellular phone.


The user preferences for the operation of the tactile interface layer 100 retrieved in Step S132 are preferably one of the variations as described above but may alternatively be any other suitable combination of or any other kind of user preference for the operation of the tactile interface layer 100. The volume of fluid is preferably manipulated in Step S142 using a system or method described above, but may alternatively be a combination of the systems and/or methods described above or any other suitable system or method.


2.2 User Preference of a Second Embodiment


A user preference for the interaction between the device and the tactile interface layer retrieved in Step S132 may also be of one of several variations. In a first variation, the user preference of the second embodiment may be a preference for the location of the particular region 113 relative to the device 10. For example, the user may indicate the location of the particular region 113 relative to the device 10 that best fits the size of his or her hand. In a second variation, the tactile interface layer 100 may include a second cavity 125b that corresponds to a second particular region 113, and the user preference of the second embodiment may be a preference for the location of a particular region 113 relative to another particular region 113. For example, the displacement device 130 may manipulate fluid to deform a plurality of particular regions 113 into tactilely distinguishable formations that cooperatively represent a keyboard layout and the user preference may be a preference for the relative location between the keys of the keyboard, as shown in FIGS. 6a and 6b. By allowing the user to provide a preference for the relative location between the keys of the keyboard the tactile interface layer 100 is substantially customized to each individual user, which may increase the usability of the keyboard and may potentially decrease the risk of repetitive stress syndrome.


A third variation of a user preference of the second embodiment may include a preference for the timing for the actuation of a deformation. As an example, the user preference may include the preference for actuation of a deformation when a particular application of the device is actuated. The tactile interface layer 100 may define a plurality of particular regions 113 that cooperatively represent a numeric keypad and device 10 may include a phone application and the user preference may be to actuate the deformation of the plurality of particular regions 113 when the phone application is actuated. In another example, the displacement device 130 may manipulate fluid to deform a plurality of particular regions 113 into tactilely distinguishable formations that cooperatively represent a QWERTY keyboard and the device 10 may include a typing application and the user preference may be to actuate the expansion of the QWERTY keyboard when the user initiates a typing application. In yet another example, the displacement device 130 may manipulate fluid to deform a plurality of particular regions 113 into tactilely distinguishable formations and the user preference may include a preference for the actuation of the deformation of a particular tactilely distinguishable formation at a particular timing. The plurality tactilely distinguishable formations cooperatively represent a keyboard and the user preference preferably includes a preference for a tactilely distinguishable region representing a particular key.


The user preference for interaction between the device 10 and the tactile interface layer 100 retrieved in Step S134 is preferably one of the variations as described above but may alternatively be any other suitable combination of or any other kind of user preference for the operation of the device 10 and/or interaction between the device 10 and the tactile interface layer 100.


The volume of fluid is preferably manipulated in Step S144 using a system or method described above for the step of manipulating the volume of fluid to deform a particular region of the surface Step S142, but may alternatively be a combination of the systems and/or methods described above or any other suitable system or method. The manipulation of the fluid is preferably actuated by a processing unit of the device 10, for example, actuating the expansion of the desired cavity 125 during certain usage scenarios such as incoming phone calls on a phone. However, any other suitable interaction between the device 10 and the tactile interface layer 100 may be used.


3. Providing a User Interface


As described above, the user interface provided in Step S110 to retrieve a user input may be provided on the tactile interface layer 100, which may allow the user to have a direct tactile comparison between different available settings for the tactile interface layer 100; on both the device 10 and the tactile interface layer 100, which may allow the device 10 and the tactile interface layer 100 to cooperatively provide a user interface for the user; on the device 10; or in any other suitable arrangement. The device 10 and/or the tactile interface layer 100 preferably enters a “customization mode” wherein the user is prompted to provide inputs for user preferences that preferably do not register as any other kind of input. The user interface tactile, visual, audible, or in any other suitable kind of media.


In a first variation of the user interface, the interface is provided on the tactile interface layer 100. In a first example of the user interface of the first variation, the user interface may provide a plurality of expanded cavities 125 and/or 125b that result in a plurality of deformed particular regions 113 on the tactile interface layer 100, wherein each of the plurality of deformed particular regions 113 is of a different characteristic such as a different degree of firmness and/or a different shape. The user then selects the particular region 113 that best fits their preferences and the selection is detected by the sensor 140 and sent to a processing unit in the tactile interface layer 100 and/or a processing unit in the device 10.


In a second example of the first variation, the user interface may provide a deformed particular region 113 in the form of a slider on the tactile interface layer 100. The slider may include a plurality of regions, each region representing a different degree of a characteristic such as firmness, size, and/or distance between deformations. The user may slide their finger along the slider to experience the various degrees of the characteristic and select the desired degree. The selection may be inputted by providing force at the location along the slider of the degree they select, but may alternatively be a selection inputted adjacent to the slider or any other suitable location or kind of input.


In a third example of the first variation, the user interface may provide a deformed particular region 113 in the form of a slider and another particular region 113 in the form of an “example region” on the tactile interface layer 100. The user may adjust the position of the slider to adjust the option for adjustment demonstrated by the “example region.” The user may tactilely feel the example region as they adjust the slider and then select their desired adjustment. The slider is preferably of a uniform characteristic to decrease the tactile variations felt by the user and to potentially decrease confusion, but may alternatively emulate the adjustment demonstrated by the example region to allow the user to tactilely feel the adjusted characteristic on more than one location or shape of deformed particular region.


In a fourth example of the first variation, the user interface may provide a deformed particular region 113 that transitions in between different degrees of a characteristic such as firmness, or size and the user selects the desired degree. The transitions are preferably cyclic and repeat the range of degrees for the user to experience as many times as necessary before making a selection. The user may input the selection as the deformed particular region 113 is demonstrating the various available options, but may alternatively input the selection after the deformed particular region 113 has demonstrated the available options. The rate of demonstration by the deformed particular region 113 is preferably at a slow rate to allow the user to adequately examine the option for their decision, but may alternatively be an adjustable rate or any other suitable rate.


In a fifth example of the first variation, the user interface may provide a plurality of cavities 125 that may correspond to, for example, a keyboard layout. A plurality of cavities 125 is expanded and a plurality of deformed particular regions of the surface 113 is presented to the user. The user may then select the set of deformed particular regions of the surface 113 that best fit their hand shape for a particular application as described in the second variation of a user preference of the second embodiment retrieved in Step S134 and as shown in FIGS. 4a and 4b. In the example of a keyboard layout, the user may select the set of deformed particular regions that best match their hand size and shape, allowing for a more personalized keyboard layout for each individual user, potentially decreasing the risk of repetitive stress disorder that may result from arranging the hand of the user in an uncomfortable and stressful position. In the example of the keyboard layout, the user may be presented with a plurality of options for the location of the deformed particular region that corresponds to each keyboard key. The options for the location of each key may be presented concurrently with the options for every other key in the keyboard, but may alternatively be presented to the user one after the other. However, any other suitable method to allow the user to select their desired location of each key may be used. Once the location of each key is determined, the user may then be prompted to select the desired height and/or firmness of each key, allowing the user interface system to accommodate to the natural angle of the user's hands, further decreasing the potential of repetitive stress syndrome.


In a second variation of the user interface, the user interface is provided on the device 10. This variation is particularly applicable in retrieving a user preference for the interaction of the device and the tactile interface layer S134. The user interface as provide on the device 10 is preferably applied to a device 10 that includes a display 150 that provides an image to communicate to the user, but may alternatively be applied to any other suitable kind of device, for example, a device that includes a speaker to communicate with the user, or a device that provides a vibration to communicate with the user. In a first example of the second variation of the user interface, as shown in FIG. 7, the user interface may provide a series of check boxes where the user may choose options for the actuation of the deformation of the particular region 113 (such as to retrieve a user preference for the actuation of a deformation in the third variation of the user preference of the second embodiment). As shown in FIG. 7, the user may select to actuate the deformation of the particular region 113 when the “place call,” “receive call,” “email,” etc, application of the device 10 is actuated. Additionally, the user may provide a preference for the arrangement of the particular region 113 that is to be deformed, for example, a QWERTY keyboard or a numeric keypad.


In a second example of the second variation, as shown in FIG. 8, the user interface may provide an interface on the device 10 that allows the user to provide a preference for the operation of the tactile interface layer 100. In other words, a user interface to retrieve a user preference for the operation of the tactile layer 100 (the first embodiment of user preference) may be provided on the device 10. This example of the second variation of the user interface may function similarly to the second and third example of the user interface of the first variation that provide a slider on the tactile interface layer 100.


In a third example of the second variation, as shown in FIG. 9, the user interface may provide an interface on the device 10 that allows the user to provide a preference for the operation of the device, for example, vibrating and/or producing a sound when a particular region 113 is deformed or when a particular application of the device is actuated. This is particularly applicable to retrieving a user preference for the operation of the device in Step S133.


In a fourth example of the second variation, the user interface may allow the user to select the desired location for a particular region. For example, in the variation where the device 10 includes an application which uses a keyboard, the user interface may prompt the user to select the desired location for each key in a keyboard instead of providing options to the user for the location of each key in the keyboard. The user may alternatively be asked to place the fingers of their hand in the most natural position onto the tactile interface layer 100. The location of each finger is detected and the cavity 125 and particular region of the surface 113 that is substantially adjacent to the location of the finger is then selected as the location of the keyboard key.


In a third variation of the user interface, the user interface may be is provided on a device that is external to both the device 10 and the tactile interface layer 100. For example, the user interface may be provided an application on the Internet, on a personal computer, or any other suitable medium.


The user interface of the preferred embodiments is preferably one of the variations described above, but may alternatively be a combination of the variations described above. For example, the user interface may provide a slider on the device 10 that functions to control the characteristic of an “example region” on the tactile interface layer 100, allowing the device 10 and the tactile interface layer 100 to cooperatively provide a user interface to the user. The device may also provide a visual indicator (for example, a numerical level setting) that indicates the level of a particular setting. This may facilitate in communicating setting options to the user. However, any other suitable user interface may be used.


As shown in FIGS. 1 and 2, a processing unit retrieves a user preference that is provided by the user on the user interface S130 and sets the user preferences to the operating conditions S140. The processing unit may actuate the manipulation of the volume of fluid based on the user preferences to the operation of the tactile interface layer S132, the operation of the device S133, and/or the interaction between the device and the tactile interface layer S134. In a first variation, the processing unit may be included into the tactile interface layer 100 and may also function to control the displacement device 130, sensor 140 and/or the display 150. The processing unit may communicate directly with the components of the tactile interface layer 100 (e.g. the displacement device 130, but may alternatively communicate with the components of the tactile interface layer 100 in any other suitable manner. The processing unit of this first variation may function to communicate with a processing unit of the device 10 to receive signals representing user selections.


In a second variation, the processing unit may be included into the device 10 and may also function to control the applications of the device 10. The processing unit of this second variation may communicate directly with the components of the tactile interface layer 100 (e.g. the displacement device 130), but may alternatively communicate to the components of the tactile interface layer 100 in any other suitable manner. The processing unit of this second variation may communicate with the components of the tactile interface layer 100 through a wired communication protocol, a wireless communication protocol, or any other suitable kind of communication protocol.


In a third variation, the processing unit may be external to both the tactile interface layer 100 and the device 10, for example, a personal computer that is communicably coupled to the tactile interface layer 100 and/or the device 10. In this variation, when the user desires to provide and/or apply user preferences to operating conditions, the device and/or the tactile layer 100 may be connected to a personal computer that may include an interface that allows the user to provide a user preference.


The processing unit of the preferred embodiments is preferably one of the variations as described above, but may alternatively be any combination of the above variations. For example, the tactile interface layer 100 may include a processing unit that functions to control the tactile interface layer 100 and the device 10 may include a processing unit that functions to control the device 10. The processing units of the tactile interface layer 100 and the device 10 may function to communicate with each other to provide control for an operating condition. In this variation, the processing unit of the tactile interface layer 100 may communicate with the processing unit of the device 10 through a wired communication protocol, a wireless communication protocol, or any other suitable kind of communication protocol. However, any other suitable arrangement of the processing unit may be used.


As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the preferred embodiments of the invention without departing from the scope of this invention defined in the following claims.

Claims
  • 1. A method, for adjusting a user interface of a computing device, comprising: displaying a range of parameter levels between a minimum parameter level and a maximum parameter level for a button rise time;receiving a parameter level selection from the range of parameter levels;receiving a retract time selection from a range of retract times between a minimum retract time and a maximum retract time;rendering an input graphic on a display of the computing device;correlating the parameter level selection with a rate of fluid displacement for the button rise time;displacing a volume of fluid into a cavity arranged over the input graphic on the display at the rate of fluid displacement to expand a deformable region corresponding to the input graphic from a retracted setting into an expanded setting, the deformable region flush with an adjacent peripheral region in the retracted setting and offset above the peripheral region in the expanded setting;displacing fluid, at a second flow rate, out of the cavity to retract the deformable region from the expanded setting into the retracted setting, wherein displacing fluid out of the cavity comprises displacing fluid out of the cavity at the second flow rate corresponding to the retract selection; andsensing an input on the deformable region.
  • 2. The method of claim 1, further comprising receiving a button firmness level selection from a range of button firmness levels between a minimum button firmness level and a maximum button firmness level, and wherein displacing the volume of fluid into the cavity comprises displacing the volume of fluid corresponding to the button firmness level selection into the cavity.
  • 3. The method of claim 2, wherein displacing the volume of fluid into the cavity comprises displacing fluid into the cavity until a fluid pressure within the cavity reaches a target fluid pressure corresponding to the button firmness preference.
  • 4. The method of claim 1, further comprising receiving a button height selection from a range of button heights between a minimum button height and a maximum button height, and displacing the volume of fluid into the cavity to expand the deformable region according to the button height selection.
  • 5. The method of claim 4, wherein displacing the volume of fluid into the cavity comprises correlating the button height preference with a target fluid volume between 0.003 milliliters and 0.1 milliliters and displacing the target fluid volume into the cavity.
  • 6. The method of claim 1, further comprising receiving a button size selection from a range of button sizes between a minimum button size and a maximum button size; wherein displacing the volume of fluid into the cavity comprises displacing the volume of fluid corresponding to the button size selection into the cavity.
  • 7. The method of claim 1, further comprising displaying a second input graphic on the display, the second input graphic different from and adjacent the input graphic, and further comprising displacing a second volume of fluid into a second cavity arranged over the second input graphic on the display at a rate corresponding to the parameter level selection for the button rise time to expand a second deformable region corresponding to the second input graphic from a retracted setting into an expanded setting, the second deformable region flush with an adjacent peripheral region in the retracted setting and offset above the peripheral region in the expanded setting.
  • 8. The method of claim 7, wherein displacing the second volume of fluid into the second cavity comprises displacing the volume of fluid into the cavity and the second volume of fluid into the second cavity substantially simultaneously, the volume of fluid and the second volume of fluid substantially equivalent.
  • 9. The method of claim 7, wherein displaying the input graphic on the display and displaying the second input graphic on the display comprise displaying a virtual alphabetical keyboard on the display, the input graphic and the deformable region corresponding to a first alphabetical character and the second input graphic and the second deformable region corresponding to a second alphabetical character.
  • 10. The method of claim 1, wherein sensing the input on the deformable region comprises detecting a degree of change in capacitance of a capacitive touch sensor proximal the deformable region and correlating the degree of change in capacitance with an input on the deformable region.
  • 11. The method of claim 1, wherein displacing the volume of fluid into the cavity comprises actuating a pump fluidly coupled to the cavity for a duration of time corresponding to the parameter level selection for the button rise time.
  • 12. A method for adjusting a user interface of a computing device comprising: displaying a range of parameter levels between a minimum parameter level and a maximum parameter level for a button rise time;receiving a parameter level selection from the range of parameter levels;receiving a retract time selection from a range of retract times between a minimum retract time and a maximum retract time;displaying a input graphic on a display of the computing device;displacing fluid, at a flow rate corresponding to the parameter level selection for the button rise time, into a cavity arranged over the input graphic on the display to expand a deformable region corresponding to the input graphic from a retracted setting into an expanded setting, the deformable region flush with an adjacent peripheral region in the retracted setting and offset above the peripheral region in the expanded setting;displacing fluid, at a second flow rate, out of the cavity to retract the deformable region from the expanded setting into the retracted setting, wherein displacing fluid out of the cavity comprises displacing fluid out of the cavity at the second flow rate corresponding to the retract time selection; andsensing an input on the deformable region.
  • 13. The method of claim 12, further comprising receiving a button size selection from a range of button sizes between a minimum button size and a maximum button size; wherein displacing the volume of fluid into the cavity comprises displacing the volume of fluid corresponding to the button size selection into the cavity.
  • 14. A method for manipulating a tactile interface layer comprising a tactile layer, a substrate, and a displacement device, the tactile layer defining a deformable region and a peripheral region, the substrate defining a fluid channel and a cavity adjacent the deformable region and fluidly coupled to the fluid channel, the displacement device manipulating fluid through the fluid channel into the cavity, the method comprising: displaying a range of parameter levels between a minimum parameter level and a maximum parameter level for a button height;receiving a parameter level selection for a button height from the range of parameter levels;receiving a retract time selection from a range of retract times between a minimum retract time and a maximum retract time;rendering an input graphic on a display of the computing device substantially aligned with deformable region;correlating the parameter level selection with a particular displacement volume of fluid;defining a retracted setting of the deformable region substantially planar with the peripheral region and a maximum expanded setting offset from the peripheral region and tactilely distinguishable from the peripheral region, the maximum expanded setting substantially corresponding to the maximum parameter level;displacing the particular displacement volume of fluid into the cavity to expand a deformable region corresponding to the input graphic from the retracted setting into a particular expanded setting corresponding to the parameter level selection; anddisplacing fluid, at a second flow rate, out of the cavity to retract the deformable region from the expanded setting into the retracted setting, wherein displacing fluid out of the cavity comprises displacing fluid out of the cavity at the second flow rate corresponding to the retract time selection.
  • 15. The method of claim 14, wherein displacing the particular displacement volume of fluid into the cavity comprises displacing fluid into the cavity until a fluid pressure within the cavity reaches a target fluid pressure corresponding to the parameter level selection.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of U.S. application Ser. No. 13/465,737, filed 7 May 2012, which is a continuation of U.S. application Ser. No. 12/830,426, filed on 5 Jul. 2010 now issued as U.S. Pat. No. 8,243,038, which claims priority to U.S. Provisional Application No. 61/223,003 filed 3 Jul. 2009 and U.S. Provisional Application No. 61/303,214, filed 10 Feb. 2010, all of which are incorporated in their entirety by this reference. This application is related to U.S. application Ser. No. 11/969,848, filed on 4 Jan. 2008, and U.S. application Ser. No. 12/319,334, filed on 5 Jan. 2009, which are both incorporated in their entirety by this reference.

US Referenced Citations (515)
Number Name Date Kind
3034628 Wadey May 1962 A
3659354 Sutherland May 1972 A
3759108 Borom et al. Sep 1973 A
3780236 Gross Dec 1973 A
3818487 Brody et al. Jun 1974 A
4109118 Kley Aug 1978 A
4209819 Seignemartin Jun 1980 A
4290343 Gram Sep 1981 A
4307268 Harper Dec 1981 A
4467321 Volnak Aug 1984 A
4477700 Balash et al. Oct 1984 A
4517421 Margolin May 1985 A
4543000 Hasenbalg Sep 1985 A
4584625 Kellogg Apr 1986 A
4700025 Hatayama et al. Oct 1987 A
4772205 Chlumsky et al. Sep 1988 A
4920343 Schwartz Apr 1990 A
4940734 Ley et al. Jul 1990 A
5194852 More et al. Mar 1993 A
5195659 Eiskant Mar 1993 A
5212473 Louis May 1993 A
5222895 Fricke Jun 1993 A
5286199 Kipke Feb 1994 A
5369228 Faust Nov 1994 A
5412189 Cragun May 1995 A
5459461 Crowley et al. Oct 1995 A
5488204 Mead et al. Jan 1996 A
5496174 Garner Mar 1996 A
5666112 Crowley et al. Sep 1997 A
5717423 Parker Feb 1998 A
5729222 Iggulden et al. Mar 1998 A
5742241 Crowley et al. Apr 1998 A
5754023 Roston et al. May 1998 A
5766013 Vuyk Jun 1998 A
5767839 Rosenberg Jun 1998 A
5835080 Beeteson et al. Nov 1998 A
5880411 Gillespie et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5917906 Thornton Jun 1999 A
5943043 Furuhata et al. Aug 1999 A
5977867 Blouin Nov 1999 A
5982304 Selker et al. Nov 1999 A
6067116 Yamano et al. May 2000 A
6154198 Rosenberg Nov 2000 A
6154201 Levin et al. Nov 2000 A
6160540 Fishkin et al. Dec 2000 A
6169540 Rosenberg et al. Jan 2001 B1
6187398 Eldridge Feb 2001 B1
6188391 Seely et al. Feb 2001 B1
6218966 Goodwin Apr 2001 B1
6243074 Fishkin et al. Jun 2001 B1
6243078 Rosenberg Jun 2001 B1
6268857 Fishkin et al. Jul 2001 B1
6271828 Rosenberg et al. Aug 2001 B1
6278441 Gouzman et al. Aug 2001 B1
6300937 Rosenberg Oct 2001 B1
6310614 Maeda et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6337678 Fish Jan 2002 B1
6354839 Schmidt Mar 2002 B1
6356259 Maeda et al. Mar 2002 B1
6359572 Vale Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6369803 Brisebois et al. Apr 2002 B2
6384743 Vanderheiden May 2002 B1
6414671 Gillespie et al. Jul 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6437771 Rosenberg et al. Aug 2002 B1
6462294 Davidson et al. Oct 2002 B2
6469692 Rosenberg Oct 2002 B2
6486872 Rosenberg et al. Nov 2002 B2
6498353 Nagle et al. Dec 2002 B2
6501462 Garner Dec 2002 B1
6509892 Cooper et al. Jan 2003 B1
6529183 MacLean et al. Mar 2003 B1
6573844 Venolia et al. Jun 2003 B1
6636202 Ishmael, Jr. et al. Oct 2003 B2
6639581 Moore et al. Oct 2003 B1
6655788 Freeman Dec 2003 B1
6657614 Ito et al. Dec 2003 B1
6667738 Murphy Dec 2003 B2
6681031 Cohen et al. Jan 2004 B2
6683627 Ullmann Jan 2004 B1
6686911 Levin et al. Feb 2004 B1
6697086 Rosenberg et al. Feb 2004 B2
6700556 Richley et al. Mar 2004 B2
6703924 Tecu et al. Mar 2004 B2
6743021 Prince et al. Jun 2004 B2
6788295 Inkster Sep 2004 B1
6819316 Schulz et al. Nov 2004 B2
6850222 Rosenberg Feb 2005 B1
6861961 Sandbach et al. Mar 2005 B2
6877986 Fournier et al. Apr 2005 B2
6881063 Yang Apr 2005 B2
6930234 Davis Aug 2005 B2
6937225 Kehlstadt et al. Aug 2005 B1
6975305 Yamashita Dec 2005 B2
6979164 Kramer Dec 2005 B2
6982696 Shahoian Jan 2006 B1
6995745 Boon et al. Feb 2006 B2
7027032 Rosenberg et al. Apr 2006 B2
7056051 Fiffie Jun 2006 B2
7061467 Rosenberg Jun 2006 B2
7064655 Murray et al. Jun 2006 B2
7079111 Ho Jul 2006 B2
7081888 Cok et al. Jul 2006 B2
7096852 Gregorio Aug 2006 B2
7102541 Rosenberg Sep 2006 B2
7104152 Levin et al. Sep 2006 B2
7106305 Rosenberg Sep 2006 B2
7106313 Schena et al. Sep 2006 B2
7109967 Hioki et al. Sep 2006 B2
7112737 Ramstein Sep 2006 B2
7113166 Rosenberg et al. Sep 2006 B1
7116317 Gregorio et al. Oct 2006 B2
7124425 Anderson, Jr. et al. Oct 2006 B1
7129854 Arneson et al. Oct 2006 B2
7131073 Rosenberg et al. Oct 2006 B2
7136045 Rosenberg et al. Nov 2006 B2
7138977 Kinerk et al. Nov 2006 B2
7138985 Nakajima Nov 2006 B2
7143785 Maerkl et al. Dec 2006 B2
7144616 Unger et al. Dec 2006 B1
7148875 Rosenberg et al. Dec 2006 B2
7151432 Tierling Dec 2006 B2
7151527 Culver Dec 2006 B2
7151528 Taylor et al. Dec 2006 B2
7154470 Tierling Dec 2006 B2
7158112 Rosenberg et al. Jan 2007 B2
7159008 Wies et al. Jan 2007 B1
7161276 Face Jan 2007 B2
7161580 Bailey et al. Jan 2007 B2
7168042 Braun et al. Jan 2007 B2
7176903 Katsuki et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7191191 Peurach et al. Mar 2007 B2
7193607 Moore et al. Mar 2007 B2
7195170 Matsumoto et al. Mar 2007 B2
7196688 Schena Mar 2007 B2
7198137 Olien Apr 2007 B2
7199790 Rosenberg et al. Apr 2007 B2
7202851 Cunningham et al. Apr 2007 B2
7205981 Cunningham Apr 2007 B2
7208671 Chu Apr 2007 B2
7209028 Boronkay et al. Apr 2007 B2
7209113 Park Apr 2007 B2
7209117 Rosenberg et al. Apr 2007 B2
7209118 Shahoian et al. Apr 2007 B2
7210160 Anderson, Jr. et al. Apr 2007 B2
7215326 Rosenberg May 2007 B2
7216671 Unger et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7218313 Marcus et al. May 2007 B2
7233313 Levin et al. Jun 2007 B2
7233315 Gregorio et al. Jun 2007 B2
7233476 Goldenberg et al. Jun 2007 B2
7236157 Schena et al. Jun 2007 B2
7245202 Levin Jul 2007 B2
7245292 Custy Jul 2007 B1
7249951 Bevirt et al. Jul 2007 B2
7250128 Unger et al. Jul 2007 B2
7253803 Schena et al. Aug 2007 B2
7253807 Nakajima Aug 2007 B2
7265750 Rosenberg Sep 2007 B2
7280095 Grant Oct 2007 B2
7283120 Grant Oct 2007 B2
7283123 Braun et al. Oct 2007 B2
7283696 Ticknor et al. Oct 2007 B2
7289106 Bailey et al. Oct 2007 B2
7289111 Asbill Oct 2007 B2
7307619 Cunningham et al. Dec 2007 B2
7308831 Cunningham et al. Dec 2007 B2
7319374 Shahoian Jan 2008 B2
7336260 Martin et al. Feb 2008 B2
7336266 Hayward et al. Feb 2008 B2
7339572 Schena Mar 2008 B2
7339580 Westerman et al. Mar 2008 B2
7342573 Ryynaenen Mar 2008 B2
7355595 Bathiche et al. Apr 2008 B2
7369115 Cruz-Hernandez et al. May 2008 B2
7382357 Panotopoulos et al. Jun 2008 B2
7390157 Kramer Jun 2008 B2
7391861 Levy Jun 2008 B2
7397466 Bourdelais et al. Jul 2008 B2
7403191 Sinclair Jul 2008 B2
7432910 Shahoian Oct 2008 B2
7432911 Skarine Oct 2008 B2
7432912 Cote et al. Oct 2008 B2
7433719 Dabov Oct 2008 B2
7471280 Prins Dec 2008 B2
7489309 Levin et al. Feb 2009 B2
7511702 Hotelling Mar 2009 B2
7522152 Olien et al. Apr 2009 B2
7545289 Mackey et al. Jun 2009 B2
7548232 Shahoian et al. Jun 2009 B2
7551161 Mann Jun 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7567232 Rosenberg Jul 2009 B2
7567243 Hayward Jul 2009 B2
7589714 Funaki Sep 2009 B2
7592999 Rosenberg et al. Sep 2009 B2
7605800 Rosenberg Oct 2009 B2
7609178 Son et al. Oct 2009 B2
7656393 King et al. Feb 2010 B2
7659885 Kraus et al. Feb 2010 B2
7671837 Forsblad et al. Mar 2010 B2
7679611 Schena Mar 2010 B2
7679839 Polyakov et al. Mar 2010 B2
7688310 Rosenberg Mar 2010 B2
7701438 Chang et al. Apr 2010 B2
7728820 Rosenberg et al. Jun 2010 B2
7733575 Heim et al. Jun 2010 B2
7743348 Robbins et al. Jun 2010 B2
7755602 Tremblay et al. Jul 2010 B2
7808488 Martin et al. Oct 2010 B2
7834853 Finney et al. Nov 2010 B2
7843424 Rosenberg et al. Nov 2010 B2
7864164 Cunningham et al. Jan 2011 B2
7869589 Tuovinen Jan 2011 B2
7890257 Fyke et al. Feb 2011 B2
7890863 Grant et al. Feb 2011 B2
7920131 Westerman Apr 2011 B2
7924145 Yuk et al. Apr 2011 B2
7944435 Rosenberg et al. May 2011 B2
7952498 Higa May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7973773 Pryor Jul 2011 B2
7978181 Westerman Jul 2011 B2
7978183 Rosenberg et al. Jul 2011 B2
7978186 Vassallo et al. Jul 2011 B2
7979797 Schena Jul 2011 B2
7982720 Rosenberg et al. Jul 2011 B2
7986303 Braun et al. Jul 2011 B2
7986306 Eich et al. Jul 2011 B2
7989181 Blattner et al. Aug 2011 B2
7999660 Cybart et al. Aug 2011 B2
8002089 Jasso et al. Aug 2011 B2
8004492 Kramer et al. Aug 2011 B2
8013843 Pryor Sep 2011 B2
8020095 Braun et al. Sep 2011 B2
8022933 Hardacker et al. Sep 2011 B2
8031181 Rosenberg et al. Oct 2011 B2
8044826 Yoo Oct 2011 B2
8047849 Ahn et al. Nov 2011 B2
8049734 Rosenberg et al. Nov 2011 B2
8059104 Shahoian et al. Nov 2011 B2
8059105 Rosenberg et al. Nov 2011 B2
8063892 Shahoian et al. Nov 2011 B2
8063893 Rosenberg et al. Nov 2011 B2
8068605 Holmberg Nov 2011 B2
8077154 Emig et al. Dec 2011 B2
8077440 Krabbenborg et al. Dec 2011 B2
8077941 Assmann Dec 2011 B2
8094121 Obermeyer et al. Jan 2012 B2
8094806 Levy Jan 2012 B2
8103472 Braun et al. Jan 2012 B2
8106787 Nurmi Jan 2012 B2
8115745 Gray Feb 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125347 Fahn Feb 2012 B2
8125461 Weber et al. Feb 2012 B2
8130202 Levine et al. Mar 2012 B2
8144129 Hotelling et al. Mar 2012 B2
8144271 Han Mar 2012 B2
8154512 Olien et al. Apr 2012 B2
8154527 Ciesla Apr 2012 B2
8159461 Martin et al. Apr 2012 B2
8162009 Chaffee Apr 2012 B2
8164573 Dacosta et al. Apr 2012 B2
8166649 Moore May 2012 B2
8169306 Schmidt et al. May 2012 B2
8169402 Shahoian et al. May 2012 B2
8174372 Da Costa May 2012 B2
8174495 Takashima et al. May 2012 B2
8174508 Sinclair et al. May 2012 B2
8174511 Takenaka et al. May 2012 B2
8178808 Strittmatter May 2012 B2
8179375 Ciesla et al. May 2012 B2
8179377 Ciesla et al. May 2012 B2
8188989 Levin et al. May 2012 B2
8195243 Kim et al. Jun 2012 B2
8199107 Xu et al. Jun 2012 B2
8199124 Ciesla et al. Jun 2012 B2
8203094 Mittleman et al. Jun 2012 B2
8203537 Tanabe et al. Jun 2012 B2
8207950 Ciesla et al. Jun 2012 B2
8212772 Shahoian Jul 2012 B2
8217903 Ma et al. Jul 2012 B2
8217904 Kim Jul 2012 B2
8223278 Kim et al. Jul 2012 B2
8224392 Kim et al. Jul 2012 B2
8228305 Pryor Jul 2012 B2
8232976 Yun et al. Jul 2012 B2
8243038 Ciesla Aug 2012 B2
8253052 Chen Aug 2012 B2
8253703 Eldering Aug 2012 B2
8279172 Braun et al. Oct 2012 B2
8279193 Birnbaum et al. Oct 2012 B1
8310458 Faubert et al. Nov 2012 B2
8345013 Heubel et al. Jan 2013 B2
8350820 Deslippe et al. Jan 2013 B2
8362882 Heubel et al. Jan 2013 B2
8363008 Ryu et al. Jan 2013 B2
8367957 Strittmatter Feb 2013 B2
8368641 Tremblay et al. Feb 2013 B2
8378797 Pance et al. Feb 2013 B2
8384680 Paleczny et al. Feb 2013 B2
8390594 Modarres et al. Mar 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8395591 Kruglick Mar 2013 B2
8400402 Son Mar 2013 B2
8400410 Taylor et al. Mar 2013 B2
8547339 Ciesla Oct 2013 B2
8587541 Ciesla Nov 2013 B2
8587548 Ciesla Nov 2013 B2
8749489 Ito et al. Jun 2014 B2
20010008396 Komata Jul 2001 A1
20010043189 Brisebois et al. Nov 2001 A1
20020063694 Keely, Jr. May 2002 A1
20020104691 Kent et al. Aug 2002 A1
20020106614 Prince Aug 2002 A1
20020110237 Krishnan Aug 2002 A1
20020149570 Knowles et al. Oct 2002 A1
20020180620 Gettemy et al. Dec 2002 A1
20030087698 Nishiumi et al. May 2003 A1
20030117371 Roberts et al. Jun 2003 A1
20030179190 Franzen Sep 2003 A1
20030206153 Murphy Nov 2003 A1
20030223799 Pihlaja Dec 2003 A1
20040001589 Mueller et al. Jan 2004 A1
20040056876 Nakajima Mar 2004 A1
20040056877 Nakajima Mar 2004 A1
20040106360 Farmer et al. Jun 2004 A1
20040114324 Kusaka et al. Jun 2004 A1
20040164968 Miyamoto Aug 2004 A1
20040178006 Cok Sep 2004 A1
20050007339 Sato Jan 2005 A1
20050007349 Vakil et al. Jan 2005 A1
20050020325 Enger et al. Jan 2005 A1
20050030292 Diederiks Feb 2005 A1
20050057528 Kleen Mar 2005 A1
20050073506 Durso Apr 2005 A1
20050088417 Mulligan Apr 2005 A1
20050110768 Marriott et al. May 2005 A1
20050162408 Martchovsky Jul 2005 A1
20050212773 Asbill Sep 2005 A1
20050231489 Ladouceur et al. Oct 2005 A1
20050253816 Himberg et al. Nov 2005 A1
20050270444 Miller et al. Dec 2005 A1
20050285846 Funaki Dec 2005 A1
20060026521 Hotelling et al. Feb 2006 A1
20060087479 Sakurai et al. Apr 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060098148 Kobayashi et al. May 2006 A1
20060118610 Pihlaja et al. Jun 2006 A1
20060119586 Grant et al. Jun 2006 A1
20060152474 Saito et al. Jul 2006 A1
20060154216 Hafez et al. Jul 2006 A1
20060197753 Hotelling Sep 2006 A1
20060214923 Chiu et al. Sep 2006 A1
20060238495 Davis Oct 2006 A1
20060238510 Panotopoulos Oct 2006 A1
20060256075 Anastas et al. Nov 2006 A1
20060278444 Binstead Dec 2006 A1
20070013662 Fauth Jan 2007 A1
20070036492 Lee Feb 2007 A1
20070085837 Ricks et al. Apr 2007 A1
20070108032 Matsumoto et al. May 2007 A1
20070122314 Strand et al. May 2007 A1
20070130212 Peurach et al. Jun 2007 A1
20070152983 Mckillop et al. Jul 2007 A1
20070165004 Seelhammer et al. Jul 2007 A1
20070171210 Chaudhri et al. Jul 2007 A1
20070182718 Schoener et al. Aug 2007 A1
20070229233 Dort Oct 2007 A1
20070229464 Hotelling Oct 2007 A1
20070236466 Hotelling Oct 2007 A1
20070236469 Woolley et al. Oct 2007 A1
20070247429 Westerman Oct 2007 A1
20070254411 Uhland et al. Nov 2007 A1
20070257634 Leschin et al. Nov 2007 A1
20070273561 Philipp Nov 2007 A1
20070296702 Strawn et al. Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20080010593 Uusitalo et al. Jan 2008 A1
20080024459 Poupyrev et al. Jan 2008 A1
20080054875 Saito Mar 2008 A1
20080062151 Kent Mar 2008 A1
20080136791 Nissar Jun 2008 A1
20080138774 Ahn Jun 2008 A1
20080143693 Schena Jun 2008 A1
20080150911 Harrison Jun 2008 A1
20080165139 Hotelling et al. Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080202251 Serban et al. Aug 2008 A1
20080238448 Moore et al. Oct 2008 A1
20080248836 Caine Oct 2008 A1
20080251368 Holmberg et al. Oct 2008 A1
20080252607 De Jong et al. Oct 2008 A1
20080266264 Lipponen et al. Oct 2008 A1
20080286447 Alden et al. Nov 2008 A1
20080291169 Brenner et al. Nov 2008 A1
20080297475 Woolf et al. Dec 2008 A1
20080303796 Fyke Dec 2008 A1
20090002140 Higa Jan 2009 A1
20090002205 Klinghult et al. Jan 2009 A1
20090002328 Ullrich Jan 2009 A1
20090002337 Chang Jan 2009 A1
20090009480 Heringslack Jan 2009 A1
20090015547 Franz Jan 2009 A1
20090028824 Chiang Jan 2009 A1
20090033617 Lindberg et al. Feb 2009 A1
20090066672 Tanabe et al. Mar 2009 A1
20090085878 Heubel et al. Apr 2009 A1
20090106655 Grant et al. Apr 2009 A1
20090115733 Ma et al. May 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090128376 Caine et al. May 2009 A1
20090128503 Grant May 2009 A1
20090129021 Dunn May 2009 A1
20090132093 Arneson et al. May 2009 A1
20090135145 Chen et al. May 2009 A1
20090140989 Ahlgren Jun 2009 A1
20090160813 Takashima et al. Jun 2009 A1
20090167508 Fadell et al. Jul 2009 A1
20090167509 Fadell et al. Jul 2009 A1
20090167567 Halperin et al. Jul 2009 A1
20090167677 Kruse et al. Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090174673 Ciesla Jul 2009 A1
20090174687 Ciesla et al. Jul 2009 A1
20090181724 Pettersson Jul 2009 A1
20090182501 Fyke et al. Jul 2009 A1
20090195512 Pettersson Aug 2009 A1
20090207148 Sugimoto et al. Aug 2009 A1
20090215500 You et al. Aug 2009 A1
20090231305 Hotelling Sep 2009 A1
20090243998 Wang Oct 2009 A1
20090250267 Heubel Oct 2009 A1
20090289922 Henry Nov 2009 A1
20090303022 Griffin et al. Dec 2009 A1
20090309616 Klinghult Dec 2009 A1
20100043189 Fukano Feb 2010 A1
20100045613 Wu et al. Feb 2010 A1
20100073241 Ayala Vazquez et al. Mar 2010 A1
20100078231 Yeh Apr 2010 A1
20100079404 Degner et al. Apr 2010 A1
20100097323 Edwards Apr 2010 A1
20100103116 Leung et al. Apr 2010 A1
20100103137 Ciesla et al. Apr 2010 A1
20100109486 Polyakov et al. May 2010 A1
20100121928 Leonard May 2010 A1
20100141608 Huang et al. Jun 2010 A1
20100142516 Lawson et al. Jun 2010 A1
20100162109 Chatterjee et al. Jun 2010 A1
20100171719 Craig Jul 2010 A1
20100171720 Craig et al. Jul 2010 A1
20100177050 Heubel Jul 2010 A1
20100182245 Edwards Jul 2010 A1
20100232107 Dunn Sep 2010 A1
20100237043 Garlough Sep 2010 A1
20100295820 Kikin-Gil Nov 2010 A1
20100296248 Campbell et al. Nov 2010 A1
20100298032 Lee et al. Nov 2010 A1
20100302199 Taylor Dec 2010 A1
20100321335 Lim et al. Dec 2010 A1
20110001613 Ciesla Jan 2011 A1
20110011650 Klinghult Jan 2011 A1
20110012851 Ciesla et al. Jan 2011 A1
20110018813 Kruglick Jan 2011 A1
20110029862 Scott et al. Feb 2011 A1
20110043457 Oliver et al. Feb 2011 A1
20110060998 Schwartz et al. Mar 2011 A1
20110074691 Causey et al. Mar 2011 A1
20110120784 Osoinach et al. May 2011 A1
20110148793 Ciesla et al. Jun 2011 A1
20110148807 Fryer Jun 2011 A1
20110157056 Karpfinger Jun 2011 A1
20110157080 Ciesla et al. Jun 2011 A1
20110163978 Park et al. Jul 2011 A1
20110175838 Higa Jul 2011 A1
20110175844 Berggren Jul 2011 A1
20110181530 Park et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110241442 Mittleman et al. Oct 2011 A1
20110254672 Ciesla et al. Oct 2011 A1
20110254709 Ciesla et al. Oct 2011 A1
20110254789 Ciesla et al. Oct 2011 A1
20120032886 Ciesla et al. Feb 2012 A1
20120038583 Westhues et al. Feb 2012 A1
20120043191 Kessler et al. Feb 2012 A1
20120056846 Zaliva Mar 2012 A1
20120062483 Ciesla et al. Mar 2012 A1
20120080302 Kim et al. Apr 2012 A1
20120098789 Ciesla et al. Apr 2012 A1
20120105333 Maschmeyer et al. May 2012 A1
20120120357 Jiroku May 2012 A1
20120154324 Wright et al. Jun 2012 A1
20120193211 Ciesla et al. Aug 2012 A1
20120200528 Ciesla Aug 2012 A1
20120200529 Ciesla Aug 2012 A1
20120206364 Ciesla et al. Aug 2012 A1
20120218213 Ciesla Aug 2012 A1
20120218214 Ciesla et al. Aug 2012 A1
20120223914 Ciesla et al. Sep 2012 A1
20120235935 Ciesla Sep 2012 A1
20120242607 Ciesla et al. Sep 2012 A1
20120306787 Ciesla et al. Dec 2012 A1
20130019207 Rothkopf et al. Jan 2013 A1
20130127790 Wassvik May 2013 A1
20130141118 Guard Jun 2013 A1
20130215035 Guard Aug 2013 A1
20140043291 Ciesla Feb 2014 A1
20140160063 Yairi et al. Jun 2014 A1
20140160064 Yairi et al. Jun 2014 A1
Foreign Referenced Citations (25)
Number Date Country
1260525 Jul 2000 CN
1530818 Sep 2004 CN
1882460 Dec 2006 CN
10255106 Sep 1998 JP
H10255106 Sep 1998 JP
2006268068 Oct 2006 JP
2006285785 Oct 2006 JP
2009064357 Mar 2009 JP
20000010511 Feb 2000 KR
100677624 Jan 2007 KR
2004028955 Apr 2004 WO
2008037275 Apr 2008 WO
2009002605 Dec 2008 WO
2009044027 Apr 2009 WO
2009067572 May 2009 WO
2009088985 Jul 2009 WO
2010077382 Jul 2010 WO
2010078596 Jul 2010 WO
2010078597 Jul 2010 WO
2011003113 Jan 2011 WO
2011087816 Jul 2011 WO
2011087817 Jul 2011 WO
2011112984 Sep 2011 WO
2011133604 Oct 2011 WO
2011133605 Oct 2011 WO
Non-Patent Literature Citations (5)
Entry
“Sharp Develops and Will Mass Produce New System LCD with Embedded Optical Sensors to Provide Input Capabilities Including Touch Screen and Scanner Functions,” Sharp Press Release, Aug. 31, 2007, 3 pages, downloaded from the Internet at: http://sharp-world.com/corporateinews/070831.html.
Jeong et al., “Tunable Microdoublet Lens Array,” Optical Society of America, Optics Express; vol. 12, No. 11. May 31, 2004, 7 Pages.
Preumont, A. Vibration Control of Active Structures: An Introduction, Jul. 2011.
Essilor. “Ophthalmic Optic Files Materials,” Essilor International, Ser 145 Paris France, Mar. 1997, pp. 1-29, [retrieved on Nov. 18, 2014]. Retrieved from the internet. URL: <http://www.essiloracademy.eu/sites/default/files/9.Materials.pdf>.
Lind. “Two Decades of Negative Thermal Expansion Research: Where Do We Stand?” Department of Chemistry, the University of Toledo, Materials 2012, 5, 1125-1154; doi:10.3390/ma5061125, Jun. 20, 2012 pp. 1125-1154, [retrieved on Nov. 18, 2014]. Retrieved from the internet. URL: <https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=materials-05-01125.pdf>.
Related Publications (1)
Number Date Country
20140043291 A1 Feb 2014 US
Provisional Applications (2)
Number Date Country
61223003 Jul 2009 US
61303214 Feb 2010 US
Continuations (2)
Number Date Country
Parent 13465737 May 2012 US
Child 14054527 US
Parent 12830426 Jul 2010 US
Child 13465737 US