Dynamic tactile interface

Information

  • Patent Grant
  • 9557915
  • Patent Number
    9,557,915
  • Date Filed
    Thursday, September 3, 2015
    9 years ago
  • Date Issued
    Tuesday, January 31, 2017
    8 years ago
Abstract
A system that detects and transitions a configuration of a dynamic tactile interface coupled to a computing device incorporating a touchscreen. The dynamic tactile interface includes a tactile layer and a substrate, wherein the tactile layer defines a tactile surface, a deformable region, and a first region adjacent the deformable region and coupled to the substrate opposite the tactile surface. A variable volume fluidly coupled to a fluid channel and a displacement device that transitions the deformable region from a retracted setting into an expanded setting in response to actuation of an input actuator coupled to the displacement device. A sensor detects a change in a position of the input actuator. In response to the change in the position, the configuration of the deformable regions of the dynamic tactile interface is correlated to a rendered graphical user interface on the touchscreen.
Description

This application is related to U.S. patent application Ser. No. 12/319,334, filed on 5 Jan. 2009; U.S. patent application Ser. No. 11/969,848, filed on 4 Jan. 2008; and U.S. patent application Ser. No. 13/414,602, filed on 7 Mar. 2012, all of which are incorporated in their entireties by this reference.


TECHNICAL FIELD

This invention relates generally to touch-sensitive displays, and more specifically to a new and useful dynamic tactile interface in the field of touch-sensitive displays.





BRIEF DESCRIPTION OF THE FIGURES


FIG. 1 is a flowchart representation of one embodiment of the invention;



FIG. 2 is a flowchart representation in accordance with one implementation of method S100;



FIG. 3A-B are flowchart representations in accordance with one implementation of method S100;



FIGS. 3C-E are schematic representations in accordance with one implementation of method S100;



FIG. 4 is a schematic representation in accordance with one implementation of method S100;



FIG. 5 is a flowchart representation in accordance with one implementation of method S100; and



FIG. 6 is a flowchart representation in accordance with one implementation of method S100.



FIGS. 7A-B are schematic representations in accordance with one implementation of method S100.



FIGS. 8A-C are schematic representations in accordance with one implementation of method S100.





DESCRIPTION OF THE EMBODIMENTS

The following description of the embodiment of the invention is not intended to limit the invention to these embodiments, but rather to enable any person skilled in the art to make and use this invention.


1. Method and Applications


As shown in FIG. 1, a method S100 for detecting and transitioning a configuration of a dynamic tactile interface coupled to a computing device incorporating a touchscreen, wherein the dynamic tactile interface includes an tactile layer and a substrate, the tactile layer defines a tactile surface, a deformable region, and a first region adjacent the deformable region and coupled to the substrate opposite the tactile surface, the deformable region cooperates with the substrate form a variable volume filled with a mass of fluid, the variable volume fluidly coupled to a fluid channel and a displacement device that transitions the deformable region from a retracted setting into an expanded setting in response to actuation of an input actuator coupled to the displacement device, the deformable region substantially flush with the first region in the retracted setting and tactilely distinguishable from the first region in the expanded setting. Method S100 includes: at a sensor, detecting a change in a position of the input actuator; in response to the change in the position, correlating the configuration of the deformable regions of the dynamic tactile interface to a rendered graphical user interface on the touchscreen; and, in response to the change in position, correlating the position to a sensitivity of the touchscreen.


Method S100 can function to control the configuration of the graphical user interface of the touchscreen and the sensitivity of the touchscreen based on the detected change in position of the input actuator and an interpreted configuration of deformable regions. Generally, method S100 functions to detect manual actuation of the displacement device and, in response to detected manual actuation to the displacement device, to manipulate the rendered images displayed on the computing device display and the sensitivity of the touchscreen to accommodate for the change in the configuration of the dynamic tactile interface.


2. Dynamic Tactile Interface


The dynamic tactile interface can include and/or interface with an tactile layer including a substrate, the tactile layer including a deformable region and a first region, the first region adjacent the deformable region and coupled to the substrate opposite the tactile layer, and the deformable region cooperating with the substrate to form a variable volume filled with a mass of fluid. Generally, the tactile layer defines one or more deformable regions operable between expanded and retracted settings to intermittently define tactilely distinguishable formations over a surface, such as over a touch-sensitive digital display (e.g., to form a touchscreen), such as described in U.S. patent application Ser. No. 13/414,589.


Generally, the dynamic tactile interface can couple to a computing device (e.g., a smartphone), to provide intermittent tactile guidance to a user entering an input into an input region on the device, such as described in U.S. patent application Ser. No. 13/414,589. For example, the tactile layer can be integrated into or applied over a touchscreen of a mobile computing device to provide tactile guidance to a user interacting with the touchscreen to control the device. The tactile layer can include a deformable region, which can be planar or flush with the first region in the retracted setting and can be raised above (i.e., offset above) the first region to define a tactilely distinguishable feature on the tactile layer in the expanded setting. The deformable region can coincide with (i.e., be arranged over) an input key rendered with a graphical user interface on a touch-sensitive display of the device, the deformable region mimicking a (raised) physical hard key (or guide, etc.) in the expanded setting and thus tactilely guiding user entry of the corresponding input key into the device. The deformable region can then be retracted to yield a flush, smooth, and/or continuous surface with substantially minimal optical distortion across both the first region and the deformable region. For example, a user can manually actuate the displacement device just before providing an input on the touchscreen, and the displacement device can thus transition the deformable region into the expanded setting to provide tactile guidance to the user during entry of inputs onto the touchscreen. The user can then actuate the displacement device to transition the deformable region back into the retracted setting when the user no longer desires tactile guidance across the tactile layer or is no longer providing inputs to the touchscreen such that the deformable region returns to substantially flush with the first region, thereby yielding reduced optical distortion of an image output by the touch-sensitive display and transmitted through the tactile layer.


In particular, the dynamic tactile interface, as described in U.S. patent application Ser. Nos. 11/969,848, 13/414,589, 13/456,010, 13/456,031, 13/465,737, and 13/465,772, which are incorporated in their entireties by this reference, can also incorporate additional components that define the displacement device and cooperate with the dynamic tactile interface to displace fluid into and out of a bladder in order to expand and retract one or more deformable regions of the dynamic tactile interface. The bladder can be coupled to the fluid channel and the displacement device. Thus, the displacement device can displace fluid from the bladder into the fluid channel and the variable volume, thereby expanding the deformable region into the expanded setting. Additionally, the displacement device can displace fluid from the fluid channel into the bladder, thereby drawing the deformable region into the retracted setting.


The displacement device of the dynamic tactile interface can include a bladder and a displacement device actuator, which includes a platen that compresses the bladder in response to actuation (e.g., translation, rotation, depression, etc.) of the input actuator. The displacement device actuator can be a rotary actuator, a linear slide actuator, and/or any other actuator (e.g., pump) suitable for actuating the displacement device, as described in U.S. patent application Ser. No. 14/081,519, filed 15 Nov. 2013, which is incorporated in their entireties by this reference. Generally, the displacement device can function to displace fluid from the bladder into the variable volume, such as via a fluid channel, to transition the deformable region adjacent the variable volume from the retracted setting into the expanded setting. For example, the deformable region can be flush with an adjacent first region in the retracted setting and can be offset above and tactilely distinguishable from the first region in the expanded setting. The displacement device can also transition the deformable region from the expanded setting into the retracted setting. For example, the platen can expand (e.g., stretch) the bladder in response to actuation of the displacement device actuator in a second direction opposite the first direction to draw fluid from the variable volume back into the bladder via the fluid channel. The bladder of the displacement device can therefore be coupled to the variable volume of the tactile layer via a fluid channel.


The displacement device can include a sensor or a set of sensors coupled to (e.g., embedded, bonded to, adjacent, etc.) the displacement device and/or the displacement device actuator. The sensor or set of sensors can include any one or combination of sensors that detect position data of the input actuator and/or the displacement device, such as optical sensors, capacitive sensors, magnetic field sensors, ultrasonic sensors, piezo-electric sensors, limit switches, encoders, inductive sensors, potentiometers, etc. Therefore, the dynamic tactile interface can also include a wireless or wired communication module (e.g., a physical data port of a Bluetooth module) that communicates a state (or other output) of the displacement device sensor into the computing device, such as over wireless or wired communication protocol.


The dynamic tactile interface can be integrated into an aftermarket housing for a computing device, wherein the tactile layer can be applied over a touchscreen of the computing device. The dynamic tactile interface can include an input actuator that displaces fluid into and out of the fluid channel in order to transition the deformable regions between expanded and retracted settings. The input actuator can include a sensible marker, such as a magnet, remotely detectable by a sensor integrated into the computing device to remotely determine a position of the actuator and therefore a position of the deformable region. The input actuator can additionally or alternatively include a sensible finger (e.g., protrusion) that contacts the touchscreen and is detectable to sensors integrated into the touchscreen. For example, in a computing device with a capacitive touchscreen, the sensible finger can be a capacitive member that slides along an edge of the touchscreen, and the capacitive touchscreen can detect a change in position of the capacitive member when the actuator is actuated. Alternatively, the sensible finger can include a resistive element that interacts with a resistive touchscreen or can include an optically-detectable element that interacts with an optical touchscreen to communicate a position of the actuator into the computing device. The dynamic tactile interface can include one or more input actuators with each input actuator actuating a set of deformable regions actuated independently from other sets of deformable regions, and each actuator can include a sensible marker detectable by the computing device without any wired or wireless connection with the aftermarket housing or the dynamic tactile interface. Alternatively, the input actuators can include detectable by the computing device through wired connection between the dynamic tactile interface and the computing device.


In one example, the dynamic tactile interface can be integrated into an aftermarket housing for a computing device, such as a mobile phone, a tablet, a gaming controller, etc., wherein an tactile layer can be applied over a touchscreen of the computing device. In one example, the input actuator can be a rotary actuator. The rotary actuator can operate the dynamic tactile interface by rotating, thereby expanding or retracting the deformable region(s) of the tactile layer. The rotary actuator can include a disk that a user rotates to actuate the displacement device and a marker integrated into a location on the disk radially offset from the center of the disk. With the aftermarket housing assembled over the computing device, a remote sensor, integrated within the computing device, remotely detects when the marker rotates past the remote sensor and/or a position of the marker relative to the remote sensor. Thus, the sensor remotely detects an arcuate position of the marker. The computing and/or dynamic tactile interface device may include a processor, memory, a display, and other components that are typically found on computers such as desktop computers, laptop computers, computing devices in automobiles, tablet computers, mobile devices, and smart phones. The memory may include one or more programs or code which may be executed by the processor to perform tasks and operations including rendering or causing to be rendered a graphical user interface on the display and updating the graphical user interface based on signals and/or other data received from one or more sensors and input devices. The processor within the computing device can interpret data from the sensor to determine an arcuate position of the marker and, thus, a position of the deformable region of the dynamic tactile interface. The computing device can then modify a graphical user interface rendered on the display of the computing device and/or modify a function (e.g., sensitivity) of the touch sensor within the computing device according to the position of the input actuator and, thus, the configuration of the dynamic tactile layer.


In various examples, the dynamic tactile interface can be integrated into a case, peripheral, or aftermarket peripheral for a tablet, a smartphone, a laptop computer, a desktop computer, a personal data assistant (PDA), a personal music player (e.g., MP3 player), or other computing device. The dynamic tactile interface can also be incorporated into or arranged over an existing automotive dashboard touch-sensitive display or console, a television, a personal navigation device, a watch, a home stereo system interface, a lighting or thermostat control system, a machine tool controller, a computer mouse, a computer touchpad, a keyboard or keypad, a gaming controller or console, cooking equipment, or any other suitable electronic and/or digital computing device.


3. Applications


In one example application of method S100, the dynamic tactile interface is integrated into an aftermarket housing for a computing device, wherein the dynamic tactile interface is arranged over a touchscreen incorporated in the computing device. The displacement device of the dynamic tactile interface can include two input actuators: the first input actuator controlling a first set of deformable regions defining an alphanumeric keyboard in a portrait layout, the second input actuator actuating a second set of deformable regions defining an alphanumeric keyboard in a landscape layout. The first and second input actuators can each include a capacitive finger that slides along an edge of the capacitive touchscreen of the computer. Thus, a capacitive touch sensor integrated into the computing device (e.g., a touchscreen) can detect changes in positions of the first and second input actuators. For example, the first input actuator can slide vertically along a vertical edge of the touchscreen and the second input actuator can slide horizontally along the horizontal edge of the touchscreen. The first and second input actuators are each coupled to separate displacement devices and independently actuate respective sets of deformable regions. Alternatively, the input actuators can control a same displacement device and actuate valves controlling which set of deformable regions are actuated due to actuation of the displacement device. In response to detection of the first input actuator in an expanded setting position, a processor within the computing device updates a graphical user interface rendered on the display of the computing device to depict a portrait alphanumeric keyboard and locks the graphical user interface in a portrait display mode. Method S100 can further increase the sensitivity of the capacitive touch sensor proximal or coincident the first set of deformable regions now elevated over keys of the portrait alphanumeric keyboard. Furthermore, in response to detection of the second input actuator in an expanded setting position, the processor can update the graphical user interface rendered on the touchscreen to depict a landscape alphanumeric keyboard and then lock the graphical user interface in a landscape display mode while the deformable regions are in the expanded setting. Method S100 can also increase the sensitivity of the capacitive touch sensor proximal or coincident the second set of deformable regions now elevated over keys of the landscape alphanumeric keyboard. Alternatively, the graphical user interface can change color of an image rendered by the display at a location coincident the first set of deformable regions.


In one example application of method S100, the dynamic tactile interface is arranged over a touchscreen incorporated in the computing device. The displacement device includes a linear slide actuator as described in U.S. patent application Ser. No. 14/081,519, which is incorporated in its entirety by this reference. The linear slide actuator can include an integrated solenoid, which generates a magnetic field detectable by a magnetically coupled magnetic field sensor. The magnetic field sensor can be integrated into the computing device, wherein the magnetic field sensor remotely detects proximity of the solenoid and, thus, the linear slide actuator. Thus, if a track on which the linear slide actuator slides includes an array of magnetic field sensors, as the linear slide actuator travels along the track, the magnetic field sensors can detect (e.g., triangulate) the position of the linear slide actuator on the track. A processor integrated into the computing device and coupled to the magnetic field sensors can correlate the position of the linear slide actuator to a configuration of expanded deformable regions, such as a keyboard layout (e.g., a portrait alphanumeric keyboard). The processor integrated into the computing device can also remotely correlate the position of the linear slide actuator, which is detected remotely by a sensor integrated into the computing device, to the graphical user interface rendered by the tablet on the touchscreen, such that the graphical user interface corresponds to an image of an alphanumeric keyboard (e.g., a portrait QWERTY graphical keyboard) with the keys of the keyboard corresponding to the deformable regions of the configuration of expanded deformable regions. Likewise, another processor within the computing device can reconfigure one or more touch sensors integrated into the touchscreen of the tablet such that portions of the touch sensor corresponding to (i.e., arranged below) the deformable regions of the configuration of expanded deformable regions have increased sensitivity to contact on the tactile surface and/or such that portions of the touch sensor corresponding to first regions of the tactile layer adjacent the deformable regions exhibit decreased sensitivity to contact on the tactile surface. Thus, by reconfiguring input sensitivities across the touch sensor, method S100 can function to substantially ensure that inputs on the tactile surface can be detected by the touch sensor even over deformable regions in the expanded setting and/or to lessen detection of inadvertent or incidental inputs (e.g., typographical errors) into the computing device. Additionally, when the linear slide actuator translates from a first position to a second position, a Block of the method can cooperate with a sensor (e.g., a magnetic field sensor) within the computing device to detect this change, to correlate the second position with a particular configuration of expanded deformable regions across the tactile layer, and to modify or update a graphical user interface (e.g., a gaming interface, a landscape alphanumeric keyboard) rendered on the display of the computing device.


4. Actuation Detection


Block S110 of method S100 includes, at a sensor, detecting a change in a position of the input actuator. Generally, Block S110 functions to detect a position and/or an extent of actuation of the displacement device. Thus, Block S110 can continuously or intermittently detect a displacement of the input actuator and/or a magnitude of displacement of the input actuator. For example, Block S110 can detect a magnitude of linear displacement of the input actuator of a linear slide displacement device, a magnitude of arcuate displacement of the input actuator of a rotary displacement device, and/or a magnitude of a mass and/or volume of fluid displaced by the displacement device.


In one implementation of Block S110, a sensor can detect an extent of actuation of the input actuator. The sensor can be integrated into the computing device and, therefore, remotely (i.e., without a wired or wireless connected over with digital data is transmitted) detect movement of the input actuator. Alternatively, the sensor can be coupled to the displacement device, the displacement device actuator and can detect the motion or change of position of the input actuator. In this implementation, the sensor can detect the magnitude of the change in position of the input actuator.


In one example, Block S110 includes detecting a magnetic field generated by a magnet coupled to (e.g., embedded in) the input actuator with a three-axis vector magnetometer integrated into the computing device (e.g., a mobile phone with a built in magnetometer). The vector magnetometer can detect changes in a magnetic field resulting from the change in position of the magnet integrated into the input actuator. Block S110 can detect the change in position of the input actuator with any other magnetometer suitable to detect the change in position of the input actuator. Block S110 can interpret changes in the detected magnetic field as movement of the input actuator at a processor arranged within the computing device. Thus, the processor can determine a change in position and/or an absolute position of the input actuator based on a change in the detected magnetic field of the magnet integrated into the input actuator.


In another example, Block S110 includes detecting the change of position of the input actuator using sensors integrated into the touchscreen of the computing device. For example, for a capacitive touchscreen, the input actuator can include a capacitive finger that extends from the input actuator to an edge of the touchscreen. As the input actuator moves, the finger slides across the edge of touchscreen. Thus, the capacitive sensors in the touchscreen can detect presence and movement of the capacitive finger along the edge of the touchscreen. Block S110 can interpret movement of the capacitive finger at the edge of the touchscreen as the change in position of the input actuator and differentiate capacitive inputs at the edge of the touchscreen by the capacitive finger from an input by a user at other locations on the touchscreen.


5. Actuation Response


As shown in FIG. 2 Block S120 of method S100 includes rendering a graphical user interface on the touchscreen, the graphical user interface corresponding to the configuration of the dynamic tactile interface, and/or modifying a sensitivity of a portion of the touch sensor that corresponds to the configuration of the deformable regions. Generally, Block S120 of method S100 functions to render a suitable graphical user interface to modify the tactile sensitivity of touch sensors of the touchscreen of the computing device in response to the detected change in the position of the displacement device.


One implementation of Block S120 includes rendering a graphical user interface corresponding to the configuration of deformable regions of the tactile layer. For example, the graphical user interface can render images of input keys, such as on a virtual keyboard, such that the images of input keys are coincident with the deformable regions. Therefore, the dynamic tactile interface provides tactile guidance aiding selection of an input key through depression of the deformable region coincident with an image of the input key.


An example of the preceding implementation of Block S120 includes rendering a capitalized keyboard in response to transition of the deformable regions in a keyboard layout from an expanded setting into a more expanded setting, wherein the deformable regions are elevated higher above the first region in the more expanded setting than the deformable regions in the expanded setting. In this example, the deformable regions in the expanded setting correspond to a graphical user interface that depicts an alphanumeric keyboard of lowercase letters and numbers. In response to expansion of the deformable regions to the more expanded setting, the computing device renders on the touchscreen a graphical user interface yielding an uppercase keyboard of letters and numbers. Alternatively, the computing device can render a graphical user interface yielding a foreign keyboard layout (e.g., with German or Chinese characters).


Another example of the preceding implementation of Block S120 includes rendering a graphical user interface with brighter and more contrasted images of input keys. The modification of the input keys in the rendered graphical user interface may be provided in response to a signal received from a sensor and cause the keys to be easier to perceive through the deformable region in the expanded setting. The modifications to the rendered graphical interface may help to compensate for distortion of the images due to refraction of the images across fluid and the tactile layer. As described in U.S. patent application Ser. No. 14/320,041, which is incorporated in its entirety by this reference, the graphical user interface can also modify a portion of the images by shifting a set of pixels to black or a lower intensity and the location of the images on the touchscreen.


Another example of, in response to a sensor signal, modifying a rendered graphical user interface to be easier to perceive through a deformable region includes rendering a graphical user interface with space between the keys or other input buttons optimized for use with the fluid and tactile layer. For instance, the space between the keys could be provided as a particular color that made it easier to discern the location of the dynamic keys while in a raised position, or a change in the underlying graphical image around and beneath the raised dynamic key that reduced or compensated for distortion of images of the keys or buttons due to refraction of the key or button images, or the refraction of the space between the keys or buttons, across the fluid or tactile layer. In particular, the space between keys, buttons, or other rendered images associated with a region to receive input from a user might be provided as black.


Another example of the preceding implementation of Block S120 includes rendering an image with input keys that are coincident with the deformable regions in the configuration of the deformable regions. For example, Block S120 can render a portrait alphanumeric keyboard in response to transition of the configuration of deformable regions into a portrait layout, as shown in FIG. 3A.


Another example of the preceding implementation of Block S120 includes rendering a series of screen images with input keys rendered in the same location in a display, where the rendered locations are coincident with deformable regions. For example, Block S120 can render a first portrait alphanumeric keyboard in a first rendered page in response to transition of the configuration of deformable regions into a portrait layout, as shown in FIG. 3A. Block S120 can subsequently render a second page with one or more buttons, or a second keyboard, the one or more buttons or second keyboard rendered such that the buttons or keys are positioned to correlate to deformable regions associated with the first rendered keyboard. In this example, separate and subsequent screens can be rendered with buttons, keyboard images, gaming interfaces, volume controls, camera shutter button images, etc. that correspond to a single set of deformable regions.


For example, Block S120 can render a portrait alphanumeric keyboard in response to transition of the configuration of deformable regions into a portrait layout, as shown in FIG. 3A. Block S120 can also render layout alphanumeric keyboard images, gaming interfaces, volume controls, camera shutter button images, etc. coincident with deformable regions of the configuration of deformable region in response to transition of the configuration of the deformable regions, as shown in FIG. 4.


Another example of the preceding implementation of Block 120 includes rendering an image with input keys that are coincident with the deformable regions in the configuration of the deformable regions, wherein the deformable regions may be configured when an actuator is moved out of a central position. FIG. 3B illustrates a slider that is initially positioned in the middle of a linear track. The slide may be moved in either direction along the track. The slider operates as an actuator to move a plunger or other object towards one of the two bladders, wherein a bladder may be located at each end of the track or at other locations along the track. Additionally, the bladder and/or any displacement device, as well as any plunger that manipulates the bladder, may be located somewhere other than at an end of the track or along the track


In some instances, a signal from a sensor may indicate whether the slider is in a position that correlates to a particular rendered display. For example, when a slider is positioned in the middle 310 of track 320, a signal may be sent to indicate that no buttons, keyboards, or other graphical image should be rendered to correlate to a deformable region of the dynamic tactile layer. In some instances, however, when the slider is positioned in the middle 310 of the track 320, there may be no signal provided to the display device and, as a result, no buttons, keyboards, or other graphical image rendered in the display that correlates to a deformable region of the dynamic tactile layer. When input is received to manually displace the slider from the center position 310 to a first end 311 of the track, the slider may engage a bladder located at the location 311 to move fluid to a first set of deformable regions, thereby providing a set of deformable regions to enter an expanded state. Additionally, one or more sensors located at position 311 may provide a signal to the display device that result in one or more buttons, keyboards, or other graphical user interface elements that are selectable through the tactile interface deformable regions, for example a keyboard 331 in a portrait orientation. The deformable regions in the expanded state may be positioned to correspond with the rendered graphical user interface elements such that when the deformable regions receive a force applied by a user, either through the user's finger, a stylus, or other input, the input will be received in a location that corresponds to the rendered graphical user interface element. In some implementations, one or more sensors can be placed at various locations other than position 311, and can operate to detect motion of the slider into position 311.


When input is received to manually displace the slider from the center position 310, 311 or any other location to a second end 312 of the track, the slider may engage a bladder located at the location 312 to move fluid to a second set of deformable regions, thereby causing the set of deformable regions to enter an expanded state. One or more sensors located at position 312 may provide a signal to the display device that result in one or more buttons, keyboards, or other graphical user interface elements that are selectable through the tactile interface deformable regions, for example a keyboard 332 in a landscape orientation. The rendered graphical user interface elements may be positioned to correspond with the deformable regions in the expanded state such that when the deformable regions receive a force applied by a user, either through the user's finger, a stylus, or other input, the input will be received in a location corresponding to the rendered graphical user interface element.


In some instances, when the slider is moved from a first position to a second position between an at-rest or off position and an end point along a track for moving the slider, the sensitivity of a touch sensor associated with a deformable region may be adjusted. The computing device may detect the location of the slider set by the user and infer the height of the deformable regions and the corresponding optimized sensitivity level. A graphical user interface may provide an indicator that illustrates a user preference for the sensitivity based on the detected location of the slider, such that the user may move the slider along the track and watch the indicator provide information that indicates whether the slide indicator is at the position associated with the user preference. The indicator may be a color, a numerical indicator, a rendered slide, a dial, or some other graphical icon rendered as part of the graphical user interface and dynamically updated as the user moves the slider. In some instances, input actuators other than a slider may be used to adjust the height of a deformable region and may be detected by sensors to infer the height of the deformable regions, such as for example a dial, one or more push buttons, and other input mechanisms.


In some implementations, one or more sensors may be utilized with a sliding or other actuator to provide signals to an underlying display device that vary with the degree of movement or other input of the actuator. For example, a slider actuator may be configured with sensors that extend along the track upon which the slider moves. As the slider moves along the track, the sensors may detect the different positions of the slider. When a slider actuator is half way between a rest position and an end point of the track, the deformable regions may expand half as much as they are configured to do so when the slider actuator is positioned at the end point of the track. In particular, the slider may mechanically actuate a peristaltic pump that displaces fluid and causes the deformable regions to expand or retract incrementally as the slider is moved. Similarly, sensors may be used to adjust the graphical user interface rendered in the display based on a position of the slider actuator. For example, the graphical user interface may include an indicator that communicates the amount a slider actuator has been moved. This may be useful to determine the accuracy of the sensors and a confirmation of what elements the graphical user interface is providing. In particular, when a slider actuator is moved half way from an at rest position to a particular track end, a rendered graphical user interface may include a lower case keyboard and an indication that the slider is only moved half way towards the track end, and when the slider is moved completely to the end of the particular track the rendered graphical user interface may include an upper case keyboard along with an indication that slider actuator is moved completely to the end of the track. In some instances, moving a slider part way may be used to change the color of certain keys of a keyboard, such as for example by changing the color of numeric keys when the slide is moved half way towards an end point.


In some implementations, as shown in FIGS. 3C, 3D, and 3E, more than one actuator such as a slider may be implemented to communicate with a display device. FIG. 3C illustrates a dynamic tactile interface having two sliders positioned next to each other. The two sliders may be used to configure different elements. For example, a first slider may be used to control the sensitivity for a set of deformable regions as well as particular sets of deformable regions that are provided through a tactile interface, such as a keyboard in portrait configuration when the first slider is moved to towards a first end of the track, or a keyboard in landscape configuration when the first slider is moved towards the other end of the track, the sensitivity of each keyboard depending on how far the slider is moved from the center of the track to the particular end of the track associated with the particular keyboard. The other slide may be used to provide input for configuring a rendered graphical user interface, such as for example providing subset of highlighted keys in a keyboard, a brightness, contrast or intensity of the graphical user interface, providing a particular language of content, and other graphical user interface configurations. When the sliders are positioned next to each other, they may be manipulated by the same finger or fingers from the same hand by the user. The sliders may also be positioned apart from each other, such as for example on opposite sides of the device. FIG. 3D illustrates a dynamic tactile interface having two slider located on opposite sides of the front of a display device. The sliders may have different functions, such as configuring deformable region sensitivity and configuring a graphical user interface. FIG. 3E illustrates a single slide having a first track in a first direction and a second track in a second direction. The tracks in the dynamic tactile layer of the device of FIG. 3E are perpendicular to each other and meet at a particular point, forming an “L” shaped track. A single slider is configured to move along both tracks, in a first back and forth direction on the first track and in a second, perpendicular back and forth direction in the second track. Each track segment (each straight portion of the track) may have a different function, such as configuring deformable region sensitivity, configuring a graphical user interface, providing different sets of keys, and other functions.


Block S120 can also render layout alphanumeric keyboard images, gaming interfaces, volume controls, camera shutter button images, etc. coincident with deformable regions of the configuration of deformable region in response to transition of the configuration of the deformable regions, as shown in FIG. 4. The actuator on of the dynamic tactile interface of FIG. 4 is implemented as a dial. The dial may include one or more plungers that, when travelling in a circular motion as the dial is rotated or turned, may compress one or more bladders positioned along a portion of the circular path traveled by the plunger as the dial is turned. In some implementations, as the dial is rotated or turned, one or more valves may be opened in addition to compressing one or more bladders. For example, when the dial is rotated or turned to engage a valve and bladder, the compressed bladder may force fluid through the open valve and into the deformable region, setting the deformable region to an expanded state. In some implementations, the dial may engage multiple bladders located near or away from the dial, as well as one or more valves located near or away from the dial. As the dial is turned, it may cause a single bladder to become increasingly compressed or it may cause one or more additional bladders to become compressed.


The different bladders placed in the circular path of the plunger may force fluid into different sets of deformable regions. For example, a first bladder in the circular path may cause a set of deformable regions associated with a set of audio playback control to enter an expanded state, a second bladder in the circular path may cause a set of deformable regions associated with website navigation to enter an expanded state, a third bladder in the circular path may cause a set of deformable regions associated with a keyboard in portrait mode to enter an expanded state, and a fourth bladder in the circular path may cause a set of deformable regions associated with a keyboard in landscape mode to enter an expanded state.


In some implementations, the actuator may be implemented as one or more buttons rather than a slider. Each button may be configured to open or close a valve, as well as compressing one or more bladders. When the button is depressed, fluid may flow from the compressed bladder through the open valve and into the deformable region, setting the deformable region to an expanded state. The button may stay in place until depressed again, for example utilizing a similar double-click closing mechanism as a clickable pen (with cams, springs, and other elements) In some implementations, pressing one or more of multiple buttons may open or close one or more valves while moving a slider or applying a force to another input actuator may cause fluid to be moved through the opened one or more valves.


Sensors within the dynamic tactile layer may detect the current location of the dial and communicate the location via one or more signals from the one or more sensors to the display device. The display device may provide an interface that correlates to the particular set of deformable regions, such as an audio playback control interface (shown in FIG. 4), website navigation interface, a keyboard interface in a portrait mode, a keyboard interface in a landscape mode, and other interfaces.


One variation of the method S100 includes Block S130, which recites, in response to the change in position, correlating the change in position to a sensitivity of the touchscreen and/or rendering a graphical user interface corresponding to the configuration of the dynamic tactile interface. Generally, Block S130 can function to modify the sensitivity of touch sensors integrated into the touchscreen of the computing device to accommodate for changes in the configuration of the dynamic tactile interface.


In this variation, touch sensors coincident with the deformable region can increase in sensitivity to compensate for decreased sensation of contact due to the presence of a fluid layer and the tactile layer between a contacting object (e.g., a finger) and the touchscreen. For example, in a computing device with a capacitive touchscreen, Block S130 can increase sensitivity to distortions in the electrostatic field of the touchscreen by dynamically altering the sensor. Additionally or alternatively, Block S130 can alter sensitivity calibrations of the computing device, such that software executing on the computing device registers inputs more readily. For example, inputs that yield less electrostatic distortion of the capacitive touchscreen than can be detected typically by processors in the computing device can still be detected and processed by the computing device when the processors have been reconfigured to detect lesser electrostatic distortion. Additionally or alternatively, Block S130 can reduce sensitivity to inputs on the first region, thereby reducing the risk of incidental inputs (e.g., typographical errors) by lowering sensitivity of areas where inputs are undesirable.


In one example, Block S130 includes increasing the sensitivity of portions of the touchscreen of the computing device coincident the deformable regions and, thus, coincident images of keys of a keyboard. Thus, Block S130 functions to increase detection of inputs at desired input regions (e.g., keys of the keyboard). Block S130 can additionally or alternatively decrease sensitivity of portions of the touchscreen coincident the first regions (e.g., areas surrounding the keys).


In another example, Block S130 includes increasing sensitivity of portions of the touchscreen coincident deformable regions, the increase in sensitivity proportional to the change in position of the input actuator. As the change in position of the input actuator can correspond displacement of fluid from a fluid bladder into the deformable regions and, thus, correspond to a height of the deformable regions relative the first region. As shown in FIG. 5, a height of the tactile surface with respect to the touch screen can be increased. Block S130 can increase sensitivity of areas of the touchscreen relative to an extent of actuation of the displacement device to accommodate for decreased sensitivity introduced by increased distance between the tactile surface of the tactile layer at the deformable regions and the touchscreen when the deformable regions are in the expanded setting.


6. Variations


In another variation of the method S100, Block S110 of includes detecting—with a sensor integrated into the displacement device—the change in position of the input actuator. FIG. 6 illustrates a dynamic tactile interface that includes a set of computing device sensors positioned beneath a tactile layer. In one example of this variation, Block S110 includes intermittently detecting presence or absence of the input actuator or displacement device platen adjacent a sensor in a set of sensors that lie, for example, along the track on which the input actuator of a linear slide displacement device slides. In this example, the sensors can be binary sensors that detect either the presence or the absence of an object (e.g., the displacement device and/or input actuator) in proximity to the binary sensors. When the input actuator contacts a binary sensor, the binary sensor indicates to the processor that the input actuator is contacting the binary sensor. The processor can know a location of the binary sensor and can, thus, calculate a position of the input actuator from the known location of the binary sensor and the detected contact with the binary sensor.


In another example, Block S110 includes detecting the displacement of the displacement device with a set of sensory markers that line (e.g., in series) an external and/or internal surface of the bladder such that, when the displacement device platen compresses the bladder or slides over the bladder, Block S120 can detect the number of sensory markers that have passed a sensor coupled to the displacement device platen in order to calculate the displacement of the displacement device platen. A processor coupled to the sensor can determine which marker of the set of sensory markers is adjacent the sensor. For example, a processor arranged within the computing device and executing Block S110 can determine a magnitude of a displacement of fluid into the tactile layer and thus the height of one or more deformable regions within the tactile layer by storing a number of sensory markers that pass or contact the sensor.


Another implementation of Block S110 includes detecting the mass or volume of fluid displaced from the bladder to the fluid channel(s) and the variable volume(s) as a result of actuating the displacement device. Thus, a mass flow sensor (e.g., a pressure-based, thermal, or optical mass flow meter) can be coupled to the fluid channel and can determine the mass of fluid displaced by detecting a mass flow rate of the fluid being displaced. The mass flow sensor can intermittently detect a set of mass flow rates over a specified period of time. A processor can determine the mass (or volume) of fluid displaced by averaging the set of mass flow rates and multiplying an averaged mass flow rate by the specified period of time. Alternatively, the mass flow sensor can continuously detect the mass flow rate. Block S120 can also detect the mass (or volume) of fluid displaced with a mechanical flow meter, such as a current meter, which detects average flow velocity through detected hydroelectric power output. Block S120 can rely on sensors coupled to the fluid channel, the bladder, and/or the intersection of fluid channel to the bladder. The cross-sectional area of the flow area in which the sensor(s) detect the flow is known and can be used to detect the volume and/or mass of fluid displaced. A density of the displaced fluid can also be measured and/or assumed by the processor to calculate the volume and/or mass of the fluid displaced.


In some implementations, a rendered graphical user interface may be modified when used with a deformable region in the expanded state to reduce or eliminate the effects of refraction or distortion caused by the expanded deformable regions as well as the materials of the tactile layer, fluid and other materials used in the dynamic tactile layer. The substrate, the tactile layer, and the volume of fluid can each be substantially transparent such that images of a graphical user interface rendered on the digital display can be visible to a user through the substrate, tactile layer, and fluid arranged over the digital display. However, the substrate, the tactile layer, and the fluid can each exhibit a refractive index that differs from that of air such that expansion of one or more deformable regions into expanded settings yields variations in thickness across the dynamic tactile layer and thus non-uniform distortion (e.g., refraction) of light output from the digital display through the dynamic tactile layer. In particular, transition of a deformable region of the dynamic tactile layer from the retracted setting into the expanded setting can cause a user to visually detect optical distortion of an image rendered on the digital display. To compensate for the distortion, an image rendered on the digital display can be modified prior to transition of the deformable region into the expanded setting to reduce a user's perceived optical distortion of the image once the deformable region enters the expanded setting. The rendered image, in some implementations, can be systematically refreshed with modifications of the image to compensate for a dynamically changing profile of the dynamic tactile layer throughout transition of the deformable region from the retracted setting into the expanded, and vice versa. Details for modifying a graphical user interface when used with a deformable region are discussed in U.S. patent application Ser. No. 14/320,041, the entirety of which is incorporated herein by reference.


In particular, the method S100 can modify an image and/or refresh the digital display within the computing device to reduce or limit perceived light scattering effects, perceived internal reflection of regions of the image, perceived refraction and/or diffraction of the image, perceived directional or preferential light transmission or emission through the substrate, perceived chromatic dispersion of light transmitted through the dynamic tactile layer, and/or other perceived optical distortions of parallax effects of the displayed image. The image rendered on the digital display may be modified based on the predicted user viewing position and the current position of the deformable region to reduce and/or minimize optical distortion of the image output by the digital display as perceived by the user. In some embodiments, portions of the graphical user interface may be linearly stretched about a predicted point of focus of the user on the digital display, a subregion of the image adjacent a deformable region may be translated based on an angle and distance of the user to the deformable region or to the digital display, and a subregion of the image adjacent the deformable region may be scaled to offset preferential magnification of the subregions of the image by the adjacent deformable region in the expanded setting.


In some implementations, an image rendered on the digital display may include space between one or more keys of a keyboard, and the space between the keyboard may be a particular shade in contrast to the keys themselves, such as for example black shading. FIG. 7A illustrates a keyboard rendered on the display device without any image modification or compensation for refraction or distortion. In particular, the keys on the rendered keyboard 700 have no space between them. FIG. 7B illustrates a keyboard rendered on the display device after a slide 704 has been moved along a track to indicate the keyboard will be used with deformable regions. The keyboard 700 includes a modification for refraction and/or distortion. In particular, the keys on the rendered keyboard 700 have spacing 702 between them. In some implementations, black spacing on a graphical user interface that extends into the normal “button” graphical portion of the graphical user interface to reduce distortion. By providing spacing 702 between the keys on keyboard 700, the results of distortion or refraction caused by the raised deformable region can be reduced.


In some implementations, a support structure may be used to support a portion of the tactile layer and prevent accidently depressions of the tactile layer into the cavity 122. As shown in FIGS. 8A-C, the substrate 120 may include a support member 115 that extends over the cavity 122 and under the particular region of the tactile layer no. When the cavity 122 is expanded and the deformation is present at the deformable region 112, the support member 115 functions to prevent a user from “pressing too far” into the deformation below the plane of the tactile layer 110. When the cavity 122 is not expanded and the deformation is not present in the tactile layer 110, the support member 115 functions to reduce (or potentially eliminate) the user from feeling “divots” in the tactile layer no when swiping a finger across the tactile layer no. As shown in FIG. 8C, the support member 115 can include holes or channels that allow for the expansion of the cavity 115 and the deformation of the deformable region 112. The support member 115 can be integrally formed with the substrate 120, but may alternatively be formed with the tactile layer 110 or may be separately formed and later attached to the substrate 120. Finally, the support member 115 may alternatively partially define the cavity 115. The substrate 120 is preferably rigid, but may alternatively be flexible in one or more directions. The substrate 120—if located above a display—is preferably optically transparent, but may—if located below the display or if bundled without a display—be translucent or opaque. The substrate 120 can be made from a material including polymers or glass, for example, elastomers, silicon-based organic polymers such as poly-dimethylsiloxane (PDMS), thermoset plastics such as polymethyl methacrylate (PMMA), and photocurable solvent resistant elastomers such as perfluropolyethers. The substrate 120 can be made of any suitable material that supports the tactile layer 110 and at least partially defines the cavity 122. In some implementations, the substrate 120 can be a single homogenous layer approximately 1 mm to 0.1 mm thick and can be manufactured using well-known techniques for micro-fluid arrays to create one or more cavities and/or micro channels. In alternative versions, the substrate 120 can be constructed using multiple layers from the same material or from different suitable materials.


As a person skilled in the art will recognize from the previous detailed description and from the figures and the claims, modifications and changes can be made in the foregoing embodiments of the invention without departing from the scope of this invention as defined in the following claims.

Claims
  • 1. A dynamic tactile interface comprising: a substrate;a tactile layer defining a first region coupled to the substrate and a deformable region adjacent the peripheral region;the deformable region cooperating with the substrate to form a variable volume filled with a mass of fluid, the variable volume fluidly coupled to a fluid channel and transitioning from a retracted setting into an expanded setting in response to actuation of an input actuator, the deformable region substantially flush with the first region in the retracted setting and tactilely distinguishable from the first region in the expanded setting;a sensor detecting a change in a position of the input actuator and transmitting a signal in response to the change in position of the input actuator; anda processor in a display device coupled to the dynamic tactile interface, the processor processing the signal to generate an output.
  • 2. The dynamic tactile interface of claim 1, further comprising a displacement device coupled to an actuator, the displacement device transitioning the deformable region from a retracted setting into an expanded setting.
  • 3. The dynamic tactile interface of claim 2, wherein, in response to detecting the change in position of the input actuator, the volume of fluid is increased by the displacement device.
  • 4. The dynamic tactile interface of claim 1, wherein the processor renders a graphical user interface to correlate to the change in position of the input actuator based on the signal.
  • 5. The dynamic tactile interface of claim 1, the processor inferring the height of the deformable region from the sensor based on the signal.
  • 6. The dynamic tactile interface of claim 1, further including a first bladder fluidly coupled to the variable volume, the variable volume of fluid modified in response to the input actuator adjusting the fluidic volume of the first bladder.
  • 7. The dynamic tactile interface of claim 1, further including a second bladder, wherein the input actuator engages the first bladder at a first position and the second bladder at a second location.
  • 8. The dynamic tactile interface of claim 1, wherein the input actuator moves along a track.
  • 9. The dynamic tactile interface of claim 1, wherein the input actuator includes a dial.
  • 10. The dynamic tactile interface of claim 1, wherein the dial causes one or more bladders to become compressed when the dial is rotated.
  • 11. The dynamic tactile interface of claim 1, wherein the dial causes one or more valves to open or close when the dial is rotated.
  • 12. The dynamic tactile interface of claim 1, wherein the processor renders a first graphical user interface in correlation with a first position of the input actuator and a second graphical user interface in correlation with a second position of the input actuator.
  • 13. The dynamic tactile interface of claim 1, wherein the input actuator may travel in multiple directions.
  • 14. The dynamic tactile interface of claim 1, wherein the rendered graphical interface includes a keyboard.
  • 15. The dynamic tactile interface of claim 1, wherein the rendered graphical user interface is configured to reduce distortion caused by the deformable region in the expanded setting.
  • 16. The dynamic tactile interface of claim 1, wherein, in response to the signal, the rendered graphical user interface is modified to be easier to perceive through the deformable region in the expanded setting.
  • 17. The dynamic tactile interface of claim 15, wherein the rendered graphical user interface includes space between keys on a keyboard.
  • 18. The dynamic tactile interface of claim 1, wherein the processor renders a first graphical interface that correlates with one or more deformable regions and the processor subsequently renders a second graphical interface that correlates with the one or more deformable regions.
  • 19. The dynamic tactile interface of claim 1, wherein the dynamic tactile interface includes a second input actuator, the second input actuator, a second deformable region cooperating with the substrate to form a second variable volume filled with a second mass of fluid, the second variable volume fluidly coupled to a second fluid channel and transitioning from a retracted setting into an expanded setting in response to actuation of the second input actuator.
  • 20. The dynamic tactile interface of claim 1, further including a support member, the support member extending from the substrate into the cavity and flush with the upper surface of the substrate.
  • 21. The dynamic tactile interface of claim 20, wherein the support member forms one or more holes on the surface of the substrate.
  • 22. The dynamic tactile interface of claim 1, wherein the input actuator engages a valve to allow fluid motion to or from a bladder.
  • 23. The dynamic tactile interface of claim 22, wherein the input actuator includes a button that engages the valve.
  • 24. The dynamic tactile interface of claim 22, wherein the input actuator includes a dial that engages the valve.
CROSS-REFERENCE TO RELATED APPLICATIONS

The application claims the benefit of U.S. Provisional Patent Application No. 62/045,145, filed on 3 Sep. 2014, which is incorporated in its entirety by this reference.

US Referenced Citations (579)
Number Name Date Kind
2885967 Vogel et al. May 1959 A
3034628 Wadey May 1962 A
3441111 Spalding Apr 1969 A
3453967 Spurlock et al. Jul 1969 A
3490733 Jean Jan 1970 A
3659354 Sutherland May 1972 A
3759108 Borom et al. Sep 1973 A
3780236 Gross Dec 1973 A
3818487 Brody et al. Jun 1974 A
4109118 Kley Aug 1978 A
4181476 Malbec Jan 1980 A
4209819 Seignemartin Jun 1980 A
4290343 Gram Sep 1981 A
4307268 Harper Dec 1981 A
4467321 Volnak Aug 1984 A
4477700 Balash et al. Oct 1984 A
4517421 Margolin May 1985 A
4543000 Hasenbalg Sep 1985 A
4584625 Kellogg Apr 1986 A
4700025 Hatayama et al. Oct 1987 A
4743895 Alexander May 1988 A
4772205 Chlumsky et al. Sep 1988 A
4920343 Schwartz Apr 1990 A
4940734 Ley et al. Jul 1990 A
4980646 Zemel Dec 1990 A
5090297 Paynter Feb 1992 A
5194852 More et al. Mar 1993 A
5195659 Eiskant Mar 1993 A
5212473 Louis May 1993 A
5222895 Fricke Jun 1993 A
5286199 Kipke Feb 1994 A
5346476 Elson Sep 1994 A
5369228 Faust Nov 1994 A
5412189 Cragun May 1995 A
5459461 Crowley et al. Oct 1995 A
5470212 Pearce Nov 1995 A
5488204 Mead et al. Jan 1996 A
5496174 Garner Mar 1996 A
5496175 Oyama et al. Mar 1996 A
5666112 Crowley et al. Sep 1997 A
5717423 Parker Feb 1998 A
5729222 Iggulden et al. Mar 1998 A
5742241 Crowley et al. Apr 1998 A
5754023 Roston et al. May 1998 A
5766013 Vuyk Jun 1998 A
5767839 Rosenberg Jun 1998 A
5835080 Beeteson et al. Nov 1998 A
5880411 Gillespie et al. Mar 1999 A
5889236 Gillespie et al. Mar 1999 A
5917906 Thornton Jun 1999 A
5943043 Furuhata et al. Aug 1999 A
5977867 Blouin Nov 1999 A
5982304 Selker et al. Nov 1999 A
6067116 Yamano et al. May 2000 A
6154198 Rosenberg Nov 2000 A
6154201 Levin et al. Nov 2000 A
6160540 Fishkin et al. Dec 2000 A
6169540 Rosenberg et al. Jan 2001 B1
6187398 Eldridge Feb 2001 B1
6188391 Seely et al. Feb 2001 B1
6218966 Goodwin et al. Apr 2001 B1
6243074 Fishkin et al. Jun 2001 B1
6243078 Rosenberg Jun 2001 B1
6268857 Fishkin et al. Jul 2001 B1
6271828 Rosenberg et al. Aug 2001 B1
6278441 Gouzman et al. Aug 2001 B1
6300937 Rosenberg Oct 2001 B1
6310614 Maeda et al. Oct 2001 B1
6323846 Westerman et al. Nov 2001 B1
6337678 Fish Jan 2002 B1
6354839 Schmidt et al. Mar 2002 B1
6356259 Maeda et al. Mar 2002 B1
6359572 Vale Mar 2002 B1
6366272 Rosenberg et al. Apr 2002 B1
6369803 Brisebois et al. Apr 2002 B2
6384743 Vanderheiden May 2002 B1
6414671 Gillespie et al. Jul 2002 B1
6429846 Rosenberg et al. Aug 2002 B2
6437771 Rosenberg et al. Aug 2002 B1
6462294 Davidson et al. Oct 2002 B2
6469692 Rosenberg Oct 2002 B2
6486872 Rosenberg et al. Nov 2002 B2
6498353 Nagle et al. Dec 2002 B2
6501462 Garner Dec 2002 B1
6509892 Cooper et al. Jan 2003 B1
6529183 MacLean et al. Mar 2003 B1
6573844 Venolia et al. Jun 2003 B1
6636202 Ishmael et al. Oct 2003 B2
6639581 Moore et al. Oct 2003 B1
6655788 Freeman Dec 2003 B1
6657614 Ito et al. Dec 2003 B1
6667738 Murphy Dec 2003 B2
6681031 Cohen et al. Jan 2004 B2
6683627 Ullmann et al. Jan 2004 B1
6686911 Levin et al. Feb 2004 B1
6697086 Rosenberg et al. Feb 2004 B2
6700556 Richley et al. Mar 2004 B2
6703924 Tecu et al. Mar 2004 B2
6743021 Prince et al. Jun 2004 B2
6788295 Inkster Sep 2004 B1
6819316 Schulz et al. Nov 2004 B2
6850222 Rosenberg Feb 2005 B1
6861961 Sandbach et al. Mar 2005 B2
6877986 Fournier et al. Apr 2005 B2
6881063 Yang Apr 2005 B2
6930234 Davis Aug 2005 B2
6937225 Kehlstadt et al. Aug 2005 B1
6975305 Yamashita Dec 2005 B2
6979164 Kramer Dec 2005 B2
6982696 Shahoian Jan 2006 B1
6995745 Boon et al. Feb 2006 B2
7004655 Ferrara Feb 2006 B2
7015894 Morohoshi Mar 2006 B2
7027032 Rosenberg et al. Apr 2006 B2
7056051 Fiffie Jun 2006 B2
7061467 Rosenberg Jun 2006 B2
7064655 Murray et al. Jun 2006 B2
7079111 Ho Jul 2006 B2
7081888 Cok et al. Jul 2006 B2
7096852 Gregorio Aug 2006 B2
7102541 Rosenberg Sep 2006 B2
7104152 Levin et al. Sep 2006 B2
7106305 Rosenberg Sep 2006 B2
7106313 Schena et al. Sep 2006 B2
7109967 Hioki et al. Sep 2006 B2
7112737 Ramstein Sep 2006 B2
7113166 Rosenberg et al. Sep 2006 B1
7116317 Gregorio et al. Oct 2006 B2
7124425 Anderson, Jr. et al. Oct 2006 B1
7129854 Arneson et al. Oct 2006 B2
7131073 Rosenberg et al. Oct 2006 B2
7136045 Rosenberg et al. Nov 2006 B2
7138977 Kinerk et al. Nov 2006 B2
7138985 Nakajima Nov 2006 B2
7143785 Maerkl et al. Dec 2006 B2
7144616 Unger et al. Dec 2006 B1
7148875 Rosenberg et al. Dec 2006 B2
7151432 Tierling Dec 2006 B2
7151527 Culver Dec 2006 B2
7151528 Taylor et al. Dec 2006 B2
7154470 Tierling Dec 2006 B2
7158112 Rosenberg et al. Jan 2007 B2
7159008 Wies et al. Jan 2007 B1
7161276 Face Jan 2007 B2
7161580 Bailey et al. Jan 2007 B2
7168042 Braun et al. Jan 2007 B2
7176903 Katsuki et al. Feb 2007 B2
7182691 Schena Feb 2007 B1
7191191 Peurach et al. Mar 2007 B2
7193607 Moore et al. Mar 2007 B2
7195170 Matsumoto et al. Mar 2007 B2
7196688 Schena Mar 2007 B2
7198137 Olien Apr 2007 B2
7199790 Rosenberg et al. Apr 2007 B2
7202851 Cunningham et al. Apr 2007 B2
7205981 Cunningham Apr 2007 B2
7208671 Chu Apr 2007 B2
7209028 Boronkay et al. Apr 2007 B2
7209113 Park Apr 2007 B2
7209117 Rosenberg et al. Apr 2007 B2
7209118 Shahoian et al. Apr 2007 B2
7210160 Anderson, Jr. et al. Apr 2007 B2
7215326 Rosenberg May 2007 B2
7216671 Unger et al. May 2007 B2
7218310 Tierling et al. May 2007 B2
7218313 Marcus et al. May 2007 B2
7233313 Levin et al. Jun 2007 B2
7233315 Gregorio et al. Jun 2007 B2
7233476 Goldenberg et al. Jun 2007 B2
7236157 Schena et al. Jun 2007 B2
7245202 Levin Jul 2007 B2
7245292 Custy Jul 2007 B1
7249951 Bevirt et al. Jul 2007 B2
7250128 Unger et al. Jul 2007 B2
7253803 Schena et al. Aug 2007 B2
7253807 Nakajima Aug 2007 B2
7265750 Rosenberg Sep 2007 B2
7280095 Grant Oct 2007 B2
7283120 Grant Oct 2007 B2
7283123 Braun et al. Oct 2007 B2
7283696 Ticknor et al. Oct 2007 B2
7289106 Bailey et al. Oct 2007 B2
7289111 Asbill Oct 2007 B2
7307619 Cunningham et al. Dec 2007 B2
7308831 Cunningham et al. Dec 2007 B2
7319374 Shahoian Jan 2008 B2
7336260 Martin et al. Feb 2008 B2
7336266 Hayward et al. Feb 2008 B2
7339572 Schena Mar 2008 B2
7339580 Westerman et al. Mar 2008 B2
7342573 Ryynaenen Mar 2008 B2
7355595 Bathiche et al. Apr 2008 B2
7369115 Cruz-Hernandez et al. May 2008 B2
7382357 Panotopoulos et al. Jun 2008 B2
7390157 Kramer Jun 2008 B2
7391861 Levy Jun 2008 B2
7397466 Bourdelais et al. Jul 2008 B2
7403191 Sinclair Jul 2008 B2
7432910 Shahoian Oct 2008 B2
7432911 Skarine Oct 2008 B2
7432912 Cote et al. Oct 2008 B2
7433719 Dabov Oct 2008 B2
7453442 Poynter Nov 2008 B1
7471280 Prins Dec 2008 B2
7489309 Levin et al. Feb 2009 B2
7511702 Hotelling Mar 2009 B2
7522152 Olien et al. Apr 2009 B2
7545289 Mackey et al. Jun 2009 B2
7548232 Shahoian et al. Jun 2009 B2
7551161 Mann Jun 2009 B2
7561142 Shahoian et al. Jul 2009 B2
7567232 Rosenberg Jul 2009 B2
7567243 Hayward Jul 2009 B2
7589714 Funaki Sep 2009 B2
7592999 Rosenberg et al. Sep 2009 B2
7605800 Rosenberg Oct 2009 B2
7609178 Son et al. Oct 2009 B2
7656393 King et al. Feb 2010 B2
7659885 Kraus et al. Feb 2010 B2
7671837 Forsblad et al. Mar 2010 B2
7679611 Schena Mar 2010 B2
7679839 Polyakov et al. Mar 2010 B2
7688310 Rosenberg Mar 2010 B2
7701438 Chang et al. Apr 2010 B2
7728820 Rosenberg et al. Jun 2010 B2
7733575 Heim et al. Jun 2010 B2
7743348 Robbins et al. Jun 2010 B2
7755602 Tremblay et al. Jul 2010 B2
7808488 Martin et al. Oct 2010 B2
7834853 Finney et al. Nov 2010 B2
7843424 Rosenberg et al. Nov 2010 B2
7864164 Cunningham et al. Jan 2011 B2
7869589 Tuovinen Jan 2011 B2
7890257 Fyke et al. Feb 2011 B2
7890863 Grant et al. Feb 2011 B2
7920131 Westerman Apr 2011 B2
7924145 Yuk et al. Apr 2011 B2
7944435 Rosenberg et al. May 2011 B2
7952498 Higa May 2011 B2
7956770 Klinghult et al. Jun 2011 B2
7973773 Pryor Jul 2011 B2
7978181 Westerman Jul 2011 B2
7978183 Rosenberg et al. Jul 2011 B2
7978186 Vassallo et al. Jul 2011 B2
7979797 Schena Jul 2011 B2
7982720 Rosenberg et al. Jul 2011 B2
7986303 Braun et al. Jul 2011 B2
7986306 Eich et al. Jul 2011 B2
7989181 Blattner et al. Aug 2011 B2
7999660 Cybart et al. Aug 2011 B2
8002089 Jasso et al. Aug 2011 B2
8004492 Kramer et al. Aug 2011 B2
8013843 Pryor Sep 2011 B2
8020095 Braun et al. Sep 2011 B2
8022933 Hardacker et al. Sep 2011 B2
8031181 Rosenberg et al. Oct 2011 B2
8044826 Yoo Oct 2011 B2
8047849 Ahn et al. Nov 2011 B2
8049734 Rosenberg et al. Nov 2011 B2
8059104 Shahoian et al. Nov 2011 B2
8059105 Rosenberg et al. Nov 2011 B2
8063892 Shahoian et al. Nov 2011 B2
8063893 Rosenberg et al. Nov 2011 B2
8068605 Holmberg Nov 2011 B2
8077154 Emig et al. Dec 2011 B2
8077440 Krabbenborg et al. Dec 2011 B2
8077941 Assmann Dec 2011 B2
8094121 Obermeyer et al. Jan 2012 B2
8094806 Levy Jan 2012 B2
8103472 Braun et al. Jan 2012 B2
8106787 Nurmi Jan 2012 B2
8115745 Gray Feb 2012 B2
8116831 Meitzler et al. Feb 2012 B2
8123660 Kruse et al. Feb 2012 B2
8125347 Fahn Feb 2012 B2
8125461 Weber et al. Feb 2012 B2
8130202 Levine et al. Mar 2012 B2
8144129 Hotelling et al. Mar 2012 B2
8144271 Han Mar 2012 B2
8154512 Olien et al. Apr 2012 B2
8154527 Ciesla et al. Apr 2012 B2
8159461 Martin et al. Apr 2012 B2
8162009 Chaffee Apr 2012 B2
8164573 Dacosta et al. Apr 2012 B2
8166649 Moore May 2012 B2
8169306 Schmidt et al. May 2012 B2
8169402 Shahoian et al. May 2012 B2
8174372 Da Costa May 2012 B2
8174495 Takashima et al. May 2012 B2
8174508 Sinclair et al. May 2012 B2
8174511 Takenaka et al. May 2012 B2
8178808 Strittmatter May 2012 B2
8179375 Ciesla et al. May 2012 B2
8179377 Ciesla et al. May 2012 B2
8188989 Levin et al. May 2012 B2
8195243 Kim et al. Jun 2012 B2
8199107 Xu et al. Jun 2012 B2
8199124 Ciesla et al. Jun 2012 B2
8203094 Mittleman et al. Jun 2012 B2
8203537 Tanabe et al. Jun 2012 B2
8207950 Ciesla et al. Jun 2012 B2
8212772 Shahoian Jul 2012 B2
8217903 Ma et al. Jul 2012 B2
8217904 Kim Jul 2012 B2
8223278 Kim et al. Jul 2012 B2
8224392 Kim et al. Jul 2012 B2
8228305 Pryor Jul 2012 B2
8232976 Yun et al. Jul 2012 B2
8243038 Ciesla et al. Aug 2012 B2
8253052 Chen Aug 2012 B2
8253703 Eldering Aug 2012 B2
8279172 Braun et al. Oct 2012 B2
8279193 Birnbaum et al. Oct 2012 B1
8310458 Faubert et al. Nov 2012 B2
8345013 Heubel et al. Jan 2013 B2
8350820 Deslippe et al. Jan 2013 B2
8362882 Heubel et al. Jan 2013 B2
8363008 Ryu et al. Jan 2013 B2
8367957 Strittmatter Feb 2013 B2
8368641 Tremblay et al. Feb 2013 B2
8378797 Pance et al. Feb 2013 B2
8384680 Paleczny et al. Feb 2013 B2
8390594 Modarres et al. Mar 2013 B2
8390771 Sakai et al. Mar 2013 B2
8395587 Cauwels et al. Mar 2013 B2
8395591 Kruglick Mar 2013 B2
8400402 Son Mar 2013 B2
8400410 Taylor et al. Mar 2013 B2
8547339 Ciesla Oct 2013 B2
8587541 Ciesla et al. Nov 2013 B2
8587548 Ciesla et al. Nov 2013 B2
8749489 Ito et al. Jun 2014 B2
8856679 Sirpal et al. Oct 2014 B2
8922503 Ciesla et al. Dec 2014 B2
8922510 Ciesla et al. Dec 2014 B2
8928621 Ciesla et al. Jan 2015 B2
8970403 Ciesla et al. Mar 2015 B2
9035898 Ciesla May 2015 B2
9075429 Karakotsios Jul 2015 B1
9116617 Ciesla et al. Aug 2015 B2
9128525 Yairi et al. Sep 2015 B2
9274612 Ciesla et al. Mar 2016 B2
9274635 Birnbaum Mar 2016 B2
9372539 Ciesla et al. Jun 2016 B2
20010008396 Komata Jul 2001 A1
20010043189 Brisebois et al. Nov 2001 A1
20020063694 Keely et al. May 2002 A1
20020104691 Kent et al. Aug 2002 A1
20020106614 Prince et al. Aug 2002 A1
20020110237 Krishnan Aug 2002 A1
20020125084 Kreuzer et al. Sep 2002 A1
20020149570 Knowles et al. Oct 2002 A1
20020180620 Gettemy et al. Dec 2002 A1
20030087698 Nishiumi et al. May 2003 A1
20030117371 Roberts et al. Jun 2003 A1
20030179190 Franzen Sep 2003 A1
20030206153 Murphy Nov 2003 A1
20030223799 Pihlaja Dec 2003 A1
20040001589 Mueller et al. Jan 2004 A1
20040056876 Nakajima Mar 2004 A1
20040056877 Nakajima Mar 2004 A1
20040106360 Farmer et al. Jun 2004 A1
20040114324 Kusaka et al. Jun 2004 A1
20040164968 Miyamoto Aug 2004 A1
20040178006 Cok Sep 2004 A1
20050007339 Sato Jan 2005 A1
20050007349 Vakil et al. Jan 2005 A1
20050020325 Enger et al. Jan 2005 A1
20050030292 Diederiks Feb 2005 A1
20050057528 Kleen Mar 2005 A1
20050073506 Durso Apr 2005 A1
20050088417 Mulligan Apr 2005 A1
20050110768 Marriott et al. May 2005 A1
20050162408 Martchovsky Jul 2005 A1
20050212773 Asbill Sep 2005 A1
20050231489 Ladouceur et al. Oct 2005 A1
20050253816 Himberg et al. Nov 2005 A1
20050270444 Miller et al. Dec 2005 A1
20050285846 Funaki Dec 2005 A1
20060026521 Hotelling et al. Feb 2006 A1
20060026535 Hotelling et al. Feb 2006 A1
20060053387 Ording Mar 2006 A1
20060087479 Sakurai et al. Apr 2006 A1
20060097991 Hotelling et al. May 2006 A1
20060098148 Kobayashi et al. May 2006 A1
20060118610 Pihlaja et al. Jun 2006 A1
20060119586 Grant et al. Jun 2006 A1
20060152474 Saito et al. Jul 2006 A1
20060154216 Hafez et al. Jul 2006 A1
20060197753 Hotelling Sep 2006 A1
20060214923 Chiu et al. Sep 2006 A1
20060238495 Davis Oct 2006 A1
20060238510 Panotopoulos et al. Oct 2006 A1
20060238517 King et al. Oct 2006 A1
20060256075 Anastas et al. Nov 2006 A1
20060278444 Binstead Dec 2006 A1
20070013662 Fauth Jan 2007 A1
20070036492 Lee Feb 2007 A1
20070085837 Ricks et al. Apr 2007 A1
20070108032 Matsumoto et al. May 2007 A1
20070122314 Strand et al. May 2007 A1
20070130212 Peurach et al. Jun 2007 A1
20070152982 Kim et al. Jul 2007 A1
20070152983 Mckillop et al. Jul 2007 A1
20070165004 Seelhammer et al. Jul 2007 A1
20070171210 Chaudhri et al. Jul 2007 A1
20070182718 Schoener et al. Aug 2007 A1
20070229233 Dort Oct 2007 A1
20070229464 Hotelling et al. Oct 2007 A1
20070236466 Hotelling Oct 2007 A1
20070236469 Woolley et al. Oct 2007 A1
20070247429 Westerman Oct 2007 A1
20070257634 Leschin et al. Nov 2007 A1
20070273561 Philipp Nov 2007 A1
20070296702 Strawn et al. Dec 2007 A1
20070296709 Guanghai Dec 2007 A1
20080010593 Uusitalo et al. Jan 2008 A1
20080024459 Poupyrev et al. Jan 2008 A1
20080054875 Saito Mar 2008 A1
20080062151 Kent Mar 2008 A1
20080136791 Nissar Jun 2008 A1
20080138774 Ahn et al. Jun 2008 A1
20080143693 Schena Jun 2008 A1
20080150911 Harrison Jun 2008 A1
20080165139 Hotelling et al. Jul 2008 A1
20080174321 Kang et al. Jul 2008 A1
20080174570 Jobs et al. Jul 2008 A1
20080202251 Serban et al. Aug 2008 A1
20080238448 Moore et al. Oct 2008 A1
20080248836 Caine Oct 2008 A1
20080249643 Nelson Oct 2008 A1
20080251368 Holmberg et al. Oct 2008 A1
20080252607 De et al. Oct 2008 A1
20080266264 Lipponen et al. Oct 2008 A1
20080286447 Alden et al. Nov 2008 A1
20080291169 Brenner et al. Nov 2008 A1
20080297475 Woolf et al. Dec 2008 A1
20080303796 Fyke Dec 2008 A1
20080312577 Drasler et al. Dec 2008 A1
20080314725 Karhiniemi et al. Dec 2008 A1
20090002140 Higa Jan 2009 A1
20090002205 Klinghult et al. Jan 2009 A1
20090002328 Ullrich et al. Jan 2009 A1
20090002337 Chang Jan 2009 A1
20090009480 Heringslack Jan 2009 A1
20090015547 Franz et al. Jan 2009 A1
20090028824 Chiang et al. Jan 2009 A1
20090033617 Lindberg et al. Feb 2009 A1
20090059495 Matsuoka Mar 2009 A1
20090066672 Tanabe et al. Mar 2009 A1
20090085878 Heubel et al. Apr 2009 A1
20090106655 Grant et al. Apr 2009 A1
20090115733 Ma et al. May 2009 A1
20090115734 Fredriksson et al. May 2009 A1
20090128376 Caine et al. May 2009 A1
20090128503 Grant et al. May 2009 A1
20090129021 Dunn May 2009 A1
20090132093 Arneson et al. May 2009 A1
20090135145 Chen et al. May 2009 A1
20090140989 Ahlgren Jun 2009 A1
20090160813 Takashima et al. Jun 2009 A1
20090167508 Fadell et al. Jul 2009 A1
20090167509 Fadell et al. Jul 2009 A1
20090167567 Halperin et al. Jul 2009 A1
20090167677 Kruse et al. Jul 2009 A1
20090167704 Terlizzi et al. Jul 2009 A1
20090174673 Ciesla Jul 2009 A1
20090174687 Ciesla et al. Jul 2009 A1
20090181724 Pettersson Jul 2009 A1
20090182501 Fyke et al. Jul 2009 A1
20090195512 Pettersson Aug 2009 A1
20090207148 Sugimoto et al. Aug 2009 A1
20090215500 You et al. Aug 2009 A1
20090231305 Hotelling et al. Sep 2009 A1
20090243998 Wang Oct 2009 A1
20090250267 Heubel et al. Oct 2009 A1
20090256817 Perlin et al. Oct 2009 A1
20090273578 Kanda et al. Nov 2009 A1
20090289922 Henry Nov 2009 A1
20090303022 Griffin et al. Dec 2009 A1
20090309616 Klinghult Dec 2009 A1
20100043189 Fukano Feb 2010 A1
20100045613 Wu et al. Feb 2010 A1
20100073241 Ayala et al. Mar 2010 A1
20100078231 Yeh et al. Apr 2010 A1
20100079404 Degner et al. Apr 2010 A1
20100090814 Cybart et al. Apr 2010 A1
20100097323 Edwards et al. Apr 2010 A1
20100103116 Leung et al. Apr 2010 A1
20100103137 Ciesla et al. Apr 2010 A1
20100109486 Polyakov et al. May 2010 A1
20100121928 Leonard May 2010 A1
20100141608 Huang et al. Jun 2010 A1
20100142516 Lawson et al. Jun 2010 A1
20100162109 Chatterjee et al. Jun 2010 A1
20100171719 Craig et al. Jul 2010 A1
20100171720 Craig et al. Jul 2010 A1
20100171729 Chun Jul 2010 A1
20100177050 Heubel et al. Jul 2010 A1
20100182135 Moosavi Jul 2010 A1
20100182245 Edwards et al. Jul 2010 A1
20100225456 Eldering Sep 2010 A1
20100232107 Dunn Sep 2010 A1
20100237043 Garlough Sep 2010 A1
20100238367 Montgomery et al. Sep 2010 A1
20100295820 Kikin-Gil Nov 2010 A1
20100296248 Campbell et al. Nov 2010 A1
20100298032 Lee et al. Nov 2010 A1
20100302199 Taylor et al. Dec 2010 A1
20100321335 Lim et al. Dec 2010 A1
20110001613 Ciesla et al. Jan 2011 A1
20110011650 Klinghult Jan 2011 A1
20110012851 Ciesla et al. Jan 2011 A1
20110018813 Kruglick Jan 2011 A1
20110029862 Scott et al. Feb 2011 A1
20110043457 Oliver et al. Feb 2011 A1
20110060998 Schwartz et al. Mar 2011 A1
20110074691 Causey et al. Mar 2011 A1
20110102462 Birnbaum May 2011 A1
20110120784 Osoinach et al. May 2011 A1
20110148793 Ciesla et al. Jun 2011 A1
20110148807 Fryer Jun 2011 A1
20110157056 Karpfinger Jun 2011 A1
20110157080 Ciesla et al. Jun 2011 A1
20110163978 Park et al. Jul 2011 A1
20110175838 Higa Jul 2011 A1
20110175844 Berggren Jul 2011 A1
20110181530 Park et al. Jul 2011 A1
20110193787 Morishige et al. Aug 2011 A1
20110194230 Hart et al. Aug 2011 A1
20110234502 Yun et al. Sep 2011 A1
20110241442 Mittleman et al. Oct 2011 A1
20110242749 Huang et al. Oct 2011 A1
20110248947 Krahenbuhl et al. Oct 2011 A1
20110248987 Mitchell Oct 2011 A1
20110254672 Ciesla et al. Oct 2011 A1
20110254709 Ciesla et al. Oct 2011 A1
20110254789 Ciesla et al. Oct 2011 A1
20110306931 Kamen et al. Dec 2011 A1
20120032886 Ciesla et al. Feb 2012 A1
20120038583 Westhues et al. Feb 2012 A1
20120043191 Kessler et al. Feb 2012 A1
20120044277 Adachi Feb 2012 A1
20120056846 Zaliva Mar 2012 A1
20120062483 Ciesla et al. Mar 2012 A1
20120080302 Kim et al. Apr 2012 A1
20120098789 Ciesla Apr 2012 A1
20120105333 Maschmeyer et al. May 2012 A1
20120120357 Jiroku May 2012 A1
20120154324 Wright et al. Jun 2012 A1
20120193211 Ciesla et al. Aug 2012 A1
20120200528 Ciesla et al. Aug 2012 A1
20120200529 Ciesla et al. Aug 2012 A1
20120206364 Ciesla et al. Aug 2012 A1
20120218213 Ciesla et al. Aug 2012 A1
20120218214 Ciesla et al. Aug 2012 A1
20120223914 Ciesla et al. Sep 2012 A1
20120235935 Ciesla et al. Sep 2012 A1
20120242607 Ciesla et al. Sep 2012 A1
20120306787 Ciesla et al. Dec 2012 A1
20130019207 Rothkopf et al. Jan 2013 A1
20130127790 Wassvik May 2013 A1
20130141118 Guard Jun 2013 A1
20130215035 Guard Aug 2013 A1
20130275888 Williamson et al. Oct 2013 A1
20140043291 Ciesla et al. Feb 2014 A1
20140132532 Yairi et al. May 2014 A1
20140160044 Yairi et al. Jun 2014 A1
20140160063 Yairi et al. Jun 2014 A1
20140160064 Yairi et al. Jun 2014 A1
20140176489 Park Jun 2014 A1
20150009150 Cho et al. Jan 2015 A1
20150015573 Burtzlaff et al. Jan 2015 A1
20150091834 Johnson Apr 2015 A1
20150091870 Ciesla et al. Apr 2015 A1
20150138110 Yairi et al. May 2015 A1
20150145657 Levesque et al. May 2015 A1
20150205419 Calub et al. Jul 2015 A1
20150293591 Yairi et al. Oct 2015 A1
Foreign Referenced Citations (40)
Number Date Country
1260525 Jul 2000 CN
1530818 Sep 2004 CN
1882460 Dec 2006 CN
2000884 Dec 2008 EP
190403152 Dec 1904 GB
108771 Aug 1917 GB
1242418 Aug 1971 GB
S63164122 Jul 1988 JP
10255106 Sep 1998 JP
H10255106 Sep 1998 JP
2006268068 Oct 2006 JP
2006285785 Oct 2006 JP
200964357 Mar 2009 JP
2009064357 Mar 2009 JP
2010039602 Feb 2010 JP
2010072743 Apr 2010 JP
2011508935 Mar 2011 JP
20000010511 Feb 2000 KR
100677624 Jan 2007 KR
20090023364 Nov 2012 KR
2004028955 Apr 2004 WO
2006082020 Aug 2006 WO
2008037275 Apr 2008 WO
2009002605 Dec 2008 WO
2009044027 Apr 2009 WO
2009067572 May 2009 WO
2009088985 Jul 2009 WO
2010077382 Jul 2010 WO
2010078596 Jul 2010 WO
2010078597 Jul 2010 WO
2011003113 Jan 2011 WO
2011087816 Jul 2011 WO
2011087817 Jul 2011 WO
2011108382 Sep 2011 WO
2011112984 Sep 2011 WO
2011118382 Sep 2011 WO
2011133604 Oct 2011 WO
2011133605 Oct 2011 WO
2013173624 Nov 2013 WO
2014047656 Mar 2014 WO
Non-Patent Literature Citations (5)
Entry
“Sharp Develops and Will Mass Produce New System LCD with Embedded Optical Sensors to Provide Input Capabilities Including Touch Screen and Scanner Functions,” Sharp Press Release, Aug. 31, 2007, 3 pages, downloaded from the Internet at: http://sharp-world.com/corporate/news/070831.html.
Essilor. “Ophthalmic Optic Files Materials,” Essilor International, Ser 145 Paris France, Mar. 1997, pp. 1-29, [retrieved on Nov. 18, 2014]. Retrieved from the internet. URL: <http://www.essiloracademy.eu/sites/default/files/9.Materials.pdf>.
Jeong et al., “Tunable Microdoublet Lens Array,” Optical Society of America, Optics Express; vol. 12, No. 11. May 31, 2004, 7 Pages.
Lind. “Two Decades of Negative Thermal Expansion Research: Where Do We Stand?” Department of Chemistry, the University of Toledo, Materials 2012, 5, 1125-1154; doi:10.3390/ma5061125, Jun. 20, 2012 (Jun. 20, 2012) pp. 1125-1154, [retrieved on Nov. 18, 2014]. Retrieved from the internet. URL: <https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=materials-05-01125.pdf>.
Preumont, A. Vibration Control of Active Structures: An Introduction, Jul. 2011.
Related Publications (1)
Number Date Country
20160187982 A1 Jun 2016 US
Provisional Applications (1)
Number Date Country
62045145 Sep 2014 US