This invention relates generally to tactile user interfaces, and more specifically to a new and useful mountable systems and methods for selectively raising portions of a surface of the user interface of a device.
The following description of the preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.
As shown in
The method S100 of the preferred embodiments is preferably applied to any suitable tactile interface layer that includes deformable regions. In particular, as shown in
The tactile interface layer 100 preferably functions to provide tactile guidance to a user when using a device that the tactile interface layer 100 is applied to. As shown in
The tactile interface layer 100 preferably includes a sensor that functions to detect the gesture of the user, for example, a capacitive sensor that functions to detect the motion of a finger of the user from the first location to the second location. Alternatively, in the variation of the tactile interface layer 100 as described above, a pressure sensor located within the fluid vessel may be used to detect changes in pressure within the fluid vessel to detect the motion of a finger of the user from the first location to the second location. Alternatively, the sensor may be a sensor included in the device to which the tactile interface layer 100 is applied to, for example, the device may include a touch sensitive display onto which the tactile interface layer 100 is overlaid. The gesture of the user may be detected using the sensing capabilities of the touch sensitive display. However, any other suitable gesture detection may be used.
Similarly, the tactile interface layer 100 preferably includes a processor that functions to interpret the detected gesture as a command. The processor preferably functions to discern between a gesture that is provided by the user to be a command a gesture that may be provided by the user but not meant to be a command, for example, an accidental brush of the finger along the surface of the tactile interface layer 100. The processor may include a storage device that functions to store a plurality of gesture and command associations and/or user preferences for interpretations of gestures as commands. The processor may be any suitable type of processor and the storage device may be any suitable type of storage device, for example, a flash memory device, a hard-drive, or any other suitable type. The processor and/or storage device may alternatively be a processor and/or storage device included into the device that the tactile interface layer 100 is applied to. However, any other suitable arrangement of the processor and/or storage device may be used.
As shown in
The gesture may be a single finger moving from the first location to the second location on the surface 115, as shown in
In a first variation of the gesture, as shown in Example A, the finger or fingers of a user move from a first location to a second location in a “swiping” motion. In a second variation, at least two of the fingers of the user move apart from each other in a “pinch open” motion, as shown in Example B. In other words, a first finger moves from a first location to a second and a second finger moves from a third location to a fourth, where the second and fourth locations are farther apart from each other than the first and third. A third variation of the gesture may be thought of as opposite that of the second variation, where at least two of the fingers of the user move together in a “pinch close” motion, as shown in Example C. In a fourth variation of the gesture, at least two fingers of the user may move in substantially the same direction in a “drag” motion, as shown in Example D. In other words, a first finger moves from a first location to a second and a second finger moves from a third location substantially adjacent to the first location to a fourth location substantially adjacent to the second location. In this variation, the first and second fingers remain substantially equidistant from the beginning of the gesture to the end of the gesture. In a fifth variation, as shown in Example E, the first and second fingers also remain substantially equidistant from the beginning of the gesture to the end of the gesture. In this fifth variation, the first finger moves from a first location to a second location and the second finger moves from a third to a fourth location along the surface by rotating about a point that is substantially in between the distance between the first and third locations. In other words, the fingers of a user rotate about a center that is substantially defined by the distance between the initial positions of the first and second fingers of the user. While the gesture is preferably one of the variations as described above, the gesture may be any other suitable combination of the above variations and/or any other suitable type of gesture.
As shown in
The command interpreted from the gesture along the surface 115 of the tactile interface layer is preferably one of the variations described above, but may alternatively be any suitable combination of the above variations or any other suitable type of command for the deformable region. In addition to a command for the deformable region, the gesture may also be interpreted as a command for the device, for example, when applied to a device that is a mobile phone, music player, or any other suitable device that outputs sound, the command may include a user command to change the volume of the sound output. Similarly, in a device that provides a visual output, the command may include a user command to change the brightness or any other suitable property of the visual output. However, any other suitable command for the device may be used.
Exemplary Interpretations of Gestures as Commands for the Deformable Region
The following include exemplary interpretations of particular gestures as commands for the deformable region and implementation of the command using the variation of the tactile interface layer 100 as described in U.S. application Ser. Nos. 11/969,848, 12/319,334, and 12/497,622, which are incorporated in their entireties this reference.
In a first exemplary interpretation, as shown in
In a second exemplary interpretation, as shown in
In a third exemplary interpretation, as shown in
In a fourth exemplary interpretation, as shown in
In another aspect of the fourth exemplary interpretation, the user may dictate interaction between expanded cavities 125. For example, in the “dragging” example mentioned above, the user may “drag” an object along a path and particular regions 113 are expanded along the path. When an object is dragged over an existing deformed particular region 113, the object and the existing deformed particular region 113 may “react” based on actions of the user. For example, if the user pauses the dragging motion when the object is in the location of the existing deformed particular region 113, the deformed particular region 113 of the object and the existing deformed particular region 113 may seemingly “merge,” for example, the total surface area of the existing deformed particular region 113 may grow as if the deformed particular region 113 of the object is added into the existing deformed particular region 113, similar to the third exemplary interpretation. The user may then also drag the “merged” particular region 113 to a different location. Alternatively, the existing deformed particular region 113 and the deformed particular region 113 for the object may “repel” each other, for example, the object may represent a baseball bat and the existing deformed particular region 113 may represent a ball, and the user may “hit” the ball with the baseball bat, seemingly “repelling” the two deformed particular regions. Similarly, the user may perform a splitting motion on an existing deformed particular region 113 and the existing deformed particular region 113 may “split,” forming two distinct deformed particular regions 113. Each of the resulting two distinct deformed particular regions 113 is preferably of a smaller surface area than the original existing deformed particular region 113. An example of a splitting motion may be drawing two fingers apart substantially adjacent to the existing deformed particular region 113, However, any other suitable interaction between expanded cavities 125 may be implemented. While an active response to a command given by the user is preferably one of the examples described here, any active response to a command given by the user may be used.
A fifth exemplary interpretation, as shown in
While the interpretation of the gesture as a command is preferably one of the variations described above, the active response may alternatively be a combination of the variations described above or any other suitable combination of gestures and commands.
As shown in
Generally, the second method S200 functions to predict a position of an upcoming input based on how a mobile computing device (e.g., a smartphone, a tablet, a PDA, personal music player, wearable device, watch, wristband, etc.) is held by a user and then to manipulate a dynamic tactile interface within the mobile computing device to yield a tactilely-distinguishable formation on the dynamic tactile interface proximal the predicted position of the upcoming input, a desired location of a button (i.e., input region), or shape of the dynamic tactile interface. Thus, the second method S200 can manipulate one or more deformable regions of a dynamic tactile interface within a mobile computing device to dynamically form tactilely-distinguishable formations on the mobile computing device, thereby improving convenience and ease of use of the mobile computing device.
In one example, while the mobile computing device is ‘locked,’ the second method S200 identifies that the mobile computing device is held in a portrait orientation in a user's left hand and thus transitions a deformable region over the top left quadrant (i.e., II Cartesian quadrant) of the display to define an physical “unlock” region adjacent a repositioned unlock slider rendered on the display. In this example, the second method S200 thus identifies how the mobile computing device is held and manipulates the dynamic tactile layer to place the physical unlock region in a position directly and naturally accessibly by the user's left thumb, thus increasing the ease with which the user may unlock the mobile computing device. In this example, the second method S200 can also adjust the position of a key (e.g., graphic) rendered on the display to align with the physical unlock region. Furthermore, for the unlock region that defines an elongated ridge indicating a swipe input to unlock, the second method S200 can modify a required input swipe direction to accommodate the user's hand position over the mobile computing device. In this example, when the mobile computing device held in a portrait orientation in the user's left hand, the second method S200 can set the swipe direction from right to left, whereas the second method S200 sets the swipe direction from left to right when the mobile computing device is held in a portrait orientation in the user's right hand.
In another example, while the mobile computing device is outputting audio (e.g., through headphones or through an internal speaker), the second method S200 identifies that the mobile computing device is held in a portrait orientation in a user's right hand and thus transitions a pair of deformable regions on the upper right region of the side of the mobile computing device into expanded settings to define a physical “volume up” key and a physical “volume down” key. In this example, the second method S200 thus identifies how the mobile computing device is held and manipulates the dynamic tactile layer to place physical volume adjustment regions in positions directly and naturally accessibly by the user's right index finger, thus increasing the ease with which the user may adjust the volume output of the mobile computing device. In this example, the second method S200 can also render a “+” image key and a “−” image key near the perimeter of the display to proximal the physical “volume up” and “volume down” keys to indicate control functions of the corresponding physical keys to the user.
In yet another example implementation, while the mobile computing device is in use (e.g., unlocked), the second method S200 determines the orientation of the mobile computing device relative to the horizon (e.g., portrait, landscape, 37° from horizontal) and transitions deformable regions within the dynamic tactile interface between expanded and retracted settings to maintain a physical “home” button proximal a current effective bottom center of the mobile computing device. Furthermore, in this example, the second method S200 can identify when the mobile computing device is rotated relative to the horizon and frequently update the position of the home button (e.g., a home button rendered on the display and a home button defined by a deformable region in the expanded setting), such as every five seconds or when the change in position of the mobile computing device exceeds a threshold position change while the mobile computing device is unlocked and in operation.
In another example implementation, once the mobile computing device is unlocked and a home screen with native applications rendered on the display, the second method S200 accesses a user application history including frequency and duration of user of native application displayed on the home screen. The second method S200 subsequently manipulates a set of deformable regions, each adjacent a displayed native application key, with a deformable region adjacent a native application key corresponding to a highest-use native application transitioned to a highest expanded position and with a deformable region adjacent a native application key corresponding to a lowest-use native application transitioned to a lowest expanded position or retained in the retracted position. Thus, in this example, the second method S200 can adjust the height of various deformable regions adjacent native application keys displayed within a home screen on the mobile computing device according to a likelihood that the user will select each native application based on application selection history.
Block S210 of the second method S200 recites determining that the mobile computing device is held by the user. Furthermore, Block S220 of the second method S200 recites identifying a position of the mobile computing device in a hand of the user. Generally, Block S210 and Block S220 function to interface with one or more sensors on the mobile computing device to detect that the mobile computing device is being held and how the mobile computing device is being held. For example, Blocks S210 and/or S220 can interface with one or more capacitive, resistive, optical, or other touch sensors arranged about the mobile computing device, such as on and around the display, the side of the mobile computing device, and/or a back surface of the mobile computing device, to detect a finger or hand hovering over or in contact with the mobile computing device. Blocks S210 and/or S220 can additionally or alternatively interface with one or more heat sensors within the mobile computing device to detect a local temperature change across a surface of the device and to correlate the temperature change with a hand holding the mobile computing device and/or interface with an accelerometer and/or a gyroscope to detect that the mobile computing device is being held, moved, and/or manipulated. For example, Block S210 can characterize accelerometer and/or gyroscope outputs as the mobile computing device being in a user's pocket while the user is walking, resting on a table or horizontal surface, or in a user's hand, etc. In another example, for the mobile computing device that is a wearable device (e.g., a smart wristband), Blocks S210 and S220 can interface with a heart rate sensor within the wearable device to detect the user's current heart rate, and the second method S200 can set a position of one or more deformable regions on the wearable device based on the user's current heart rate. Blocks S210 and S220 can similarly detect the user's current breathing rate or other vital sign, and the second method S200 can set a position of one or more deformable regions on the wearable device accordingly. Block S210 and S220 can additionally or alternatively interface with one or more bio-sensors integrated into the wearable device (or other computing device) to identify a user who is holding the wearable device based on bio-signature output from the bio-sensor, and Bocks S210 and S220 can thus adjust a position of one or more deformable regions (e.g., a location, a height, a firmness, and/or a unique gesture definition related to a deformable region) according to a preference of the identified user.
Block S220 can thus compare sensed touch areas to a touch area model to characterize a touch sensor output as a left hand or a right hand holding the mobile computing device in a portrait, landscape, or other orientation. Block S220 can similarly compare sensed heat areas to a heat area model to characterize a temperature sensor output as a left or right hand holding the mobile computing device in a portrait, landscape, or other orientation. Block S220 can also determine how the mobile computing device is held, such as by one or both hands of the user, based on how text or other inputs are entered into the mobile computing device, and Block S220 can further verify such characterization of user inputs substantially in real-time based on accelerometer and/or gyroscope data collected by sensors in the mobile computing device.
Blocks S210 and S220 can additionally or alternatively implement machine vision and/or machine learning to identify a face, body, clothing feature, etc. in a field of view of a (forward-facing) camera within the mobile computing device and thus determine that the mobile computing device is held and how the mobile computing device is held based on the identified face, body, clothing feature, etc. For example, Block S210 can implement facial recognition to determine that the mobile computing device is currently held, and Block S220 can implement face tracking to predict which hand the user is using to hold the mobile computing device. Block S210 and S220 can additionally or alternatively interface with a rear-facing camera within the mobile computing device to identify a hand (e.g., left or right) holding the mobile computing device. Blocks S210 and S220 can similarly identify a hand shape or hand motion (i.e., gesture) in a field of view of a camera within the mobile computing device (and not touching the mobile computing device), and subsequent Blocks of the second method S200 can set a deformable region position according to the identified hand shape or gesture.
Blocks S210 and S220 can additionally or alternatively determine if the mobile computing device is worn, in use, or in a particular location, on in an “ON” or “unlocked” state. For example, the second method S200 can selectively expand and retract one or more side, back, or on-screen deformable regions based on location data of the mobile computing device determined in Blocks S210 and S220 through a location (e.g., GPS) sensor within the mobile computing device. In this example, the second method S200 can thus selectively control the position of various deformable regions based on whether the user is at home, in his car, what app is running on the mobile computing device, etc.
However, Block S210 and Block S220 can function in any other way to determine that the mobile computing device is being held and to characterize how the mobile computing device is held.
Block S230 of the second method S200 recites predicting a location of a future input into the mobile computing device, the location proximal the deformable region. Generally, Block S230 functions to predict a location of an upcoming input based on how the mobile computing device is held (e.g., orientation of the mobile computing device, which hand(s) the user is using to hold the mobile computing device). In example similar to that described above, when the mobile computing device is “locked” and Blocks S210 and S220 determine that the user has picked up the mobile computing device with his left hand and is holding the mobile computing device in a portrait configuration, Block S230 can predict an upcoming input to include an “unlock” gesture. In this example, Block S230 can also predict that a convenient or preferred unlock input to be from the Quadrant I of the display (current top-right quadrant) to the Quadrant II of the display (current top-left quadrant) based the holding hand and orientation determined in Blocks S210 and S220. Block S230 can thus predict the upcoming input and a preferred location for the upcoming input.
In another example similar to that described above, when the mobile computing device is outputting sound, such as through a headphone stereo jack or internal speaker, and Blocks S210 and S220 determine that the user is holding the mobile computing device in his right hand in a portrait configuration, Block S230 can predict an upcoming input to include either of a “volume up” gesture and a “volume down” gesture. In this example, Block S230 can also predict that a convenient or preferred “volume up” and “volume down” input regions to lie off the display on an upper left lateral side of the mobile computing device such that user's right index finger falls substantially naturally on the “volume up” and “volume down” input regions. Block S230 can thus predict the upcoming input and a preferred or convenient location for the upcoming input based on the holding position of the mobile computing device determined in Blocks S210 and S220.
Block S240 of the second method S200 recites transitioning a deformable region from the retracted setting to the expanded setting. Generally, Block S240 functions to control the displacement device to displace fluid through the fluid channel to transition the deformable region from the retracted setting to the expanded setting. Block S240 can control one or more valves and/or one or more displacement devices within the mobile computing device to selectively expand and/or retract a particular subset of deformable regions, as described above or as described in U.S. patent application Ser. No. 12/319,334, filed on 5 Jan. 2009, which is incorporated in its entirety by this reference.
Therefore, the second method S200 can function to predict a future input and/or a preferred or convenient location for a future input and manipulate a deformable region on the mobile computing device to define a tangible button accordingly. The second method S200 can manipulate one or more deformable regions over a display within the mobile computing device (i.e., an on-screen physical button) and/or one or more deformable regions remote from the display (i.e., an off-screen physical button). As described above, the second method can therefore control one or more valves, displacement devices, etc. to form a physical volume up button, volume down button, lock button, unlock button, ringer or vibrator state button, home button, camera shutter button, and/or application selection button, etc. on the mobile computing device. The second method S200 can further manage outputs from a touch sensor to handle user inputs into selectively formed buttons, and the second method can also interface with a display driver to render visual input region identifiers adjacent (i.e., under) on-screen buttons and/or to render visual input identifiers near or pointing to off-screen buttons. For example, the second method S200 can detect a first gesture, selectively adjust the position of a particular deformable region accordingly, detect a subsequent gesture, assign a particular output type to the particular deformable region, and then generate an output of the particular output type when the particular deformable region is subsequently selected by the user. However, the second method S200 can function in any other way to estimate how the mobile computing device is held, to predict a type and/or location of a future input, and to manipulate a vertical position of one or more deformable regions accordingly to the predicted type and/or location of the future input.
An example of method S200 includes detecting an ongoing phone call on a mobile phone with a touchscreen or other sound output through a speaker of the mobile phone. Method S200 can further detect the orientation of the phone by detecting the touchscreen proximal and/or contacting an ear of the user, such as when the user holds the mobile phone up to the ear during the ongoing phone call. In response, method S200 can select and expand a deformable region corresponding to the ear and the speaker such that the deformable region forms an earpiece. Thus, method S200 can expand the earpiece to conform to the ear and focus sound output from the speaker toward to ear for improved hearing.
As shown in
As shown in
Generally, method S300 functions to register an implicit event associated with an input, define a command for the dynamic tactile interface in response to the implicit event, and, in response to the command, modify the dynamic tactile interface according to an anticipated future input to the dynamic tactile interface. In particular, method S300 functions to correlate spatial orientation of the device and a native application executing on the device with a configuration of deformable regions of the dynamic tactile interface.
The dynamic tactile interface can further include a display coupled to the substrate opposite the tactile layer and displaying an image of a key substantially aligned with the deformable region and/or a touch sensor coupled to the substrate and outputting a signal corresponding to an input on a tactile surface of the tactile layer adjacent the deformable region. The dynamic tactile interface can also include a housing that transiently engages a (mobile) computing device and transiently retains the substrate over a digital display of the (mobile) computing device.
Generally, the dynamic tactile interface can be implemented within or in conjunction with a computing device to provide tactile guidance to a user entering input selections through a touchscreen or other illuminated surface of the computing device. In particular the dynamic tactile interface defines one or more deformable regions of a tactile layer that can be selectively expanded and retracted to intermittently provide tactile guidance to a user interacting with the computing device. In one implementation, the dynamic tactile interface is integrated into or applied over a touchscreen of a mobile computing device, such as a smartphone or a tablet. For example, the dynamic tactile interface can include a set of round or rectangular deformable regions, wherein each deformable region is substantially aligned with a virtual key of a virtual keyboard rendered on the a display integrated into the mobile computing device, and wherein each deformable region in the set mimics a physical hard key when in an expanded setting. However, in this example, when the virtual keyboard is not rendered on the display of the mobile computing device, the dynamic tactile interface can retract the set of deformable regions to yield a substantially uniform (e.g., flush) tactile surface yielding reduced optical distortion of an image rendered on the display. In another example, the dynamic tactile interface can include an elongated deformable region aligned with a virtual ‘swipe-to-unlock’ input region rendered on the display such that, when in the expanded setting, the elongated deformable region provides tactile guidance for a user entering an unlock gesture into the mobile computing device. Once the mobile computing device is unlocked responsive to the swipe gesture suitably aligned with the virtual input region, the dynamic tactile interface can transition the elongated deformable region back to the retracted setting to yield a uniform surface over the display.
The dynamic tactile interface can alternatively embody an aftermarket device that adds tactile functionality to an existing computing device. For example, the dynamic tactile interface can include a housing that transiently engages an existing (mobile) computing device and transiently retains the substrate over a digital display of the computing device. The displacement device of the dynamic tactile interface can thus be manually or automatically actuated to transition the deformable region(s) of the tactile layer between expanded and retracted settings.
Generally, Block S310 detects an orientation of the device. In particular, Block S310 can interface with a sensor incorporated into the device (e.g., a touch sensor, an optical sensor, an accelerometer, Global Positioning System, etc.) to detect the orientation of the device relative an external surface or body. For example, Block S310 can interface with an accelerometer built into the device to detect orientation of a mobile phone relative to a horizontal surface. The mobile phone can be oriented in a portrait orientation, such that a minor axis of the device can be substantially parallel to the horizontal surface. Likewise, the device can be oriented in a landscape orientation, such that the major axis of the device can be substantially parallel the horizontal surface. Alternatively, Block S310 can detect the device in any other orientation with any other sensor suitable for detecting orientation of the device. For example, Block S310 can detect, with an optical sensor, a display of the device resting on a horizontal surface. Block S310 can further detect the position of the device relative an external surface and/or object. In another example, Block S310 can detect an input object (e.g., a finger) resting on a surface the device. Block S310 can detect with a sensor, such as a capacitive, resistive, and/or optical sensor.
Generally, Block S320 predicts a location of an upcoming input related to a native application executing on the device. In particular, Block S320 can predict a particular input at a particular location in response to execution of the native application. For example, Block S320 can predict a contact with a surface of the device at the particular location. For example, Block S320 can identify a future input defined by a contact by an input object (e.g., a finger) on a portion of the touchscreen of the computing device corresponding to a virtual image rendered by the touchscreen.
Generally, Block S330 selects a particular deformable region from a set of deformable regions, the particular deformable region corresponding to the anticipated input and adjacent the input location. In particular, Block S330 can select the particular deformable region adjacent or arranged over the input location. Block S330 can select a particular deformable region with a shape substantially corresponding to the anticipated input. For example, if the anticipated input includes a slide gesture across the tactile surface, Block S330 can select a particular deformable region that forms an elongated and elevated button, such that the user can slide a finger across the expanded deformable region to enter the gesture into the device. Alternatively, Block S330 can select a set of particular deformable regions from the set of deformable regions, such that the set of particular deformable regions cooperatively correspond to the anticipated input.
Generally, Block S340 selectively transitions the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region. In particular, Block S340 can transition the particular deformable region(s) by displacing fluid from a fluid vessel into a cavity arranged under the deformable region. The tactile layer can include a substrate, a deformable region, and a peripheral region adjacent the deformable region and coupled to the substrate opposite the tactile layer, the substrate defining a fluid channel and cooperating with the deformable region to define a cavity filled with fluid. A displacement device (e.g., a pump) fluidly coupled to the fluid channel can displace fluid between the cavity and a reservoir fluidly coupled to the displacement device, thereby transitioning the deformable region between an expanded setting substantially elevated above the peripheral region and a retracted setting substantially flush with the peripheral region. Generally, the tactile layer can define one or more deformable regions operable between the expanded and retracted settings to intermittently define tactilely distinguishable formations over a surface, such as over a touch-sensitive digital display (e.g., a touchscreen), such as described in U.S. patent application Ser. No. 13/414,589. Thus, the displacement device can transition the deformable region into the expanded setting by displacing fluid from the fluid vessel into the cavity. Method S300 can additionally or alternatively transition the particular deformable region(s) using electromechanical actuation. For example, method S300 can be implemented with a “snap dome” deformable region.
Generally, Block S350 detects an input, corresponding to the anticipated input, to the particular deformable region. In particular, Block S350 detects an input at a sensor, such as a touch sensor integrated in a touchscreen display of the mobile computing device (e.g., a capacitive, resistive, or optical touch sensor). Alternatively, Block S350 can detect the input at a pressure sensor by detecting a change in pressure of the fluid in the cavity. An increase in pressure of the fluid in the cavity corresponds to depression of the deformable region into the cavity and, thus, an input to the dynamic tactile interface.
Generally, method S300 functions to register interaction with the dynamic tactile interface by detecting an orientation of the device in Block 310, identifying an anticipated input corresponding to a native application currently executing on the device, the anticipated input associated with an input location of the device in Block S320; selecting a particular deformable region from a set of deformable regions, the particular deformable region corresponding to the anticipated input and adjacent the input location in Block S330; selectively transitioning the particular deformable region from a retracted setting substantially flush with the peripheral region to an expanded setting tactilely distinguishable from the peripheral region in Block S340; and detecting an input, corresponding to the anticipated input, to the particular deformable region in Block S350.
One example of method S300 includes detecting a mobile phone held by a user in a landscape orientation in Block S310. Block S310 can detect the mobile phone held by two hands of the user, the mobile phone situated between a thumb and an index finger of each hand as shown in
In a similar example, method S300 can include detecting the orientation of the mobile phone (e.g., in a portrait orientation) in Block S310. Block S320 can detect a camera application executing on the mobile phone, the camera application capturing an image detected by a forward-facing camera built into a face of the mobile phone proximal the display. Block S320 can anticipate an input, such as selection of a virtual shutter button in order to capture the image with the forward-facing camera (i.e., a “selfie”) as shown in
In another example, method S300 can include detecting the orientation of the mobile computing device with an accelerometer or other orientation-detecting sensor. Block S310 of method S300 can detect the minor axis of the mobile computing device substantially parallel a horizontal plane, thereby defining a portrait orientation. Block S310 of method S300 can also detect the major axis of the mobile phone substantially parallel a horizontal plane, thereby defining a landscape orientation as shown in
In another example shown in
In another example, method S300 can detect an input object proximal a surface of the device, and, upon detection of the input object contacting the device, method S300 can expand the particular deformable region coincident the input object. Method S300 can identify an anticipated input corresponding to a command to wake a “sleeping” device (e.g., a device in a low energy mode). For example, method S300 can anticipate depression of a wake button on the “sleeping” device. The “sleeping” device can be powered on (e.g., consuming energy from a battery and executing programs) but a touchscreen of the device can be disabled until the command to wake the “sleeping” device enables the touchscreen. Method S300 can detect the input object proximal or coincident a surface of the device. For example, method S300 can detect a hand or finger resting on the device as would occur if one were to hold the device in the hand. Accordingly, method S300 can select the particular deformable region coincident or adjacent the input object and selectively expand the particular deformable region. Method S300 can detect depression of the particular deformable region and interpret depression of the particular deformable region as a command to wake the “sleeping” device accordingly.
An example of this variation includes expanding a deformable region corresponding to an icon indicating receipt of an incoming message as shown in
Another example of the variation includes expanding the particular deformable region corresponding in response to an incoming phone call, the particular deformable region corresponding to an anticipated input that answers the incoming phone call. In particular, method S300 can detect an incoming phone call and, thus, render a notification on the display to notify the user of the incoming phone call. For example, method S300 can render a virtual icon on a touchscreen of the device to prompt the user to answer the phone call. Additionally, method S300 can selectively expand a particular deformable region arranged over the virtual icon. Alternatively, method S300 can select and expand a particular deformable region corresponding to an anticipated input location, such as a surface of the device where an input object (e.g., the user's finger) is in contact with the device prior and up to the time of the incoming phone call. Thus, the method can raise a particular deformable region adjacent a surface of the device that the user is already touching, and the user can answer the phone call by depressing the particular deformable region thus raised under or adjacent the user's finger.
In another example, method S300 can detect an external surface, such as a surface on which the device rests, and selectively deformable the particular deformable region(s) opposite the external surface. For example, a mobile phone can rest on a surface of a table with the touchscreen of the mobile phone contacting the surface of the table. Method S300 can detect the surface of the table proximal the touchscreen. In response to receipt of an incoming phone call, method S300 can identify a notification notifying the user of the phone call, a location of the notification corresponding to a surface of the mobile phone opposite the external surface (e.g., the back of the phone), and an anticipated input corresponding to answering the incoming phone call. Thus, method S300 can select the particular deformable region corresponding to the location of the notification opposite the external surface (e.g., the back of the phone) and transition the deformable region to an expanded setting, thereby indicating the incoming phone call and providing a tactile feature on which a user can apply the anticipated input.
Another example of the variation includes expanding the particular deformable region corresponding to an icon representing a local area wireless technology or short-range wireless communication rendered by the touchscreen of the mobile computing device in response to short-range wireless communication (e.g., Bluetooth) between the mobile computing device and a secondary device, as shown in
In another example of the variation, method S300 can retract the deformable region(s) and disable input(s) to the mobile computing device in response to receipt of a signal from a third party device indicating the mobile computing device was lost or stolen. In particular, method S300 can detect a phone tracking application executing on the mobile computing device. Method S300 can detect a message from a third party device indicating that owner of the mobile computing device no longer possesses the mobile computing device. Thus, with the phone tracking application, method S300 tracks location and can disable interactive features of the mobile computing device. Method S300 can disable inputs and outputs to the mobile computing device. Thus, method S300 can selectively transition expanded deformable regions to the retracted setting.
The systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated with the application, applet, host, server, network, website, communication service, communication interface, native application, frame, iframe, hardware/firmware/software elements of a user computer or mobile device, or any suitable combination thereof. Other systems and methods of the embodiments can be embodied and/or implemented at least in part as a machine configured to receive a computer-readable medium storing computer-readable instructions. The instructions can be executed by computer-executable components integrated by computer-executable components integrated with apparatuses and networks of the type described above. The computer-readable medium can be stored on any suitable computer readable media such as RAMs, ROMs, flash memory, EEPROMs, optical devices (CD or DVD), hard drives, floppy drives, or any suitable device. The computer-executable component can be a processor, though any suitable dedicated hardware device can (alternatively or additionally) execute the instructions.
As a person skilled in the art will recognize from the previous detailed description and from the figures and claims, modifications and changes can be made to the preferred embodiments of the invention without departing from the scope of this invention defined in the following claims.
This application claims the benefit of U.S. Provisional Application No. 61/871,264, filed on 28 Aug. 2013, which is incorporated in its entirety by this reference. This application is related to U.S. application Ser. No. 11/969,848, filed on 4 Jan. 2008; U.S. application Ser. No. 12/319,334, filed on 5 Jan. 2009; U.S. application Ser. No. 12/497,622, filed on 3 Jul. 2009, which are all incorporated in their entirety by this reference.
Number | Name | Date | Kind |
---|---|---|---|
3034628 | Wadey | May 1962 | A |
3659354 | Sutherland | May 1972 | A |
3759108 | Borom et al. | Sep 1973 | A |
3780236 | Gross | Dec 1973 | A |
3818487 | Brody et al. | Jun 1974 | A |
4109118 | Kley | Aug 1978 | A |
4209819 | Seignemartin | Jun 1980 | A |
4290343 | Gram | Sep 1981 | A |
4307268 | Harper | Dec 1981 | A |
4467321 | Volnak | Aug 1984 | A |
4477700 | Balash et al. | Oct 1984 | A |
4517421 | Margolin | May 1985 | A |
4543000 | Hasenbalg | Sep 1985 | A |
4584625 | Kellogg | Apr 1986 | A |
4700025 | Hatayama et al. | Oct 1987 | A |
4772205 | Chlumsky et al. | Sep 1988 | A |
4920343 | Schwartz | Apr 1990 | A |
4940734 | Ley et al. | Jul 1990 | A |
5194852 | More et al. | Mar 1993 | A |
5195659 | Eiskant | Mar 1993 | A |
5212473 | Louis | May 1993 | A |
5222895 | Fricke | Jun 1993 | A |
5286199 | Kipke | Feb 1994 | A |
5369228 | Faust | Nov 1994 | A |
5412189 | Cragun | May 1995 | A |
5459461 | Crowley et al. | Oct 1995 | A |
5488204 | Mead et al. | Jan 1996 | A |
5496174 | Garner | Mar 1996 | A |
5666112 | Crowley et al. | Sep 1997 | A |
5717423 | Parker | Feb 1998 | A |
5729222 | Iggulden et al. | Mar 1998 | A |
5742241 | Crowley et al. | Apr 1998 | A |
5754023 | Roston et al. | May 1998 | A |
5766013 | Vuyk | Jun 1998 | A |
5767839 | Rosenberg | Jun 1998 | A |
5835080 | Beeteson et al. | Nov 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5889236 | Gillespie et al. | Mar 1999 | A |
5917906 | Thornton | Jun 1999 | A |
5943043 | Furuhata et al. | Aug 1999 | A |
5977867 | Blouin | Nov 1999 | A |
5982304 | Selker et al. | Nov 1999 | A |
6067116 | Yamano et al. | May 2000 | A |
6154198 | Rosenberg | Nov 2000 | A |
6154201 | Levin et al. | Nov 2000 | A |
6160540 | Fishkin et al. | Dec 2000 | A |
6169540 | Rosenberg et al. | Jan 2001 | B1 |
6187398 | Eldridge | Feb 2001 | B1 |
6188391 | Seely et al. | Feb 2001 | B1 |
6218966 | Goodwin et al. | Apr 2001 | B1 |
6243074 | Fishkin et al. | Jun 2001 | B1 |
6243078 | Rosenberg | Jun 2001 | B1 |
6268857 | Fishkin et al. | Jul 2001 | B1 |
6271828 | Rosenberg et al. | Aug 2001 | B1 |
6278441 | Gouzman et al. | Aug 2001 | B1 |
6300937 | Rosenberg | Oct 2001 | B1 |
6310614 | Maeda et al. | Oct 2001 | B1 |
6323846 | Westerman et al. | Nov 2001 | B1 |
6337678 | Fish | Jan 2002 | B1 |
6354839 | Schmidt et al. | Mar 2002 | B1 |
6356259 | Maeda et al. | Mar 2002 | B1 |
6359572 | Vale | Mar 2002 | B1 |
6366272 | Rosenberg et al. | Apr 2002 | B1 |
6369803 | Brisebois et al. | Apr 2002 | B2 |
6384743 | Vanderheiden | May 2002 | B1 |
6414671 | Gillespie et al. | Jul 2002 | B1 |
6429846 | Rosenberg et al. | Aug 2002 | B2 |
6437771 | Rosenberg et al. | Aug 2002 | B1 |
6462294 | Davidson et al. | Oct 2002 | B2 |
6469692 | Rosenberg | Oct 2002 | B2 |
6486872 | Rosenberg et al. | Nov 2002 | B2 |
6498353 | Nagle et al. | Dec 2002 | B2 |
6501462 | Garner | Dec 2002 | B1 |
6509892 | Cooper et al. | Jan 2003 | B1 |
6529183 | Maclean et al. | Mar 2003 | B1 |
6573844 | Venolia et al. | Jun 2003 | B1 |
6636202 | Ishmael et al. | Oct 2003 | B2 |
6639581 | Moore et al. | Oct 2003 | B1 |
6655788 | Freeman | Dec 2003 | B1 |
6657614 | Ito et al. | Dec 2003 | B1 |
6667738 | Murphy | Dec 2003 | B2 |
6681031 | Cohen et al. | Jan 2004 | B2 |
6683627 | Ullmann et al. | Jan 2004 | B1 |
6686911 | Levin et al. | Feb 2004 | B1 |
6697086 | Rosenberg et al. | Feb 2004 | B2 |
6700556 | Richley et al. | Mar 2004 | B2 |
6703924 | Tecu et al. | Mar 2004 | B2 |
6743021 | Prince et al. | Jun 2004 | B2 |
6788295 | Inkster | Sep 2004 | B1 |
6819316 | Schulz et al. | Nov 2004 | B2 |
6850222 | Rosenberg | Feb 2005 | B1 |
6861961 | Sandbach et al. | Mar 2005 | B2 |
6877986 | Fournier et al. | Apr 2005 | B2 |
6881063 | Yang | Apr 2005 | B2 |
6930234 | Davis | Aug 2005 | B2 |
6937225 | Kehlstadt et al. | Aug 2005 | B1 |
6975305 | Yamashita | Dec 2005 | B2 |
6979164 | Kramer | Dec 2005 | B2 |
6982696 | Shahoian | Jan 2006 | B1 |
6995745 | Boon et al. | Feb 2006 | B2 |
7027032 | Rosenberg et al. | Apr 2006 | B2 |
7056051 | Fiffie | Jun 2006 | B2 |
7061467 | Rosenberg | Jun 2006 | B2 |
7064655 | Murray et al. | Jun 2006 | B2 |
7079111 | Ho | Jul 2006 | B2 |
7081888 | Cok et al. | Jul 2006 | B2 |
7096852 | Gregorio | Aug 2006 | B2 |
7102541 | Rosenberg | Sep 2006 | B2 |
7104152 | Levin et al. | Sep 2006 | B2 |
7106305 | Rosenberg | Sep 2006 | B2 |
7106313 | Schena et al. | Sep 2006 | B2 |
7109967 | Hioki et al. | Sep 2006 | B2 |
7112737 | Ramstein | Sep 2006 | B2 |
7113166 | Rosenberg et al. | Sep 2006 | B1 |
7116317 | Gregorio et al. | Oct 2006 | B2 |
7124425 | Anderson, Jr. et al. | Oct 2006 | B1 |
7129854 | Arneson et al. | Oct 2006 | B2 |
7131073 | Rosenberg et al. | Oct 2006 | B2 |
7136045 | Rosenberg et al. | Nov 2006 | B2 |
7138977 | Kinerk et al. | Nov 2006 | B2 |
7138985 | Nakajima | Nov 2006 | B2 |
7143785 | Maerkl et al. | Dec 2006 | B2 |
7144616 | Unger et al. | Dec 2006 | B1 |
7148875 | Rosenberg et al. | Dec 2006 | B2 |
7151432 | Tierling | Dec 2006 | B2 |
7151527 | Culver | Dec 2006 | B2 |
7151528 | Taylor et al. | Dec 2006 | B2 |
7154470 | Tierling | Dec 2006 | B2 |
7158112 | Rosenberg et al. | Jan 2007 | B2 |
7159008 | Wies et al. | Jan 2007 | B1 |
7161276 | Face | Jan 2007 | B2 |
7161580 | Bailey et al. | Jan 2007 | B2 |
7168042 | Braun et al. | Jan 2007 | B2 |
7176903 | Katsuki et al. | Feb 2007 | B2 |
7182691 | Schena | Feb 2007 | B1 |
7191191 | Peurach et al. | Mar 2007 | B2 |
7193607 | Moore et al. | Mar 2007 | B2 |
7195170 | Matsumoto et al. | Mar 2007 | B2 |
7196688 | Schena | Mar 2007 | B2 |
7198137 | Olien | Apr 2007 | B2 |
7199790 | Rosenberg et al. | Apr 2007 | B2 |
7202851 | Cunningham et al. | Apr 2007 | B2 |
7205981 | Cunningham | Apr 2007 | B2 |
7208671 | Chu | Apr 2007 | B2 |
7209028 | Boronkay et al. | Apr 2007 | B2 |
7209113 | Park | Apr 2007 | B2 |
7209117 | Rosenberg et al. | Apr 2007 | B2 |
7209118 | Shahoian et al. | Apr 2007 | B2 |
7210160 | Anderson, Jr. et al. | Apr 2007 | B2 |
7215326 | Rosenberg | May 2007 | B2 |
7216671 | Unger et al. | May 2007 | B2 |
7218310 | Tierling et al. | May 2007 | B2 |
7218313 | Marcus et al. | May 2007 | B2 |
7233313 | Levin et al. | Jun 2007 | B2 |
7233315 | Gregorio et al. | Jun 2007 | B2 |
7233476 | Goldenberg et al. | Jun 2007 | B2 |
7236157 | Schena et al. | Jun 2007 | B2 |
7245202 | Levin | Jul 2007 | B2 |
7245292 | Custy | Jul 2007 | B1 |
7249951 | Bevirt et al. | Jul 2007 | B2 |
7250128 | Unger et al. | Jul 2007 | B2 |
7253803 | Schena et al. | Aug 2007 | B2 |
7253807 | Nakajima | Aug 2007 | B2 |
7265750 | Rosenberg | Sep 2007 | B2 |
7280095 | Grant | Oct 2007 | B2 |
7283120 | Grant | Oct 2007 | B2 |
7283123 | Braun et al. | Oct 2007 | B2 |
7283696 | Ticknor et al. | Oct 2007 | B2 |
7289106 | Bailey et al. | Oct 2007 | B2 |
7289111 | Asbill | Oct 2007 | B2 |
7307619 | Cunningham et al. | Dec 2007 | B2 |
7308831 | Cunningham et al. | Dec 2007 | B2 |
7319374 | Shahoian | Jan 2008 | B2 |
7336260 | Martin et al. | Feb 2008 | B2 |
7336266 | Hayward et al. | Feb 2008 | B2 |
7339572 | Schena | Mar 2008 | B2 |
7339580 | Westerman et al. | Mar 2008 | B2 |
7342573 | Ryynaenen | Mar 2008 | B2 |
7355595 | Bathiche et al. | Apr 2008 | B2 |
7369115 | Cruz-Hernandez et al. | May 2008 | B2 |
7382357 | Panotopoulos et al. | Jun 2008 | B2 |
7390157 | Kramer | Jun 2008 | B2 |
7391861 | Levy | Jun 2008 | B2 |
7397466 | Bourdelais et al. | Jul 2008 | B2 |
7403191 | Sinclair | Jul 2008 | B2 |
7432910 | Shahoian | Oct 2008 | B2 |
7432911 | Skarine | Oct 2008 | B2 |
7432912 | Cote et al. | Oct 2008 | B2 |
7433719 | Dabov | Oct 2008 | B2 |
7471280 | Prins | Dec 2008 | B2 |
7489309 | Levin et al. | Feb 2009 | B2 |
7511702 | Hotelling | Mar 2009 | B2 |
7522152 | Olien et al. | Apr 2009 | B2 |
7545289 | Mackey et al. | Jun 2009 | B2 |
7548232 | Shahoian et al. | Jun 2009 | B2 |
7551161 | Mann | Jun 2009 | B2 |
7561142 | Shahoian et al. | Jul 2009 | B2 |
7567232 | Rosenberg | Jul 2009 | B2 |
7567243 | Hayward | Jul 2009 | B2 |
7589714 | Funaki | Sep 2009 | B2 |
7592999 | Rosenberg et al. | Sep 2009 | B2 |
7605800 | Rosenberg | Oct 2009 | B2 |
7609178 | Son et al. | Oct 2009 | B2 |
7656393 | King et al. | Feb 2010 | B2 |
7659885 | Kraus et al. | Feb 2010 | B2 |
7671837 | Forsblad et al. | Mar 2010 | B2 |
7679611 | Schena | Mar 2010 | B2 |
7679839 | Polyakov et al. | Mar 2010 | B2 |
7688310 | Rosenberg | Mar 2010 | B2 |
7701438 | Chang et al. | Apr 2010 | B2 |
7728820 | Rosenberg et al. | Jun 2010 | B2 |
7733575 | Heim et al. | Jun 2010 | B2 |
7743348 | Robbins et al. | Jun 2010 | B2 |
7755602 | Tremblay et al. | Jul 2010 | B2 |
7808488 | Martin et al. | Oct 2010 | B2 |
7834853 | Finney et al. | Nov 2010 | B2 |
7843424 | Rosenberg et al. | Nov 2010 | B2 |
7864164 | Cunningham et al. | Jan 2011 | B2 |
7869589 | Tuovinen | Jan 2011 | B2 |
7890257 | Fyke et al. | Feb 2011 | B2 |
7890863 | Grant et al. | Feb 2011 | B2 |
7920131 | Westerman | Apr 2011 | B2 |
7924145 | Yuk et al. | Apr 2011 | B2 |
7944435 | Rosenberg et al. | May 2011 | B2 |
7952498 | Higa | May 2011 | B2 |
7956770 | Klinghult et al. | Jun 2011 | B2 |
7973773 | Pryor | Jul 2011 | B2 |
7978181 | Westerman | Jul 2011 | B2 |
7978183 | Rosenberg et al. | Jul 2011 | B2 |
7978186 | Vassallo et al. | Jul 2011 | B2 |
7979797 | Schena | Jul 2011 | B2 |
7982720 | Rosenberg et al. | Jul 2011 | B2 |
7986303 | Braun et al. | Jul 2011 | B2 |
7986306 | Eich et al. | Jul 2011 | B2 |
7989181 | Blattner et al. | Aug 2011 | B2 |
7999660 | Cybart et al. | Aug 2011 | B2 |
8002089 | Jasso et al. | Aug 2011 | B2 |
8004492 | Kramer et al. | Aug 2011 | B2 |
8013843 | Pryor | Sep 2011 | B2 |
8020095 | Braun et al. | Sep 2011 | B2 |
8022933 | Hardacker et al. | Sep 2011 | B2 |
8031181 | Rosenberg et al. | Oct 2011 | B2 |
8044826 | Yoo | Oct 2011 | B2 |
8047849 | Ahn et al. | Nov 2011 | B2 |
8049734 | Rosenberg et al. | Nov 2011 | B2 |
8059104 | Shahoian et al. | Nov 2011 | B2 |
8059105 | Rosenberg et al. | Nov 2011 | B2 |
8063892 | Shahoian et al. | Nov 2011 | B2 |
8063893 | Rosenberg et al. | Nov 2011 | B2 |
8068605 | Holmberg | Nov 2011 | B2 |
8077154 | Emig et al. | Dec 2011 | B2 |
8077440 | Krabbenborg et al. | Dec 2011 | B2 |
8077941 | Assmann | Dec 2011 | B2 |
8094121 | Obermeyer et al. | Jan 2012 | B2 |
8094806 | Levy | Jan 2012 | B2 |
8103472 | Braun et al. | Jan 2012 | B2 |
8106787 | Nurmi | Jan 2012 | B2 |
8115745 | Gray | Feb 2012 | B2 |
8123660 | Kruse et al. | Feb 2012 | B2 |
8125347 | Fahn | Feb 2012 | B2 |
8125461 | Weber et al. | Feb 2012 | B2 |
8130202 | Levine et al. | Mar 2012 | B2 |
8144129 | Hotelling et al. | Mar 2012 | B2 |
8144271 | Han | Mar 2012 | B2 |
8154512 | Olien et al. | Apr 2012 | B2 |
8154527 | Ciesla et al. | Apr 2012 | B2 |
8159461 | Martin et al. | Apr 2012 | B2 |
8162009 | Chaffee | Apr 2012 | B2 |
8164573 | Dacosta et al. | Apr 2012 | B2 |
8166649 | Moore | May 2012 | B2 |
8169306 | Schmidt et al. | May 2012 | B2 |
8169402 | Shahoian et al. | May 2012 | B2 |
8174372 | Da Costa | May 2012 | B2 |
8174495 | Takashima et al. | May 2012 | B2 |
8174508 | Sinclair et al. | May 2012 | B2 |
8174511 | Takenaka et al. | May 2012 | B2 |
8178808 | Strittmatter | May 2012 | B2 |
8179375 | Ciesla et al. | May 2012 | B2 |
8179377 | Ciesla et al. | May 2012 | B2 |
8188989 | Levin et al. | May 2012 | B2 |
8195243 | Kim et al. | Jun 2012 | B2 |
8199107 | Xu et al. | Jun 2012 | B2 |
8199124 | Ciesla et al. | Jun 2012 | B2 |
8203094 | Mittleman et al. | Jun 2012 | B2 |
8203537 | Tanabe et al. | Jun 2012 | B2 |
8207950 | Ciesla et al. | Jun 2012 | B2 |
8212772 | Shahoian | Jul 2012 | B2 |
8217903 | Ma et al. | Jul 2012 | B2 |
8217904 | Kim | Jul 2012 | B2 |
8223278 | Kim et al. | Jul 2012 | B2 |
8224392 | Kim et al. | Jul 2012 | B2 |
8228305 | Pryor | Jul 2012 | B2 |
8232976 | Yun et al. | Jul 2012 | B2 |
8243038 | Ciesla et al. | Aug 2012 | B2 |
8253052 | Chen | Aug 2012 | B2 |
8253703 | Eldering | Aug 2012 | B2 |
8279172 | Braun et al. | Oct 2012 | B2 |
8279193 | Birnbaum et al. | Oct 2012 | B1 |
8310458 | Faubert et al. | Nov 2012 | B2 |
8345013 | Heubel et al. | Jan 2013 | B2 |
8350820 | Deslippe et al. | Jan 2013 | B2 |
8362882 | Heubel et al. | Jan 2013 | B2 |
8363008 | Ryu et al. | Jan 2013 | B2 |
8367957 | Strittmatter | Feb 2013 | B2 |
8368641 | Tremblay et al. | Feb 2013 | B2 |
8378797 | Pance et al. | Feb 2013 | B2 |
8384680 | Paleczny et al. | Feb 2013 | B2 |
8390594 | Modarres et al. | Mar 2013 | B2 |
8395587 | Cauwels et al. | Mar 2013 | B2 |
8395591 | Kruglick | Mar 2013 | B2 |
8400402 | Son | Mar 2013 | B2 |
8400410 | Taylor et al. | Mar 2013 | B2 |
8547339 | Ciesla | Oct 2013 | B2 |
8587541 | Ciesla et al. | Nov 2013 | B2 |
8587548 | Ciesla et al. | Nov 2013 | B2 |
8749489 | Ito et al. | Jun 2014 | B2 |
20010008396 | Komata | Jul 2001 | A1 |
20010043189 | Brisebois et al. | Nov 2001 | A1 |
20020063694 | Keely et al. | May 2002 | A1 |
20020104691 | Kent et al. | Aug 2002 | A1 |
20020106614 | Prince et al. | Aug 2002 | A1 |
20020110237 | Krishnan | Aug 2002 | A1 |
20020149570 | Knowles et al. | Oct 2002 | A1 |
20020180620 | Gettemy et al. | Dec 2002 | A1 |
20030087698 | Nishiumi et al. | May 2003 | A1 |
20030117371 | Roberts et al. | Jun 2003 | A1 |
20030179190 | Franzen | Sep 2003 | A1 |
20030206153 | Murphy | Nov 2003 | A1 |
20030223799 | Pihlaja | Dec 2003 | A1 |
20040001589 | Mueller et al. | Jan 2004 | A1 |
20040056876 | Nakajima | Mar 2004 | A1 |
20040056877 | Nakajima | Mar 2004 | A1 |
20040106360 | Farmer et al. | Jun 2004 | A1 |
20040114324 | Kusaka et al. | Jun 2004 | A1 |
20040164968 | Miyamoto | Aug 2004 | A1 |
20040178006 | Cok | Sep 2004 | A1 |
20050007339 | Sato | Jan 2005 | A1 |
20050007349 | Vakil et al. | Jan 2005 | A1 |
20050020325 | Enger et al. | Jan 2005 | A1 |
20050030292 | Diederiks | Feb 2005 | A1 |
20050057528 | Kleen | Mar 2005 | A1 |
20050073506 | Durso | Apr 2005 | A1 |
20050088417 | Mulligan | Apr 2005 | A1 |
20050110768 | Marriott et al. | May 2005 | A1 |
20050162408 | Martchovsky | Jul 2005 | A1 |
20050212773 | Asbill | Sep 2005 | A1 |
20050231489 | Ladouceur et al. | Oct 2005 | A1 |
20050253816 | Himberg et al. | Nov 2005 | A1 |
20050270444 | Miller et al. | Dec 2005 | A1 |
20050285846 | Funaki | Dec 2005 | A1 |
20060026521 | Hotelling et al. | Feb 2006 | A1 |
20060087479 | Sakurai et al. | Apr 2006 | A1 |
20060097991 | Hotelling et al. | May 2006 | A1 |
20060098148 | Kobayashi et al. | May 2006 | A1 |
20060118610 | Pihlaja et al. | Jun 2006 | A1 |
20060119586 | Grant et al. | Jun 2006 | A1 |
20060152474 | Saito et al. | Jul 2006 | A1 |
20060154216 | Hafez et al. | Jul 2006 | A1 |
20060197753 | Hotelling | Sep 2006 | A1 |
20060214923 | Chiu et al. | Sep 2006 | A1 |
20060238495 | Davis | Oct 2006 | A1 |
20060238510 | Panotopoulos et al. | Oct 2006 | A1 |
20060256075 | Anastas et al. | Nov 2006 | A1 |
20060278444 | Binstead | Dec 2006 | A1 |
20070013662 | Fauth | Jan 2007 | A1 |
20070036492 | Lee | Feb 2007 | A1 |
20070085837 | Ricks et al. | Apr 2007 | A1 |
20070108032 | Matsumoto et al. | May 2007 | A1 |
20070122314 | Strand et al. | May 2007 | A1 |
20070130212 | Peurach et al. | Jun 2007 | A1 |
20070152983 | Mckillop et al. | Jul 2007 | A1 |
20070165004 | Seelhammer et al. | Jul 2007 | A1 |
20070171210 | Chaudhri et al. | Jul 2007 | A1 |
20070182718 | Schoener et al. | Aug 2007 | A1 |
20070229233 | Dort | Oct 2007 | A1 |
20070229464 | Hotelling et al. | Oct 2007 | A1 |
20070236466 | Hotelling | Oct 2007 | A1 |
20070236469 | Woolley et al. | Oct 2007 | A1 |
20070247429 | Westerman | Oct 2007 | A1 |
20070257634 | Leschin et al. | Nov 2007 | A1 |
20070273561 | Philipp | Nov 2007 | A1 |
20070296702 | Strawn et al. | Dec 2007 | A1 |
20070296709 | Guanghai | Dec 2007 | A1 |
20080010593 | Uusitalo et al. | Jan 2008 | A1 |
20080024459 | Poupyrev et al. | Jan 2008 | A1 |
20080054875 | Saito | Mar 2008 | A1 |
20080062151 | Kent | Mar 2008 | A1 |
20080136791 | Nissar | Jun 2008 | A1 |
20080138774 | Ahn et al. | Jun 2008 | A1 |
20080143693 | Schena | Jun 2008 | A1 |
20080150911 | Harrison | Jun 2008 | A1 |
20080165139 | Hotelling et al. | Jul 2008 | A1 |
20080174570 | Jobs et al. | Jul 2008 | A1 |
20080202251 | Serban et al. | Aug 2008 | A1 |
20080238448 | Moore et al. | Oct 2008 | A1 |
20080248836 | Caine | Oct 2008 | A1 |
20080251368 | Holmberg et al. | Oct 2008 | A1 |
20080252607 | De Jong et al. | Oct 2008 | A1 |
20080266264 | Lipponen et al. | Oct 2008 | A1 |
20080286447 | Alden et al. | Nov 2008 | A1 |
20080291169 | Brenner et al. | Nov 2008 | A1 |
20080297475 | Woolf et al. | Dec 2008 | A1 |
20080303796 | Fyke | Dec 2008 | A1 |
20090002140 | Higa | Jan 2009 | A1 |
20090002205 | Klinghult et al. | Jan 2009 | A1 |
20090002328 | Ullrich et al. | Jan 2009 | A1 |
20090002337 | Chang | Jan 2009 | A1 |
20090009480 | Heringslack | Jan 2009 | A1 |
20090015547 | Franz et al. | Jan 2009 | A1 |
20090028824 | Chiang et al. | Jan 2009 | A1 |
20090033617 | Lindberg et al. | Feb 2009 | A1 |
20090059495 | Matsuoka | Mar 2009 | A1 |
20090066672 | Tanabe et al. | Mar 2009 | A1 |
20090085878 | Heubel et al. | Apr 2009 | A1 |
20090106655 | Grant et al. | Apr 2009 | A1 |
20090115733 | Ma et al. | May 2009 | A1 |
20090115734 | Fredriksson et al. | May 2009 | A1 |
20090128376 | Caine et al. | May 2009 | A1 |
20090128503 | Grant et al. | May 2009 | A1 |
20090129021 | Dunn | May 2009 | A1 |
20090132093 | Arneson et al. | May 2009 | A1 |
20090135145 | Chen et al. | May 2009 | A1 |
20090140989 | Ahlgren | Jun 2009 | A1 |
20090160813 | Takashima et al. | Jun 2009 | A1 |
20090167508 | Fadell et al. | Jul 2009 | A1 |
20090167509 | Fadell et al. | Jul 2009 | A1 |
20090167567 | Halperin et al. | Jul 2009 | A1 |
20090167677 | Kruse et al. | Jul 2009 | A1 |
20090167704 | Terlizzi et al. | Jul 2009 | A1 |
20090174673 | Ciesla | Jul 2009 | A1 |
20090174687 | Ciesla et al. | Jul 2009 | A1 |
20090181724 | Pettersson | Jul 2009 | A1 |
20090182501 | Fyke et al. | Jul 2009 | A1 |
20090195512 | Pettersson | Aug 2009 | A1 |
20090207148 | Sugimoto et al. | Aug 2009 | A1 |
20090215500 | You et al. | Aug 2009 | A1 |
20090231305 | Hotelling et al. | Sep 2009 | A1 |
20090243998 | Wang | Oct 2009 | A1 |
20090250267 | Heubel et al. | Oct 2009 | A1 |
20090289922 | Henry | Nov 2009 | A1 |
20090303022 | Griffin et al. | Dec 2009 | A1 |
20090309616 | Klinghult | Dec 2009 | A1 |
20100043189 | Fukano | Feb 2010 | A1 |
20100045613 | Wu et al. | Feb 2010 | A1 |
20100073241 | Ayala et al. | Mar 2010 | A1 |
20100078231 | Yeh et al. | Apr 2010 | A1 |
20100079404 | Degner et al. | Apr 2010 | A1 |
20100097323 | Edwards et al. | Apr 2010 | A1 |
20100103116 | Leung et al. | Apr 2010 | A1 |
20100103137 | Ciesla et al. | Apr 2010 | A1 |
20100109486 | Polyakov et al. | May 2010 | A1 |
20100121928 | Leonard | May 2010 | A1 |
20100141608 | Huang et al. | Jun 2010 | A1 |
20100142516 | Lawson et al. | Jun 2010 | A1 |
20100162109 | Chatterjee et al. | Jun 2010 | A1 |
20100171719 | Craig et al. | Jul 2010 | A1 |
20100171720 | Craig et al. | Jul 2010 | A1 |
20100177050 | Heubel et al. | Jul 2010 | A1 |
20100182245 | Edwards et al. | Jul 2010 | A1 |
20100232107 | Dunn | Sep 2010 | A1 |
20100237043 | Garlough | Sep 2010 | A1 |
20100295820 | Kikin-Gil | Nov 2010 | A1 |
20100296248 | Campbell et al. | Nov 2010 | A1 |
20100298032 | Lee et al. | Nov 2010 | A1 |
20100302199 | Taylor et al. | Dec 2010 | A1 |
20100321335 | Lim et al. | Dec 2010 | A1 |
20110001613 | Ciesla et al. | Jan 2011 | A1 |
20110011650 | Klinghult | Jan 2011 | A1 |
20110012851 | Ciesla et al. | Jan 2011 | A1 |
20110018813 | Kruglick | Jan 2011 | A1 |
20110029862 | Scott et al. | Feb 2011 | A1 |
20110043457 | Oliver et al. | Feb 2011 | A1 |
20110060998 | Schwartz et al. | Mar 2011 | A1 |
20110074691 | Causey et al. | Mar 2011 | A1 |
20110120784 | Osoinach et al. | May 2011 | A1 |
20110148793 | Ciesla et al. | Jun 2011 | A1 |
20110148807 | Fryer | Jun 2011 | A1 |
20110157056 | Karpfinger | Jun 2011 | A1 |
20110157080 | Ciesla et al. | Jun 2011 | A1 |
20110163978 | Park et al. | Jul 2011 | A1 |
20110175838 | Higa | Jul 2011 | A1 |
20110175844 | Berggren | Jul 2011 | A1 |
20110181530 | Park et al. | Jul 2011 | A1 |
20110193787 | Morishige et al. | Aug 2011 | A1 |
20110194230 | Hart et al. | Aug 2011 | A1 |
20110241442 | Mittleman et al. | Oct 2011 | A1 |
20110254672 | Ciesla et al. | Oct 2011 | A1 |
20110254709 | Ciesla et al. | Oct 2011 | A1 |
20110254789 | Ciesla et al. | Oct 2011 | A1 |
20120032886 | Ciesla et al. | Feb 2012 | A1 |
20120038583 | Westhues et al. | Feb 2012 | A1 |
20120043191 | Kessler et al. | Feb 2012 | A1 |
20120056846 | Zaliva | Mar 2012 | A1 |
20120062483 | Ciesla et al. | Mar 2012 | A1 |
20120080302 | Kim et al. | Apr 2012 | A1 |
20120098789 | Ciesla et al. | Apr 2012 | A1 |
20120105333 | Maschmeyer et al. | May 2012 | A1 |
20120120357 | Jiroku | May 2012 | A1 |
20120154324 | Wright et al. | Jun 2012 | A1 |
20120193211 | Ciesla et al. | Aug 2012 | A1 |
20120200528 | Ciesla et al. | Aug 2012 | A1 |
20120200529 | Ciesla et al. | Aug 2012 | A1 |
20120206364 | Ciesla et al. | Aug 2012 | A1 |
20120218213 | Ciesla et al. | Aug 2012 | A1 |
20120218214 | Ciesla et al. | Aug 2012 | A1 |
20120223914 | Ciesla et al. | Sep 2012 | A1 |
20120235935 | Ciesla et al. | Sep 2012 | A1 |
20120242607 | Ciesla et al. | Sep 2012 | A1 |
20120306787 | Ciesla et al. | Dec 2012 | A1 |
20130019207 | Rothkopf et al. | Jan 2013 | A1 |
20130127790 | Wassvik | May 2013 | A1 |
20130141118 | Guard | Jun 2013 | A1 |
20130215035 | Guard | Aug 2013 | A1 |
20140043291 | Ciesla et al. | Feb 2014 | A1 |
20140160063 | Yairi et al. | Jun 2014 | A1 |
20140160064 | Yairi et al. | Jun 2014 | A1 |
20140210789 | Ciesla | Jul 2014 | A1 |
Number | Date | Country |
---|---|---|
1260525 | Jul 2000 | CN |
1530818 | Sep 2004 | CN |
1882460 | Dec 2006 | CN |
10255106 | Sep 1998 | JP |
H10255106 | Sep 1998 | JP |
2006268068 | Oct 2006 | JP |
2006285785 | Oct 2006 | JP |
2009064357 | Mar 2009 | JP |
20000010511 | Feb 2000 | KR |
100677624 | Jan 2007 | KR |
2004028955 | Apr 2004 | WO |
2008037275 | Apr 2008 | WO |
2009002605 | Dec 2008 | WO |
2009044027 | Apr 2009 | WO |
2009067572 | May 2009 | WO |
2009088985 | Jul 2009 | WO |
2010077382 | Jul 2010 | WO |
2010078596 | Jul 2010 | WO |
2010078597 | Jul 2010 | WO |
2011003113 | Jan 2011 | WO |
2011087816 | Jul 2011 | WO |
2011087817 | Jul 2011 | WO |
2011112984 | Sep 2011 | WO |
2011133604 | Oct 2011 | WO |
2011133605 | Oct 2011 | WO |
Entry |
---|
“Sharp Develops and Will Mass Produce New System LCD with Embedded Optical Sensors to Provide Input Capabilities Including Touch Screen and Scanner Functions,” Sharp Press Release, Aug. 31, 2007, 3 pages, downloaded from the Internet at: http://sharp-world.com/corporate/news/070831.html. |
Jeong et al., “Tunable Microdoublet Lens Array,” Optical Society of America, Optics Express; vol. 12, No. 11. May 31, 2004, 7 Pages. |
Preumont, A. Vibration Control of Active Structures: An Introduction, Jul. 2011. |
Essilor. “Ophthalmic Optic Files Materials,” Essilor International, Ser 145 Paris France, Mar. 1997, pp. 1-29, [retrieved on Nov. 18, 2014]. Retrieved from the internet. URL: <http://www.essiloracademy.eu/sites/default/files/9.Materials.pdf>. |
Lind. “Two Decades of Negative Thermal Expansion Research: Where Do We Stand?” Department of Chemistry, the University of Toledo, Materials 2012, 5, 1125-1154; doi:10.3390/ma5061125, Jun. 20, 2012 pp. 1125-1154, [retrieved on Nov. 18, 2014]. Retrieved from the internet. URL: <https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=materials-05-01125.pdf>. |
Number | Date | Country | |
---|---|---|---|
20150077364 A1 | Mar 2015 | US |
Number | Date | Country | |
---|---|---|---|
61871264 | Aug 2013 | US |