Wearable electronic devices such as smart watches, smart glasses, and wristbands may have displays and user interfaces. In use, the user typically moves some part of his body to bring the wearable device into some useable position. For example, a wrist watch is generally used with the user orienting his wrist so that the face of the watch comes into view. Such user movements may not be convenient.
A user interface may have a tactile, i.e., a haptic component, so that it may be perceived through the sense of touch. The simplest sort of haptic is an ordinary protruding button made of a rigid material like metal or hard plastic. In this sense, the keys of a typewriter, with their typically deep range of travel, may be seen as having a tactile component. Such an approach may not be suitable for integration into modern displays, where a much lower and less obtrusive profile is called for in addition to optical considerations. Even so, the tactile value of having protruding keys may be substantial.
The various advantages of the embodiments will become apparent to one skilled in the art by reading the following specification and appended claims, and by referencing the following drawings, in which:
As used herein, the term “tactile element” may refer to an element providing tactile feel when it is used, i.e., touched or depressed. In general terms, when a tactile element has been activated, it may assume a characteristic that enables a user to differentiate it from the surrounding area. In one embodiment, this may be accomplished by extending a portion of the element beyond its immediate neighboring area when activated. In one embodiment, the tactile element may include a diaphragm overlying a cavity that may be filled with a fluid, causing the diaphragm to bulge outwardly when the tactile element is energized, giving it a bubble-like “button” shape and feel to the touch. When the tactile element is de-activated and returned to its resting state, fluid flows out of the cavity, causing it to deflate and then tactile element has a feel that is largely the same as the surrounding area of the device.
Tactile elements may be grouped into rows, columns, arrays, concentric circles or any other shape that is suitable for the embodiment in which it is used. They may range from sub-millimetric in size to dimensions in excess of centimeters or more. In embodiments in which fluidic inflation is used, they are generally round and may herein be referred to as “buttons”. In other embodiments they may be rectangular, pin-like or have any other shape.
Several embodiments disclosed herein are presented in the context of wearable devices. As used herein, the term “wearable device” (or simply a “wearable”) includes clothing and accessories incorporating computer or other such electronic technologies. Examples of a wearable device also may include apparatus containing electronic processors that are arranged to be worn by a person and integrated into a wearable structure such as a wristband, glove, ring, eyeglasses, belt-clip or belt, arm-band, shoe, hat, shirt, undergarment, outer garment, clothing generally, and additionally fashion accessories such as wallets, purses, umbrellas etc. In embodiments, a wearable device may be implemented as including all or part of the functional capability of a smart phone or tablet computer or gaming device capable of executing computer applications, as well as voice communications and/or data communications.
The term “smart” as an adjective before a noun, such as “smart watch” or “smart glasses”, “smart wrist band”, etc., includes devices that have one or more capabilities associated with smart phones, such as geo-location capability, the ability to communicate with another device, an interactive display, multi-sensing capabilities or other feature. The wearable may be a so-called smart device, in that it has access to one or more of the capabilities now common with smart phones, including geo-location, sensors, access to the internet via Wi-Fi (Wireless Fidelity, e.g., Institute of Electrical and Electronics Engineers/IEEE 802.11-2007, Wireless Local Area Network/LAN Medium Access Control (MAC) and Physical Layer (PHY) Specifications), near field communications, Bluetooth (e.g., IEEE 802.15.1-2005, Wireless Personal Area Networks) or other communication protocol. Such access may be direct or it may be via a Bluetooth connection with a nearby smart phone or a wearable device worn elsewhere on the user's person. A wearable device may have an interface with which a user interacts.
A control module 206 directs an actuator module 211 to activate, via control lines 212, one or more tactile elements 202. The particular nature of the actuator module will depend on the specific implementation of tactile element used. For example, if the tactile elements are implemented as fluidicly activated buttons, then the actuator module may include at least one pump, which may be bidirectional, and may also include fluidic logic in the form of valves and fluid circuits to permit the selective activation and deactivation of individual or groups of tactile elements. It may also include a reservoir of hydraulic fluid. In such an example, the control lines 212 use pressurized fluid. In another example, the tactile elements may be based on solenoids, the actuator module may include electrical circuitry to selectively activate and deactivate desired tactile elements via control lines 212 that are electrically conductive. The control lines 212 in this example are electrically conducting wires. Whatever the specific form of tactile element used, the actuator module 211 may activate a specific physical pattern of tactile elements 202.
In this example, the physical pattern identifies specific tactile elements 202 for activation. It may be generated within the control module 206, which may include a memory module 207, a processor module 208, a sensor module 209, and a pattern generator module 210. The memory module 207 may store context information and rules, pre-set logical patterns of activations, applications and application data, and sensor data. The processor module 208 processes any or all of this information, resulting in the generation of a logical pattern of tactile element activations. This logical pattern is turned into (i.e., maps onto) a physical pattern of activations at the pattern generator module 210. Sensor module 209 may include and process device location and orientation data, such as may be provided by GPS sensors.
The control module may be linked via any wireless or wired protocol to network 218, which may be part of a Local Area Network (LAN), Wide Area Network (WAN), the cloud, the Internet, cellular network, Wi-Fi, and so forth. Thus, the control module may be responsive to communications received from another device or network, near or far.
In use, the user applies pressure to the activated tactile elements 202 (which, in the examples of the solenoid or fluidics embodiments, would be protruding when activated), causing switching or other sensory input into the associated sensors or switches 203. These sensor or switch elements 203 then send signals conveying information via lines 216 to the control module 206, providing feedback and/or other data, although open loop control may be practiced in other examples. After a predetermined time interval, the tactile elements 202 are de-activated for later re-use, and they return to their initial configuration in which they feel generally flush with the surrounding area.
In alternative embodiments the memory module 207, processor module 208, sensor module 209 and pattern generator module 210 may be combined into fewer modules.
All of these components may be integrated into the wearable device, or some parts of them such as the control module 206 or parts thereof may be located in another device with which the wearable in communication, such as a smart phone, another wearable, or the cloud.
Embodiments disclosed herein provide for a dynamically programmable array of tactile elements in which a physical pattern of tactile elements to be placed into an activated or a de-activated state may be based on a logical pattern of tactile elements. Each of these patterns may be varied as per the needs of the user and/or the requirements of whichever application the user may be running at a given time, and over any portion of the wearable a product designer may feel best suits the interface design. For example, an application may draw attention to a user by triggering an alarm that indicates to the user that he is to respond via the user interface by selecting from one of two possibilities: “YES” or “NO” (
The flexibility provided by the programmability of the array permits a designer broad scope in crafting user interfaces. For example, a number of tactile elements may be activated together to form a larger button for the benefit of users with larger fingers. In another embodiment, tactile elements may be grouped in fanciful patterns, e.g. so as to form a smile for the user to touch (
Tactile elements such as buttons need not be located on the outer surface of the device but may, in other embodiments, be located on an inner surface or both surfaces. Thus, a user may be signaled information based on a perceptible pattern of button activations felt against his person (at the wrist, in the case of a wristband or other wrist wearable), and then respond to that message by touching the buttons on the outer surface. In such an embodiment, inner and outer provided arrays may be independently controllable (i.e., independently addressable), and in some embodiments the inner array may be provided without any switches, as the user would not be able to reach them there.
A plurality of wearable devices employing embodiments may be provided with wireless communications capability with one another either directly or via a network so that individual users may communicate with one another through the wearable devices. Such an approach may be of use in game play, where users communicate with one another via their wearable devices through the patterns of activated buttons formed thereon.
According to another embodiment, the inner surface may be provided with a sensor array that collects information about the state or context of the person, including temperature sensors that determine user body temperature, pressure sensors for blood pressure, sensors that measure pulse rate, and electrical sensors that pick up myoelectric activity. This information may then be used by the control module to activate patterns of tactile elements on the outer surface of the wearable. For example, an elevated pulse rate might indicate that the user is in a physical or psychological state in which the user is not as able to detect or notice raised tactile elements as when the user is in a resting, calm state. In such a situation, larger, and thus more readily perceivable, groups of tactile elements may be activated to compensate for the particular context of the user.
Embodiments may make use of GPS coordinates or other geo-location techniques to vary the nature or content of the pattern of tactile elements activated on a wearable.
In the wearable cap embodiment depicted in
In another embodiment, the cascade may be along a generally circular path, with clockwise patterns of activations indicating “go right” and counterclockwise patterns of activations indicating “go left.” The pattern of activated buttons may be along an inner surface that the user feels on his skin (e.g., for a wrist band, the user's wrist, or in the example of a hat, the user's forehead), or along an outer surface, or both, depending on how a given user interface is implemented. In addition to directional information, this embodiment may be used to convey other sorts of information to the user, including text.
Mechanisms and sensors generally provided for use with GPS systems may also be used to determine device orientation and with it, the orientation of any interface linked to the device. Accelerometers, gyroscopes, and magnetometers are in wide use in smart phones for providing data for use with GPS systems, and may also be used to determine device orientation by various well known standard techniques.
The operation of one embodiment is further addressed by way of reference to the flow chart 600 in
Again referring to flow chart 600 in
Such orientation awareness permits the desired pattern of tactile element actuation to be dynamically moved across the face of the tactile element array in dependence upon the orientation of the wearable device. Thus, in one embodiment, the activation of tactile elements corresponding to the YES/NO example set forth above may be dynamically shifted along the band so that these buttons would always be arranged along the “face up” portion of the wearable or in any other orientation that would be most convenient for the user. The user would not have to contort himself into an inconvenient position in order to touch the array. Also, the user may not then have to look at the array, but may reach for the buttons of interest at a preferred orientation, either selected by himself or by the interface designer.
A number of different technologies may be used in the implementation of the tactile elements used in these embodiments.
Another approach is to use fluidics to control and activate tactile elements. In one embodiment, the tactile elements are provided as an array of buttons formed of a substrate attached to a button membrane, thereby creating a set of round, button cavities. The button cavities may be configured to be inflated and deflated by a pump coupled to a fluid reservoir. The cavities may be inflated/deflated together, in subsets, and/or individually. In some embodiments, the buttons may be sandwiched between a touch sensing layer and a display of a touch screen. In other embodiments, the button array may be located either above or below the touch screen.
Such an embodiment is shown in
Membrane 810 may be made from a suitable optically transparent and elastic material including polymers or silicon-based elastomers such as poly-dimethylsiloxane (PDMS) or polyethylene terephthalate (PET).
Enclosed cavities 820a, 820b, and 820c, formed between substrate 830 and membrane 810, are fluid tight and coupled via fluid channel 840 to one or more fluid pumps (not shown in this figure). The pump(s) may either be internal or external with respect to a touch screen assembly incorporating button array 800.
In embodiments in which fluidic buttons overlie a display screen, to minimize optical distortion, the refractive index of the button fluid should be substantially similar to that of substrate 830 and also membrane 810. Depending on the application, suitable fluids include water and alcohols such isopropanol or methanol.
When selected buttons of the button array 800 need to be activated, i.e., raised or in other words inflated, fluid pressure inside specific cavities—here 820a and 820b—is increased thereby causing the overlying membrane portions 810a and 820b to be raised. In this example, the third cavity, 820c is not pressurized, and its overlying membrane 810c remains flat. In this example which is suitable for a handheld device, cavities 820 may have a cavity diameter of approximately 5 mm and membrane 810 is approximately 100 microns thick. Conversely, when button array 800 needs to be deactivated, fluid pressure inside the cavities is decreased thereby causing them to deflate and their corresponding overlying membrane portions (in this instance, 810a and 820a) to return to their original flat profile. It is contemplated that a button fluid pressure of approximately 0.2 psi and a button fluid displacement of about 0.03 ml should be sufficient to raise selected membrane (button) portions of 810 by about 1 mm.
A further feature of this embodiment is that it may be located atop a touch display screen 850 and may include a touch sensing layer 854. According to another embodiment, the touch display screen 850 may include sensors that provide input capability thereby eliminating the need for sensing layer 854.
An optional feature of this embodiment is the inclusion of an infrared sensor layer 858 to provide for finger proximity detection. The buttons are provided with infrared sensors in layer 858 so that they are able to sense the temperature of a finger approaching the buttons, enabling them to be inflated just moments before actual contact is made. This is advantageous in some circumstances where it may be used because activating a button uses energy, and by limiting the time that the buttons are activated to the typically brief interval when the user's fingers are nearly touching the buttons until after they have left them reduces power consumption. According to an additional embodiment, there may be a vibration to alert the user to the presence of some incoming message or other information awaiting his response. Then when the user approaches the buttons, select buttons are activated.
A vibration alert feature may be used in another embodiment to indicated the arrival of information. In these embodiments, the user first perceives vibration, which he takes to be a sign to touch the interface. Contacting the interface then causes the desired pattern of buttons to be activated. In an alternative embodiment, a bell may be sounded to indicate the presence of information at the interface instead of a vibration.
Embodiments may be utilized in gaming, such as by providing an interface for game play that is worn on the wrist, head, or other portion of the player's person. Such wearable devices may utilize either inner, outer, or both inner and outer arrays of activatable buttons.
In some embodiments, not every tactile element may be individually activatable, but may instead be activated in more granular groups. For example, in the embodiment 910a shown in
In the embodiment of
In various embodiments, the wearable user interface may be implemented as a wireless system, a wired system, or a combination of both. When implemented as a wireless system, embodiments may include components and interfaces suitable for communicating over a wireless shared media, such as one or more antennas, transmitters, receivers, transceivers, amplifiers, filters, control logic, and so forth. An example of wireless shared media may include portions of a wireless spectrum, such as the RF spectrum and so forth. When implemented as a wired system, embodiments may include components and interfaces suitable for communicating over wired communications media, such as input/output (I/O) adapters, physical connectors to connect the I/O adapter with a corresponding wired communications medium, a network interface card (NIC), disc controller, video controller, audio controller, and so forth. Examples of wired communications media may include a wire, cable, metal leads, printed circuit board (PCB), backplane, switch fabric, semiconductor material, twisted-pair wire, co-axial cable, fiber optics, and so forth.
The processor core 2000 is shown including execution logic 2500 having a set of execution units 2550-1 through 2550-N. Some embodiments may include a number of execution units dedicated to specific functions or sets of functions. Other embodiments may include only one execution unit or one execution unit that may perform a particular function. The illustrated execution logic 2500 performs the operations specified by code instructions.
After completion of execution of the operations specified by the code instructions, back end logic 2600 retires the instructions of the code 2130. In one embodiment, the processor core 2000 allows out of order execution but requires in order retirement of instructions. Retirement logic 2650 may take a variety of forms as known to those of skill in the art (e.g., re-order buffers or the like). In this manner, the processor core 2000 is transformed during execution of the code 2130, at least in terms of the output generated by the decoder, the hardware registers and tables utilized by the register renaming logic 2250, and any registers (not shown) modified by the execution logic 2500.
Although not illustrated in
Referring now to
The system 1000 is illustrated as a point-to-point interconnect system, wherein the first processing element 1070 and the second processing element 1080 are coupled via a point-to-point interconnect 1050. It should be understood that any or all of the interconnects illustrated in
As shown in
Each processing element 1070, 1080 may include at least one shared cache 1896a, 1896b. The shared cache 1896a, 1896b may store data (e.g., instructions) that are utilized by one or more components of the processor, such as the cores 1074a, 1074b and 1084a, 1084b, respectively. For example, the shared cache 1896a, 1896b may locally cache data stored in a memory 1032, 1034 for faster access by components of the processor. In one or more embodiments, the shared cache 1896a, 1896b may include one or more mid-level caches, such as level 2 (L2), level 3 (L3), level 4 (L4), or other levels of cache, a last level cache (LLC), and/or combinations thereof.
While shown with only two processing elements 1070, 1080, it is to be understood that the scope of the embodiments are not so limited. In other embodiments, one or more additional processing elements may be present in a given processor. Alternatively, one or more of processing elements 1070, 1080 may be an element other than a processor, such as an accelerator or a field programmable gate array. For example, additional processing element(s) may include additional processors(s) that are the same as a first processor 1070, additional processor(s) that are heterogeneous or asymmetric to processor a first processor 1070, accelerators (such as, e.g., graphics accelerators or digital signal processing (DSP) units), field programmable gate arrays, or any other processing element. There may be a variety of differences between the processing elements 1070, 1080 in terms of a spectrum of metrics of merit including architectural, micro architectural, thermal, power consumption characteristics, and the like. These differences may effectively manifest themselves as asymmetry and heterogeneity amongst the processing elements 1070, 1080. For at least one embodiment, the various processing elements 1070, 1080 may reside in the same die package.
The first processing element 1070 may further include memory controller logic (MC) 1072 and point-to-point (P-P) interfaces 1076 and 1078. Similarly, the second processing element 1080 may include a MC 1082 and P-P interfaces 1086 and 1088. As shown in
The first processing element 1070 and the second processing element 1080 may be coupled to an I/O subsystem 1090 via P-P interconnects 10761086, respectively. As shown in
In turn, I/O subsystem 1090 may be coupled to a first bus 1016 via an interface 1096. In one embodiment, the first bus 1016 may be a Peripheral Component Interconnect (PCI) bus, or a bus such as a PCI Express bus or another third generation I/O interconnect bus, although the scope of the embodiments are not so limited.
As shown in
Note that other embodiments are contemplated. For example, instead of the point-to-point architecture of
Embodiments disclosed herein may establish one or more logical or physical channels to communicate information. The information may include media information and control information. Media information may refer to any data representing content meant for a user. Examples of content may include, for example, data from a voice conversation, videoconference, streaming video, electronic mail (“email”) message, voice mail message, alphanumeric symbols, graphics, image, video, text and so forth. Data from a voice conversation may be, for example, speech information, silence periods, background noise, comfort noise, tones and so forth. Control information may refer to any data representing commands, instructions or control words meant for an automated system. For example, control information may be used to route media information through a system, or instruct a node to process the media information in a predetermined manner.
Examples of a wearable device also may include apparatus containing electronic processors that are arranged to be worn by a person and integrated into a wearable structure such as a wristband, glove, ring, eyeglasses, belt-clip or belt, arm-band, shoe, hat, shirt, undergarment, outer garment, clothing generally, and additionally fashion accessories such as wallets, purses, umbrellas etc. In embodiments, for example, a wearable device may be implemented as all or part of a smart phone capable of executing computer applications, as well as voice communications and/or data communications.
Example 1 may include a system to provide a user interface comprising a wearable device with which is integrated a plurality of tactile elements, each of said tactile elements having an active state and an inactive state, a pattern generator module to define a physical pattern of tactile elements to place in an active state, wherein the pattern generator module is to permit variation of the physical pattern, and sensors to determine at least one of an orientation of the wearable device and a location of the wearable device.
Example 2 may include the system of Example 1, further comprising a wireless communications link connecting the wearable device to a network.
Example 3 may include the system of Examples 1 or 2, further comprising a plurality of wearable devices in communication with one another.
Example 4 may include the system of Example 1, wherein the tactile elements each comprise a chamber configured to contain a quantity of pressurizable fluid, the chamber having an overlying flexible portion to bulge out when the fluid is pressurized, further comprising at least one fluid reservoir in fluidic communication with the tactile elements and at least one pump to pressurize the fluid.
Example 5 may include the system of any one of Examples 1, 2, or 4, wherein the physical pattern varies with information provided by at least one of an orientation of the wearable device and a location of the wearable device.
Example 6 may include the system of Example 1, further comprising a display over which some of the tactile elements are arrayed.
Example 7 may include a method of activating tactile elements on a wearable device user interface, comprising generating a logical pattern of tactile elements, and using the logical pattern to define a physical pattern of active tactile elements, wherein both the logical pattern and the physical pattern are variable.
Example 8 may include the method of Example 7, wherein the user interface comprises outer facing and inner facing arrays of tactile elements, wherein both arrays are independently addressable.
Example 9 may include the method of Examples 7 or 8, including determining the orientation of the user interface, and forming a physical pattern of active tactile elements in dependence upon said orientation.
Example 10 may include the method of Examples 7 or 8, including determining a geo-location of the user interface, and forming a physical pattern of active tactile elements in dependence upon said geo-location.
Example 11 may include the method of Example 7, wherein the logical pattern is determined based on information that is obtained remotely from the user interface.
Example 12 may include the method of Examples 7 or 8, including determining a context in which the wearable device is used, and activating groups of tactile elements based on the context.
Example 13 may include the method of Examples 7 or 8, wherein the tactile elements are activated in a pattern that forms a message.
Example 14 may include the method of Example 7, wherein the tactile elements are not activated unless the wearable device is touched by a user.
Example 15 may include at least one computer readable storage medium comprising a set of instructions which, when executed by a computing device, cause the computing device to generate a logical pattern of tactile elements in an array of tactile elements to be activated, and use the logical pattern to define a physical pattern of active tactile elements, wherein both the logical pattern and the physical pattern are variable.
Example 16 may include the at least one computer readable storage medium of Example 15, wherein the instructions, when executed, cause a computing device to address an outer array and an inner facing array of tactile elements.
Example 17 may include the at least one computer readable storage medium of Example 15, wherein the instructions, when executed, cause a computing device to determine an orientation of a user interface, and form a physical pattern of active tactile elements on said interface in dependence upon said orientation.
Example 18 may include the at least one computer readable storage medium of any of Examples 14-17, wherein the instructions, when executed, cause a computing device to determine the geo-location of the interface, and form a physical pattern of active tactile elements on said interface in dependence upon said geo-location.
Example 19 may include the at least one computer readable storage medium of Example 15, wherein the instructions, when executed, cause a computing device to determine the logical pattern based on information determined remotely.
Example 20 may include an apparatus to provide a user interface comprising first and second layers of tactile elements in back-to-back proximity to one another, each of said tactile elements having an active state and an inactive state, and a control module to generate a physical pattern of active tactile elements for at least one of the layers.
Example 21 may include the apparatus of Example 21, wherein the control module is further able to generate a logical pattern of tactile element activations, wherein the physical pattern of activations is based on the logical pattern, and wherein both the physical pattern and the logical pattern are variable.
Example 22 may include the apparatus of Examples 20 or 21, comprising sensors associated with the plurality of tactile elements.
Example 23 may include the apparatus of Example 22, wherein the sensors are to detect infrared radiation and further comprising a sensor module to enable selected tactile elements to be placed into an active state when detected infrared radiation rises above a threshold.
Example 24 may include the apparatus of Example 20, wherein the tactile elements are capable of being individually addressable.
Example 25 may include the apparatus of Examples 20 or 21, further comprising a wearable device article to which the tactile elements are attached.
Example 26 may include the apparatus of Example 20, wherein the first layer of tactile elements has sensors that differ from sensors in the second layer of tactile elements.
Example 27 may include the apparatus of Examples 20 or 21, further comprising sensors to determine at least one of the orientation of the apparatus and the location of the apparatus.
Example 28 may include the apparatus of Examples 20 or 26, further comprising a display over which some of the tactile elements are arrayed.
Example 29 may include the apparatus of Example 20, wherein the tactile elements each comprise a chamber for containing a quantity of pressurizable fluid, the chamber having an overlying flexible portion that bulges out when the fluid is pressurized, further comprising at least one fluid reservoir in fluidic communication with the tactile elements and at least one pump to pressurize the fluid.
Example 30 may include the apparatus of Example 20, further comprising an actuator module to activate the tactile elements belonging to the physical pattern, and wherein the control module further comprises a memory module, a sensor module, a processor module to generate a logical pattern of tactile element activations, and a pattern generator module to map the logical pattern onto a physical pattern of tactile element activations.
Example 31 may include a wearable device comprising first and second layers of tactile elements in back-to-back proximity to one another, each of said tactile elements having an active state and an inactive state, means for generating a logical pattern of active tactile elements, means for generating a physical pattern of active tactile elements, means for determining the orientation or location of the device, and means for varying the physical pattern based on the orientation or location of the device.
Example 32 may include the wearable device of Example 31, further comprising means for wirelessly communicating with another device.
Example 33 may include the wearable device of Examples 31 or 32, further comprising sensors capable of detecting the proximity of a human finger, and means for activating the tactile elements when the finger is near the sensors.
Example 34 may include the wearable device of Example 31, wherein the means for generating a physical pattern of tactile elements does not do so unless a user first touches one of the layers of tactile elements.
Example 35 may include a method of activating tactile elements on a wearable device user interface, comprising generating a logical pattern of tactile elements, using the logical pattern to define a physical pattern of active tactile elements, and varying the physical pattern in dependence upon the orientation of the user interface.
Example 36 may include the method of Example 35, wherein the logical pattern is received by the wearable device user interface through a wireless channel.
Example 37 may include the method of Example 35, wherein a plurality of wearable devices are in communication with one another.
Example 38 may include the method of any one of Examples 35, 36 or 37, including using Global Positioning System coordinates to form the physical pattern.
Example 39 may include the method of Example 38, wherein the physical pattern conveys directional information.
Example 40 may include the method of Example 35, further including game play.
Example 41 may include the method of Example 35, wherein the physical pattern conveys a message.
Example 42 may include the method of Example 41, wherein the physical pattern is conveyed via a clockwise or counterclockwise activation of tactile elements.
Example 43 may include the method of Example 41, wherein the physical pattern is conveyed by activating a cascading series of tactile elements.
Example 44 may include the method of Example 35, further comprising determination of a user context, and wherein the shape or size of the physical pattern depends on the context.
Example 45 may include the method of Example 35, wherein accelerometer data is used to determine an activity state of a user, and wherein the number or pattern of tactile elements activated is selected in dependence on the activity state.
Example 46 may include the method of Examples 35 or 44, wherein tactile elements are activated in groups to provide a user with the tactile sensation of larger buttons.
Example 47 may include a method of navigating via a wearable device user interface that comprises an array of buttons, each of said buttons having an active state and an inactive state, comprising forming a pattern of buttons to activate, and activating buttons defined by the pattern, wherein the pattern is at least partly based on directions at least partially based on Global Positioning System coordinates.
Example 48 may include the method of Example 47, wherein the wearable device user interface has an inner side and has an array of buttons on the inner side.
Example 49 may include a user interface comprising an array of addressable tactile elements having an active state and an inactive state, each of said tactile elements further having a position in the array, a sensor element in proximity to each tactile element, means for selecting a pattern of said tactile elements and selectively placing the tactile elements into an active state, means for placing at least one sensor element into an inactive state, and means for varying the pattern and the tactile elements that are in an active state.
Example 50 may include the user interface of Example 49, further comprising a wearable device that is in the form of a hat, shirt, undergarment, belt, wristband, watch, or glasses.
Example 51 may include the user interface of Example 49, wherein the user interface has an inner side and the array of addressable tactile elements is on the inner side.
Example 52 may include the user interface of Example 49, wherein the user interface has an outer side and the array of addressable tactile elements is on the outer side.
Example 53 may include the user interface of Example 49, comprising two arrays of addressable tactile elements, wherein the user interface has an inner side and an outer side, and wherein one said array is on the inner side, and one side array is on the outer side.
Example 54 may include the user interface of Example 53, wherein the two arrays of addressable tactile elements are independently addressable with respect to one another.
Various embodiments may be implemented using hardware elements, software elements, or a combination of both. Examples of hardware elements may include processors, microprocessors, circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), logic gates, registers, semiconductor device, chips, microchips, chipsets, and so forth. Examples of software may include software components, programs, applications, computer programs, application programs, system programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory resources, data bus speeds and other design or performance constraints.
As used herein, the term “fluidic” may encompass the term “microfluidic” and the field of microfluidics, depending on component size.
Example sizes/models/values/ranges may have been given, although embodiments are not limited to the same. As manufacturing techniques mature over time, it is expected that devices of smaller size and smaller tactile element size may be manufactured. In addition, well known electrical or fluidic components may or may not be shown within the figures, for simplicity of illustration and discussion, and so as not to obscure certain aspects of the embodiments. Further, arrangements may be shown in block diagram form in order to avoid obscuring embodiments, and also in view of the fact that specifics with respect to implementation of such block diagram arrangements are highly dependent upon the platform within which the embodiment is to be implemented, i.e., such specifics should be well within purview of one skilled in the art. Where specific details (e.g., circuits) are set forth in order to describe example embodiments, it should be apparent to one skilled in the art that embodiments may be practiced without, or with variation of, these specific details. The description is thus to be regarded as illustrative instead of limiting.
Those skilled in the art will appreciate from the foregoing description that the broad techniques of the embodiments may be implemented in a variety of forms. Therefore, while the embodiments have been described in connection with particular examples thereof, the true scope of the embodiments should not be so limited since other modifications will become apparent to the skilled practitioner upon a study of the drawings, specification, and following claims.