This disclosure generally relates to an wearable electronic device.
Mobile electronic devices provide a user with access to computing capabilities even as the user moves about various locations. Examples of mobile electronic devices include mobile phones, media players, laptops, tablets, PDAs, or hybrid devices that include functionality of multiple devices of this type.
Mobile electronic devices may be part of a communication network such as a local area network, wide area network, cellular network, the Internet, or any other suitable network. A mobile electronic device may use a communication network to communicate with other electronic devices, for example, to access remotely-stored data, access remote processing power, access remote displays, provide locally-stored data, provide local processing power, or provide access to local displays. For example, networks may provide communication paths and links to servers, which may host applications, content, and services that may be accessed or utilized by users via mobile electronic devices. The content may include text, video data, audio data, user settings or other types of data. Networks may use any suitable communication protocol or technology to facilitate communication between mobile electronic devices, such as, for example, BLUETOOTH, IEEE WI-FI (802.11a/b/g/n/ac), or TCP/IP.
Particular embodiments of an wearable electronic device include a stack-up that allows some or all of the processing and display system to fit inside the body of the device, which may be encompassed by an element, such as an outer ring, that provides at least one way for the user to interact with the device. In addition or the alternative, particular embodiments may include external components incorporated into the band for additional functionality, as described more fully herein.
Below the touch-sensitive layer 210 may be a circular display 215, which may be laminated or mechanically affixed to any of the preceding or forgoing layers. In particular embodiments, lamination may reduce glare and improve display legibility by reducing internal reflections. As described more fully below, display 215 may have an outer inactive area that may be symmetric or asymmetric. Display 215 may be positioned such that it is axially centered relative to protective layer 205 for a visually symmetric presentation. Display 215 may be of any suitable type, such as for example light-emitting diode (LED), organic light emitting diode (OLED), or liquid crystal display (LCD). In particular embodiments, display 215 may be flexible. In particular embodiments, display 215 may be partially transparent. In particular embodiments, display 215 may be translucent.
Below display 215 may be battery 220, which in particular embodiments may be positioned so that base 245 may be reduced in diameter without affecting the size of the battery. Battery 220 may be of any suitable type, such as for example lithium-ion based. Battery 220 may adopt the circular shape of the device, or may adopt any other suitable shape, such as a rectangular form, as illustrated. In particular embodiments, battery 220 may “float” in the device, e.g. may have space above, below, or around the battery to accommodate thermal expansion. In particular embodiments, high-height components such as for example haptic actuators or other electronics may be positioned in the additional space beyond the edge of the battery for optimal packing of components. In particular embodiments, connectors from processor board 225 may be placed in this space to reduce the overall height of the device.
Below battery 220 may be processor board 225. Processor board 225 may include any suitable processing components, such as for example one or more processing units, drive units, sense units, caches, memory elements, or integrated circuits. Processor board 225 may include one or more heat sensors or cooling units (such as e.g., fans) for monitoring and controlling the temperature of one or more processor board components. In particular embodiments, body 105 of the device may itself act as the heat sink
Below the processor board may be an encoder 230, encircled by one or more outer rings 235. As described more fully below, encoder 230 may be of any suitable type, and may be part of outer ring 235 or may be a separate component, as illustrated in
The device body may conclude with a base 245. Base 245 may be stationary relative to the one or more rotatable components of the device, such as outer ring 235. In particular embodiments, base 245 connects to band 240, described more fully herein. Connections may be mechanical or electrical, such as for example part of the circuitry linking wired communication components in band 240 to processing board 225. In particular embodiments, connectors are positioned to avoid the encoder and the anchor points for the bands. In particular embodiments, band 240 may be detachable from base 245. As described more fully herein, band 240 may include one or more inner connectors 250, one or more optical sensing modules 255, or one or more other sensors. In particular embodiments, the interior of the device, or portions of that interior, may be sealed from the external environment.
While this disclosure describes specific examples of components in stack-up 200 of wearable electronic device 100 and of the shape, size, order, connections, and functionality of those components, this disclosure contemplates that a wearable device, such as device 100, may include any suitable components of any suitable shape, size, and order connected or communicating in any suitable way. As merely one example, battery 220 may be placed more toward the bottom of the stack up than is illustrated in
In particular embodiments, the display of the device has a circular or elliptical form, and houses a circular display unit, such as for example an LCD display, and an OLED display. The display unit may be mounted such that the visible area is centrally located within the display module. Should the display unit have an offset design, one or more appropriate maskings may be used to obscure part of the display to produce a circular and correctly placed visual outline.
In particular embodiments, a display module has an outer ring that is part of the user interface of the device. The outer ring may rotate while the band holds the bottom and inside part of the device stable.
A display module may additionally incorporate one or more sensors on or near the same surface as the display. For example, the display module may include a camera or other optical sensor, microphone, or antenna. One or more sensors may be placed in an inactive area or of the display. For example,
In particular embodiments, the packaging of a circular display includes an inactive area, as illustrated in
In particular embodiments, all processing and RF components are located within the body of the device, which may create a challenge in allowing RF signals to pass out of the device. The FPC board may additionally be attached to sides of the polygon where there is no connection to the display itself to allow the mounting of strip line, stub, ceramic, or other antennae (or other suitable sensors) in the same plane as the display, as illustrated in
In particular embodiments, a display may be shielded from electromagnetic interference with the main processor board using a metal shield. In particular embodiments, the metal shield may also be used as a heat sink for the battery, and thus may improve charge or discharge rates for the battery.
In particular embodiments, an wearable electronic device may include one or more outer elements (which may be of any suitable shape) about the device body.
In particular embodiments, detents or encoders (which may be used interchangeably, where suitable) of an outer element may provide a user with haptic feedback (e.g. a tactile click) provided by, for example, a detent that allows the user to determine when the element has been moved one “step” or “increment”, which may be used interchangeably herein. This click may be produced directly via a mechanical linkage (e.g. a spring mechanism) or may be produced electronically via a haptic actuator (e.g. a motor or piezo actuator). For example, a motor may provide resistance to motion of a ring, such as for example by being shorted to provide resistance and unshorted to provide less resistance, simulating the relative high and low torque provided by a mechanical detent system. As another example, magnetic systems may be used to provide the haptic feel of a detent. For example, a solenoid mechanism may be used to disengage the detent spring or escapement as needed. The spring or escapement provides the actual mechanical feedback. However, this arrangement allows the device to skip a number of détentes as needed, while re-engaging the detent at exact intervals to create the sensation of detents, such as those that have changed size. As another example, a rotatable outer element (such as, for example, the outer ring) may be magnetized, such as by an electromagnetic used to attract the ring at “detent” positions, increasing torque and simulating detent feedback. As another example, a rotatable outer element may have alternating north-south poles, which repels and attracts corresponding magnetic poles in the device body. As another example, a permanent magnet may be used to lock the ring in place when the electromagnet is not in use, preventing freewheeling. As another example, instead of an electromagnet, an easily-magnetizable ferromagnetic alloy may be used within a solenoid. This allows the electromagnetic field of the solenoid to “reprogram” the magnetic orientation of the core, thus maintaining the effect of the magnetic actuation even when the solenoid itself is disengaged. While this disclosure provides specific examples of detents, detent-like systems, and encoders, this disclosure contemplates any suitable detents, detent-like systems, or encoders.
In particular embodiments, an encoder or detent may be used to determine the position of the outer ring relative to the device body. Particular embodiments utilize an encoder that is affixed to the device body, as illustrated by encoder 230 of
In particular embodiments, a retention ring connecting the outer ring to the body of the device may have strain gages to detect pressure on the outer ring. As an example,
When strain is placed on a component containing strain gauges or any other suitable strain or pressure detection system, the detected strain may result in any suitable functionality. For example, when strain is placed on the outer ring, such as for example by a user squeezing the outer ring, feedback may be provided to the user. That feedback may take any suitable form, such as tactile feedback (e.g. vibration, shaking, or heating/cooling), auditory feedback such as beeping or playing a particular user-defined tone, visual feedback (e.g. by the display of the device), or any other suitable feedback or combination thereof. Functionality associated with squeezing a ring is described more fully herein, and this disclosure contemplates any suitable functionality resulting from strain or pressure placed on and detected by any suitable components.
An wearable electronic device may be attached to a band to affix the device to the user. Here, reference to a “band” may encompass any suitable apparatus for affixing a device to the user, such as for example a traditional band 1405 that can be worn around the arm, wrist, waist, or leg of the user, as illustrated by way of example in
In particular embodiments, sensors and corresponding electronics may be attached to a band, where appropriate. For example, the bands of
This disclosure contemplates any suitable structure for connecting a band as illustrated in
In particular embodiments, a band containing electrical components may also incorporate a traditional physical contact connector, as illustrated by connector 250 of
In particular embodiments, a band may be used to house flexible batteries (such as, e.g., lithium-based batteries) to increase the energy storage of the device. As energy storage capacity may be tied to total volume, batteries internal to the band increase the storage capacity for volume-limited wearable devices without impacting the total size of the device body.
As described more fully below, an wearable electronic device may include one or more sensors on or in the device. For example, an wearable electronic device may include one or more optical sensors or depth sensors. Optical sensors may be placed in any suitable location, such as for example on the face of the device, on a band facing outward from the user's body, on a band facing opposite the face, on a band facing toward the user's body, or any suitable combination thereof.
In particular embodiments, placement of an optical sensor on the band may be adjustable by the user within a predetermined range. In particular embodiments, placement of an optical sensor on the band may be optimized such that the sensor is conveniently aimable by the user. For example, as illustrated by
In particular embodiments, placement of an optical sensor may be such that the user may view the display of the device while the sensor is pointing outward from the user's body. Thus, the user may view content captured by the optical sensor and displayed by the device without blocking the user's view of the physical scene captured by the sensor, as illustrated by the viewing triangle in
In particular embodiments, an optical or depth sensor module (which may be used interchangeably, where appropriate) may communicate with a device via a simple extension of the bus the optical sensor would use if it were directly mounted on the main printed circuit board (PCB), as illustrated in
In one embodiment, the camera control integrated circuit may be mounted directly on a small circuit board at the optical module, as illustrated in
Sensors may internally produce sensor data, which may be simply filtered or reformatted by, for example, a detector or data conditioner. Raw data may be formatted to an uniform format by the data formatter for ingestion by the Application API. Recognizers may use numeric models (such as decision trees), heuristic models, pattern recognition, or any other suitable hardware, software, and techniques to detect sensor data, such as gesture input. Recognizers may be enabled or disabled by the API. In such cases, the associated sensors may also be disabled if the recognizer is not to receive data from the sensors or is incapable of recognizing the sensor data.
A device may incorporate a database of sensor outputs that allow the same detector to detect many different sensor outputs. Depending on the requests produced by the API, a sensor priority decoder may suppress or pass through sensor output based on criteria supplied. The criteria may be a function of the design of the API. In particular embodiments, recognizers may ingest the output of more than one sensor to detect sensor output.
In particular embodiments, multiple sensors may be used to detect similar information. For example, both a normal and a depth sensing camera may be used to detect a finger, or both a gyroscope and a magnetometer may be used to detect orientation. When suitable, functionality that depends on or utilizes sensor information may substitute sensors or choose among them based on implementation and runtime considerations such as cost, energy use, or frequency of use.
Sensors may be of any suitable type, and as described herein, may be located in or on a device body, in or on a band, or a suitable combination thereof. In particular embodiments, sensors may include one or more depth or proximity sensors (terms which may be used interchangeably herein, when appropriate), such as for example infrared sensor, optical sensors, acoustic sensors, or any other suitable depth sensors or proximity sensors. For example, a depth sensor may be placed on or near a display of a device to detect when, e.g., the user's hand, finger, or face comes near the display. As another example, depth sensors may detect any object that a user's finger in the angle of view of the depth sensor is pointing to, as described more fully herein. Depth sensors also or in the alternative may be located on a band of the device, as described more fully herein. In particular embodiments, sensors may include on or more touch-sensitive areas on the device body, band or both. Touch-sensitive areas may utilize any suitable touch-sensitive techniques, such as for example resistive, surface acoustic wave, capacitive (including mutual capacitive or self-capacitive), infrared, optical, dispersive, or any other suitable techniques. Touch-sensitive areas may detect any suitable contact, such as swipes, taps, contact at one or more particular points or with one or more particular areas, or multi-touch contact (such as, e.g., pinching two or more fingers on a display or rotating two or more fingers on a display). As described more fully herein, touch-sensitive areas may comprise at least a portion of a device's display, ring, or band. Like for other sensors, in particular embodiments touch-sensitive areas may be activated or deactivated for example based on context, power considerations, or user settings. For example, a touch-sensitive portion of a ring may be activated when the ring is “locked” (e.g. does not rotate) and deactivated when the ring rotates freely. In particular embodiments, sensors may include one or more optical sensors, such as suitable cameras or optical depth sensors.
In particular embodiments, sensors may include one or more inertial sensors or orientation sensors, such as an accelerometer, a gyroscope, a magnetometer, a GPS chip, or a compass. In particular embodiments, output from inertial or orientation sensors may be used to activate or unlock a device, detect one or more gestures, interact with content on the device's display screen or a paired device's display screen, access particular data or activate particular functions of the device or of a paired device, initiate communications between a device body and band or a device and a paired device, or any other suitable functionality. In particular embodiments, sensors may include one or more microphones for detecting e.g. speech of a user, or ambient sounds to determine the context of the device. In addition, in particular embodiments a device may include one or more speakers on the device body or on the band.
In particular embodiments, sensors may include components for communicating with other devices, such as network devices (e.g. servers or routers), smartphones, computing devices, display devices (e.g. televisions or kiosks), audio systems, video systems, other wearable electronic devices, or between a band and a device body. Such sensors may include NFC readers/beacons, BLUETOOTH technology, or antennae for transmission or reception at any suitable frequency.
In particular embodiments, sensors may include sensors that receive or detect haptic input from a user of the device, such as for example piezoelectrics, pressure sensors, force sensors, inertial sensors (as described above), strain/stress sensors, or mechanical actuators. Such sensors may be located at any suitable location on the device. In particular embodiments, components of the device may also provide haptic feedback to the user. For example, one or more rings, surfaces, or bands may vibrate, produce light, or produce audio.
In particular embodiments, an wearable electronic device may include one or more sensors of the ambient environment, such as a temperature sensor, humidity sensor, or altimeter. In particular embodiments, an wearable electronic device may include one or more sensors for sensing a physical attribute of the user of the wearable device. Such sensors may be located in any suitable area, such as for example on a band of the device or on base of the device contacting the user's skin. As an example, sensors may include acoustic sensors that detects vibrations of a user's skin, such as when the user rubs skin (or clothing covering skin) near the wearable device, taps the skin near the device, or moves the device up and down the user's arm. As additional examples, a sensor may include one or more body temperature sensors, a pulse oximeter, galvanic-skin-response sensors, capacitive imaging sensors, electromyography sensors, biometric data readers (e.g. fingerprint or eye), and any other suitable sensors. Such sensors may provide feedback to the user of the user's state, may be used to initiate predetermined functionality (e.g. an alert to take particular medication, such as insulin for a diabetic), or may communicate sensed information to a remote device (such as, for example, a terminal in a medical office).
An wearable electronic device may include one or more charging components for charging or powering the device. Charging components may utilize any suitable charging method, such as capacitive charging, electromagnetic charging, trickle charging, charging by direct electrical contact, solar, kinetic, inductive, or intelligent charging (for example, charging based on a condition or state of a battery, and modifying charging actions accordingly). Charging components may be located on any suitable portion of the device, such as in or on the body of the device or in or on the band of a device. For example,
Charger 2000 may be made of any suitable material, such as acrylic, and in particular embodiments may have a non-slip material as its backing, such as e.g. rubber. In particular embodiments, charger 2000 may be affixed or attached to a surface, for example may be attached to a wall as illustrated in
As another example of a charging components in a wearable electronic device,
In particular embodiments, the band or device may implement an antenna for a wireless charging solution. Since wireless charging operates optimally in the absence of ferrous metals, this allows a wider choice of materials for the body of the device, while allowing improved wireless charging transfer capacity by allowing the coil to be held between the poles of a charging driver (as described above) rather than being simply coplanar to the driver. As described above and illustrated in
In particular embodiments a charging unit with an internal charge reservoir may be associated with a wearable electronic device. When plugged into the wall, the charging unit can charge both an attached device and the charging unit's internal reservoir. When not plugged in, the charging unit can still charge an attached device from its reservoir of power until that reservoir is depleted. When only the charger is connected to a power source without a device, it still charges itself, so that it can provide additional power for the device at a later point. Thus, the charging unit described herein is useful with and without being plugged into a power source, as it also can power any partially-charged device for a while when a person is not able to connect to a power source, for example when travelling, on plane, train station, outdoors, or anywhere a user might need charge for a device but does not have access to a power source. The device can be both in standby or in-use while the charger charges the device, and no modifications to the software or hardware of the target device is needed. Additional benefits of one or more embodiments of the invention may include reducing the number of items one must carry, providing the benefits of both a charger and a power pack, making charger useful to carry when on the move, and reducing the number of cables and connectors one must carry to extend the battery life of their devices. This disclosure contemplates that such a charging unit may be applied to any suitable electronic devices, including but not limited to an wearable electronic device.
As described above, a charging unit can charge a device from the charging unit's internal charging reservoir even when not connected to an external power source, and can charge itself, a connected device, or both when connected to an external power source. This disclosure contemplates any suitable scheme for allocating charge between the charging unit and device. Such allocation scheme may depend on the amount of charge internal to the device, internal to the charging unit, the amount of power being consumed by the device, the charging capabilities of an external power source, or any suitable combination thereof. In addition or the alternative, charging threshold may determine which allocation scheme to use. For example, one charging scheme may be used when the device is near full charge and the charging unit has little charge left, and another may be used when the device has little charge left.
Continuing the example of
In particular embodiments, functionality or components of the device (such as e.g. sensors) may be activated and deactivated, for example, to conserve power or reduce or eliminate unwanted functionality. For example, a locked state detector detects when the device is inactivated, and disables sensors as needed to conserve power, while monitoring the sensor data for a gesture or other suitable input that may reactivate the device. A device may have one or more power modes, such as sleep mode or fully active mode. As one example, in particular embodiments the device is arm-worn, and a touch surface of the device may come in contact with objects and persons while in regular use. To prevent accidental activation, an accelerometer or other inertial sensor in the body or band of the device can be used to gauge the approximate position of the device relative to the gravity of the Earth. If the gravity vector is detected towards the sides of the device (e.g. the device is determined by at the user's side or the display is determined not to be pointed at the user) the touch screen can be locked and display disabled to reduce energy use. When the gravity vector is determined to be pointing below the device (e.g. the device is roughly horizontal, resulting in a determination that the user is viewing or otherwise using the device), the system may power up the display and enable the touch screen for further interactions. In particular embodiments, in addition or in the alternative to the direction of the gravity vector waking or unlocking a device, a rate of change of the direction or magnitude of the gravity vector may be used to wake or unlock a device. For example, if the rate of change of the gravity vector is zero for a predetermined amount of time (in other words, the device has been held in a particular position for the predetermined amount of time) the device may be woken or unlocked. As another example, one or more inertial sensors in the device may detect a specific gesture or sequence of gestures for activating a display or other suitable component or application. In particular embodiments, the encoder of the device is robust to accidental activation, and thus can be left active so that the user may change between selections while bringing the device up to their angle of view. In other embodiments the encoder may be deactivated based on context or user input.
In addition or the alternative to power conservation, particular embodiments may lock one or more sensors, particular functionality, or particular applications to provide security for one or more users. Appropriate sensors may detect activation or unlocking of the secure aspects of the device or of another device paired with or communicating with the wearable device. For example, a specific gesture performed with the device or on a touch-sensitive area of the device may unlock one or more secure aspects of the device. As another example, particular rotation or sequence of rotations of a rotatable ring of the device may unlock one or more secure aspects of the device, on its own or in combination with other user input. For example, a user may turn a rotatable ring to a unique sequence of symbols, such as numbers or pictures. In response to receiving the sequence of rotational inputs used to turn the rotatable ring, the display may display the specific symbol(s) corresponding to each rotational input, as described more fully herein. In particular embodiments, the symbols used may be user-specific (such as, e.g., user pictures stored on or accessible by the device or symbols pre-selected by the user). In particular embodiments, different symbols may be presented to the user after a predetermined number of unlockings or after a predetermined amount of time. The example inputs described above may also or in the alternative be used to activate/deactivate aspects of the device, particular applications, or access to particular data. While this disclosure describes specific examples of user input unlocking secure aspects of a device, this disclosure contemplates any suitable input or combination of inputs for unlocking any secure aspect of the device. This disclosure contemplates that input or other suitable parameters for unlocking secure aspects of a device or activating/deactivating components of the device may be user-customizable.
In particular embodiments, an wearable electronic device may detect one or more gestures performed with or on the device. Gestures may be of any suitable type, may be detected by any suitable sensors (e.g. inertial sensors, touch sensors, cameras, or depth sensors), and may be associated with any suitable functionality. For example, one or more depth sensors may be used in conjunction with one or more cameras to capture a gesture. In particular embodiments, several depth sensors or cameras may be used to enhance the accuracy of detecting a gesture or the background associated with a gesture. When appropriate, sensors used to detect gestures (or processing used to initiate functionality associated with a gesture) may be activated or deactivated to conserve power or provide security, as described more fully above. As shown above,
In particular embodiments, an wearable electronic device may detect one or more gestures performed with or on the device. Gestures may be of any suitable type, may be detected by any suitable sensors (e.g. inertial sensors, touch sensors, cameras, or depth sensors), and may be associated with any suitable functionality. For example, one or more depth sensors may be used in conjunction with one or more cameras to capture a gesture. In particular embodiments, several depth sensors or cameras may be used to enhance the accuracy of detecting a gesture or the background associated with a gesture. When appropriate, sensors used to detect gestures (or processing used to initiate functionality associated with a gesture) may be activated or deactivated to conserve power or provide security, as described more fully above.
In particular embodiments, gestures may include gestures that involve at least on hand of the user and an appendage on which the device is worn, such as e.g. the other wrist of the user. For example, in particular embodiments, a user may use the hand/arm on which the device is worn to appropriately aim an optical sensor of the device (e.g. a camera or depth sensor) and may move or position the other arm/hand/fingers to perform a particular gesture. As described herein and illustrated in
As examples of functionality associated with this gesture, a camera may focus on the object, the object detected and pointed at may then appear on the display, information about that object may appear on the display, and displayed content may be transferred to another device's display (e.g. when the object is another device).
In particular embodiments, a gesture may include a motion of the wearable device, such as, for example, by the arm wearing the device. The motion may be detected by any suitable sensors, such as inertial sensors, orientation sensors, or any suitable combination thereof.
Like for
In particular embodiments, gesture may optionally include detecting some non-motion or non-orientation input. For example
In particular embodiments, a gesture may include interacting directly with the body or band of a wearable device. For example
In particular embodiments, a gesture may include contact with skin near the device.
In particular embodiments, gestures may involve detecting metaphoric gestures made by the hand not wearing the device. For example, such gesture may be detected by, e.g., any suitable front-facing sensor on or near the display of the device oriented such that the hand not wearing the device is in the angle of view of the sensor.
In particular embodiments, a gesture may involve the entire appendage on which a device is affixed or worn. For example,
In particular embodiments, a user may interact with the device via a variety of input mechanisms or types including, for example, the outer ring, touch-sensitive interfaces (e.g. the touch-sensitive layer), gestures performed by the user (described herein), or a speech interface (e.g. including voice input and speech recognition for applications including text input, communication, or searching). Additionally, in particular embodiments, a user may interact with a graphical user interface presented on a circular display of the device via any of the input mechanisms or types.
A user of the wearable electronic device may interact with the device (including, e.g., a graphical user interface presented on the circular display) by using the outer ring. In particular embodiments, the outer ring may be touch-sensitive, such that a user's touch on one or more portions of the ring may be detected as an input to the device and interpreted, causing one or more actions to be taken by the device (e.g. within a graphical user interface of the device). As an example, a touch-sensitive outer ring may be a capacitive ring or inductive ring, and a user of the device may perform any suitable touch gesture on the touch-sensitive ring to provide input to the device. The input may, for example, include swiping the ring with one finger, swiping the ring with two or more fingers, performing a rotational gesture with one or more fingers, or squeezing the ring. In particular embodiments, the outer ring may be rotatable, such that a physical rotation of the ring may serve as an input to the device. Additionally, in particular embodiments, the outer ring may be clicked (e.g. pressed down) or squeezed. Any of the embodiments of the outer ring may be combined, as suitable, such that the ring may be one or more of touch-sensitive, rotatable, clickable (or pressable), or squeezable. Inputs from the different modalities of the outer ring (e.g. touch, rotation, clicking or pressing, or squeezing) may be interpreted differently depending, for example, on the combination of the modalities of input provided by a user. As an example, a rotation of the outer ring may indicate a different input than a rotation in combination with a clicking or pressing of the ring. Additionally, feedback may be provided to the user when the user provides input via the outer ring, including haptic feedback, audio feedback, or visual feedback, described herein.
In particular embodiments, a touch-sensitive interface of the device (e.g. the touch-sensitive layer) may accept user touch input and allow the device to determine the x-y coordinates of a user's touch, identify multiple points of touch contact (e.g. at different areas of the touch-sensitive layer), and distinguish between different temporal lengths of touch interaction (e.g. differentiate gestures including swiping, single tapping, or double tapping). Touch gestures (described herein) may include multi-directional swiping or dragging, pinching, double-tapping, pressing or pushing on the display (which may cause a physical movement of the display in an upward or downward direction), long pressing, multi-touch (e.g. the use of multiple fingers or implements for touch or gesturing anywhere on the touch-sensitive interface), or rotational touch gestures.
In particular embodiments, a graphical user interface of the device may operate according to an interaction and transition model. The model may, for example, determine how modes including applications, functions, sub-modes, confirmations, content, controls, active icons, actions, or other features or elements may be organized (e.g. in a hierarchy) within a graphical user interface of the device.
In one embodiment, the graphical user interface (GUI) includes multiple top-level screens that each correspond to a different mode or application (or sub-mode, function, confirmation, content, or any other feature) of the device. Each of these applications may be on the same level of the hierarchy of the interaction and transition model of the GUI.
In one embodiment, the model may include operability for differentiation of the “left” and “right” sides in relation to the home screen. As an example, one or more of the top-level screens may be associated with modes or applications (or other features) in the hierarchy of the interaction and transition model of the GUI that are fixed (e.g. always available to the user) or contextual or dynamic (e.g. available depending on context). The contextual screens may, for example, reflect the modes, applications, or functions most recently used by the user, the modes, applications, or functions most recently added (e.g. downloaded) by the user, ad-hoc registered devices (that may, for example, enter or exit the communication range of the device as it is used), modes, applications, or functions that are “favorites” of the user (e.g. explicitly designated by the user), or modes, applications, or functions that are suggested for the user (e.g. based on the user's prior activity or current context).
In particular embodiments, the top level of the hierarchy of the interaction and transition model of the GUI may include only “faces,” and the next level of the hierarchy may include applications (or any other features). As an example, the top level of the hierarchy may include a home screen (e.g. the clock), and one or more faces, each face corresponding to a different type of background, mode, or activity such as a wallpaper (e.g. customizable by the user), weather information, a calendar, or daily activity information. Each of the faces may show the time in addition to any other information displayed. Additionally, the face currently displayed may be selected by the user (e.g. via any suitable input mechanism or type) or automatically change based on context (e.g. the activity of the user). The faces to the left of the home screen may be contextual, and the faces to the right of the home screen may be fixed.
In particular embodiments, an input from a user of the device or an input from another input source (e.g. via any of the variety of input mechanisms or types including the outer ring, touch-sensitive interfaces, gestures, speech, or sensors), or a context of use of the device may cause a transition within the GUI from a screen at one level of the hierarchy of the interaction and transition model of the GUI to a screen at another level of the hierarchy. For example, a selection event or input by the user (e.g. a touch or tap of the display, voice input, eye gazing, clicking or pressing of the outer ring, squeezing of the outer ring, any suitable gestures, internal muscular motion detected by sensors, or other sensor input) may cause a transition within the GUI from a top-level screen to a screen nested one level deeper in the hierarchy. If, for example, the current screen is a top-level screen associated with an application, a selection event (e.g. pressing the ring) selects the application and causes the GUI to transition to a screen nested one layer deeper. This second screen may, for example, allow for interaction with a feature of the selected application and may, in particular embodiments, correspond to a main function of the selected application. There may be multiple screens at this second, nested layer, and each of these screens may correspond to different functions or features of the selected application. Similarly, a “back” selection input or event by the user (e.g. a double pressing of the outer ring or a touch gesture in a particular part of the display) may cause a transition within the GUI from one screen (e.g. a feature of a particular application) to another screen that is one level higher in the hierarchy (e.g. the top-level application screen).
In particular embodiments, an interaction layout may structure an interaction and transition model of a GUI of the device. An interaction layout may be applied to any suitable interaction model and need not be dependent on any specific type of motion or animation within a GUI of the device, for example. Although specific examples of interaction layouts are discussed below, any suitable interaction layout may be used to structure an interaction and transition model.
As one example, a panning linear interaction layout may structure an interaction and transition model of a GUI of the device. In a panning-linear-type GUI, elements or features within a layer may be arranged to the left and right of the currently displayed element or feature. User input such as a rotation of the outer ring in a clockwise or counterclockwise direction navigates within a single layer of the model hierarchy. As an example, a rotation of the outer ring clockwise one rotational increment may display the element or feature to the right (e.g. the next element), and a rotation counterclockwise one rotational increment may display the element or feature to the left (e.g. the previous element). In particular embodiments, a fast rotation clockwise or counterclockwise may cause the GUI to perform accelerated browsing. In such an embodiment, a single turn may cause the GUI to transition through multiple elements or features, rather than a single element or feature, as described herein. Different user input may navigate between layers (e.g. either deeper layers or higher layers) in the model hierarchy. As an example, if the user touches or taps the touch-sensitive layer of the display, the GUI may transition one layer deeper in the model hierarchy (e.g. confirming the user's selection or providing options related to the selection). Any suitable input by the user may cause the GUI to transition between layers in the model hierarchy, either in place of or in addition to touch- or tap-based input.
As another example, if the user presses a particular region of the touch-sensitive layer of the display (e.g. designated as a “back” button), or if the user double-taps the touch-sensitive layer of the display, the GUI may transition one layer higher in the model hierarchy (e.g. to the previous layer). If, for example, the user performs a long press of the display or screen, the GUI may transition back to the home screen (e.g. a clock). Without additional user input, the GUI may also transition back to the home screen after a pre-determined period of time (e.g. a timeout period). As described herein, as a user begins, for example, to rotate the outer ring in a clockwise or counterclockwise fashion, the GUI transitions within the same layer, and the next user interface element or feature (e.g. a breadcrumb icon in the same layer) to the right or left, respectively, may begin to appear while the current user interface element or feature may begin to disappear.
As another example, a panning radial (or panning circular) interaction layout may structure an interaction and transition model of a GUI of the device. In a panning-radial-type GUI, elements or features in a layer may be arranged above and below the currently displayed element or feature. User input such as a rotation of the outer ring in a clockwise or counterclockwise direction navigates between layers of the model hierarchy. As an example, a rotation of the outer ring clockwise one increment may cause the GUI to transition one layer deeper in the model hierarchy (e.g. entering a particular application's layer or confirming selection of the application), and a rotation counterclockwise one increment may cause the GUI to transition one layer higher in the model hierarchy (e.g. exiting a particular application's layer to the previous layer). In particular embodiments, a fast rotation clockwise or counterclockwise may cause the GUI to perform accelerated browsing, as described herein. In such an embodiment, a single rotational increment may cause the GUI to transition through multiple layers of the hierarchy, rather than a single layer. Different user input may navigate within a single layer in the model hierarchy. As an example, if the user touches or taps the touch-sensitive layer of the display, the GUI may transition to the next element or feature (e.g. the element below the currently displayed element). As another example, if the user presses a particular region of the touch-sensitive layer of the display (e.g. designated as a “back” button), or if the user double-taps the touch-sensitive layer of the display, the GUI may transition to a previous element or feature (e.g. the element above the currently displayed element). If, for example, the user performs a long press of the display or screen, the GUI may transition back to the home screen (e.g. a clock). Without additional user input, the GUI may also transition back to the home screen after a pre-determined period of time (e.g. a timeout period). As described herein, as a user begins, for example, to rotate the outer ring in a clockwise or counterclockwise fashion, the GUI transitions to a different layer, and the next user interface element or feature (e.g. in a different layer) may begin to appear while the current user interface element or feature may begin to disappear.
As yet another example, an accordion-type interaction layout may structure an interaction and transition model of a GUI of the device. In an accordion-type GUI, elements or features of multiple layers may be arranged in a circular list structure. For example, rotating within the list structure (e.g. by rotating the outer ring) in a first direction past a screen associated with the last element or feature in that direction (e.g. the last fixed application of the device) may cause the GUI to transition to a screen associated with the last element or feature in a second direction (e.g. the least-recently used contextual application of the device). Continuing to rotate in the first direction may cause the GUI to transition through screens associated with contextual applications in “reverse” order (e.g. from least-recently used to most-recently used). Similarly, rotating in the second direction past the screen of the least-recently used contextual application may cause the GUI to transition to the screen associated with the last fixed application, and continuing to rotate in the second direction may cause the GUI to transition through the screens of the fixed applications in reverse order (e.g. from the last fixed application to the first, adjacent to the home screen). In an accordion-type GUI, the element or feature currently displayed may be “expanded” (e.g. if selected by the user) such that its sub-elements or sub-features may become part of the single-layer list structure. In particular embodiments, an element or feature with sub-elements may indicate (when displayed) that it has sub-elements through, for example, visible edges of the sub-elements. User input such as a rotation of the outer ring in a clockwise or counterclockwise direction navigates within a single layer of the model, which may include elements or features, as well as sub-elements or sub-features of a selected element or feature. As an example, a rotation of the outer ring clockwise one increment may display the element or feature to the right (e.g. the next element), and a rotation counterclockwise one increment may display the element or feature to the left (e.g. the previous element). In particular embodiments, a fast rotation clockwise or counterclockwise may cause the GUI to perform accelerated browsing. In such an embodiment, a single rotational increment may cause the GUI to transition through multiple elements or features, rather than a single element or feature. Different user input may cause the selection and expansion of an element or feature in the model. As an example, if the user touches or taps the touch-sensitive layer of the display, the GUI may expand the displayed feature or element within the existing layer and transition to a sub-element or sub-feature. As another example, if the user presses a particular region of the touch-sensitive layer of the display (e.g. designated as a “back” button), or if the user double-taps the touch-sensitive layer of the display, the GUI may collapse the expanded sub-elements or sub-features and transition to an element or feature in the list. If, for example, the user performs a long press of the display or screen, the GUI may transition back to the home screen (e.g. a clock). Without additional user input, the GUI may also transition back to the home screen after a pre-determined period of time (e.g. a timeout period). As described herein, as a user begins, for example, to rotate the outer ring in a clockwise or counterclockwise fashion, the GUI transitions within the same layer, and the next user interface element or feature (e.g. a breadcrumb icon in the same layer) to the right or left, respectively, may begin to appear while the current user interface element or feature may begin to disappear.
In particular embodiments, the GUI may navigate to a home screen based on input received by a user of the device. The user input may include, for example, pressing and holding (e.g. a long press) the touch-sensitive layer, pressing and holding the display, pressing (e.g. clicking) and holding the outer ring, squeezing and holding the outer ring, covering the face (e.g. the display) of the device, covering a particular sensor of the device, turning the face of the device in a downward direction, pressing a software button (discussed herein), pressing a hardware button on the device, or shaking the device (or any other suitable gesture). Any of these inputs or any variation of these inputs (including, for example, shorter durations) may be used as user inputs to go “back” within an interaction and transition model.
In particular embodiments, the GUI of the device may display particular types of content including, for example, lists.
In particular embodiments, the GUI of the device may display vertically or horizontally continuous (or substantially continuous) content including, for example, charts or text. In particular embodiments, an input from the user (e.g. any suitable input mechanism or type) may cause a selection indicator of the GUI to move through the continuous content. In other embodiments, an input from the user may cause the content to move into and out of the display in a horizontal direction, vertical direction, or any other direction mapped to the user's input (and the selection indicator, if present, may remain in a constant position). In the example of
In particular embodiments, the GUI may display content that is of a size larger than the display. In such embodiments, the GUI may scale or crop (or otherwise shrink or fit) the content so that all of the content may be displayed within the display at one time. In other embodiments, the GUI does not alter the size of the content, and instead provides the ability for the user to pan through the content one portion at a time, for example using scrolling (described herein).
In particular embodiments, the device includes the circular display, and the GUI includes circular navigation and menu layouts. This disclosure contemplates any shape for the display, however, and any suitable navigation or menu layout for the GUI. The menu layout may provide a user a visual indication of where the user is located within an interaction and transition model hierarchy of the GUI, for example. The menu layout may also provide visual indicators that allow the user to differentiate between different types of menu items, as well as show an overall view of menu options. Additionally, the menu may be displayed over any suitable background or content of the device.
In particular embodiments, the GUI may display both an item of reference or background content as well as an indication of an available action or function to be performed with respect to the reference or background content.
In particular embodiments, icons displayed in the GUI of device may optimize the energy or battery usage of the device. As an example, an icon may include primarily black background with the icon itself being composed of thin white strokes. This may allow for the amount of white color on the display screen to be very low, allowing for reduced energy consumption of the display while the GUI is used. The icons displayed in GUI may also include real-time notifications. For example, a mobile phone icon may include a notification with the number of new voicemails, an e-mail icon may include a notification with the number of new e-mails, a chat icon may include a notification with the number of new chat messages, and a telephone icon may include a notification with the number of missed calls. In particular embodiments, the GUI of the device only displays colors other than black and white for user-generated content (e.g. pictures, files, contacts, notifications, or schedules). Other information, including menu items, may be displayed in black and white.
In particular embodiments, as the GUI transitions from one element (e.g. feature, content item, or icon) to another (e.g. upon receiving input from a user), the GUI may display visual transition effects. These transition effects may depend, for example, on the type of input received from a user of device. As an example, a single touch on the display may trigger particular transition effects, while a rotation of the outer ring may trigger a different (potentially overlapping) set of transition effects.
In particular embodiments, a user's touch input on the touch-sensitive layer may trigger transition effects including center-oriented expansion, directional sliding, and scaling in or out.
In particular embodiments, a user's rotation of the outer ring may trigger visual transition effects including zooming, directional sliding, blurring, masking, page folding, rotational movement, and accelerated motion.
In particular embodiments, the GUI of the device may include a physical model that takes into account motion of the user and produces visual feedback reflecting the user's movements. As an example, once there is activation input (e.g. in the form of a particular gesture) by the user, the user's motion may be continuously tracked through input from one or more of the sensors of the device. The visual feedback may reflect the user's motion in the user interface, while the underlying content stays still, so that gestures may be registered and parallax may be used to distinguish between UI features or controls and underlying content. In particular embodiments, the physical model may include a generalized spring model with damping. In such a model, items may be arranged in layers. Deeper layer may have a “stiffer” spring in the physical model holding items in place. This may cause bottom layers of the user interface to move slightly when the device is moved, while top layers may move more, creating a sense of parallax. Additionally, the spring model may include damping, which causes motion to lag, creating a more fluid, smooth motion.
In particular embodiments, the GUI of the device may include faces as default screens or wallpapers for the device, and these faces may be part of an interaction and transition model hierarchy (e.g. in the top layer of the hierarchy or as a home screen). As described herein, these faces may be changeable applications or modes that may automatically respond contextually to a user's activity. As an example, the faces may change depending on the user's environment, needs, taste, location, activity, sensor data, gestures, or schedule. The availability of a face (or the transition in the GUI from one face to another) may be determined based on contextual information. As an example, if the user has an upcoming event scheduled in her calendar, the face of the device may change to a calendar face that displays the upcoming event information to the user. As another example, if the user is determined to be in the vicinity of her home (e.g. based on GPS data), the face of the device may change to a face associated with a home-automation application. As yet another example, if the user is determined (e.g. based on various biometric sensors such as heart rate or arousal sensors, or based on accelerometers) to be moving vigorously, the face of the device may change to a fitness mode, showing the user's measured pulse, calories burned, time elapsed since the activity (e.g. a run) began, and the time. Any suitable sensor data (e.g. from sensors including biometric sensors, focus sensors, or sensors which may determine a user's hand position while driving a car) may be used to determine a context and appropriate face to display to the user. The user's historical usage of the device (e.g. a particular time of day when the user has used a fitness application, such as in a fitness class) may also determine which face is displayed on the device. As an example, the device may anticipate the user's need for the fitness mode at the particular time of day when the user tends to exercise. Contextual faces may also be associated with the suppression of notifications (e.g. if the user is determined to be driving or if the device is not being worn) or a change in how notifications are expressed (e.g. visually, or audibly). In particular embodiments, the faces of the device need not be associated with any application on the device and may be wallpapers or backgrounds on the display of the device. Faces may be dedicated to specific channels of information (e.g. calendar feeds, health or activity feeds, notifications, weather feeds, or news). As an example, a severe weather notification or alert (received, e.g., from a weather feed) may cause the weather face to be displayed on the display along with the notification. Faces may display the time (e.g. in analog or digital format) regardless of the type of face. The faces may be customizable by the user. The user's customizations or tastes may be input explicitly by the user (e.g. to management software on the device or a paired device) or learned directly by the device (e.g. using sensor and usage data to create a model over time).
In particular embodiments, the device may be worn on a limb of a user (without obscuring the user's face and without requiring the user to hold the device) and may include augmented reality (AR) functionality. This AR functionality may be based on the use of body motion for aiming a camera of the device, which may allow for aiming with higher accuracy due to a user's sense of proprioception. This type of system may allow the user of the device to view an object in the real world at the same time that the user views a version of the object (e.g. captured by a camera of the device) on the display. An example of this AR capability is illustrated in
In particular embodiments, if the device does not have the capability to calculate features of interest itself, the device may capture an image, transfer the image to a communicatively coupled device (e.g. a nearby device such as a phone or personal computer) or to an Internet-based service, where the features of interest may be calculated remotely. Once the features of interest are determined, an Internet-based service or local data catalog may be consulted for additional information about a recognized object. If information is found, the relevant data may be displayed to the user on the device along with the recognized feature.
The device may, in particular embodiments, have a small form factor and be constrained in terms of available memory, processing, and energy. A delegation model may allow the device to delegate portions of one or more processing tasks (e.g. tasks related to AR functionality) to nearby devices (e.g. phone or personal computer) or to network- or Internet-based services, for example. As an example, for delegable tasks, the application requiring the task provides the system (e.g. a kernel of an operating system of the device) with characteristics or a profile of the task, including the task's latency sensitivity, processing requirements, and network payload size. This may be done for each delegable subtask of the overall delegable task. Since tasks are often pipelined, contiguous chunks of the task pipeline may be delegated. The system may, in particular embodiments, take measurements of or build a model of one or more characteristics of the device. Characteristics of the device may include static properties of the device, e.g. properties of hardware components of the device including total memory installed, maximum CPU speed, maximum battery energy, or maximum bandwidth of a network interface. Characteristics of the device may also include dynamic properties of the device, e.g. operating properties of the device including available memory, current CPU capacity, available energy, current network connectivity, availability of network-based services, a tally of average user behavior among one or more users, or a predicted or expected processing time of a task (e.g. given a particular usage scenario). In particular embodiments, the device may have a model that incorporates previous and current measurements of device characteristics to aid in determining future device behavior. Based on the task characteristics or profile and these measurements or models, as well as based on whether the task may be executed on the device, the system may delegate (or not delegate) one or more portions of the task or task pipeline. For example, if the available memory on the device cannot support the processing of a task (e.g. playing a video), one or more portions of the task may be delegated. As another example, if the CPU capacity of the device cannot support processing a task (e.g. if the CPU is running at capacity due to its existing load), one or more portions of the task may be delegated. As another example, if a battery level of the device is low and the battery is not expected to provide energy to the device for as long as the expected processing time of the task, one or more portions of the task may be delegated. As another example, if the network connectivity of the device is low or non-existent, one or more portions of the task may not be delegated (e.g. if the device also has enough available memory, CPU capacity, and energy). As another example, if one or more network-based services are available to the device (e.g. cloud-based services for processing) and the device has suitable network connectivity (e.g. good available bandwidth), one or more portions of the task may be delegated. As another example, if a user of the device typically (e.g. historically) delegates the playing of videos, one or more portions of the task of playing a video may be delegated. As another example, if a predicted processing time of the task (e.g. predicted based on a model incorporating previous and current measurements of device characteristics) is beyond a certain threshold (e.g. several minutes), the task may be delegated. Any suitable characteristics of the device (e.g. static or dynamic properties) in any suitable combination may be used to determine whether to delegate a task. Furthermore, any suitable characteristics of a task of the device (e.g. including a task profile or characteristics of the task including latency sensitivity, processing requirements, or network payload size) may be used to determine whether to delegate a task, either alone or in conjunction with device characteristics. Additionally, any model of the device (e.g. device behavior) may be used, either alone or in conjunction with device or task characteristics, may be used to determine whether to delegate a task. In particular embodiments, devices paired with the device may also include a delegation model, such that the paired device (e.g. a phone) performs the same steps, delegating tasks based on its own models of energy, connectivity, runtime requirements, and feasibility. The delegated task may be processed or run to completion on the paired device (e.g. phone), and the results of processing the delegated task may be returned to the device. In particular embodiments, the device may operate in standalone mode (e.g. without delegating any processing tasks) when it does not have any network connectivity or when no paired devices are in range of the device. Once the device regains connectivity, or when a device is paired with the device, delegation of tasks may resume.
An example algorithm of a delegation model of the device is illustrated in
It is contemplated by this disclosure that a delegation model for a particular the device (or for a family or range of devices) may be dynamic or contextual. As an example, a delegation model may take into account available memory, CPU capacity, and available energy of a particular the device (or a family of devices), factors which may all change over time. The delegation model may also take into account the availability of network- or cloud-based services (and the capacity of each), as well as network connectivity (e.g. bandwidth and latency), which may also change over time. For example, with reference to
The device may choose to delegate functionality to a paired processing-rich device (e.g. phone, computer, tablet, television, set-top box, refrigerator, washer, or dryer) or to the Internet based on the energy reserves or connectivity bandwidth to each of these locations. For example, a device with a powerful processor may delegate to the paired device when low on energy, or it may choose to delegate to the Internet service when the paired device does not have sufficient power reserves. Likewise, the system of the device may choose to process locally if the connection to the Internet is showing higher latency to reduce the size of the data transfer.
In particular embodiments, an entire application or a portion of an application may be delegated by a user of the device to a paired device or vice versa. This may occur on a per-application basis. When the application on a target device (e.g. a television) is to be delegated to the device, the target device may send a request over the paired connection (possibly via an intermediary device, such as a smartphone or personal computer) to load the application on the device. The device may then act as a client to a server running on the paired device (e.g. television). Similarly, an application running on the device may be delegated to the paired device (e.g. a video playing on the device may be delegated to playing on a paired television). For example, if the device is running a first application, and a user of the device wants to interact with a second application, the device may automatically delegate a task of the first application to be processed by another device (e.g. a paired television).
In particular embodiments, a camera or other optical sensor of the device may be used to recognize any gestures performed by the user (e.g. in the space between the camera and a target in the real world). These gestures may, for example, be used to act upon the data presented (e.g. the real world target, such as a sign including text) or may be used to point to particular items upon which augmented reality functions may be performed. For example, the user may point to a word on a sign, causing the device to translate it and display the translation to the user.
In particular embodiments, objects or images may be recognized by the device when they are within the frame of view of a camera of the device. As described herein, there may be multiple ways for the device to recognize an object. As one example, a gesture performed by the user (e.g. a pointing gesture indicating a particular object) may enable AR functionality on the device and cause the device to recognize the object. As another example, automatic object recognition may occur when, for example, the user positions the camera for a certain amount of time on a particular object (e.g. a section of text). As a third example, object recognition or AR functionality may be enabled explicitly by the user when, for example, the user taps or touches the display (or, e.g., clicks the outer ring) when the camera of the device has captured an object of interest. Global object recognition may, in some instances, be computationally intensive and error-prone. As such, in particular embodiments, a limiting set (e.g. the pages of a magazine or catalog or a catalog of a particular type of object such as plant leaves or book covers) may be applied to improve accuracy. There exist a number of choices for calculation of feature vectors from images, which the designer of the system for the device may select from. In some instances, the conversion of feature vectors between different approaches may be computationally expensive, so that the choice of the database of possible matches is replicated on the device. The calculation of feature vectors may be delegated, as described herein.
In particular embodiments, barcodes of various types may be recognized by the device. These barcodes may be used to query Internet-based services for additional data, as well as options to purchase, review, or bookmark the barcoded item for future review. While two-dimensional barcodes may generally be read directly, the system of the device may offer an addition close-focus mode for particularly small or one-dimensional barcodes to improve recognition rate. Should the system lack the ability to decode the barcode, it may simply focus the camera, take a picture, and delegate recognition to a remote service, as described herein.
In particular embodiments, the device may perform translation. Translation functionality may be divided into two portions: optical character recognition (OCR), and translation of recognized characters, words, or phrases. OCR may be completed on the device or delegated (e.g. to a paired processing device) to reduce the amount of data to be translated by the device. Simple word translations may be performed on the device or delegated (e.g. to a paired processing device). As with other functionality described herein, part or all of the recognition or translation process may be delegated as needed. The user may optionally use a gesture to indicate the word to be translated, as shown in
In particular embodiments, a pairing and control model for the device may include the following characteristics. The device may function as the host for an application that interacts with or controls one or more functions of a remote device (e.g. an appcessory such as a controllable thermostat). A smartphone (or other locally-paired device), which may have previously been the host for the application, may now function merely as a local target device to which the device may delegate certain functions related to the interaction or control of the remote device (e.g. longer-range wireless connectivity to the remote device, sending commands to the remote device, receiving data from the remote device, or processing tasks). Control of the remote appcessory device may be done by the device using any suitable means including, for example, visual means (e.g. using the camera) or motion-based gestures. In other embodiments, the locally-paired smartphone may continue to function as the host for the application that interacts with the remote appcessory, but the device may provide some or all of the user interface for data input and output to and from the application (e.g. a “light” version of the application hosted by the smartphone). For example, the user may control the application using the device, but the smartphone may still function as the host of the application.
In particular embodiments, the device may be operable with one or more services. These services may fall in categories including security, energy, home automation and control, content sharing, healthcare, sports and entertainment, commerce, vehicles, and social applications.
Example security applications include the following. The device may authenticate a user (who is wearing the unlocked device) to another device near the user (e.g. paired with the device). The device may be unlocked with a code entered by the user using any suitable input including, for example, rotating the outer ring of the device. As an example, while a user rotates (or presses or clicks) the outer ring, the display may show alphanumeric or symbolic data corresponding to the rotation (or press or click) by the user. If, for example, the user rotates the outer ring one rotational increment in a clockwise direction (or, e.g., clicks or presses the outer ring once), the display may show the user a “1,” and if the user rotates the outer ring two rotational increments (e.g. within a certain period of time, such as a millisecond) in a clockwise direction (or, e.g., clicks or presses the outer ring twice), the display may show the user a “2.” In particular embodiments, the display of alphanumeric or symbolic data corresponding to a rotation (or press or click) by the user may allow the user to unlock the device using the metaphor of a combination lock. The device may also be unlocked using biometric data (e.g. by skin or bone signatures of the user).
In an example energy application, the device may automatically display information about the energy consumption of the room or other location in which the user is located. The device may also be able to display information about the energy consumption of other paired devices and update all of this information dynamically as the user changes location.
In an example home control application, the user may select and directly control paired home-control devices using, for example, rotation of the outer ring or a gesture input.
The user may use gestures to control the sharing or transfer of content to or from the device (e.g. transferring video playing on the device to a paired television, as described herein). Additionally, auxiliary information (e.g. movie subtitles) may be provided on the device for content shown on another, larger device (e.g. television screen playing the movie).
The device may automatically determine a healthcare context (e.g. if the user is exercising or sleeping). When it determines this context, the device may open applications corresponding to the healthcare context (e.g. for recording heart rate during exercise, movement during exercise, duration of exercise, pulse oximetry during exercise, sleep patterns, duration of sleep, or galvanic skin response). The device may, for example, measure a user's health-related data (e.g. heart rate, movement, or pulse oximetry) and send some or all of this data to a paired device or a server. Although illustrated in the healthcare context, the determination of a relevant context (e.g. based on a user's behavior), opening of corresponding applications, recording of data, or transmission of this data may be applicable in any suitable context.
The device may assist in sports-related applications such as, for example, automatically assessing a golf swing of the user and suggesting corrections.
In a commercial setting, the device may automatically identify a product (e.g. using RFID, NFC, barcode recognition, or object recognition) when the user picks up the product and may provide information about the product (e.g. nutrition information, source information, or reviews) or the option to purchase the product. Payment for the product may, for example, be accomplished using visual barcode technology on the device. In particular embodiments, the device may be used to pay for a product using NFC, RFID, or any other suitable form of short-distance communication. During payment, the user's information may, for example, be authenticated by the device, which may detect the user's biometric information (e.g. bone structure or skin signature). The device may also automatically provide an indication to the user (e.g. a vibration) when the user is near a product on her shopping list (e.g. stored in the device) or another list (e.g. a wish list of the user's friend).
The device may function as a key for unlocking or turning on one or more vehicles. The user may, for example, enter a code using the outer ring to unlock or turn on the vehicle (e.g. using NFC technology), as described earlier. In particular embodiments, both user biometric information and a code entered by the user may be required to unlock the car, allowing for enhanced security for a car-based application. Additionally, the device may include profiles for one or more users, each profile containing vehicle settings (e.g. temperature or seat position). As another example, biometric information of a particular user may be used not only to unlock the device, but also to determine which user profile to load during the car's operation. The proximity of the device to the vehicle may automatically cause the vehicle to implement the vehicle settings of the profile of the user. The device may also be operable for GPS navigation (either directly on the device or when paired with and controlling a phone, for example).
The device may access and operate in conjunction with a service that provides support for mixed-reality games or massively multi-player reality-based games. This functionality may, for example, include registration, management of user data (e.g. user profiles and game-related data such as levels completed or inventories of supplies), and management of accomplishment lists. The functionality of the device and the service may also include management of connectivity (e.g. concentrator functionality) that handles fragile wireless communication channels and provides a unified API to third party game servers.
The device may access and operate in conjunction with a service that allows a user of the device to publish locations, check-ins, or other location-based data that allows various services to access a consistent reservoir of the most current information regarding the position and status of the user. As an example, the user of the device may find friends using similar devices. The service and device together may handle status updates, profile management, application access permissions, blacklists, or user-to-user access permissions. The service may be a trusted and centralized touchpoint for private data. By combining access to a unified location service, energy and battery life may, in particular embodiments, be conserved. In particular embodiments, certain functionality tokens may be made available based on the position of the user. An application may, for example, check on the device to see if this token is available and act accordingly. On the server side, APIs may allow developers to see use of the tokens or allow for redemption. In particular embodiments, information may be distributed by the device to other users (e.g. a single other user, or in broadcast mode to multiple users).
The device may access and operate in conjunction with a service that provides a unified polling interface that allows devices to receive and send polls. The device and service together may manage distribution lists, scoring criteria, and poll availability frames (both temporal and geographic, for example). This service may be exposed on the device and on a server such that third parties may use APIs to write applications and receive results back via online APIs.
In particular embodiments, the device may access and operate in conjunction with a service that provides optimizations for the presentation of text, images, or other information on a circular display of the device. As an example, a web site may be rendered or formatted for display on a computer monitor, but a service may customize the rendering and formatting for a smaller, circular display by emphasizing images and truncating text. The customized rendering and formatting may, for example, be a task delegable among the device and one or more servers or locally-paired devices. This service may also include news or advertising services.
This disclosure contemplates any suitable number of computer systems 13700. This disclosure contemplates computer system 13700 taking any suitable physical form. As example and not by way of limitation, computer system 13700 may be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC) (such as, for example, a computer-on-module (COM) or system-on-module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a personal digital assistant (PDA), a server, a tablet computer system, or a combination of two or more of these. Where appropriate, computer system 13700 may include one or more computer systems 13700; be unitary or distributed; span multiple locations; span multiple machines; span multiple data centers; or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 13700 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example and not by way of limitation, one or more computer systems 13700 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 13700 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
In particular embodiments, computer system 13700 includes a processor 13702, memory 13704, storage 13706, an input/output (I/O) interface 13708, a communication interface 13710, and a bus 13712. Although this disclosure describes and illustrates a particular computer system having a particular number of particular components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
In particular embodiments, processor 13702 includes hardware for executing instructions, such as those making up a computer program. As an example and not by way of limitation, to execute instructions, processor 13702 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 13704, or storage 13706; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 13704, or storage 13706. In particular embodiments, processor 13702 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 13702 including any suitable number of any suitable internal caches, where appropriate. As an example and not by way of limitation, processor 13702 may include one or more instruction caches, one or more data caches, and one or more translation lookaside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 13704 or storage 13706, and the instruction caches may speed up retrieval of those instructions by processor 13702. Data in the data caches may be copies of data in memory 13704 or storage 13706 for instructions executing at processor 13702 to operate on; the results of previous instructions executed at processor 13702 for access by subsequent instructions executing at processor 13702 or for writing to memory 13704 or storage 13706; or other suitable data. The data caches may speed up read or write operations by processor 13702. The TLBs may speed up virtual-address translation for processor 13702. In particular embodiments, processor 13702 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 13702 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 13702 may include one or more arithmetic logic units (ALUs); be a multi-core processor; or include one or more processors 13702. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
In particular embodiments, memory 13704 includes main memory for storing instructions for processor 13702 to execute or data for processor 13702 to operate on. As an example and not by way of limitation, computer system 13700 may load instructions from storage 13706 or another source (such as, for example, another computer system 13700) to memory 13704. Processor 13702 may then load the instructions from memory 13704 to an internal register or internal cache. To execute the instructions, processor 13702 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 13702 may write one or more results (which may be intermediate or final results) to the internal register or internal cache. Processor 13702 may then write one or more of those results to memory 13704. In particular embodiments, processor 13702 executes only instructions in one or more internal registers or internal caches or in memory 13704 (as opposed to storage 13706 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 13704 (as opposed to storage 13706 or elsewhere). One or more memory buses (which may each include an address bus and a data bus) may couple processor 13702 to memory 13704. Bus 13712 may include one or more memory buses, as described below. In particular embodiments, one or more memory management units (MMUs) reside between processor 13702 and memory 13704 and facilitate accesses to memory 13704 requested by processor 13702. In particular embodiments, memory 13704 includes random access memory (RAM). This RAM may be volatile memory, where appropriate, and this RAM may be dynamic RAM (DRAM) or static RAM (SRAM), where appropriate. Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM. Memory 13704 may include one or more memories 13704, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
In particular embodiments, storage 13706 includes mass storage for data or instructions. As an example and not by way of limitation, storage 13706 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 13706 may include removable or non-removable (or fixed) media, where appropriate. Storage 13706 may be internal or external to computer system 13700, where appropriate. In particular embodiments, storage 13706 is non-volatile, solid-state memory. In particular embodiments, storage 13706 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these. This disclosure contemplates mass storage 13706 taking any suitable physical form. Storage 13706 may include one or more storage control units facilitating communication between processor 13702 and storage 13706, where appropriate. Where appropriate, storage 13706 may include one or more storages 13706. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
In particular embodiments, I/O interface 13708 includes hardware, software, or both, providing one or more interfaces for communication between computer system 13700 and one or more I/O devices. Computer system 13700 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 13700. As an example and not by way of limitation, an I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 13708 for them. Where appropriate, I/O interface 13708 may include one or more device or software drivers enabling processor 13702 to drive one or more of these I/O devices. I/O interface 13708 may include one or more I/O interfaces 13708, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
In particular embodiments, communication interface 13710 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 13700 and one or more other computer systems 13700 or one or more networks. As an example and not by way of limitation, communication interface 13710 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 13710 for it. As an example and not by way of limitation, computer system 13700 may communicate with an ad hoc network, a personal area network (PAN), a local area network (LAN), a wide area network (WAN), a metropolitan area network (MAN), body area network (BAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 13700 may communicate with a wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these. Computer system 13700 may include any suitable communication interface 13710 for any of these networks, where appropriate. Communication interface 13710 may include one or more communication interfaces 13710, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.
In particular embodiments, bus 13712 includes hardware, software, or both coupling components of computer system 13700 to each other. As an example and not by way of limitation, bus 13712 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a front-side bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these. Bus 13712 may include one or more buses 13712, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.
Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs, optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, solid-state drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.
Herein, “or” is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A or B” means “A, B, or both,” unless expressly indicated otherwise or indicated otherwise by context. Moreover, “and” is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, “A and B” means “A and B, jointly or severally,” unless expressly indicated otherwise or indicated otherwise by context.
The scope of this disclosure encompasses all changes, substitutions, variations, alterations, and modifications to the example embodiments described or illustrated herein that a person having ordinary skill in the art would comprehend. The scope of this disclosure is not limited to the example embodiments described or illustrated herein. Moreover, although this disclosure describes and illustrates respective embodiments herein as including particular components, elements, feature, functions, operations, or steps, any of these embodiments may include any combination or permutation of any of the components, elements, features, functions, operations, or steps described or illustrated anywhere herein that a person having ordinary skill in the art would comprehend. Furthermore, reference in the appended claims to an apparatus or system or a component of an apparatus or system being adapted to, arranged to, capable of, configured to, enabled to, operable to, or operative to perform a particular function encompasses that apparatus, system, component, whether or not it or that particular function is activated, turned on, or unlocked, as long as that apparatus, system, or component is so adapted, arranged, capable, configured, enabled, operable, or operative.
While this disclosure describes particular structures, features, interactions, and functionality in the context of a wearable device, this disclosure contemplates that those structures, features, interactions, or functionality may be applied to, used for, or used in any other suitable electronic device (such as, for example, a smart phone, tablet, camera, or personal computing device), where appropriate.
This application claims the benefit, under 35 U.S.C. §119(e), of U.S. Provisional Patent Application No. 61/728,765, filed 20 Nov. 2012, U.S. Provisional Patent Application No. 61/728,770, filed 20 Nov. 2012, U.S. Provisional Patent Application No. 61/773,803, filed 6 Mar. 2013, U.S. Provisional Patent Application No. 61/728,773, filed 20 Nov. 2012, U.S. Provisional Patent Application No. 61/773,813, filed 7 Mar. 2013, U.S. Provisional Patent Application No. 61/773,815, filed 7 Mar. 2013, U.S. Provisional Patent Application No. 61/773,817, filed 7 Mar. 2013, U.S. Provisional Patent Application No. 61/775,688, filed 11 Mar. 2013, U.S. Provisional Patent Application No. 61/775,687, filed 11 Mar. 2013, and U.S. Provisional Patent Application No. 61/775,686, filed 11 Mar. 2013, which are all incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
2473226 | Sheldon | Jun 1949 | A |
3062369 | Guy | Nov 1962 | A |
3477285 | Krafft | Nov 1969 | A |
D221081 | Kahn | Jul 1971 | S |
D224913 | Maroni | Oct 1972 | S |
3915534 | Yonkers | Oct 1975 | A |
D249874 | Lawrence | Oct 1978 | S |
4427303 | Matthias | Jan 1984 | A |
D282914 | Maron | Mar 1986 | S |
D284949 | Kong | Aug 1986 | S |
4636047 | Green | Jan 1987 | A |
4757456 | Benghiat | Jul 1988 | A |
D297121 | Porsche | Aug 1988 | S |
D299028 | Hermann | Dec 1988 | S |
D300678 | Barrault | Apr 1989 | S |
D302664 | Dawson, Jr. | Aug 1989 | S |
4906207 | Panning | Mar 1990 | A |
D335263 | Willis | May 1993 | S |
D347589 | LaBate | Jun 1994 | S |
D351558 | Moon | Oct 1994 | S |
5361169 | Deal | Nov 1994 | A |
D355132 | Williams | Feb 1995 | S |
D356960 | Courteault | Apr 1995 | S |
5418760 | Kawashima | May 1995 | A |
D362396 | Chester | Sep 1995 | S |
D365550 | Houlihan | Dec 1995 | S |
D366036 | Houlihan | Jan 1996 | S |
D372878 | Finnegan | Aug 1996 | S |
D383985 | Davenport | Sep 1997 | S |
D384661 | Burrell | Oct 1997 | S |
D386696 | Walch | Nov 1997 | S |
D401515 | Yida | Nov 1998 | S |
5832296 | Wang | Nov 1998 | A |
D404317 | Cameron | Jan 1999 | S |
D410854 | Cheung | Jun 1999 | S |
5915580 | Melk | Jun 1999 | A |
D413817 | Ando | Sep 1999 | S |
D416814 | Welsh | Nov 1999 | S |
6031525 | Perlin | Feb 2000 | A |
D422513 | Bodino | Apr 2000 | S |
D433949 | Helleu | Nov 2000 | S |
D434675 | Ando | Dec 2000 | S |
6285757 | Carroll | Sep 2001 | B1 |
D453005 | Baumer | Jan 2002 | S |
6336126 | Bjorklund et al. | Jan 2002 | B1 |
6359837 | Tsukamoto | Mar 2002 | B1 |
D455356 | Saffer | Apr 2002 | S |
D459352 | Giovanniello | Jun 2002 | S |
6400996 | Hoffberg | Jun 2002 | B1 |
6407379 | Shinbo | Jun 2002 | B1 |
D460430 | Wada | Jul 2002 | S |
6424743 | Ebrahimi | Jul 2002 | B1 |
D463296 | Sanomi | Sep 2002 | S |
6477117 | Narayanaswami et al. | Nov 2002 | B1 |
D466487 | Wada | Dec 2002 | S |
D466488 | Wada | Dec 2002 | S |
6535461 | Karhu | Mar 2003 | B1 |
6556222 | Narayanaswami | Apr 2003 | B1 |
D474982 | Wilson | May 2003 | S |
6573883 | Bartlett | Jun 2003 | B1 |
6597345 | Hirshberg | Jul 2003 | B2 |
6636635 | Matsugu | Oct 2003 | B2 |
6703918 | Kita | Mar 2004 | B1 |
6744423 | Kraft | Jun 2004 | B2 |
6744427 | Maglio | Jun 2004 | B2 |
6747680 | Igarashi | Jun 2004 | B1 |
6809774 | Yamazaki et al. | Oct 2004 | B1 |
6873575 | Yamazaki | Mar 2005 | B2 |
6948135 | Ruthfield | Sep 2005 | B1 |
6968508 | Lucaci | Nov 2005 | B2 |
D519858 | Migliorati | May 2006 | S |
7081905 | Raghunath | Jul 2006 | B1 |
D526215 | Calvani | Aug 2006 | S |
D526912 | Calvani | Aug 2006 | S |
D526973 | Gates | Aug 2006 | S |
7096048 | Sanders | Aug 2006 | B2 |
D528928 | Burton | Sep 2006 | S |
D529402 | Burton | Oct 2006 | S |
D537371 | Belamich | Feb 2007 | S |
D537409 | Suzuki | Feb 2007 | S |
D537738 | Ohki | Mar 2007 | S |
7187908 | Fujisawa | Mar 2007 | B2 |
D543122 | Lafever | May 2007 | S |
D545305 | Mulcahy | Jun 2007 | S |
7229385 | Freeman | Jun 2007 | B2 |
D545697 | Martin | Jul 2007 | S |
D547211 | Dias | Jul 2007 | S |
D550105 | Oberrieder | Sep 2007 | S |
D550614 | Fee | Sep 2007 | S |
7286063 | Gauthey | Oct 2007 | B2 |
D554636 | Tom | Nov 2007 | S |
D558207 | Ikeda | Dec 2007 | S |
D558208 | Ikeda | Dec 2007 | S |
D558209 | Ikeda | Dec 2007 | S |
D564367 | Molyneux | Mar 2008 | S |
7382691 | Capozzi | Jun 2008 | B2 |
7385592 | Collins | Jun 2008 | B2 |
7398151 | Burrell | Jul 2008 | B1 |
7404146 | Bennetts | Jul 2008 | B2 |
D574262 | Martinez | Aug 2008 | S |
D574263 | Loth | Aug 2008 | S |
D575289 | Kuo | Aug 2008 | S |
7477890 | Narayanaswami | Jan 2009 | B1 |
D585898 | Skurdal | Feb 2009 | S |
7487467 | Kawahara | Feb 2009 | B1 |
7506269 | Lang | Mar 2009 | B2 |
D590277 | Wei | Apr 2009 | S |
D590727 | Wei | Apr 2009 | S |
7539532 | Tran | May 2009 | B2 |
D596509 | Kume | Jul 2009 | S |
D596610 | Hou | Jul 2009 | S |
D596959 | Rousseau | Jul 2009 | S |
7567239 | Seni | Jul 2009 | B2 |
D600144 | Diltoer | Sep 2009 | S |
D602858 | Ellis | Oct 2009 | S |
D604643 | Ahlstrom | Nov 2009 | S |
D610472 | Nashimoto | Feb 2010 | S |
7667657 | Koshiji | Feb 2010 | B2 |
D612269 | Leung | Mar 2010 | S |
D615955 | Oh | May 2010 | S |
7778118 | Lyons | Aug 2010 | B2 |
D626550 | Julien | Nov 2010 | S |
D627718 | Houghton | Nov 2010 | S |
7854684 | Freeman | Dec 2010 | B1 |
D631373 | Mikkelsen | Jan 2011 | S |
D635472 | Dalla Libera | Apr 2011 | S |
D636686 | Cobbett | Apr 2011 | S |
D636687 | Morrison | Apr 2011 | S |
7925986 | Aravamudan | Apr 2011 | B2 |
7932893 | Berthaud | Apr 2011 | B1 |
D640576 | Nashimoto | Jun 2011 | S |
D640936 | Teixeia | Jul 2011 | S |
D640948 | Quigley | Jul 2011 | S |
8001488 | Lam | Aug 2011 | B1 |
D645360 | Kiser | Sep 2011 | S |
D650706 | Zanella | Dec 2011 | S |
D651099 | Ruffieux | Dec 2011 | S |
D654431 | Stephanchick | Feb 2012 | S |
8117540 | Assadollahi | Feb 2012 | B2 |
D659093 | Schmid | May 2012 | S |
8176432 | Klein | May 2012 | B2 |
8179563 | King et al. | May 2012 | B2 |
8179604 | Gomez et al. | May 2012 | B1 |
8184070 | Taubman | May 2012 | B1 |
8184983 | Ho et al. | May 2012 | B1 |
D661206 | Register | Jun 2012 | S |
D661275 | Roka | Jun 2012 | S |
8194036 | Braun | Jun 2012 | B1 |
8212781 | Wong | Jul 2012 | B2 |
8228315 | Starner | Jul 2012 | B1 |
D664880 | Cobbett | Aug 2012 | S |
D664881 | Cobbett | Aug 2012 | S |
D669371 | Rasumowsky | Oct 2012 | S |
8279716 | Gossweiler, III | Oct 2012 | B1 |
8289162 | Mooring | Oct 2012 | B2 |
8295879 | Alameh | Oct 2012 | B2 |
8313353 | Purdy | Nov 2012 | B2 |
D671858 | Cobbett | Dec 2012 | S |
D672255 | Zanella | Dec 2012 | S |
8335993 | Tan | Dec 2012 | B1 |
D678081 | Cabiddu | Mar 2013 | S |
D680541 | Lee | Apr 2013 | S |
D687736 | Wycoff | Aug 2013 | S |
8506158 | Keung | Aug 2013 | B2 |
8511890 | Andren | Aug 2013 | B2 |
8572175 | Lamb | Oct 2013 | B2 |
8639214 | Fujisaki | Jan 2014 | B1 |
8655027 | Olthoff | Feb 2014 | B1 |
8676123 | Hinkle | Mar 2014 | B1 |
8676273 | Fujisaki | Mar 2014 | B1 |
8949847 | Kim | Feb 2015 | B2 |
8963806 | Starner | Feb 2015 | B1 |
8994827 | Mistry | Mar 2015 | B2 |
9007302 | Bandt-Horn | Apr 2015 | B1 |
9022291 | van der Merwe | May 2015 | B1 |
9022292 | van der Merwe | May 2015 | B1 |
9030446 | Mistry | May 2015 | B2 |
9239740 | Zhao | Jan 2016 | B2 |
9373230 | Argue | Jun 2016 | B2 |
10357322 | Olson | Jul 2019 | B2 |
10416712 | Brawer | Sep 2019 | B2 |
10423214 | Mistry | Sep 2019 | B2 |
20010017663 | Yamaguchi | Aug 2001 | A1 |
20010043514 | Kita | Nov 2001 | A1 |
20020044691 | Matsugu | Apr 2002 | A1 |
20020068600 | Chihara et al. | Jun 2002 | A1 |
20020101457 | Lang | Aug 2002 | A1 |
20020115478 | Fujisawa | Aug 2002 | A1 |
20020118603 | Tamagawa | Aug 2002 | A1 |
20020122031 | Maglio | Sep 2002 | A1 |
20020135615 | Lang | Sep 2002 | A1 |
20020180586 | Kitson | Dec 2002 | A1 |
20030025603 | Smith | Feb 2003 | A1 |
20030025670 | Barnett | Feb 2003 | A1 |
20030030595 | Radley-Smith | Feb 2003 | A1 |
20030046228 | Berney | Mar 2003 | A1 |
20030070106 | Kosuda | Apr 2003 | A1 |
20030123328 | Guanter | Jul 2003 | A1 |
20030197740 | Reponen | Oct 2003 | A1 |
20030204132 | Suzuki | Oct 2003 | A1 |
20030229900 | Reisman | Dec 2003 | A1 |
20040044721 | Song | Mar 2004 | A1 |
20040104896 | Suraqui | Jun 2004 | A1 |
20040130581 | Howard | Jul 2004 | A1 |
20040164957 | Yamaguchi | Aug 2004 | A1 |
20040199918 | Skovira | Oct 2004 | A1 |
20040209642 | Kim | Oct 2004 | A1 |
20040209657 | Ghassabian | Oct 2004 | A1 |
20040210479 | Perkowski | Oct 2004 | A1 |
20040218474 | Yamazaki | Nov 2004 | A1 |
20040261031 | Tuomainen | Dec 2004 | A1 |
20040263473 | Cho | Dec 2004 | A1 |
20050001821 | Low | Jan 2005 | A1 |
20050007337 | Sellen | Jan 2005 | A1 |
20050030112 | Kosuda | Feb 2005 | A1 |
20050037814 | Yasui | Feb 2005 | A1 |
20050038653 | Roth | Feb 2005 | A1 |
20050114796 | Bast | May 2005 | A1 |
20050137470 | Rosenthal | Jun 2005 | A1 |
20050162523 | Darrell | Jul 2005 | A1 |
20050210402 | Gunn | Sep 2005 | A1 |
20050212767 | Marvit | Sep 2005 | A1 |
20050212911 | Marvit | Sep 2005 | A1 |
20060026288 | Acharya et al. | Feb 2006 | A1 |
20060026535 | Hotelling | Feb 2006 | A1 |
20060092177 | Blasko | May 2006 | A1 |
20060139320 | Lang | Jun 2006 | A1 |
20060149652 | Fellenstein et al. | Jul 2006 | A1 |
20060170649 | Kosugi | Aug 2006 | A1 |
20060197835 | Anderson | Sep 2006 | A1 |
20060209218 | Lee | Sep 2006 | A1 |
20060224766 | Malackowski | Oct 2006 | A1 |
20060253010 | Brady et al. | Nov 2006 | A1 |
20060271867 | Wang | Nov 2006 | A1 |
20060288233 | Kozlay | Dec 2006 | A1 |
20070004969 | Kong | Jan 2007 | A1 |
20070018591 | Noguchi | Jan 2007 | A1 |
20070074131 | Assadollahi | Mar 2007 | A1 |
20070100244 | Lin | May 2007 | A1 |
20070124949 | Burns, Jr. et al. | Jun 2007 | A1 |
20070155434 | Jobs | Jul 2007 | A1 |
20070156891 | Abbott | Jul 2007 | A1 |
20070211042 | Kim | Sep 2007 | A1 |
20070025083 | Kobayashi | Oct 2007 | A1 |
20070236381 | Ouchi | Oct 2007 | A1 |
20070236475 | Wherry | Oct 2007 | A1 |
20070271528 | Park | Nov 2007 | A1 |
20070276270 | Tran | Nov 2007 | A1 |
20070279852 | Daniel et al. | Dec 2007 | A1 |
20070298785 | Lee | Dec 2007 | A1 |
20070299796 | Macbeth | Dec 2007 | A1 |
20080043824 | Jacobs et al. | Feb 2008 | A1 |
20080062127 | Brodersen | Mar 2008 | A1 |
20080062141 | Chandri | Mar 2008 | A1 |
20080086704 | Aravamudan | Apr 2008 | A1 |
20080126933 | Gupta | May 2008 | A1 |
20080129621 | Koshiji | Jun 2008 | A1 |
20080162834 | Brokenshire | Jul 2008 | A1 |
20080171572 | Choi | Jul 2008 | A1 |
20080184360 | Kornilovsky | Jul 2008 | A1 |
20080273755 | Hildreth | Nov 2008 | A1 |
20080276168 | Mansfield | Nov 2008 | A1 |
20080319353 | Howell | Dec 2008 | A1 |
20090022118 | Behzad | Jan 2009 | A1 |
20090049412 | Lee | Feb 2009 | A1 |
20090058809 | Vuong | Mar 2009 | A1 |
20090059730 | Lyons et al. | Mar 2009 | A1 |
20090083847 | Fadell | Mar 2009 | A1 |
20090096746 | Kruse | Apr 2009 | A1 |
20090128487 | Langereis | May 2009 | A1 |
20090134838 | Raghuprasad | May 2009 | A1 |
20090167487 | Shah | Jul 2009 | A1 |
20090169018 | Deisher | Jul 2009 | A1 |
20090196124 | Mooring | Aug 2009 | A1 |
20090199092 | Ghassabian | Aug 2009 | A1 |
20090217211 | Hildreth | Aug 2009 | A1 |
20090228707 | Linsky | Sep 2009 | A1 |
20090234967 | Yu et al. | Sep 2009 | A1 |
20090249475 | Ohaka | Oct 2009 | A1 |
20090254904 | Howard | Oct 2009 | A1 |
20090265669 | Kida | Oct 2009 | A1 |
20090280861 | Khan | Nov 2009 | A1 |
20090318779 | Tran | Dec 2009 | A1 |
20090320023 | Barsness | Dec 2009 | A1 |
20090327724 | Shah | Dec 2009 | A1 |
20100020033 | Nwosu | Jan 2010 | A1 |
20100029327 | Jee | Feb 2010 | A1 |
20100039393 | Pratt | Feb 2010 | A1 |
20100070913 | Murrett | Mar 2010 | A1 |
20100082444 | Lin | Apr 2010 | A1 |
20100082485 | Lin | Apr 2010 | A1 |
20100095240 | Shiplacoff | Apr 2010 | A1 |
20100122214 | Sengoku | May 2010 | A1 |
20100123792 | Nagumo | May 2010 | A1 |
20100124949 | Demuynck | May 2010 | A1 |
20100125847 | Hayashi | May 2010 | A1 |
20100164879 | Doktorova | Jul 2010 | A1 |
20100167646 | Alameh | Jul 2010 | A1 |
20100169781 | Graumann | Jul 2010 | A1 |
20100188249 | Shin | Jul 2010 | A1 |
20100188428 | Shin | Jul 2010 | A1 |
20100199232 | Mistry | Aug 2010 | A1 |
20100219943 | Vanska | Sep 2010 | A1 |
20100225592 | Jo | Sep 2010 | A1 |
20100245078 | Nadkarni et al. | Sep 2010 | A1 |
20100250789 | Collopy | Sep 2010 | A1 |
20100272258 | Sadovsky | Oct 2010 | A1 |
20100289740 | Kim | Nov 2010 | A1 |
20100289760 | Jonoshita | Nov 2010 | A1 |
20100291911 | Kraft | Nov 2010 | A1 |
20100295773 | Alameh | Nov 2010 | A1 |
20100300771 | Miyazaki | Dec 2010 | A1 |
20100304673 | Yoshizawa | Dec 2010 | A1 |
20100318999 | Zhao | Dec 2010 | A1 |
20110004574 | Sangoh | Jan 2011 | A1 |
20110016405 | Grob | Jan 2011 | A1 |
20110035751 | Krishnakumar et al. | Feb 2011 | A1 |
20110055317 | Vonog | Mar 2011 | A1 |
20110072263 | Bishop | Mar 2011 | A1 |
20110080339 | Sun | Apr 2011 | A1 |
20110102161 | Heubel | May 2011 | A1 |
20110128824 | Downey | Jun 2011 | A1 |
20110144543 | Tsuzuki | Jun 2011 | A1 |
20110149101 | Kim | Jun 2011 | A1 |
20110157046 | Lee | Jun 2011 | A1 |
20110161974 | Kurabayashi et al. | Jun 2011 | A1 |
20110173622 | Shin | Jul 2011 | A1 |
20110185306 | Aravamudan | Jul 2011 | A1 |
20110189981 | Faith | Aug 2011 | A1 |
20110190053 | Kawamoto | Aug 2011 | A1 |
20110205067 | Konishi | Aug 2011 | A1 |
20110214126 | Sadovsky | Sep 2011 | A1 |
20110214158 | Pasquero | Sep 2011 | A1 |
20110216060 | Weising | Sep 2011 | A1 |
20110219427 | Hito | Sep 2011 | A1 |
20110221666 | Newton | Sep 2011 | A1 |
20110221685 | Lee | Sep 2011 | A1 |
20110221688 | Byun | Sep 2011 | A1 |
20110231872 | Gharachorloo | Sep 2011 | A1 |
20110271076 | Koning | Nov 2011 | A1 |
20110275391 | Lee | Nov 2011 | A1 |
20110287757 | Nykoluk | Nov 2011 | A1 |
20110289519 | Frost | Nov 2011 | A1 |
20110296351 | Ewing | Dec 2011 | A1 |
20110300851 | Krishnaswamy | Dec 2011 | A1 |
20110320520 | Jain | Dec 2011 | A1 |
20120062465 | Spetalnick | Mar 2012 | A1 |
20120062571 | Malek | Mar 2012 | A1 |
20120065814 | Seok | Mar 2012 | A1 |
20120069027 | Yamazaki | Mar 2012 | A1 |
20120072481 | Nandlall | Mar 2012 | A1 |
20120075204 | Murray | Mar 2012 | A1 |
20120081852 | Maravilla | Apr 2012 | A1 |
20120083237 | Fish | Apr 2012 | A1 |
20120092383 | Jorg | Apr 2012 | A1 |
20120096345 | Ho et al. | Apr 2012 | A1 |
20120102504 | Iyer | Apr 2012 | A1 |
20120105312 | Helmes | May 2012 | A1 |
20120106789 | Zhou | May 2012 | A1 |
20120119997 | Gutowitz | May 2012 | A1 |
20120127074 | Nakamura | May 2012 | A1 |
20120130547 | Fadell | May 2012 | A1 |
20120140451 | Araujo | Jun 2012 | A1 |
20120172126 | Padovani | Jul 2012 | A1 |
20120190299 | Takatsuka | Jul 2012 | A1 |
20120194551 | Osterhout | Aug 2012 | A1 |
20120200494 | Perski | Aug 2012 | A1 |
20120210266 | Jiang | Aug 2012 | A1 |
20120221968 | Patterson et al. | Aug 2012 | A1 |
20120223889 | Medlock | Sep 2012 | A1 |
20120002623 | King | Oct 2012 | A1 |
20120249409 | Toney | Oct 2012 | A1 |
20120249590 | Maciocci | Oct 2012 | A1 |
20120256959 | Ye | Oct 2012 | A1 |
20120262370 | Ko | Oct 2012 | A1 |
20120268376 | Bi | Oct 2012 | A1 |
20120281149 | Mori | Nov 2012 | A1 |
20120283855 | Hoffman | Nov 2012 | A1 |
20120299847 | Kwon | Nov 2012 | A1 |
20120306740 | Hoda | Dec 2012 | A1 |
20120311430 | Seo | Dec 2012 | A1 |
20120316456 | Rahman | Dec 2012 | A1 |
20120317024 | Rahman | Dec 2012 | A1 |
20120319950 | Marks | Dec 2012 | A1 |
20120322430 | Fish | Dec 2012 | A1 |
20130002538 | Mooring | Jan 2013 | A1 |
20130007289 | Seo | Jan 2013 | A1 |
20130007842 | Park | Jan 2013 | A1 |
20130016070 | Starner et al. | Jan 2013 | A1 |
20130018659 | Chi | Jan 2013 | A1 |
20130021374 | Miao et al. | Jan 2013 | A1 |
20130031550 | Choudhury | Jan 2013 | A1 |
20130033447 | Cho | Feb 2013 | A1 |
20130036382 | Yuan | Feb 2013 | A1 |
20130045037 | Schaffer | Feb 2013 | A1 |
20130046544 | Kay | Feb 2013 | A1 |
20130047155 | Caspole | Feb 2013 | A1 |
20130050096 | Fallah | Feb 2013 | A1 |
20130054843 | Jan | Feb 2013 | A1 |
20130057774 | Yoshida | Mar 2013 | A1 |
20130069985 | Wong et al. | Mar 2013 | A1 |
20130073403 | Tuchman | Mar 2013 | A1 |
20130080143 | Reeves | Mar 2013 | A1 |
20130080525 | Aoki | Mar 2013 | A1 |
20130080890 | Krishnamurthi | Mar 2013 | A1 |
20130082962 | Jo | Apr 2013 | A1 |
20130089848 | Exeter | Apr 2013 | A1 |
20130095814 | Mottes | Apr 2013 | A1 |
20130104079 | Yasui | Apr 2013 | A1 |
20130107674 | Gossweiler, III | May 2013 | A1 |
20130111009 | Sng | May 2013 | A1 |
20130111395 | Ying et al. | May 2013 | A1 |
20130117834 | Ishioka | May 2013 | A1 |
20130120106 | Cauwels | May 2013 | A1 |
20130120405 | Maloney | May 2013 | A1 |
20130125224 | Kaufman | May 2013 | A1 |
20130132848 | Bhatt | May 2013 | A1 |
20130141325 | Bailey | Jun 2013 | A1 |
20130142016 | Pozzo Di Borgo | Jun 2013 | A1 |
20130145322 | Hendricks | Jun 2013 | A1 |
20130165180 | Fukuda Kelley | Jun 2013 | A1 |
20130169571 | Gai | Jul 2013 | A1 |
20130176228 | Griffin | Jul 2013 | A1 |
20130187857 | Griffin | Jul 2013 | A1 |
20130191232 | Calman | Jul 2013 | A1 |
20130194294 | Zhao et al. | Aug 2013 | A1 |
20130197680 | Cobbett et al. | Aug 2013 | A1 |
20130197681 | Alberth | Aug 2013 | A1 |
20130198056 | Aldrey | Aug 2013 | A1 |
20130211843 | Clarkson | Aug 2013 | A1 |
20130217978 | Ma | Aug 2013 | A1 |
20130222270 | Winkler | Aug 2013 | A1 |
20130227460 | Jawerth | Aug 2013 | A1 |
20130232095 | Tan | Sep 2013 | A1 |
20130250182 | Yuan | Sep 2013 | A1 |
20130254705 | Mooring | Sep 2013 | A1 |
20130262305 | Jones | Oct 2013 | A1 |
20130278504 | Tong et al. | Oct 2013 | A1 |
20130290911 | Praphul | Oct 2013 | A1 |
20130300719 | Wang | Nov 2013 | A1 |
20130326398 | Zuverink | Dec 2013 | A1 |
20140049477 | Dai | Feb 2014 | A1 |
20140059558 | Davis | Feb 2014 | A1 |
20140068520 | Missig | Mar 2014 | A1 |
20140075328 | Hansen | Mar 2014 | A1 |
20140078065 | Akkok | Mar 2014 | A1 |
20140095695 | Wang | Apr 2014 | A1 |
20140139422 | Mistry | May 2014 | A1 |
20140139454 | Mistry | May 2014 | A1 |
20140143678 | Mistry | May 2014 | A1 |
20140143737 | Mistry | May 2014 | A1 |
20140143785 | Mistry | May 2014 | A1 |
20140165187 | Daesung | Jun 2014 | A1 |
20140181758 | Pasquero | Jun 2014 | A1 |
20140189115 | Eggert | Jul 2014 | A1 |
20140197232 | Birkler | Jul 2014 | A1 |
20140282211 | Ady | Sep 2014 | A1 |
20140282274 | Everitt | Sep 2014 | A1 |
20140306898 | Cueto | Oct 2014 | A1 |
20140330900 | Libin | Nov 2014 | A1 |
20140337037 | Chi | Nov 2014 | A1 |
20150053007 | Decoux | Feb 2015 | A1 |
20150089593 | Herman | Mar 2015 | A1 |
20150098308 | Herman | Apr 2015 | A1 |
20160132233 | Ghassabian | May 2016 | A1 |
20160134419 | Smith | May 2016 | A1 |
Number | Date | Country |
---|---|---|
1324030 | Nov 2001 | CN |
1330303 | Jan 2002 | CN |
1404676 | Mar 2003 | CN |
1538721 | Oct 2004 | CN |
1176417 | Nov 2004 | CN |
1766824 | May 2006 | CN |
101035148 | Aug 2007 | CN |
101052939 | Oct 2007 | CN |
101185050 | May 2008 | CN |
101329600 | Dec 2008 | CN |
101404928 | Apr 2009 | CN |
101739203 | Jun 2010 | CN |
101739206 | Jun 2010 | CN |
101739206 | Jun 2010 | CN |
101815326 | Aug 2010 | CN |
102117178 | Jul 2011 | CN |
101329600 | Oct 2011 | CN |
102368200 | Mar 2012 | CN |
102446081 | May 2012 | CN |
102681786 | Sep 2012 | CN |
102713794 | Oct 2012 | CN |
102779002 | Nov 2012 | CN |
103038728 | Apr 2013 | CN |
103488420 | Jan 2014 | CN |
103534676 | Jan 2014 | CN |
102349073 | Feb 2017 | CN |
10 2008 027 746 | Dec 2009 | DE |
102008027746 | Dec 2009 | DE |
1213896 | Jun 2002 | EP |
1760573 | Mar 2007 | EP |
1832969 | Sep 2007 | EP |
2037579 | Mar 2009 | EP |
2220997 | Aug 2010 | EP |
2256592 | Dec 2010 | EP |
2341420 | Jul 2011 | EP |
2474950 | Jul 2012 | EP |
2866103 | Apr 2015 | EP |
2411337 | Aug 2005 | GB |
2411552 | Aug 2005 | GB |
H 09-85983 | Mar 1997 | JP |
A-10-177449 | Jun 1998 | JP |
H10177449 | Jun 1998 | JP |
1999-298362 | Oct 1999 | JP |
2000-050133 | Feb 2000 | JP |
2000-152060 | May 2000 | JP |
2000-267797 | Sep 2000 | JP |
2002-040175 | Feb 2002 | JP |
2002-041235 | Feb 2002 | JP |
2002-099476 | Apr 2002 | JP |
2002-528811 | Sep 2002 | JP |
2002259347 | Sep 2002 | JP |
2002350573 | Dec 2002 | JP |
2003-018923 | Jan 2003 | JP |
2003-131785 | May 2003 | JP |
2004062267 | Feb 2004 | JP |
2004-072450 | Mar 2004 | JP |
2004-184396 | Jul 2004 | JP |
2004184396 | Jul 2004 | JP |
2004-288172 | Oct 2004 | JP |
2005-174356 | Jun 2005 | JP |
2005-244676 | Sep 2005 | JP |
2005-352739 | Dec 2005 | JP |
2006-502457 | Jan 2006 | JP |
2006-279137 | Oct 2006 | JP |
2007-014471 | Feb 2007 | JP |
2007-064758 | Mar 2007 | JP |
3948260 | Apr 2007 | JP |
2007-116270 | May 2007 | JP |
2008089039 | Nov 2008 | JP |
2008-299866 | Dec 2008 | JP |
2009-005320 | Jan 2009 | JP |
2009-071726 | Apr 2009 | JP |
2009081309 | Apr 2009 | JP |
2009-301485 | Dec 2009 | JP |
2010-124181 | Jun 2010 | JP |
2010-527057 | Aug 2010 | JP |
2010244922 | Oct 2010 | JP |
2010-267220 | Nov 2010 | JP |
2010-277198 | Dec 2010 | JP |
2011-049897 | Mar 2011 | JP |
2011-237953 | Nov 2011 | JP |
4899108 | Jan 2012 | JP |
2010-073830 | Apr 2012 | JP |
2012-073830 | Apr 2012 | JP |
2012-098771 | May 2012 | JP |
2012-108771 | Jun 2012 | JP |
2012-110037 | Jun 2012 | JP |
2012-113525 | Jun 2012 | JP |
2012147432 | Aug 2012 | JP |
WO 2012124250 | Jul 2014 | JP |
10-0408009 | Dec 2003 | KR |
100408009 | Dec 2003 | KR |
10-2005-0065197 | Jun 2005 | KR |
20060134119 | Dec 2006 | KR |
0761262 | Sep 2007 | KR |
2010-0028465 | Mar 2010 | KR |
10-1078100 | Oct 2011 | KR |
101078100 | Oct 2011 | KR |
2012-0028314 | Mar 2012 | KR |
10-1199454 | Nov 2012 | KR |
10-2012-0137032 | Dec 2012 | KR |
2319997 | Mar 2008 | RU |
2324970 | May 2008 | RU |
74259 | Jun 2008 | RU |
2383915 | Mar 2010 | RU |
2421774 | Jun 2011 | RU |
2455676 | Jul 2012 | RU |
WO 0025193 | May 2000 | WO |
2004-012178 | Feb 2004 | WO |
2004-023289 | Mar 2004 | WO |
2005-065404 | Jul 2005 | WO |
WO 2005065404 | Jul 2005 | WO |
WO 2005103863 | Nov 2005 | WO |
WO 2006120211 | Nov 2006 | WO |
WO 2007074905 | Jul 2007 | WO |
2008-045379 | Apr 2008 | WO |
WO 2011011751 | Jan 2011 | WO |
2012-009335 | Jan 2012 | WO |
WO 2012030477 | Mar 2012 | WO |
WO 2012030653 | Mar 2012 | WO |
2012-075292 | Jun 2012 | WO |
2012-099584 | Jul 2012 | WO |
WO 2012092247 | Jul 2012 | WO |
WO 2012154620 | Nov 2012 | WO |
WO 13012603 | Jan 2013 | WO |
Entry |
---|
Non-Final Office Action for U.S. Appl. No. 14/015,909, dated Feb. 14, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,940, dated Feb. 12, 2015. |
International Search Report for International Application No. PCT/KR2013-010538, dated Feb. 28, 2014. |
International Search Report for International Application No. PCT/KR2013-010539, dated Feb. 26, 2014. |
International Search Report for International Application No. PCT/KR2013-010544, dated Feb. 24, 2014. |
International Search Report for International Application No. PCT/KR2013-010546, dated Feb. 27, 2014. |
International Search Report for International Application No. PCT/KR2013-010547, dated Mar. 7, 2014. |
International Search Report for International Application No. PCT/KR2013-010554, dated Feb. 28, 2014. |
International Search Report for International Application No. PCT/KR2013-010555, dated Feb. 25, 2014. |
International Search Report for International Application No. PCT/KR2013-010566, dated Feb. 28, 2014. |
Elliot, Amy-Mae, Zypad WL 1000, Retrieved from Internet, http://www.pocket-lint.com/news/81360-zvpad-w1000-futuristic-wearable-computer, Jun. 29, 2007. |
Slap band Watches, Retrieved from Internet, http://www.whatsondeals.com.au/deal/perth/894/stop-the-clock-twice-2-kids-Funm-and-funky-slap-band-watches-this-summer-for-only-17-free-delivery-to-your-door-normall.html, Dec. 18, 2011. |
Bohn, Dieter, I'm Watch hands-on pictures and video, Retrieved from Internet, http://www.theverge.com/2012/1/10/2697043/im-watch-pictures-video-release-date-price, Jan. 10, 2012. |
Honig, Zach, Pebble Smartwatch Review, http://wwwengadget.com/2013/01/25/pebble-smartwatch-review/, Jan. 25, 2013. |
Campbell, Mikey, “Apple patent filing points directly to ‘iWatch’ Concept with Flexible Touchscreen Display,” Retrieved from Internet, http:/ /appleinsider.com/articles/13/02/21/apple-patent-filing-points-directly-to-iwatch-concept-with-flexible-touchscreen-display, Feb. 21, 2013. |
Mentor, Microsoft SPOT Watch Smartwatch. Retrieved from Internet, http://www.smartwatchnews.org/2004-microsoft-spot-watch-smartwatch/, Mar. 11, 2013. |
Keene, J.. “Sony SmartWatch review,” Retrieved from Internet, http://www.theverge.com/2012/3719/2876341/sony-smartwatch-review, Mar. 19, 2013. |
Hughes. N., “New Samsung smart watch will be company's third stab at wrist accessory,” Retrieved from Internet, http://appleinsider.com/articles/13/03/21/new-samsung-smart-watch-will-be-companys-third-stab-at-wrist-accessory, Mar. 21, 2013. |
Houghton, S., “Smartwatches—Fad or the future,” Retrieved from Internet, http://www.trustedreviews.com/opinions/smart-watches-fad-or-the-future, Mar. 25, 2013. |
Mogg, T., “Microsoft smartwatch? Computer Giant Reportedly Gathering Components for High-Tech Wristwatch,” Retrieved from Internet, http://www.digitaltrends.com/mobile/microsoft-reportedly-gathering-components-for-smartwatch/, Apr. 15, 2013. |
Plunkett, L., “Report: Xbox Staff Working on Microsoft Smartwatch”, Retrieved from Internet, http://kotaku.com/report-xbox-staff-working-on-microsoft-smartwatch-475322574, Apr. 17, 2013. |
Bennett, B., “Sonostar enters the smartwatch arena (hands-on),” Retrieved from Internet, http:reviews.cnet.com/cell-phone-and-smart/sonostar-smartwatch/4505-6448_7-35768972.html, May 21, 2013. |
Campbell, M., “Samsung to reportedly debut ‘Galaxy Gear’ smart watch on Sep. 4,” Retrieved from Internet, http://appleinsider.com/articles/13/08/14/samsung-to-reportedly-debut-galaxy-gear-smart-watch-on-sept-4, Aug. 14, 2013. |
Volpe, J.; “Sony SmartWatch 2 unveiled: a water resistant ‘second screen’ for Android devices.” Available at FTP: http://www.engadget.com/2013/06/25/sony-smartwatch-2-water-resistant-android/ pp. 1-18, Jun. 25, 2013. |
Office Action for JP Patent Application No. 2013-240458, dated Oct. 7, 2014. |
Non-Final Office Action for U.S. Appl. No. 14/015,795, dated Jan. 13, 2014. |
Non-Final Office Action for U.S. Appl. No. 14/015,795, dated Jul. 16, 2014. |
Final Office Action for U.S. Appl. No. 14/015,940, dated Aug. 1, 2014. |
Non-Final Office Action for U.S. Appl. No. 14/015,873, dated Dec. 3, 2014. |
Office Action for AU Patent Application No. 2013-260687, dated Apr. 21, 2015. |
Notice of Allowance for JP Patent Application 2013-240458 (with English abstract only), dated Feb. 9, 2015. |
Office Action for U.S. Appl. No. 14/015,909, dated Sep. 4, 2015. |
Office Action for U.S. Appl. No. 14/015,940, dated Jul. 30, 2015. |
Office Action for U.S. Appl. No. 14/015,926, dated Jun. 17, 2015. |
Office Action for U.S. Appl. No. 14/015,873, dated May 29, 2015. |
Office Action for U.S. Appl. No. 14/015,890, dated May 22, 2015. |
Australian Government Patent Examination Report No. 3 for AU Patent Application No. 2013260687, dated Dec. 16, 2015. |
Australian Government Patent Examination Report No. 2 for AU Patent Application No. 2013260687, dated Sep. 24, 2015. |
Kim, David et al. “Digits: Freehand 3D Interactions Anywhere Using a Wrist-Worn Gloveless Sensor”, dated Oct. 7, 2012. |
European Search Report for EP Application No. 13193702.1, dated May 3, 2016. |
European Search Report for EP Application No. 13193638.7, dated May 4, 2016. |
European Search Report for EP Application No. 13193666.8, dated Apr. 29, 2016. |
Extended European Search Report for EP Application No. 13193674.2, dated Apr. 29, 2016. |
Extended European Search Report for EP Application No. 13193561.1, dated May 9, 2016. |
Extended European Search Report for EP Application No. 13193672.6, dated May 9, 2016. |
AU Patent Office—Notice of Acceptance for AU Application No. 2013260687, dated Apr. 12, 2016. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,795, dated Apr. 11, 2014. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,795, dated Oct. 16, 2014. |
Supplemental Response to Non-Final Office Action for U.S. Appl. No. 14/015,795, dated Nov. 11, 2014. |
Notice of Allowance for U.S. Appl. No. 14/015,795, dated Nov. 26, 2014. |
1.312 Amendment After Allowance for U.S. Appl. No. 14/015,795, dated Feb. 2, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,830, dated Nov. 15, 2013. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,830, dated Feb. 18, 2014. |
Final Office Action for U.S. Appl. No. 14/015,830, dated Apr. 15, 2014. |
Response to Final Office Action for U.S. Appl. No. 14/015,830, dated Jul. 15, 2014. |
Notice of Allowance for U.S. Appl. No. 14/015,830, dated Oct. 9, 2014. |
Notice of Allowance for U.S. Appl. No. 14/015,830, dated Jan. 14, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,873, dated Dec. 19, 2013. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,873, dated Mar. 19, 2014. |
Final Office Action for U.S. Appl. No. 14/015,873, dated May 23, 2014. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,873, dated Mar. 3, 2015. |
Final Office Action for U.S. Appl. No. 14/015,873, dated May 29, 2015. |
Appeal Brief for U.S. Appl. No. 14/015,873, dated Feb. 11, 2016. |
Non-Final Office Action for U.S. Appl. No. 14/015,890, dated May 22, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,909, dated Feb. 24, 2015. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,909, dated Jun. 24, 2015. |
Final Office Action for U.S. Appl. No. 14/015,909, dated Sep. 4, 2015. |
Response to Final Office Action for U.S. Appl. No. 14/015,909, dated Dec. 4, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,909, dated Jan. 12, 2016. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,926, dated Apr. 29, 2015. |
Final Office Action for U.S. Appl. No. 14/015,926, dated Jun. 17, 2015. |
Response to Final Office Action for U.S. Appl. No. 14/015,926, dated Sep. 17, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,940, dated Dec. 30, 2013. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,940, dated Mar. 31, 2014. |
Response to Final Office Action for U.S. Appl. No. 14/015,940, dated Jan. 2, 2015. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,940, dated Jun. 12, 2015. |
Final Office Action for U.S. Appl. No. 14/015,940, dated Jul. 30, 2015. |
Response to Final Office Action for U.S. Appl. No. 14/015,940, dated Dec. 16, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,940, dated Feb. 19, 2016. |
Non-Final Office Action for U.S. Appl. No. 14/015,926, dated Jan. 29, 2015. |
Final Office Action for U.S. Appl. No. 14/015,947, dated May 11, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,909, dated Jul. 12, 2016. |
Alan Messer, et al. “Towards a Distributed Platform for Resource-Constrained Devices;” Proceedings of the 22nd International Conference on Distributed Computing Systems, Jul. 2, 2002. |
Niroshinie Fernando, et al. “Mobile cloud computing: A Survey;” Future Generation Computing Systems 29 (2013), Jun. 6, 2012. |
European Search Report for Application No. EP 13193666.8, dated Jul. 26, 2016. |
European Search Report for Application No. EP 13193558.7, dated Jul. 21, 2016. |
European Search Report for Application No. EP 13193702.1, dated Aug. 9, 2016. |
European Search Report for Application No. EP 13193638.7, dated Jul. 26, 2016. |
European Search Report for Application No. EP 13193670.0, dated Jul. 21, 2016. |
Supplementary European Search Report for Publication No. EP 1213896, dated Apr. 8, 2004. |
MX—OA Memo Concerning the Official Action Reported in the Covering Letter, Mexican Patent Application No. MX/A/2015/006361, with English translation, dated Jul. 6, 2016. |
MX—OA Memo Concerning the Official Action Reported in the Covering Letter, Mexican Patent Application No. MX/A/2015/006359, with English translation, dated Jul. 6, 2016. |
International Search Report and Written Opinion from International Application No. PCT/KR2015/001992, dated May 14, 2015. |
Non-Final OA for U.S. Appl. No. 14/633,673, dated Feb. 9, 2017. |
Final OA for U.S. Appl. No. 14/015,890, dated Nov. 4, 2016. |
Non-Final OA for U.S. Appl. No. 14/015,940, dated Nov. 17, 2016. |
Non-Final OA U.S. Appl. No. 14/813,002, dated Dec. 30, 2016. |
EP Communication under Rule 71(3) for Application No. 13 193 674.2-1972, dated Jun. 9, 2017. |
Notice of Appeal and Pre-Appeal Brief for U.S. Appl. No. 14/015,873, Sep. 29, 2015. |
Notice of Panel Decision for U.S. Appl. No. 14/015,873, dated Dec. 11, 2015. |
Response to Final Office Action for U.S. Appl. No. 14/015,926, dated Apr. 8, 2016. |
Response to Final OA for U.S. Appl. No. 14/015,940, dated Oct. 28, 2018. |
Response to Non-Final OA for U.S. Appl. No. 14/813,002, dated Mar. 20, 2017. |
Submission with R.C.E. after Decision of the Patent Trial and U.S. Appl. No. 14/015,873, Aug. 2, 2017. |
Decision on Appeal for U.S. Appl. No. 14/015,873, Jun. 2, 2017. |
Examiner's Answer to Appeal Brief for U.S. Appl. No. 14/015,926, dated Jun. 2, 2017. |
Notice of Panel Decision for U.S. Appl. No. 14/015,940, dated Jul. 10, 2017. |
Final Office Action for U.S. Appl. No. 14/813,002, dated Jul. 13, 2017. |
Non-Final OA for U.S. Appl. No. 14/632,065, dated Mar. 10, 2017. |
Final OA U.S. Appl. No. 14/015,940, dated Mar. 20, 2017. |
RU Decision on Grant for Application No. 2015124021/08 (with English translation), dated Mar. 14, 2017. |
CN Office Action for Application No. 201380070267.1 (with English translation), dated May 22, 2017. |
CN—First OA for Application No. 201380070241.7 (with English translation), dated Feb. 3, 2017. |
Decision on Grant for RU Application No. 2015124004/08(037489) (with English translation), dated Mar. 22, 2017. |
RU—OA for Application No. 2015124029/08(037515) (with English translation), dated Mar. 24, 2017. |
CN—First OA for Application No. 2013 80070244.0 (with English translation), dated Apr. 14, 2017. |
MX Office Action for Application No. MX/a/2017/013843, dated Jan. 4, 2019. |
KR Office Action for Application No. 10-2013-141777, dated Sep. 10, 2019. |
AU Notice of Acceptance for Application No. 2013260681, dated Jul. 26, 2019. |
AU Examination Report for Application No. 2013260685, dated Aug. 15, 2019. |
EP Search Report for Application No. 19162170.5, dated Jul. 10, 2019. |
CN Office Action for Application No. 201580021006.X (with English translation), dated Jan. 21, 2019. |
CN Office Action for Application No. 201380070246.X (with English translation), dated Aug. 3, 2017. |
JP Non-Final Office Action for Application No. 2013-240448 (with English translation), dated Aug. 14, 2017. |
RU Decision on Grant for Application No. 2015124029/08 (with English translation), dated Aug. 25, 2017. |
CN Office Action for Application No. 201380070247.4 (with English translation), dated Aug. 29, 2017. |
JP Non-Final Office Action for Application No. 2013-240462 (with English translation), dated Sep. 11, 2017. |
CN Office Action for Application No. 201380070264,8 (with English translation), dated Sep. 14, 2014. |
JP Non-Final Office Action for Application No. 2013-240459 (with English translation), dated Sep. 27, 2017. |
JP Non-Final Office Action for Application No. 2013-240447(with English translation), dated Sep. 27, 2017. |
JP Non-Final Office Action for Application No. 2013-240460 (with English translation), dated Sep. 28, 2017. |
CN Office Action for Application No. 201380070260.X (with English translation), dated Oct. 10, 2017. |
AU Examination Report for Application No. 2013260685, dated Sep. 24, 2019. |
MX Office Action for Application No. MX/a/2017/013843, dated Nov. 12, 2019. |
KR Office Action for Application No. 10-2013-141775, dated Dec. 2, 2019. |
EP Office Action for Application No. 13193638.7, dated Dec. 10, 2019. |
CN Office Action for Application No. 201380070267.1 (no English translation), dated Feb. 5, 2018. |
JP Notice of Allowance for Application No. 2013-240460 (with English translation), dated Mar. 19, 2018. |
JP Office Action for Application No. 2013-240462 (with English translation), dated Apr. 2, 2018. |
JP Notice of Allowance for Application No. 2013-240459 (with English translation), dated Apr. 2, 2018. |
CN Office Action for Application No. 20130070246.X (no English translation), dated Apr. 4, 2018. |
Communication Pursuant to Article 94(3) for EP Application No. 13193672.6-1216, dated Apr. 6, 2018. |
JP Office Action for Application No. 2013-240448 (with English translation), dated Apr. 9, 2018. |
Wolski, Rich et al., “Using Bandwidth Data to Make Computation Offloading Decisions”, IEEE International Symposium on Parallell & Distributed Processing, Apr. 14, 2008. |
CN Office Action for Application No. 201380070244.0 (with English translation), dated Dec. 13, 2017. |
CN Office Action for Application No. 201380070243.6 (with English translation), dated Dec. 19, 2017. |
EP Communication Pursuant to Article 94(3) for Application No. 13 193 638.7-1972, dated Nov. 13, 2017. |
EP Communication Pursuant to Article 94(3) for Application No. 13 193 666.8-1972, dated Nov. 13, 2017. |
EP Communication Pursuant to Article 94(3) for Application No. 13 193 702.1-1972, dated Nov. 13, 2017. |
EP Communication Pursuant to Article 94(3) for Application No. 13 193 558.7-1879, dated Dec. 1, 2017. |
EP Communication Pursuant to Article 94(3) for Application No. 13 193 670.0-1879, dated Dec. 5, 2017. |
JP Office Action for Application No. 2013-240449 (with English translation), dated Jan. 15, 2018. |
JP Office Action for Application No. 2013-240457 (with English translation), dated Jan. 15, 2018. |
ID Office Action for Application No. P00201503667 (with English translation), dated Oct. 8, 2018. |
JP Notice of Allowance for Application No. 2013-240448 (no English translation), dated Oct. 15, 2018. |
ID Office Action for Application No. P00201503666 (with English translation), dated Nov. 8, 2018. |
Examination Report No. 2 for AU Application No. 2013260686, dated Nov. 12, 2018. |
MY Office Action for Application No. PI 2015701582, dated Nov. 15, 2018. |
IN Office Action for Application No. 3654/MUM/2013 (with English translation), dated Nov. 27, 2018. |
JP Notice of Allowance for Application No. 2013-240447 (no English translation), dated Dec. 3, 2018. |
CN 2nd Office Action for Chinese Patent Application No. 201580021006.X (with English Translation), 14 pages, dated Aug. 2, 2019. |
EP Examination Report for Application No. 15754441.2, 8 pgs, dated Sep. 3, 2019. |
AU Notice of Allowance for Application No. 2013260686, dated Apr. 16, 2019. |
AU Notice of Allowance for Application No. 2013260682, dated May 7, 2019. |
CN 4th Office Action for Application No. 201380070244.0 (no English translation), dated Jun. 10, 2019. |
JP Notice of Allowance for Application No. 2013-240449 (no English translation), dated Jun. 24, 2019. |
Wilbert O. Galitz, “The Essential Guide to User Interface Design”, second edition, 2002. |
Computer User's Dictionary, fifth edition, 2002. |
Narayanaswami et al., “Application design for a smart with a high-resolution display”, IEEE, 2002. |
Website Architecture, “Web Site Information Architecture Models”, retrieved from the internet:<http://webdesignfromscratch.com/, 2007. |
AU Examination Report No. 1 for Application No. 2013260681, dated Aug. 22, 2018. |
AU Examination Report No. 1 for Application No. 2013260682, dated Sep. 18, 2018. |
AU Examination Report No. 1 for Application No. 2013260683, dated Sep. 19, 2018. |
AU Examination Report No. 1 for Application No. 2013260684, dated Sep. 27, 2018. |
AU Examination Report No. 1 for Application No. 2013260685, dated Sep. 25, 2018. |
AU Examination Report No. 1 for Application No. 2013260686, dated Aug. 6, 2018. |
AU Examination Report No. 1 for Application No. 2013260688, dated Oct. 2, 2018. |
CN Office Action for Application No. 201380070246.X (No English translation), dated Sep. 28, 2018. |
CN Office Action for Application No. 201380070267.1 (No English translation), dated Aug. 17, 2018. |
EP Communication Pursuant to Article 94(3) for Application No. 16181520.4-1216, dated Jul. 26, 2018. |
JP Notice of Allowance for Application No. 2013-240457 (No English translation), dated Sep. 21, 2018. |
JP Office Action for Application No. 2013-240447 (No English translation), dated Aug. 13, 2018. |
JP Office Action for Application No. 2013-240462 (No English translation), dated Sep. 10, 2018. |
MY Office Action for Application No. PI 2015701575, dated Aug. 15, 2018. |
MY Office Action for Application No. PI 2015701577, dated Aug. 30, 2018. |
MY Office Action for Application No. PI 2015701579, dated Aug. 30, 2018. |
CN Decision of Rejection for Application No. 201580021006.X, 10 pages, dated Jan. 3, 2020. |
IN 1st Office Action for Application No. 201627027445, 6 pages, dated Feb. 10, 2020. |
Malaysian NOA for Application No. PI2015701579, 3 pages, dated Mar. 2, 2020. |
Indian Office Action for Application No. 3642/MUM/2013, 6 pages, dated Jan. 15, 2020. |
Malaysian OA for Application No. PI2015701577, 2 pages, dated Jan. 30, 2020. |
Korean NOA for Application No. 10-2013-0141777, (no translation), 3 pages, dated Mar. 2, 2020. |
Response to Final Office Action for U.S. Appl. No. 14/015,873, dated Oct. 23, 2014. |
Notice of Appeal and Pre-Appeal Brief for U.S. Appl. No. 14/015,873, dated Sep. 29, 2015. |
Notice of Panel Decision for U.S. Appl. No. 14/015,873, Dec. 11, 2015. |
Examiner's Answer to Appeal Brief for U.S. Appl. No. 14/015,873, dated Jul. 21, 2016. |
Reply Brief for U.S. Appl. No. 14/015,873, Sep. 21, 2016. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,940, dated Jun. 20, 2016. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,890, dated Sep. 22, 2015. |
Non-Final Office Action for U.S. Appl. No. 14/015,926, dated Oct. 7, 2015. |
Response to Non-Final Office Action for U.S. Appl. No. 14/015,926, dated Feb. 8, 2016. |
Final Office Action for U.S. Appl. No. 14/015,890, dated Nov. 17, 2015. |
Response to Final Office Action for U.S. Appl. No. 14/015,890, dated Feb. 17, 2016. |
Final OA for U.S. Appl. No. 14/015,926, dated Apr. 8, 2016. |
Notice of Appeal and Pre-Appeal Brief for U.S. Appl. No. 14/015,926, Aug. 8, 2016. |
Appeal Brief for U.S. Appl. No. 14/015,926, Jan. 13, 2017. |
Notice of Defective Appeal Brief for U.S. Appl. No. 14/015,926, Feb. 14, 2017. |
Supplemental Appeal Brief for U.S. Appl. No. 14/015,926, Mar. 9, 2017. |
Non-Final OA for U.S. Appl. No. 14/015,890, dated Jul. 12, 2016. |
Response to Non-Final OA for U.S. Appl. No. 14/015,890, dated Oct. 12, 2016. |
Final OA for U.S. Appl. No. 14/015,940, dated Jul. 28, 2016. |
Response to Final OA for U.S. Appl. No. 14/015,940, dated Oct. 28, 2016. |
Response to Final OA for U.S. Appl. No. 14/015,890, dated Feb. 6, 2017. |
Response to Non-Final OA for U.S. Appl. No. 14/015,940, dated Feb. 17, 2017. |
Response to Non-Final OA for U.S. Appl. No. 14/813,002, dated Mar. 30, 2017. |
Response to Non-Final OA for U.S. Appl. No. 14/633,673, dated Jul. 10, 2017. |
Response to Non-Final OA for U.S. Appl. No. 14/632,065, dated Jun. 12, 2017. |
Notice of Appeal and Pre-Appeal Brief for U.S. Appl. No. 14/015,940, Jun. 20, 2017. |
Submission with R.C.E. after Decision of the Patent Trial and Appeal Board for U.S. Appl. No. 14/015,873, Aug. 2, 2017. |
Supplemental Response for U.S. Appl. No. 14/015,873, dated Aug. 21, 2017. |
Non-Final OA for U.S. Appl. No. 14/015,890, dated Aug. 24, 2017. |
Extended Search Report for EP Application No. 15754441.2-1879, dated Sep. 22, 2017. |
AU Examination Report No. 2 for Application No. 2013260681, dated Jan. 4, 2019. |
AU Notice of Acceptance for Application No. 2013260684, dated Jan. 17, 2019. |
CN Office Action for Application No. 201380070244.0 (No English translation), dated Jan. 18, 2019. |
JP Office Action for Application No. 2013-240449 (No English translation), dated Jan. 21, 2019. |
AU Notice of Acceptance for Application No. 2013260683, dated Jan. 21, 2019. |
EP Decision to Refuse for Application No. 13 193 670.0-14, dated Jan. 22, 2019. |
AU Notice of Acceptance for Application No. 2013260688, dated Feb. 11, 2019. |
AU Examination Report No. 3 for Application No. 2013260686, dated Feb. 11, 2019. |
AU Examination Report No. 2 for Application No. 2013260682, dated Feb. 14, 2019. |
CN Office Action for Application No. 201380070246.X (No English translation), dated Mar. 1, 2019. |
JP Notice of Allowance for Application No. 2013-240462 (No English translation), dated Mar. 18, 2019. |
CN Office Action for Application No. 201380070267.1 (No English translation), dated Mar. 19, 2019. |
MX Office Action for Application No. MX/a/2015/006361 (with English translation), dated Jun. 15, 2017. |
CN Office Action for Application No. 201380070260.X (with English translation), dated Jul. 4, 2017. |
EP Communication Pursuant to Article 94(3) for Application No. 13 193 672.6-1972, dated Jul. 27, 2017. |
Summons to Attend Oral Proceedings pursuant to Rule 115(1) for EP Application No. 13193558.7-1221, May 9, 2018. |
Summons to Attend Oral Proceedings pursuant to Rule 115(1) for EP Application No. 13193670.0-1221, May 17, 2018. |
CN Office Action for Application No. 201380070244.0 (with No English translation), dated Jun. 13, 2018. |
JP Office Action for Application No. 2013-240449 (with English translation), dated Jun. 18, 2018. |
Communication Pursuant to Article 94(3) for EP Application No. 13193638.7-1216, dated Jun. 29, 2018. |
Notice of Allowance for European Application No. 13193638.7, dated Feb. 18, 2021. |
Office Action for Australian Application No. 2019236672, dated Dec. 10, 2020. |
Office Action for European Application No. 19162170.5, dated Feb. 4, 2021. |
Kovachev, et al., Augmenting Pervasive Environments with an XMPP-Based Mobile Cloud Middleware, International Conference on Mobile Computing, Applications, and Services, Oct. 2010, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20140143784 A1 | May 2014 | US |
Number | Date | Country | |
---|---|---|---|
61728765 | Nov 2012 | US | |
61728770 | Nov 2012 | US | |
61773803 | Mar 2013 | US | |
61728773 | Nov 2012 | US | |
61773813 | Mar 2013 | US | |
61773815 | Mar 2013 | US | |
61773817 | Mar 2013 | US | |
61775688 | Mar 2013 | US | |
61775687 | Mar 2013 | US | |
61775686 | Mar 2013 | US |