Crown for an electronic watch

Information

  • Patent Grant
  • 11906937
  • Patent Number
    11,906,937
  • Date Filed
    Monday, January 9, 2023
    a year ago
  • Date Issued
    Tuesday, February 20, 2024
    2 months ago
Abstract
An electronic watch includes a housing defining a side surface of the electronic watch, a transparent cover coupled to the housing and defining a front surface of the electronic watch, an image-sensing element, and a crown extending from the side of the housing and defining an imaging surface. The crown may include a light-directing feature configured to direct, onto the image-sensing element, an image of an object in contact with the imaging surface.
Description
FIELD

The described embodiments relate generally to electronic devices, and more particularly to a crown for a wearable electronic device.


BACKGROUND

Electronic devices frequently use physical input devices to facilitate user interaction. For example, buttons, keys, dials, and the like can be physically manipulated by users to control operations of the device. Physical input devices may use various types of sensing mechanisms to translate the physical manipulation to signals usable by the electronic device. For example, buttons and keys may use collapsible dome switches to detect presses, while dials and other rotating input devices may use encoders or resolvers to detect rotational movements.


SUMMARY

An electronic watch includes a housing defining a side surface of the electronic watch, a transparent cover coupled to the housing and defining a front surface of the electronic watch, an image-sensing element, and a crown extending from the side of the housing and defining an imaging surface. The crown may include a light-directing feature configured to direct, onto the image-sensing element, an image of an object in contact with the imaging surface.


The electronic watch may further include a display positioned at least partially within the housing, and a touch sensor positioned below the transparent cover and configured to detect touch inputs applied to the transparent cover. The crown may include a head having a light-transmissive portion defining the imaging surface, and a light-transmissive shaft configured to receive light from the light-directing feature. The light-transmissive portion of the head may be transparent to infrared radiation and opaque to visible light. The light-transmissive shaft may be configured to guide the light to the image-sensing element. The head and the light-transmissive shaft may be portions of a monolithic light-transmissive member.


The light-directing feature may include an interface between a first material and a second material, the interface configured to reflect incident light. The first material may be a light-transmissive solid, and the second material may be air. The interface between the first material and the second material may be at least partially defined by an angled surface, and the angled surface may cause the incident light to be reflected towards the image-sensing element.


A wearable electronic device may include a housing, a display positioned at least partially within the housing, a crown at least partially external to the housing and defining an imaging surface along a peripheral portion of the crown, and an image-sensing element within the housing and configured to receive an image of an object in contact with the imaging surface. The imaging surface may be defined by a semi-transparent mirror coating.


The crown may include a light-transmissive member defining an angled surface configured to direct light from the imaging surface to the image-sensing element. The angled surface may have an angle configured to produce total internal reflection of the light. The wearable electronic device may further include a reflective material applied to the angled surface.


The light-transmissive member may at least partially define a head of the crown and a shaft of the crown. The wearable electronic device may further include a light source at least partially within the housing and configured to illuminate the object.


An electronic watch may include a housing, a display positioned at least partially within the housing, an image sensor at least partially within the housing and comprising an image-sensing element, and a crown. The crown may include a head portion defining an imaging surface external to the housing, a shaft portion extending at least partially into the housing, and a reflective feature directing light from the imaging surface through the shaft portion and towards the image-sensing element. The electronic watch may further include a transparent cover covering the display and a sensor configured to detect touch events applied to the transparent cover.


The reflective feature may include a curved surface configured to magnify an image of an object in contact with the imaging surface. The curved surface may define an interface between the head portion and air.


The crown may further include a cover member coupled to an end of the head portion. The head portion may define a cylindrical peripheral surface, and a peripheral surface of the cover member may be flush with the cylindrical peripheral surface of the head portion.





BRIEF DESCRIPTION OF THE DRAWINGS

The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:



FIGS. 1A-1B depict an example wearable electronic device;



FIGS. 2A-2B depict an example wearable electronic device being used;



FIG. 3 is a partial cross-sectional view of an example wearable electronic device having a crown with a light-directing feature;



FIG. 4A depicts an example wearable electronic device in a state of use;



FIG. 4B depicts an example image-sensing element corresponding to the state of use in FIG. 4A;



FIG. 4C depicts an example wearable electronic device in another state of use;



FIG. 4D depicts an example image-sensing element corresponding to the state of use in FIG. 4C;



FIG. 5A depicts an example wearable electronic device in a state of use;



FIG. 5B depicts an example image-sensing element corresponding to the state of use in FIG. 5A;



FIG. 5C depicts an example wearable electronic device in another state of use;



FIG. 5D depicts an example image-sensing element corresponding to the state of use in FIG. 5C;



FIG. 6A is a partial cross-sectional view of an example wearable electronic device having a crown with a light-directing feature;



FIG. 6B depicts an example image-sensing element of the wearable electronic device of FIG. 6A;



FIG. 7 is a partial cross-sectional view of an example wearable electronic device having a crown with a curved light-directing feature;



FIG. 8 is a partial cross-sectional view of an example wearable electronic device having a crown that does not include a shaft;



FIG. 9 is a partial cross-sectional view of an example wearable electronic device having a crown with a cover member;



FIG. 10 is a partial cross-sectional view of an example wearable electronic device having a crown with a light-directing feature and a force sensor;



FIGS. 11A-11B are side views of example wearable electronic devices with crowns having imaging surfaces; and



FIG. 12 depicts example components of a wearable electronic device.





DETAILED DESCRIPTION

Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following description is not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.


The embodiments herein are generally directed to a crown of a wearable electronic device, such as an electronic watch (also referred to as a smart watch), and more particularly to a crown that includes an optical sensing system to detect user interactions with the crown. For example, users may interact with a crown by rotating the crown or, in the case of a crown that is rotationally constrained, sliding a finger over a surface of the crown. In order to sense the motion of the user's finger, a crown as described herein may include a window on the part of the crown that the user touches when interacting with the crown. The window may allow an image of the user's finger to be captured by an image sensor that is positioned within the housing. The image sensor, which may include an image-sensing element (e.g., a charge-coupled device (CCD)) in conjunction with associated processors and other components within the electronic watch, may determine how the user's finger has moved and control the operation of the electronic watch accordingly. For example, the watch may determine a speed and a direction of motion of the user's finger (or other suitable parameter), and cause a graphical output that is displayed on the watch's display to move at a speed and along a direction that is indicated by the detected motion.


As used herein, an image may refer to an optical representation of an object, which may be produced, transmitted, or propagated by lenses, mirrors, or the like. An image may be captured and stored, by an image sensor, as a multidimensional array having pixels that represent a small portion of the image formed on the image sensor. The multidimensional array may be stored as a single frame (e.g., a photograph) or a series of frames (e.g., a video). In order to detect motion using the image (e.g., the stored image), the image sensor may analyze multiple frames to determine, for example, a speed and direction of one or more features in the image. The features that are analyzed may include features of a user's skin (e.g., fingerprints, hair follicles), or any other optically detectable feature, texture, surface irregularity, image, or the like, of any object. In this way, the device may be responsive to skin (e.g., the skin of a user's finger or hand), a stylus, a gloved finger, or any other suitable object with optically detectable features. As used herein, analysis of an image by an image sensor and/or other components of an electronic device may refer to an analysis of a stored image (e.g., the stored multidimensional array, which may be a digital photograph or a video).


To facilitate capturing an image of the user's finger (or other implement used to interact with the crown), the crown may include light-transmissive materials and light-directing features that direct light (corresponding to an image of the user's finger) through the crown and onto an image-sensing element of an image sensor. For example, a peripheral surface of the crown (e.g., the cylindrical peripheral surface that a user touches to rotate the crown) may be light-transmissive, and the crown may have a reflecting feature to direct light from the peripheral surface onto the image-sensing element. In one such example, the crown may include a monolithic light-transmissive member that defines both a head portion and a shaft portion of the crown, as well as an angled surface that causes light (e.g., the light corresponding to an image of the user's finger) to be reflected at an approximately 90 degree angle so that the light may be directed through the shaft and into an enclosure of the watch. More particularly, when a user places a finger on the peripheral surface of the head, the light corresponding to an image of the user's finger may be initially directed into the head along a radial direction (relative to the cylindrical peripheral surface of the head). However, an image-sensing element may be within the housing along an axial direction of the head. Accordingly, the head may have an angled surface formed into the material that effectively changes the direction of the light (e.g., reflects or otherwise redirects the light) so that the light is directed along the axis of the crown and into the watch. In this way, the image of the user's finger (or other implement or object) is directed onto an image-sensing element within the watch, allowing the watch to analyze the image and control the operation of the device accordingly.


The optical crown system described herein may be used with both freely rotatable crowns (e.g., crowns that can rotate about an axis an indefinite number of turns) as well as rotationally constrained crowns. As used herein, a “rotationally constrained” crown or component refers to a component that is not free to rotate more than a full revolution under normal use conditions (e.g., when manipulated by the hands of a person). Thus, rotationally constrained components include both rotationally fixed components and partially rotatable components.


In the case of a rotationally constrained crown, if a user attempts to rotate the crown to operate the device, the crown may not physically rotate. Instead, the user's fingers may slide along a surface of the crown while the crown remains stationary. As used herein, a finger or object “sliding” along a surface may refer to the finger (or other object) moving along the surface of the crown while the finger (or other object) is in contact with the surface. In the case of a rotating crown, as the user's finger moves forward to rotate the crown, the part of the user's finger that is touching the crown changes. In either case, as the user's finger moves, the image that is projected or incident on the image-sensing element may include a moving image of the surface of the user's finger. The image sensor may analyze the movement of the image to determine how to manipulate the graphical output or other property of the watch. The image sensor may use features of the user's skin, such as the ridges of the user's skin (or any other texture or optically detectable feature), to determine the speed and direction of motion.


Advantageously, the crown described herein can detect inputs even under conditions when other types of touch sensors may fail. For example, some touch-sensing technologies use capacitive sensors to detect touch events or inputs. The effectiveness of capacitive sensors may be reduced, however, by gloves, clothing, overly wet or dry skin, lotions, or the like. By detecting motion via images, the optical sensing structures described herein may avoid such negative effects and may operate effectively over a broader range of conditions and may sense movement of objects other than bare skin.



FIGS. 1A-1B depict an electronic device 100. The electronic device 100 is depicted as an electronic watch, though this is merely one example embodiment of an electronic device and the concepts discussed herein may apply equally or by analogy to other electronic devices, including mobile phones (e.g., smartphones), tablet computers, notebook computers, head-mounted displays, digital media players (e.g., mp3 players), or the like.


The electronic device 100 includes a housing 102 and a band 104 coupled to the housing. The band 104 may be configured to attach the electronic device 100 to a user, such as to the user's arm or wrist.


The electronic device 100 also includes a transparent cover 108 coupled to the housing 102. The cover 108 may define a front face of the electronic device 100. For example, in some cases, the cover 108 defines substantially the entire front face and/or front surface of the electronic device. The cover 108 may also define an input surface of the device 100. For example, as described herein, the device 100 may include touch and/or force sensors that detect inputs applied to the cover 108. The cover 108 may be formed from or include glass, sapphire, a polymer, a dielectric, or any other suitable material.


The cover 108 may cover at least part of a display 109 that is positioned at least partially within the housing 102. The display 109 may define an output region in which graphical outputs are displayed. Graphical outputs may include graphical user interfaces, user interface elements (e.g., buttons, sliders, etc.), text, lists, photographs, videos, or the like. The display 109 may include a liquid-crystal display (LCD), organic light emitting diode display (OLED), or any other suitable components or display technology.


The display 109 may include or be associated with touch sensors and/or force sensors that extend along the output region of the display and which may use any suitable sensing elements and/or sensing techniques. Using touch sensors, the device 100 may detect touch inputs applied to the cover 108, including detecting locations of touch inputs, motions of touch inputs (e.g., the speed, direction, or other parameters of a gesture applied to the cover 108), or the like. Using force sensors, the device 100 may detect amounts or magnitudes of force associated with touch events applied to the cover 108. The touch and/or force sensors may detect various types of user inputs to control or modify the operation of the device, including taps, swipes, multi-finger inputs, single- or multi-finger touch gestures, presses, and the like. Touch and/or force sensors usable with wearable electronic devices, such as the device 100, are described herein with respect to FIG. 12.


The electronic device 100 also includes a crown 112 having a head, protruding portion, or component(s) or feature(s) positioned along a side surface of the housing 102. At least a portion of the crown 112 may protrude from the housing 102, and may define a generally circular shape or a circular exterior surface. The exterior surface of the crown 112 may be textured, knurled, grooved, or may otherwise have features that may improve the tactile feel of the crown 112 and/or facilitate rotation sensing. The exterior surface of the crown 112 may also have one or more light-transmissive areas, such as a light-transmissive window, that define one or more imaging surfaces. An imaging surface may refer to a surface that allows an image to be captured of an object (e.g., a finger) that is touching that surface. For example, a light-transmissive window may allow an image to be captured, through the window, of a finger that is in contact with the window. Accordingly, the light-transmissive window may define an imaging surface.


The imaging surface or surfaces may enable the device 100 to capture an image of the user's finger while the finger is interacting with the crown 112 (e.g., in order to determine input parameters such as a speed and direction of an input, as described herein). The light-transmissive areas may be transparent to some wavelengths of light and substantially opaque to others. For example, if the device 100 uses infrared imaging to capture an image of the user's finger, the light-transmissive areas may be transparent to infrared wavelengths while being opaque to visible light.


The crown 112 may afford a variety of potential user interactions. For example, the crown 112 may be rotationally constrained (e.g., rotationally fixed or partially rotatable), and may include or be associated with sensors that detect when a user slides one or more fingers along a surface of the crown 112 in a movement that resembles rotating the crown 112 (or that would result in rotation of a freely rotating crown). More particularly, where the crown 112 is rotationally fixed or rotationally constrained, a user input that resembles a twisting or rotating motion may not actually result in any substantial physical rotation that can be detected for the purposes of registering an input. Rather, the user's fingers (or other object) will move in a manner that resembles twisting, turning, or rotating, but does not actually continuously rotate the crown 112. As another example gesture that may be detected, a user attempting to rotate a rotationally fixed crown by applying a substantially tangential force to a surface of the crown 112 (as shown in FIGS. 2A-2B, for example) may also result in a sliding gesture along a surface of the crown 112. Thus, in the case of a rotationally fixed or constrained crown 112, an image sensor within the device 100 may detect inputs that result from a gesture that has the same motion as (and thus may feel and look the same as or similar to) rotating a rotatable crown.


In some cases, the crown 112 may be rotationally free or may include a rotationally free member that is free to rotate relative to the housing 102. More particularly, the rotationally free member may have no rotational constraints, and thus may be capable of being rotated indefinitely (or a sufficiently large number of turns that a user does not typically reach a hard-stop under normal use conditions). Even where the crown 112 or a portion thereof can rotate, the crown 112 may be configured so that light reflected from the user's finger is directed from an imaging surface of the crown 112 onto an image-sensing element within the housing 102.


Thus, both rotationally constrained and rotationally free crowns may detect gestures resembling a twisting, turning, or rotating motion, regardless of whether the crown rotates or not. As used herein, a twisting, turning, or rotating motion applied to a crown may be referred to as a gesture input or a rotational input (even if the crown itself does not physically rotate).


In cases where the crown 112, or a member or component of the crown 112, is capable of some rotation, it may rotate about a rotation axis (e.g., it may rotate as indicated by arrow 103 in FIG. 1A). The crown 112, or a member or component of the crown 112, may also be translatable relative to the housing 102 to accept axial inputs. For example, the crown 112 may be movable or translatable along the rotation axis, towards and/or away from the housing 102 (as indicated by arrow 105 in FIG. 1A). The crown 112 may therefore be manipulated by pushing and/or pulling on the crown 112.


The crown 112 may be able to translate any suitable distance. For example, a crown 112 may include a dome switch to register axial inputs, and the crown 112 may move a sufficient distance to facilitate physical actuation of the dome switch. In other cases, such as where a force sensor is used to detect axial inputs, the crown 112 may move a sufficient distance to facilitate force sensing. The distance that the crown 112 can translate or move may be any suitable distance, such as about 1 mm, 0.5 mm, 0.2 mm, 0.1 mm, 0.05 mm or any other suitable distance.


Alternatively, the crown 112 may be fixed or otherwise substantially non-translatable. In such cases, axial inputs applied to the crown 112 may be detected in other ways. For example, the crown 112 may include or be part of a contact sensor (described more fully below), such as a capacitive or resistive touch sensor, that determines when and optionally where a user's finger is in contact with the crown 112. The crown 112 may also use an optical sensing scheme to detect axial inputs. For example, as noted above, the crown 112 or a portion of the crown may be light-transmissive to allow light (corresponding to an image of the user's finger) to be directed from a peripheral surface of a head onto an image-sensing element. To facilitate axial input sensing, the crown 112 may also define an optical path from an end 113 of the crown 112 to an image-sensing element so that the image sensor can determine when a user's finger (or other object) is in contact with the end 113 of the crown 112.


The device 100 may include a force sensor to detect axial forces that are applied to the crown 112. The force sensor may include or use any suitable force sensing components and may use any suitable technique for sensing force inputs. For example, a force sensor may include a strain sensor, capacitive gap sensor, or other force sensitive structure that is configured to produce an electrical response that corresponds to an amount of force (e.g., axial force) applied to the crown 112. The electrical response may increase continuously as the amount of applied force increases, and as such may provide non-binary force sensing. Accordingly, the force sensor may determine, based on the electrical response of the force sensing components, one or more properties of the applied force associated with a touch input (e.g., a magnitude of the applied axial force).


As described herein, gesture inputs (e.g., rotational-style inputs applied to a rotationally free or rotationally constrained crown) and axial inputs (e.g., translations or axial forces) may control various operations and user interfaces of the electronic device 100. In particular, inputs to the crown 112 may modify the graphical output of the display 109. For example, a rotational movement of the crown 112 or a gesture applied to the crown 112 may zoom, scroll, or rotate a user interface or other object displayed on the display 109 (among other possible functions), while translational movements or axial inputs may select highlighted objects or icons, cause a user interface to return to a previous menu or display, or activate or deactivate functions (among other possible functions).


The crown 112 may also be associated with or include a contact sensor that is configured to detect contact between a user and the crown 112 (e.g., touch inputs or touch events applied to the crown 112). The contact sensor may detect even non-moving contacts between the user and the crown 112 (e.g., when the user touches the crown 112 but does not rotate the crown or apply a sliding gesture to the crown 112). Contact sensing functionality may be provided by the same optical sensing system that also detects gestures (e.g., a finger sliding along a surface of a crown or the housing), or it may be provided by a separate sensor. The contact sensor may include or use any suitable type of sensor(s), including capacitive sensors, resistive sensors, magnetic sensors, inductive sensors, optical sensors, or the like. In some cases, the crown 112 itself, or components of the crown, may be conductive and may define a conductive path between the user (e.g., the user's finger) and a contact sensor. For example, the crown may be formed from or include metal, and may itself act as an electrode for conductively coupling a capacitive sensor to the user.


The device 100 may also include one or more haptic actuators that are configured to produce a tactile output through the crown 112. For example, the haptic actuator may be coupled to the crown 112 and may be configured to impart a force to the crown 112. The force may cause the crown 112 to move (e.g., to oscillate or vibrate translationally and/or rotationally, or to otherwise move to produce a tactile output), which may be detectable by a user when the user is contacting the crown 112. The haptic actuator may produce tactile output by moving the crown 112 in any suitable way. For example, the crown 112 (or a component thereof) may be rotated (e.g., rotated in a single direction, rotationally oscillated, or the like), translated (e.g., moved along a single axis), or pivoted (e.g., rocked about a pivot point). In other cases, the haptic actuator may produce tactile outputs using other techniques, such as by imparting a force to the housing 102 (e.g., to produce an oscillation, vibration, impulse, or other motion), which may be perceptible to a user through the crown 112 and/or through other surfaces of the device 100, such as the cover 108, the housing 102, or the like. Any suitable type of haptic actuator and/or technique for producing tactile output may be used to produce these or other types of tactile outputs, including electrostatics, piezoelectric actuators, oscillating or rotating masses, ultrasonic actuators, reluctance force actuators, voice coil motors, Lorentz force actuators, or the like. In some cases, haptic outputs may be produced by collapsible domes, springs, or other mechanical components.


Tactile outputs may be used for various purposes. For example, tactile outputs may be produced when a user presses the crown 112 (e.g., applies an axial force to the crown 112) to indicate that the device 100 has registered the press as an input to the device 100. As another example, tactile outputs may be used to provide feedback when the device 100 detects a rotation of the crown 112 or a gesture being applied to the crown 112. For example, a tactile output may produce a repetitive “click” sensation as the user rotates the crown 112 or applies a gesture to the crown 112. Tactile outputs may be used for other purposes as well.


The electronic device 100 may also include other inputs, switches, buttons, or the like. For example, the electronic device 100 includes a button 110. The button 110 may be a movable button (as depicted) or a touch-sensitive region of the housing 102. The button 110 may control various aspects of the electronic device 100. For example, the button 110 may be used to select icons, items, or other objects displayed on the display 109, to activate or deactivate functions (e.g., to silence an alarm or alert), or the like.



FIGS. 2A-2B show a front and side view, respectively, of a device 200 during one example use condition. The device 200 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100. Accordingly, details of the device 100 described above may apply to the device 200, and for brevity will not be repeated here.


In the example shown in FIGS. 2A-2B, the wearable device 200 includes a crown 212 that a user may contact to provide input through the crown 212. The crown 212 may define an imaging surface 216 that is positioned in a location where a user is likely to touch the crown 212 when interacting with and/or providing input to the crown 212. In some cases, the imaging surface 216 may extend around the entire peripheral portion or surface of the crown 212. Thus, the crown 212 and an associated image sensor may detect inputs applied to any part of the periphery of the crown 212. In other cases, the imaging surface 216 may extend along less than the entire periphery of the crown 212.


The imaging surface 216, which may be defined by a light-transmissive portion of the crown 212, may in optical communication with an image-sensing element within the device 200 to facilitate the image sensor capturing and analyzing an image of whatever object is in contact with the imaging surface 216 (e.g., a bare finger, a gloved finger). For example, the crown 212 may include a light-directing feature, such as an angled surface that changes the direction of the light that corresponds to an image of the object so that it is incident on the image-sensing element. In some cases, the crown 212 includes a monolithic light-transmissive member that defines the imaging surface 216, a shaft of the crown 212, as well as a light-directing feature to redirect light down the shaft of the crown and towards the image-sensing element.



FIGS. 2A-2B show a user interacting with the crown 212 to provide an input to the device 200. In the case of a rotationally constrained crown, the crown will not continuously rotate in response to the force applied by the finger 201 moving along the direction indicated by arrow 217 (while the finger is in contact with the crown 212). Rather, the finger 201 will slide along a surface of the crown 212. In the case of a rotationally free crown, the force applied to the crown 212 by the user's finger 201 causes the crown 212 (or a head or other component of the crown 212) to rotate relative to the housing 202. In either case, an image of the user's finger 201 is projected or otherwise incident on the image-sensing element within the device 200, and the image sensor detects the movement of the finger 201 sliding along the imaging surface 216 and causes the device 200 to take an action in response to the rotation. For example, as shown in FIG. 2A, upon detection of the motion of the finger 201, the device 200 may cause a graphical output 207 on a display 209 to move in accordance with the movement of the finger 201. A movement of the finger 201 in the direction indicated by arrow 217 may result in the graphical output 207 moving in the direction indicated by arrow 215. A movement of the finger 201 in the opposite direction may result in the graphical output 207 moving in the opposite direction. Rotating the crown 212 or sliding a finger along a surface of the crown 212 may change other operational properties of the device 200 in addition to or instead of scrolling a graphical output 207. For example, sliding a finger along the surface of the crown 212 may change parameters or settings of the device, control a zoom level of a graphical output, rotate a displayed graphical output, translate a displayed graphical output, change a brightness level of a graphical output, change a time setting, scroll a list of displayed items (e.g., numbers, letters, words, images, icons, or other graphical output), or the like.


In some cases, the graphical output 207 may also be responsive to inputs applied to a touch-sensitive display 208. The touch-sensitive display 208 may include or be associated with one or more touch and/or force sensors that extend along an output region of a display and which may use any suitable sensing elements and/or sensing techniques to detect touch and/or force inputs applied to the touch-sensitive display 208. The same or similar graphical output 207 manipulations that are produced in response to inputs applied to the crown 212 may also be produced in response to inputs applied to the touch-sensitive display 208. For example, a swipe gesture applied to the touch-sensitive display 208 may cause the graphical output 207 to move along the direction indicated by the arrow 215 (FIG. 2A). As another example, a tap gesture applied to the touch-sensitive display 208 may cause an affordance to be selected or activated. In this way, a user may have multiple different ways to interact with and control an electronic watch, and in particular the graphical output 207 of an electronic watch. Further, while the crown 212 may provide overlapping functionality with the touch-sensitive display 208, using the crown allows for the graphical output of the display to be visible (without being blocked by the finger that is providing the touch input).



FIG. 3 is a partial cross section of an electronic device 400, corresponding to a view along line A-A in FIG. 1B. The device 300 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 300, and for brevity will not be repeated here.


The device 300 includes a crown 312 positioned along a side of a housing 302. The crown 312 may include a head portion or head 313 and a shaft portion or shaft 315. The head 313 and the shaft 315 may be a single monolithic component, as shown, or they may be separate components joined together (e.g., via adhesive, a threaded interface, fasteners, clips, rivets, fusion bonding, or the like). Where the head 313 and shaft 315 are separate components, index matching fluid or material may be used to occupy voids or gaps between the head 313 and shaft 315 (or between any other components of a crown 312).


The head 313 and the shaft 315 may be formed of or include a light-transmissive material, which may define an imaging surface 316. For example, head 313 and/or the shaft 315 may be formed from a light-transmissive solid such as an acrylic, glass, transparent ceramic, sapphire, polycarbonate, quartz, or another suitable material. The solid light-transmissive material may be optically transparent (e.g., clear and uncolored), or it may be transparent some wavelengths of light and opaque (or substantially opaque) to others. For example, if the device uses infrared imaging to capture an image of the user's finger, the light-transmissive areas may be transparent to infrared wavelengths while being opaque to visible light. In some cases, the light-transmissive material of the crown 312 may be coated or otherwise treated so that it is visually opaque. For example, a semi-transparent mirror coating may be applied to an otherwise transparent or light-transmissive material. This may allow the crown 312 to appear, to the wearer, to be opaque and/or metallic, while still allowing light reflected by the portion of the user's finger 301 that is in contact with the crown 312 to enter the crown 312 through the imaging surface 316. As another example, a visibly opaque but infrared-transparent paint, film, ink, or other coating or material may be applied to the crown 312 (and in particular the imaging surface 316) to provide a visually opaque appearance while still facilitating optical imaging of the user's finger 301. As used herein, light-transmissive may be used to refer to something that is transparent or otherwise allows light and/or images to propagate therethrough. In some cases, transparent or light-transmissive materials or components may introduce some diffusion, lensing effects, filtering effects (e.g., color filtering), distortions, attenuation, or the like (e.g., due to surface textures) while still allowing objects or images to be seen or transmitted through the materials or components, and such deviations are understood to be within the scope of the meaning of transparent or light-transmissive. Moreover, components such as a crown, shaft, or head that are formed of or include light-transmissive materials and that function as described herein may be referred to as, for example, a light-transmissive crown, a light-transmissive shaft, and a light-transmissive head.


The device 300 also includes an image-sensing element 324 within the housing 302 and positioned adjacent an end of the shaft 315. An image of the user's finger 301 may be directed onto the image-sensing element 324 so that the image can be captured and analyzed to determine input parameters of an input gesture applied to the crown 312. The image-sensing element 324 may be positioned on a support 322 as shown in FIG. 3, though other mounting and/or support structures are also contemplated.


The image-sensing element 324 is one example of an optical sensing element that may be used. In particular, the image-sensing element 324 may be an optical sensing element with multiple pixels or other sensing regions that capture individual portions of an image, and allow the image sensor (or other optical sensor) to detect and/or store photographs, videos, and the like. In other cases, however, one or more other optical sensing elements may be used, such as photodiodes, single-pixel sensors, photovoltaic cells, or the like. In such cases, the optical sensing element(s) may detect changes in light caused by the presence and/or motion of an object relative to the imaging surface 316 of the crown 312, and may cause the device 300 to take actions based on the detected inputs (including but not limited to controlling any user interface animations or other device functions described herein).


As noted above, the imaging surface 316 is not directly in line with the image-sensing element 324. Accordingly, the crown 312 may include a light-directing feature 326 that directs light corresponding to the image of the user's finger from the imaging surface onto the image-sensing element 324. As shown in FIG. 3, the light-directing feature 326 includes an angled surface of the light-transmissive material of the crown 312. The angled surface defines an interface between materials having different optical indices (e.g., the light-transmissive material and a different material, such as air). When the light corresponding to an image of the user's finger 301 is incident on the interface defined by the angled surface (as represented by arrow 319), all or part of the light may be reflected at an angle (e.g., a 90 degree angle) towards the image-sensing element 324. For example, the angled surface may direct the light (e.g., by causing the light to be reflected) through the shaft 315 of the crown, which may also be light-transmissive. Thus, the shaft 315 guides the reflected light towards the image-sensing element 324 (as indicated by arrow 321). In cases where the light-directing feature 326 reflects light towards the image-sensing element 324, it may be referred to as a reflective feature.


The particular angle of the angled surface may be selected based on various factors, such as the optical properties of the light-transmissive material, the position of the image-sensing element 324 relative to the imaging surface 316, the degree of reflectance desired, and the like. In some cases, the angle of the angled surface has an angle configured to produce total internal reflection of the light. For example, the angled surface may be configured so that the angle of incidence 327 of the light from the object on the imaging surface 316 is equal to or greater than the critical angle for the crown/air interface defined by the angled surface. This configuration may produce total internal reflection of the incident light. More particularly, all of the light reflected by the user's finger 301 towards the light-directing feature 326 may be reflected by the crown/air interface defined by the angled surface.


The angle and/or shape of the light-directing feature 326 that results in total internal reflection may depend, for example, on the geometry and material(s) of the crown 312, the relative angles of the imaging surface 316 and the axis of the shaft 315, and the material that is in contact with the angled surface (air, as shown in FIG. 3). In the case where the crown 312, or the optically operative portions of the crown 312, is acrylic, the critical angle may be about 41 degrees. Accordingly, the light-directing feature 326 may be configured so that the angle of incidence 327 is equal to or greater than about 41 degrees (e.g., about 41 degrees, about 45 degrees, about 50 degrees). In some cases, the angle of incidence on the light-directing feature 326 may be less than the critical angle, in which case some of the light that forms the image may be transmitted through the angled surface, rather than being reflected along the shaft 315. This may be acceptable, as it may not be necessary to achieve total internal reflection. In particular, the light that is reflected by the light-directing feature 326 and incident on the image-sensing element 324 (even though it is less than all of the light being reflected by the user's finger) may be sufficient to allow the image sensor to analyze the image and ultimately detect the input.


In some cases, the exterior surface of the light-directing feature 326 includes a coating, film, or other material or treatment that increases the reflectivity of the imaging directing feature. For example, a mirror coating (or other reflective material) may be applied to the exterior surface of the light-directing feature 326. In such cases, it may not be necessary for the angle of incidence to be at or greater than a critical angle, as the mirror coating may ensure that all or substantially all of the light forming the image is reflected along the shaft or otherwise towards an image-sensing element.


As described herein, the imaging surface 316 of the crown 312 may extend around the entire periphery of the head 313. In such cases, the light-directing feature 326 may have a generally circular configuration so that the light-directing feature 326 directs the light corresponding to the image of the user's finger (or other object or implement) towards the image-sensing element 324 no matter where the finger contacts the periphery of the head 313. For example, the light-directing feature 326 may resemble a conical recess in the end of the head 313. Arrows 323 and 325 in FIG. 3 show an example light path for an object in contact with a bottom portion of the head 313. Due to the radial symmetry of the head 313 and the light-directing feature 326, these arrows may be generally representative of a light path for an object in contact with any portion of the peripheral surface of the head 313. In cases where the imaging surface 316 does not extend around the entire periphery of the head 313, the light-directing feature 326 need not be circular, though it may still have the circular configuration.


Where the imaging surface 316 extends around the entire periphery of the head 313, an image of a user's wrist (which may be proximate to the bottom portion of the peripheral surface of the head 313) may be directed onto the image-sensing element 324 when the device is being worn. To avoid the image of the user's wrist causing false inputs to the device, the image sensor may ignore images received via the bottom surface of the head 313 under certain conditions. For example, if the only received or detected image is received via the bottom surface of the head 313, the device 300 may ignore the image, as users are unlikely to turn the crown 312 by touching only the bottom surface (e.g., in the small area between the crown and the wearer's wrist). As another example, skin patterns or characteristics may differ between a wrist and a finger, and the device 300 may ignore any image that does not contain characteristics of the skin of a finger.


In order to illuminate the finger 301 while it is in contact with the imaging surface 316, the device 300 may include a light source 330 within the housing 302. The light source 330 may direct light through the shaft 315 towards the light-directing feature 326, which in turn directs the light towards the imaging surface 316, thereby illuminating the user's finger and allowing the image-sensing element 324 to capture an image of the user's finger 301. The light source 330 may be configured to emit light of any suitable wavelength(s), based on the particular type of light and/or electromagnetic radiation that the image-sensing element 324 is configured to detect. For example, the light source 330 may emit infrared radiation. The light source 330 (or any other suitable light source) may be integrated with any of the crowns or devices described herein.


In some cases, the device 300 uses dark field illumination components and techniques for illuminating and imaging an object that is in contact with the imaging surface 316. Such components may be integrated with the crown 312 or otherwise positioned within the crown 312 or the housing 302 as appropriate to provide suitable illumination and optical detection via the crown 312.


As noted above, the crown 312 may be rotationally constrained or rotationally free, and may also be axially fixed, or it may be translatable along its axis to accept axial inputs. The crown 312 may also be coupled to or otherwise integrated with the housing 302 in any suitable way. For example, the shaft 315 may extend into the housing 302 through an opening 317. A sealing member 320, such as an elastomeric member or other material or component(s), may form a seal between the shaft 315 and the housing 302 to prevent ingress of liquids, debris, or other contaminants. The sealing member 320 may seal the opening 317 while also allowing the crown 312 to move relative to the housing 302, if the crown 312 is configured to rotate and/or translate. In cases where the shaft 315 is rotationally constrained (e.g., rotationally fixed or partially rotatable), it may still be able to translate axially. As such, the sealing member 320 may seal the opening while allowing the shaft 315 to move axially within the opening 317. In other cases, the shaft 315 may be fixed to the housing 302, such as with adhesive, welds, fusion bonds, or the like. In such cases the sealing member 320 may be omitted.


As described with respect to FIG. 3, the crown 312 may be configured to redirect the light corresponding to an image of a user's finger from an imaging surface onto an image-sensing element that is offset or otherwise not in line with the imaging surface. FIGS. 4A-4D illustrate how the image of the wearer's finger may appear on an image-sensing element and how motion of the user's finger along an imaging surface may be detected by an image sensor.



FIG. 4A shows an example electronic device 400 receiving an input from a finger 401. The device 400 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 400, and for brevity will not be repeated here. The device 400 may include a crown 412 (which may be an embodiment of the crown 312 or any other crown described herein). As shown in FIG. 4A, which is a side view of the device 400, a portion 419 of the finger 401 is in contact with an imaging surface 416 of the crown 412. FIG. 4B shows an example image-sensing element 424 (which may be an embodiment of the image-sensing element 324, FIG. 3) with an image 426 of the portion 419 of the finger 401. The image 426 (or light that corresponds to the image) may be directed onto the image-sensing element 424 by the crown 412 along a path as indicated by the arrows 319, 321 in FIG. 3. As shown in FIGS. 4B and 4D (and elsewhere in the figures), the image-sensing element 424 is a square, though any other suitable shape may be used, including circular, rectangular, oval, or the like.


In the configuration shown in FIGS. 4A-4D, the crown 412 may have an imaging surface that extends around the entire periphery of the crown 412. Accordingly, the image 426 is shown positioned along a top portion of the image-sensing element 424, resulting from the fact that the user's finger 401 is in contact with a top-facing surface of the crown 412. Were the user's finger 401 to be in contact with a bottom surface of the crown 412, for example, the image may be incident on a bottom portion of the image-sensing element 424.



FIGS. 4C and 4D show the electronic device 400 and image-sensing element 424, respectively, after the finger 401 has moved forward (e.g., in the direction of arrow 417). Notably, the portion 421 of the user's finger 401 that is in contact with the imaging surface 416 has changed, though the area of the image-sensing element 424 that receives the image 426 has not changed. However, because the user's finger 401 has moved, the features of the user's finger that are visible to the image-sensing element 424 (e.g., the ridges or other features of the skin of the finger 401) have moved. For example, feature 430, which may correspond to a ridge of a fingerprint, a texture of a fabric material of a glove, or the like, has moved along the image-sensing element 424. A processor or other component of an image sensor may analyze the image 426 and use features such as the feature 430 to determine a speed and/or direction of motion of the user's finger, and the device 400 may use that information to change or control an operation of the device 400 (e.g., to move or scroll a graphical output on a display).


While FIGS. 4A-4D show a bare finger and a fingerprint, any other object or implement may be used in place of a bare finger. For example, if a gloved finger is used to provide a gesture input to the crown 412, the image sensor may use a feature of the material of the glove to determine a speed and/or direction of motion. Indeed, the image sensor may determine motion of any suitable object by analyzing a texture, surface irregularity, or some other optically detectable feature of the object.


A crown, and more particularly, a head of a crown, may also be shaped or configured to act as a lens to aid in the image sensing functionality of a device. FIGS. 5A-5D illustrate how a lensing feature on a crown may enable a device to reject or otherwise ignore images that do not correspond to a finger in contact with the crown. For example, a lensing feature may be configured so that only objects in direct contact with the crown are in sharp focus.



FIG. 5A, shows an example electronic device 500 receiving an input from a finger 501. The device 500 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 500, and for brevity will not be repeated here. The device 500 may include a crown 512 (which may be an embodiment of the crown 312 or any other crown described herein). As shown in FIG. 5A, which is a side view of the device 500, a portion of the finger 501 is in contact with an imaging surface 516 of the crown 512. FIG. 5B shows an example image-sensing element 524 (which may be an embodiment of the image-sensing element 324, FIG. 3) with an image 526 of the finger 501. Under these conditions, the image 526 is in focus so that the features of the user's finger 501 are clearly defined and the image sensor can easily analyze the motion (if any) of the finger 501.



FIG. 5C shows the electronic device 500 with the finger 501 elevated a distance off of the imaging surface 516 (e.g., moved in a direction indicated by the arrow 517). The crown 512 may have a shape or feature that acts as a lens so that objects that are not in direct contact with (or within a threshold distance of) the crown 512 are out of focus. The threshold distance may be about 0.1 mm, 0.5 mm, 1.0 mm, 2.0 mm, or any other suitable distance. FIG. 5D shows a representation of an out-of-focus image 526. The image sensor may be able to ignore the motion of images that are not in focus, thereby preventing images from the surrounding environment from triggering inputs to the device. In some cases, instead of or in addition to a lensing feature, the device 500 may use a contact sensor to determine when an object is in contact with the crown 512, and ignore image motion when there is nothing contacting the crown 512.


As noted above, in some cases an optical crown as described herein may be configured to detect axial inputs, such as a tap or press on an end surface of the crown. FIG. 6A illustrates an example electronic device 600 with an optical crown that incorporates both optical gesture sensing (as described above), as well as axial touch sensing.


The device 600 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 600, and for brevity will not be repeated here. The device 600 may include a crown 612 (which may be an embodiment of the crown 312 or any other crown described herein). The crown 612 may include a head 613 and a shaft 615. The crown 612 may include a light-directing feature 626, which operates substantially the same as the light-directing feature 626, and directs an image from an imaging surface 616 towards an image-sensing element 624. The device may include a sealing member 620 between the shaft 615 and a housing 602, and the image-sensing element 624 may be positioned on a support 622.


The crown 612 also includes an axial imaging surface 628. The axial imaging surface 628 may be configured to allow light corresponding to an image of an object in contact with an end surface of the crown 612 to pass through the shaft 615 and onto the image-sensing element 624. The image sensor may be configured to determine, based on the image received on the image-sensing element 624, whether a user's finger (or other object) is in contact with the end surface of the head 613. In some cases, the axial imaging surface 628 may have a curvature or lensing feature that operates similar to that described with respect to FIGS. 5A-5D, so that only a finger or object that is actually in contact with the end surface of the head 613 cause the image sensor to positively identify an axial touch input.



FIG. 6B shows a representation of the image-sensing element 624, showing example images 632, 634 corresponding to a gesture applied to a peripheral surface of the crown (image 632), and an axial touch input (image 634). As shown, the image 634 is positioned on a center of the image-sensing element 624. Accordingly, the image sensor may be configured to analyze the center of the image-sensing element 624 to determine when an axial touch input has been applied (e.g., by detecting an in-focus image of a user's finger), and to analyze the peripheral portion of the image-sensing element 624 to detect a gesture input applied to the peripheral surface of the crown 612. The device 600 may take any suitable action in response to detecting an axial touch input, such as turning on or off the device (or a display of the device), selecting a user interface element being displayed on the display, activating or deactivating a device function, or the like.


In some cases, a light-directing feature may define a curved surface that not only directs light from an imaging surface onto an image-sensing element, but also adjusts, focuses, or otherwise modifies the image. FIG. 7 illustrates an example electronic device 700 with an optical crown having a light-directing feature with a curved surface to both redirect the light and modify the image of the user's finger.


The device 700 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 700, and for brevity will not be repeated here. The device 700 may include a crown 712 (which may be an embodiment of the crown 312 or any other crown described herein). The crown 712 may include a head 713 and a shaft 715. The crown 712 may include a light-directing feature 726 that directs light from an imaging surface 716 (e.g., reflected by a user's finger) to an image-sensing element 724, as described above. The device 700 may include a sealing member 720 between the shaft 715 and a housing 702, and the image-sensing element 724 may be positioned on a support 722.


The light-directing feature 726 includes a curved surface 728. The curved surface 728 may be configured so that an angle of incidence of the light from an object (e.g., a user's finger) is equal to or greater than a critical angle of the interface defined by the curved surface 728, thus producing total internal reflection of the light. Further, the curved surface 728 may be configured to magnify the image so that the image occupies a greater area of the image-sensing element 724. FIG. 7 illustrates the magnification of the image by showing the rays 730, 732, 734, and 736 diverging through the shaft 715.


As shown, the curved surface 728 is concave (e.g., as viewed from the exterior of the crown 712), though the curved surface 728 may have other configurations and/or shapes, such as convex shapes, aspherical shapes, combination concave and convex shapes (e.g., having concave and convex regions), or the like. In some cases, the surface is not only curved, but has flat regions, stepped regions, or other complex geometries.


The crown 712 shown in FIG. 7 has an imaging surface along the entire periphery of the head 713, and as such the curved surface 728 may be configured so that a magnified image does not extend past a midpoint of the image-sensing element 724. In examples where the imaging surface extends along less than the entire periphery, the image may be magnified to a greater extent. Thus, for example, a finger applied to the top portion of the head 713 may result in an image that occupies the entire image-sensing element 724 (rather than less than half of the image-sensing element, which may occur with a full-periphery imaging surface).


The crowns shown in FIGS. 3-7 each have a shaft extending through an opening in a housing. In some cases, the shaft may be omitted, or the crown may otherwise have a different configuration and/or integration with the housing. FIG. 8 illustrates an example electronic device 800 with an optical crown 812 without a shaft. This configuration may have advantages such as easier or faster manufacturing, better environmental sealing, or the like.


The device 800 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 800, and for brevity will not be repeated here. The device 800 may include a crown 812 (which may be an embodiment of the crown 312 or any other crown described herein). The crown 812 may include a head 813. The crown 812 may include a light-directing feature 826 that directs light from an imaging surface 816 to an image-sensing element 824, as described above.


Instead of an internally mounted image-sensing element, the crown 812 includes an image-sensing element 824 mounted on or otherwise incorporated with the head 813. For example, the image-sensing element 824 may be adhered to, encapsulated within, or mechanically fastened to the head 313. In some cases, the image-sensing element 824 may be set into a recess in the head 813. The light-directing feature 826 may still be configured to direct light from the imaging surface 816 onto the image-sensing element 824, though the image will not need to pass through a shaft.


Conductors 821 (e.g., wires, a flex connector, a conductive trace, or another conductive component) may be coupled to the image-sensing element 824 and may extend through an opening 817 in the housing 802, where they connect to a processor 820 or other component of an image sensor that analyzes the images received and/or detected by the image-sensing element 824.


In the device 800, both the head 813 and the image-sensing element 824 are positioned outside of the interior volume of the device housing 802. This may simplify manufacture of the device 800. For example, because there is no shaft, the need for precise optical alignment between the shaft and the image-sensing element may be eliminated. Moreover, the mating of the head 813 to the housing 802, as well as the smaller opening 817 that need only accommodate the conductors 821 (rather than a crown shaft), may allow for better sealing against liquid or other debris.


The head 813 may be attached to the housing 802 using adhesive, welding, mechanical fasteners, or the like. Where the head 813 is configured to rotate and/or translate relative to the housing 802, the head 813 may be attached to the housing 802 with bearings, bushings, guides, or other suitable components.



FIG. 9 illustrates an example electronic device 900 with an optical crown 912 with a cover member over an end surface of the crown. The device 900 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 900, and for brevity will not be repeated here. The device 900 may include a crown 912 (which may be an embodiment of the crown 312 or any other crown described herein). The crown 912 may include a head 913 and a shaft 915. The crown 912 may include a light-directing feature 926 that directs light from an imaging surface 916 to an image-sensing element 924, as described above. The device 900 may include a sealing member 920 between the shaft 915 and a housing 902, and the image-sensing element 924 may be positioned on a support 922.


In FIG. 9, the crown 912 also includes a cover member 928 attached to the end of the head 913. The cover member 928, which may also resemble and/or be referred to as a cap, may cover and protect the light-directing feature 926, and may provide a cosmetic exterior surface to the crown 912. The cover member 928 may be formed of any suitable material, such as plastic, sapphire, metal, gemstones, ceramic, or the like. In cases where the crown 912 includes an axial touch sensing system, as described above, the cover member 928 or a portion thereof may be light-transmissive to allow the axial touch sensing system to operate. The cover member 928 may be attached to the crown 912 in any suitable manner, including welding, adhesives, mechanical fasteners and/or interlocks, soldering, brazing, or the like. The cover member 928, and more particularly a peripheral surface of the cover member 928, may be substantially flush with the cylindrical peripheral surface of the head 913.



FIG. 10 illustrates an example electronic device 1000 with an optical crown 1012. The device 1000 may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100 (or any other wearable device described herein). Accordingly, details of the wearable device 100 described above may apply to the device 1000, and for brevity will not be repeated here. The device 1000 may include a crown 1012 (which may be an embodiment of the crown 312 or any other crown described herein). The crown 1012 may include a head 1013 and a shaft 1015. The crown 1012 may include a light-directing feature 1026 that directs light from an imaging surface 1016 to an image-sensing element 1024, as described above. The device 1000 may include a sealing member 1020 between the shaft 1015 and a housing 1002, and the image-sensing element 1024 may be positioned on a support 1022.


The device 1000 also includes a force sensing component 1028. The force sensing component 1028 may be configured to detect axial and/or translational inputs applied to the crown 1012, as described above. As shown, the force sensing component 1028 is a dome switch, which may provide both an input detection and a tactile output function. For example, when an axial force exceeding a collapse threshold of the dome switch is applied to the crown 1012, the dome switch may abruptly collapse, which both closes an electrical contact (thereby allowing the device to register the input), and produces a tactile “click” or other tactile output that may be felt by the user. In other cases, the force sensing component 1028 may be a force sensor that is configured to produce an electrical response that corresponds to an amount of force (e.g., axial force) applied to the crown 1012. The electrical response may increase continuously as the amount of applied force increases, and as such may provide non-binary force sensing.


As described above, an imaging surface may be defined along an entire peripheral surface of a crown, or along only a portion of a peripheral surface. FIGS. 11A and 11B illustrate example crowns in which the imaging surface extends along less than the entire peripheral surface. FIG. 11A, for example, shows an example device 1100 (which may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100) with a crown 1102. An imaging surface 1104 may extend along less than the entire peripheral surface of the crown 1102. As shown, the portion of the crown 1102 that faces or is nearest a user's wrist when the device is being worn may not be part of the imaging surface 1104. This may help prevent false inputs from being detected due to movement of the device 1100 relative to the user's wrist (e.g., as may occur during jostling or other normal use conditions).



FIG. 11B shows an example device 1110 (which may be an embodiment of the device 100, and may include the same or similar components and may provide the same or similar functions as the device 100) with a crown 1112. The crown 1112 defines several imaging surfaces, which may also be described as an imaging surface having several discrete segments. For example, an imaging surface 1114 may extend along a top portion of the crown 1112, while imaging surfaces 1114 and 1118 extend along side portions of the crown 1112. The crown 1112 may not capture images in the areas between adjacent imaging surfaces.


In FIGS. 11A-11B, the imaging surfaces may be defined by light-transmissive windows set into an otherwise opaque head. For example, windows of glass, plastic, ceramic, sapphire, or other materials may be inset into a head of metal, plastic, or other opaque material.



FIG. 12 depicts an example schematic diagram of an electronic device 1200. By way of example, the device 1200 of FIG. 12 may correspond to the wearable electronic device 100 shown in FIGS. 1A-1B (or any other wearable electronic device described herein). To the extent that multiple functionalities, operations, and structures are disclosed as being part of, incorporated into, or performed by the device 1200, it should be understood that various embodiments may omit any or all such described functionalities, operations, and structures. Thus, different embodiments of the device 1200 may have some, none, or all of the various capabilities, apparatuses, physical features, modes, and operating parameters discussed herein.


As shown in FIG. 12, a device 1200 includes a processing unit 1202 operatively connected to computer memory 1204 and/or computer-readable media 1206. The processing unit 1202 may be operatively connected to the memory 1204 and computer-readable media 1206 components via an electronic bus or bridge. The processing unit 1202 may include one or more computer processors or microcontrollers that are configured to perform operations in response to computer-readable instructions. The processing unit 1202 may include the central processing unit (CPU) of the device. Additionally or alternatively, the processing unit 1202 may include other processors within the device including application specific integrated chips (ASIC) and other microcontroller devices.


The memory 1204 may include a variety of types of non-transitory computer-readable storage media, including, for example, read access memory (RAM), read-only memory (ROM), erasable programmable memory (e.g., EPROM and EEPROM), or flash memory. The memory 1204 is configured to store computer-readable instructions, sensor values, and other persistent software elements. Computer-readable media 1206 also includes a variety of types of non-transitory computer-readable storage media including, for example, a hard-drive storage device, a solid-state storage device, a portable magnetic storage device, or other similar device. The computer-readable media 1206 may also be configured to store computer-readable instructions, sensor values, and other persistent software elements.


In this example, the processing unit 1202 is operable to read computer-readable instructions stored on the memory 1204 and/or computer-readable media 1206. The computer-readable instructions may adapt the processing unit 1202 to perform the operations or functions described above with respect to FIGS. 1A-11B. In particular, the processing unit 1202, the memory 1204, and/or the computer-readable media 1206 may be configured to cooperate with a sensor 1224 (e.g., an image sensor that detects input gestures applied to an imaging surface of a crown) to control the operation of a device in response to an input applied to a crown of a device (e.g., the crown 112). The computer-readable instructions may be provided as a computer-program product, software application, or the like.


As shown in FIG. 12, the device 1200 also includes a display 1208. The display 1208 may include a liquid-crystal display (LCD), organic light emitting diode (OLED) display, light emitting diode (LED) display, or the like. If the display 1208 is an LCD, the display 1208 may also include a backlight component that can be controlled to provide variable levels of display brightness. If the display 1208 is an OLED or LED type display, the brightness of the display 1208 may be controlled by modifying the electrical signals that are provided to display elements. The display 1208 may correspond to any of the displays shown or described herein.


The device 1200 may also include a battery 1209 that is configured to provide electrical power to the components of the device 1200. The battery 1209 may include one or more power storage cells that are linked together to provide an internal supply of electrical power. The battery 1209 may be operatively coupled to power management circuitry that is configured to provide appropriate voltage and power levels for individual components or groups of components within the device 1200. The battery 1209, via power management circuitry, may be configured to receive power from an external source, such as an AC power outlet. The battery 1209 may store received power so that the device 1200 may operate without connection to an external power source for an extended period of time, which may range from several hours to several days.


In some embodiments, the device 1200 includes one or more input devices 1210. An input device 1210 is a device that is configured to receive user input. The one or more input devices 1210 may include, for example, a push button, a touch-activated button, a keyboard, a key pad, or the like (including any combination of these or other components). In some embodiments, the input device 1210 may provide a dedicated or primary function, including, for example, a power button, volume buttons, home buttons, scroll wheels, and camera buttons. Generally, a touch sensor or a force sensor may also be classified as an input device. However, for purposes of this illustrative example, the touch sensor 1220 and a force sensor 1222 are depicted as distinct components within the device 1200.


The device 1200 may also include a sensor 1224 that detects inputs provided by a user to a crown of the device (e.g., the crown 112). As described above, the sensor 1224 may include sensing circuitry and other sensing elements that facilitate sensing of gesture inputs applied to an imaging surface of a crown, as well as other types of inputs applied to the crown (e.g., rotational inputs, translational or axial inputs, axial touches, or the like). The sensor 1224 may include an optical sensing element, such as a charge-coupled device (CCD), complementary metal-oxide-semiconductor (CMOS), or the like. The sensor 1224 may correspond to any sensors described herein or that may be used to provide the sensing functions described herein.


The device 1200 may also include a touch sensor 1220 that is configured to determine a location of a touch on a touch-sensitive surface of the device 1200 (e.g., an input surface defined by the portion of a cover 108 over a display 109). The touch sensor 1220 may use or include capacitive sensors, resistive sensors, surface acoustic wave sensors, piezoelectric sensors, strain gauges, or the like. In some cases, the touch sensor 1220 associated with a touch-sensitive surface of the device 1200 may include a capacitive array of electrodes or nodes that operate in accordance with a mutual-capacitance or self-capacitance scheme. The touch sensor 1220 may be integrated with one or more layers of a display stack (e.g., the display 109) to provide the touch-sensing functionality of a touchscreen. Moreover, the touch sensor 1220, or a portion thereof, may be used to sense motion of a user's finger as it slides along a surface of a crown, as described herein.


The device 1200 may also include a force sensor 1222 that is configured to receive and/or detect force inputs applied to a user input surface of the device 1200 (e.g., the display 109). The force sensor 1222 may use or include capacitive sensors, resistive sensors, surface acoustic wave sensors, piezoelectric sensors, strain gauges, or the like. In some cases, the force sensor 1222 may include or be coupled to capacitive sensing elements that facilitate the detection of changes in relative positions of the components of the force sensor (e.g., deflections caused by a force input). The force sensor 1222 may be integrated with one or more layers of a display stack (e.g., the display 109) to provide force-sensing functionality of a touchscreen.


The device 1200 may also include a communication port 1228 that is configured to transmit and/or receive signals or electrical communication from an external or separate device. The communication port 1228 may be configured to couple to an external device via a cable, adaptor, or other type of electrical connector. In some embodiments, the communication port 1228 may be used to couple the device 1200 to an accessory, including a dock or case, a stylus or other input device, smart cover, smart stand, keyboard, or other device configured to send and/or receive electrical signals.


As described above, one aspect of the present technology is the gathering and use of data available from various sources to facilitate the detection of inputs to an electronic device. The present disclosure contemplates that in some instances, this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include biometric data (e.g., fingerprints), demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, or any other identifying or personal information.


The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the US, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA); whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.


Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, users can deactivate input detection using imaging techniques that may capture or use biometric information (e.g., fingerprints, skin features, etc.). In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an app that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.


Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.


Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, gesture inputs applied to a crown with an object other than a finger, or with a gloved finger, may be operable to control the operation of an electronic device. Further, the functionality described herein may be provide without requiring any persistent storage of personal information. For example, images of a user's finger that are captured by an image sensor may be analyzed and then discarded immediately after the necessary motion information has been determined.


The foregoing description, for purposes of explanation, used specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings. Also, when used herein to refer to positions of components, the terms above and below, or their synonyms, do not necessarily refer to an absolute position relative to an external reference, but instead refer to the relative position of components with reference to the figures.

Claims
  • 1. An electronic watch comprising: a housing defining a side surface of the electronic watch;a display positioned at least partially within the housing and configured to display a graphical output;a transparent cover coupled to the housing and defining a front surface of the electronic watch;an optical sensing element;a crown extending from the side surface of the electronic watch and defining a peripheral surface along an exterior portion of the crown, wherein the electronic watch is configured to:detect, using the optical sensing element, movement of an object along the peripheral surface of the crown; andmodify the graphical output in accordance with the movement of the object along the peripheral surface of the crown.
  • 2. The electronic watch of claim 1, wherein: the electronic watch further comprises a touch sensor positioned below the transparent cover and configured to detect touch inputs applied to the transparent cover; andthe crown comprises a light-directing feature configured to direct, onto the optical sensing element, an image of an object in contact with the peripheral surface of the crown.
  • 3. The electronic watch of claim 2, wherein the light-directing feature comprises an interface between a first material and a second material, the interface configured to reflect incident light.
  • 4. The electronic watch of claim 1, wherein the optical sensing element is positioned within the housing.
  • 5. The electronic watch of claim 1, wherein the optical sensing element is positioned within the crown.
  • 6. The electronic watch of claim 1, wherein the crown comprises: an opaque portion; anda light-transmissive portion defining a portion of the peripheral surface.
  • 7. The electronic watch of claim 6, wherein the optical sensing element receives light through the light-transmissive portion of the crown.
  • 8. A wearable electronic device comprising: a housing;a display positioned at least partially within the housing and configured to display a graphical output;a crown at least partially external to the housing and defining a light-transmissive window; andan optical sensing element within the housing and configured to receive an image of an object in contact with the light-transmissive window, wherein the wearable electronic device is configured to:detect, using the optical sensing element, movement of the object along the light-transmissive window; andmodify the graphical output in accordance with the movement of the object along the light-transmissive window.
  • 9. The wearable electronic device of claim 8, wherein the crown comprises a light-directing feature configured to direct, onto the optical sensing element, an image of an object in contact with the light-transmissive window.
  • 10. The wearable electronic device of claim 9, wherein the light-directing feature defines an angled surface configured to direct light from the light-transmissive window to the optical sensing element.
  • 11. The wearable electronic device of claim 10, further comprising a reflective material on the angled surface.
  • 12. The wearable electronic device of claim 8, wherein the crown comprises: an opaque member; anda light-transmissive member positioned in the opaque member and defining the light-transmissive window.
  • 13. The wearable electronic device of claim 8, further comprising a light source at least partially within the housing and configured to illuminate the object.
  • 14. The wearable electronic device of claim 8, wherein the light-transmissive window includes a semi-transparent mirror coating.
  • 15. An electronic watch comprising: a housing;a display positioned at least partially within the housing;an optical sensor at least partially within the housing and comprising an optical sensing element; anda crown comprising:a head portion defining an optical sensing region along a peripheral surface of the head portion;a shaft portion extending at least partially into the housing; anda light-directing feature directing light from the optical sensing region towards the optical sensing element.
  • 16. The electronic watch of claim 15, further comprising a contact sensor configured to detect contact between an object and the crown.
  • 17. The electronic watch of claim 15, wherein the head portion comprises: a metal portion; anda light-transmissive window coupled to the metal portion and at least partially defining the optical sensing region.
  • 18. The electronic watch of claim 15, wherein the crown comprises: a transparent member defining the head portion and the shaft portion; andan opaque coating positioned on the transparent member.
  • 19. The electronic watch of claim 18, wherein the optical sensing region is defined by an opening in the opaque coating.
  • 20. The electronic watch of claim 18, wherein the optical sensor includes an image-sensing element.
CROSS-REFERENCE TO RELATED APPLICATION(S)

This application is a continuation patent application of U.S. patent application Ser. No. 16/191,349, filed Nov. 14, 2018, and titled “Crown for an Electronic Watch” and claims the benefit of U.S. Provisional Patent Application No. 62/713,745, filed Aug. 2, 2018 and titled “Crown for an Electronic Watch,” the disclosures of which are hereby incorporated herein by reference in their entirety.

US Referenced Citations (575)
Number Name Date Kind
2237860 Bolle Apr 1941 A
2288215 Taubert et al. Jun 1942 A
2497935 Feurer Feb 1950 A
2771734 Morf Nov 1956 A
2788236 Kafowi Apr 1957 A
2797592 Marrapese Jul 1957 A
3040514 Dinstman Jun 1962 A
3056030 Kelchner Sep 1962 A
3130539 Davis Apr 1964 A
3355873 Morf Dec 1967 A
3362154 Perret Jan 1968 A
3410247 Dronberger Nov 1968 A
3495398 Widmer et al. Feb 1970 A
3577876 Spadini May 1971 A
3621649 Vulcan et al. Nov 1971 A
3662618 Kroll et al. May 1972 A
3733803 Hiraga May 1973 A
3937002 Van Haften Feb 1976 A
4007347 Haber Feb 1977 A
4031341 Wuthrich et al. Jun 1977 A
4037068 Gaynor Jul 1977 A
4051665 Arn Oct 1977 A
4077200 Schneider Mar 1978 A
4133404 Griffin Jan 1979 A
4170104 Yamagata Oct 1979 A
4258096 LaMarche Mar 1981 A
4274152 Ikegami Jun 1981 A
4287400 Kitik Sep 1981 A
4289400 Kubola et al. Sep 1981 A
4311026 Ochoa Jan 1982 A
4311990 Burke Jan 1982 A
4324956 Sakakino et al. Apr 1982 A
4345119 Latasiewicz Aug 1982 A
4364674 Tesch Dec 1982 A
4379642 Meyrat Apr 1983 A
4395134 Luce Jul 1983 A
4396298 Ripley Aug 1983 A
4417824 Paterson et al. Nov 1983 A
4448199 Schmid May 1984 A
4520306 Kirby May 1985 A
4581509 Sanford et al. Apr 1986 A
4600316 Besson Jul 1986 A
4617461 Subbarao et al. Oct 1986 A
4634861 Ching et al. Jan 1987 A
4641026 Garcia, Jr. Feb 1987 A
4670737 Rilling Jun 1987 A
4766642 Gaffney et al. Aug 1988 A
4783772 Umemoto et al. Nov 1988 A
4884073 Souloumiac Nov 1989 A
4914831 Kanezashi et al. Apr 1990 A
4922070 Dorkinski May 1990 A
4931794 Haag Jun 1990 A
4952799 Loewen Aug 1990 A
4980685 Souloumiac et al. Dec 1990 A
4987299 Kobayashi et al. Jan 1991 A
5034602 Garcia et al. Jul 1991 A
5177355 Branan Jan 1993 A
5214278 Banda May 1993 A
5258592 Nishikawa et al. Nov 1993 A
5288993 Bidiville et al. Feb 1994 A
5347123 Jackson et al. Sep 1994 A
5383166 Gallay Jan 1995 A
5471054 Watanabe Nov 1995 A
5477508 Will Dec 1995 A
5509174 Worrell Apr 1996 A
5559761 Frenkel et al. Sep 1996 A
5572314 Hyman et al. Nov 1996 A
5583560 Florin et al. Dec 1996 A
5631881 Pessey et al. May 1997 A
5726645 Kamon et al. Mar 1998 A
5738104 Lo Apr 1998 A
5748111 Bates May 1998 A
5825353 Will Oct 1998 A
5841050 Clift et al. Nov 1998 A
5847335 Sugahara et al. Dec 1998 A
5867082 Van Zeeland Feb 1999 A
5943233 Ebina Aug 1999 A
5953001 Challener et al. Sep 1999 A
5960366 Duwaer et al. Sep 1999 A
5963332 Feldman et al. Oct 1999 A
5999168 Rosenberg et al. Dec 1999 A
6069567 Zawilski May 2000 A
6128006 Rosenberg et al. Oct 2000 A
6134189 Carrard Oct 2000 A
6154201 Levin et al. Nov 2000 A
6175679 Veligdan et al. Jan 2001 B1
6203190 Stotz Mar 2001 B1
6241684 Amano Jun 2001 B1
6246050 Tullis et al. Jun 2001 B1
6252825 Perotto Jun 2001 B1
6304247 Black Oct 2001 B1
6355891 Ikunami Mar 2002 B1
6361502 Puolakanaho et al. Mar 2002 B1
6377239 Isikawa Apr 2002 B1
6392640 Will May 2002 B1
6396006 Yokoji et al. May 2002 B1
6422740 Leuenberger Jul 2002 B1
6477117 Narayanaswami et al. Nov 2002 B1
6502982 Bach et al. Jan 2003 B1
6525278 Villain et al. Feb 2003 B2
6556222 Narayanaswami Apr 2003 B1
6575618 Inoue et al. Jun 2003 B1
6587400 Line Jul 2003 B1
6636197 Goldenberg et al. Oct 2003 B1
6646635 Pogatetz et al. Nov 2003 B2
6661438 Shiraishi et al. Nov 2003 B1
6672758 Ehrsam et al. Jan 2004 B2
6794992 Rogers Sep 2004 B1
6809275 Cheng et al. Oct 2004 B1
6834430 Worrell Dec 2004 B2
6846998 Hasumi et al. Jan 2005 B2
6882596 Guanter Apr 2005 B2
6888076 Hetherington May 2005 B2
6896403 Gau May 2005 B1
6909378 Lambrechts et al. Jun 2005 B1
6914551 Vidal Jul 2005 B2
6950695 Chen Sep 2005 B2
6961099 Takano et al. Nov 2005 B2
6963039 Weng et al. Nov 2005 B1
6967903 Guanter Nov 2005 B2
6977868 Brewer et al. Dec 2005 B2
6982930 Hung Jan 2006 B1
6985107 Anson Jan 2006 B2
6987568 Dana Jan 2006 B2
6998553 Hisamune et al. Feb 2006 B2
7009915 Brewer et al. Mar 2006 B2
7016263 Gueissaz et al. Mar 2006 B2
7021442 Borgerson Apr 2006 B2
7031228 Born et al. Apr 2006 B2
7034237 Ferri et al. Apr 2006 B2
7081905 Raghunath et al. Jul 2006 B1
7102626 Denny, III Sep 2006 B2
7111365 Howie, Jr. Sep 2006 B1
7113450 Plancon et al. Sep 2006 B2
7119289 Lacroix Oct 2006 B2
7135673 Saint Clair Nov 2006 B2
7167083 Giles Jan 2007 B2
7187359 Numata Mar 2007 B2
7244927 Huynh Jul 2007 B2
7255473 Hiranuma et al. Aug 2007 B2
7265336 Hataguchi et al. Sep 2007 B2
7274303 Dresti et al. Sep 2007 B2
7285738 Lavigne et al. Oct 2007 B2
7286063 Gauthey Oct 2007 B2
7292741 Ishiyama et al. Nov 2007 B2
7358481 Yeoh et al. Apr 2008 B2
7369308 Tsuruta et al. May 2008 B2
7371745 Ebright et al. May 2008 B2
7385874 Vuilleumier Jun 2008 B2
7404667 Born et al. Jul 2008 B2
7465917 Chin et al. Dec 2008 B2
7468036 Rulkov et al. Dec 2008 B1
7506269 Lang et al. Mar 2009 B2
7520664 Wai Apr 2009 B2
7528824 Kong May 2009 B2
7545367 Sunda et al. Jun 2009 B2
7591582 Hiranuma et al. Sep 2009 B2
7593755 Colando et al. Sep 2009 B2
7605846 Watanabe Oct 2009 B2
7634263 Louch et al. Dec 2009 B2
7646677 Nakamura Jan 2010 B2
7655874 Akieda Feb 2010 B2
7682070 Burton Mar 2010 B2
7708457 Girardin May 2010 B2
7710456 Koshiba et al. May 2010 B2
7732724 Otani et al. Jun 2010 B2
7761246 Matsui Jul 2010 B2
7763819 Ieda et al. Jul 2010 B2
7772507 Orr Aug 2010 B2
7778115 Ruchonnet Aug 2010 B2
7781726 Matsui et al. Aug 2010 B2
RE41637 O'Hara et al. Sep 2010 E
7791587 Kosugi Sep 2010 B2
7791588 Tierling et al. Sep 2010 B2
7791597 Silverstein et al. Sep 2010 B2
7822469 Lo Oct 2010 B2
7856255 Tsuchiya et al. Dec 2010 B2
7858583 Schmidt et al. Dec 2010 B2
7865324 Lindberg Jan 2011 B2
7894957 Carlson Feb 2011 B2
7946758 Mooring May 2011 B2
8063892 Shahoian et al. Nov 2011 B2
8138488 Grot Mar 2012 B2
8143981 Washizu et al. Mar 2012 B2
8167126 Stiehl May 2012 B2
8169402 Shahoian et al. May 2012 B2
8188989 Levin et al. May 2012 B2
8195313 Fadell et al. Jun 2012 B1
8229535 Mensinger et al. Jul 2012 B2
8248815 Yang et al. Aug 2012 B2
8263886 Lin et al. Sep 2012 B2
8263889 Takahashi et al. Sep 2012 B2
8275327 Yi et al. Sep 2012 B2
8294670 Griffin et al. Oct 2012 B2
8312495 Vanderhoff Nov 2012 B2
8318340 Stimits Nov 2012 B2
8368677 Yamamoto Feb 2013 B2
8371745 Manni Feb 2013 B2
8373661 Lan et al. Feb 2013 B2
8405618 Colgate Mar 2013 B2
8410971 Friedlander Apr 2013 B2
8432368 Momeyer et al. Apr 2013 B2
8439559 Luk et al. May 2013 B2
8441450 Degner et al. May 2013 B2
8446713 Lai May 2013 B2
8456430 Oliver et al. Jun 2013 B2
8477118 Lan et al. Jul 2013 B2
8493190 Periquet et al. Jul 2013 B2
8508511 Tanaka et al. Aug 2013 B2
8525777 Stavely et al. Sep 2013 B2
8562489 Burton et al. Oct 2013 B2
8568313 Sadhu Oct 2013 B2
8576044 Chapman Nov 2013 B2
8593598 Chen et al. Nov 2013 B2
8607662 Huang Dec 2013 B2
8614881 Yoo Dec 2013 B2
8666682 LaVigne et al. Mar 2014 B2
8677285 Tsern et al. Mar 2014 B2
8704787 Yamamoto Apr 2014 B2
8711093 Ong et al. Apr 2014 B2
8717151 Forutanpour et al. May 2014 B2
8724087 Van De Kerkhof et al. May 2014 B2
8730167 Ming et al. May 2014 B2
8743088 Watanabe Jun 2014 B2
8783944 Doi Jul 2014 B2
8797153 Vanhelle et al. Aug 2014 B2
8804993 Shukla et al. Aug 2014 B2
8816962 Obermeyer et al. Aug 2014 B2
8824245 Lau et al. Sep 2014 B2
8847741 Birnbaum et al. Sep 2014 B2
8851372 Zhou Oct 2014 B2
8859971 Weber Oct 2014 B2
8860674 Lee et al. Oct 2014 B2
8863219 Brown et al. Oct 2014 B2
D717679 Anderssen Nov 2014 S
8878657 Periquet et al. Nov 2014 B2
8885856 Sacha Nov 2014 B2
8895911 Takahashi Nov 2014 B2
8905631 Sakurazawa et al. Dec 2014 B2
8908477 Peters Dec 2014 B2
8920022 Ishida et al. Dec 2014 B2
8922399 Bajaj et al. Dec 2014 B2
8928452 Kim et al. Jan 2015 B2
8948832 Hong et al. Feb 2015 B2
8954135 Yuen et al. Feb 2015 B2
8975543 Hakemeyer Mar 2015 B2
8994827 Mistry et al. Mar 2015 B2
9001625 Essery et al. Apr 2015 B2
9024733 Wouters May 2015 B2
9028134 Koshoji et al. May 2015 B2
9030446 Mistry et al. May 2015 B2
9034666 Vaganov et al. May 2015 B2
9039614 Yuen et al. May 2015 B2
9041663 Westerman May 2015 B2
9042971 Brumback et al. May 2015 B2
9049998 Brumback et al. Jun 2015 B2
9052696 Breuillot et al. Jun 2015 B2
9086717 Meerovitsch Jul 2015 B2
9086738 Leung et al. Jul 2015 B2
9091309 Battlogg Jul 2015 B2
9100493 Zhou Aug 2015 B1
9101184 Wilson Aug 2015 B2
9105413 Hiranuma et al. Aug 2015 B2
9123483 Ferri et al. Sep 2015 B2
9134807 Shaw et al. Sep 2015 B2
9141087 Brown et al. Sep 2015 B2
9176577 Jangaard et al. Nov 2015 B2
9176598 Sweetser et al. Nov 2015 B2
9202372 Reams et al. Dec 2015 B2
9213409 Redelsheimer et al. Dec 2015 B2
9223296 Yang et al. Dec 2015 B2
9241635 Yuen et al. Jan 2016 B2
9244438 Hoover et al. Jan 2016 B2
9256209 Yang et al. Feb 2016 B2
9277156 Bennett et al. Mar 2016 B2
9350850 Pope et al. May 2016 B2
9367146 Piot Jun 2016 B2
9386932 Chatterjee et al. Jul 2016 B2
9426275 Eim et al. Aug 2016 B2
9430042 Levin Aug 2016 B2
9437357 Furuki et al. Sep 2016 B2
9449770 Sanford et al. Sep 2016 B2
9501044 Jackson et al. Nov 2016 B2
9520100 Houjou et al. Dec 2016 B2
9532723 Kim Jan 2017 B2
9542016 Armstrong-Muntner Jan 2017 B2
9545541 Aragones et al. Jan 2017 B2
9552023 Joo et al. Jan 2017 B2
9599964 Gracia Mar 2017 B2
9600071 Rothkopf Mar 2017 B2
9607505 Rothkopf et al. Mar 2017 B2
9620312 Ely et al. Apr 2017 B2
9627163 Ely Apr 2017 B2
9632318 Goto et al. Apr 2017 B2
9632537 Memering Apr 2017 B2
9638587 Marquas et al. May 2017 B2
9651922 Hysek et al. May 2017 B2
9659482 Yang et al. May 2017 B2
9680831 Jooste et al. Jun 2017 B2
9709956 Ely et al. Jul 2017 B1
9753436 Ely et al. Sep 2017 B2
D800172 Akana Oct 2017 S
9800717 Ma et al. Oct 2017 B2
9836025 Ely et al. Dec 2017 B2
9873711 Hoover et al. Jan 2018 B2
9874945 Fukumoto Jan 2018 B2
9886006 Ely et al. Feb 2018 B2
9891590 Shim et al. Feb 2018 B2
9891651 Jackson et al. Feb 2018 B2
9891667 Jung et al. Feb 2018 B2
9898032 Hafez et al. Feb 2018 B2
9913591 Lapetina et al. Mar 2018 B2
9921548 Mitani Mar 2018 B2
9927902 Burr et al. Mar 2018 B2
9939923 Sharma Apr 2018 B2
9946297 Nazzaro et al. Apr 2018 B2
9952558 Ely Apr 2018 B2
9952682 Zhang et al. Apr 2018 B2
9971305 Ely et al. May 2018 B2
9971405 Holenarsipur et al. May 2018 B2
9971407 Holenarsipur et al. May 2018 B2
9979426 Na et al. May 2018 B2
10001817 Zambetti et al. Jun 2018 B2
10012550 Yang Jul 2018 B2
10018966 Ely et al. Jul 2018 B2
10019097 Ely et al. Jul 2018 B2
10037006 Ely Jul 2018 B2
10037081 Grant Jul 2018 B2
10048802 Shedletsky Aug 2018 B2
10057470 Sangeun Aug 2018 B2
10061399 Bushnell et al. Aug 2018 B2
10066970 Gowreesunker et al. Sep 2018 B2
10092203 Mirov Oct 2018 B2
10108016 Bosveld Oct 2018 B2
10114342 Kim et al. Oct 2018 B2
10145711 Boonsom et al. Dec 2018 B2
10175652 Ely et al. Jan 2019 B2
10190891 Rothkopf et al. Jan 2019 B1
10203662 Lin et al. Feb 2019 B1
10209148 Lyon et al. Feb 2019 B2
10216147 Ely et al. Feb 2019 B2
10222755 Coakley et al. Mar 2019 B2
10222756 Ely et al. Mar 2019 B2
10222909 Shedletsky et al. Mar 2019 B2
10234828 Ely et al. Mar 2019 B2
10241593 Chen Mar 2019 B2
10274905 Chung Apr 2019 B2
10296125 Ely et al. May 2019 B2
10331081 Ely et al. Jun 2019 B2
10331082 Ely et al. Jun 2019 B2
10332111 Mokhasi et al. Jun 2019 B2
10353487 Chung et al. Jul 2019 B2
10379629 Bushnell et al. Aug 2019 B2
10386940 Kim Aug 2019 B2
10401961 Cruz-Hernandez et al. Sep 2019 B2
10429959 Battlogg Oct 2019 B2
10474194 Ell et al. Nov 2019 B1
10503258 Holenarsipur et al. Dec 2019 B2
10509486 Bushnell et al. Dec 2019 B2
10524671 Lamego Jan 2020 B2
10534320 Ferri et al. Jan 2020 B2
10534900 Cheong et al. Jan 2020 B2
10551798 Bushnell et al. Feb 2020 B1
10572053 Ely et al. Feb 2020 B2
10579090 Jackson et al. Mar 2020 B2
10593617 Ashikaga et al. Mar 2020 B2
10599101 Rothkopf et al. Mar 2020 B2
10610157 Pandya et al. Apr 2020 B2
10613685 Shedletsky Apr 2020 B2
10627783 Rothkopf et al. Apr 2020 B2
10655988 Boonsom et al. May 2020 B2
10664074 Moussette et al. May 2020 B2
10732571 Ely et al. Aug 2020 B2
10765019 Werner Sep 2020 B2
10845764 Ely et al. Nov 2020 B2
10852700 Abramov Dec 2020 B2
10852855 Niu Dec 2020 B2
10871385 Kok Dec 2020 B2
10884549 Shedletsky et al. Jan 2021 B2
10936071 Pandya et al. Mar 2021 B2
10942491 Rothkopf Mar 2021 B2
10948880 Ely et al. Mar 2021 B2
10955937 Bushnell et al. Mar 2021 B2
10962930 Ely et al. Mar 2021 B2
10962935 Ely et al. Mar 2021 B1
10987054 Pandya et al. Apr 2021 B2
11000193 Tal et al. May 2021 B2
11002572 Boonsom et al. May 2021 B2
11029831 Block et al. Jun 2021 B2
11036318 Kuboyama Jun 2021 B2
11148292 Bryner et al. Oct 2021 B2
11181863 Ely et al. Nov 2021 B2
11194298 Roach et al. Dec 2021 B2
11194299 Taylor et al. Dec 2021 B1
11221590 Rothkopf et al. Jan 2022 B2
11347189 Herrera et al. May 2022 B1
11347351 Shedletsky et al. Jun 2022 B2
11385599 Ely et al. Jul 2022 B2
11474483 Rothkopf Oct 2022 B2
11513613 Bushnell et al. Nov 2022 B2
11531306 Ely et al. Dec 2022 B2
11537082 Ferri Dec 2022 B2
11561515 Beyhs Jan 2023 B2
20020101457 Lang Aug 2002 A1
20030174590 Arikawa et al. Sep 2003 A1
20040047244 Iino et al. Mar 2004 A1
20040082414 Knox Apr 2004 A1
20040130971 Ecoffet et al. Jul 2004 A1
20040264301 Howard et al. Dec 2004 A1
20050075558 Vecerina et al. Apr 2005 A1
20050088417 Mulligan Apr 2005 A1
20060250377 Zadesky et al. Nov 2006 A1
20070013775 Shin Jan 2007 A1
20070050054 Sambandam Guruparan et al. Mar 2007 A1
20070182708 Poupyrev et al. Aug 2007 A1
20070211042 Kim et al. Sep 2007 A1
20070222756 Wu et al. Sep 2007 A1
20070229671 Takeshita et al. Oct 2007 A1
20070247421 Orsley et al. Oct 2007 A1
20080130914 Cho Jun 2008 A1
20090051649 Rondel Feb 2009 A1
20090073119 Le et al. Mar 2009 A1
20090122656 Bonnet et al. May 2009 A1
20090146975 Chang Jun 2009 A1
20090152452 Lee et al. Jun 2009 A1
20090217207 Kagermeier et al. Aug 2009 A1
20090285443 Camp et al. Nov 2009 A1
20090312051 Hansson et al. Dec 2009 A1
20090312655 Lo Dec 2009 A1
20100033430 Kakutani et al. Feb 2010 A1
20100053468 Havrill Mar 2010 A1
20100081375 Rosenblatt et al. Apr 2010 A1
20100149099 Elias Jun 2010 A1
20110007468 Burton et al. Jan 2011 A1
20110090148 Li et al. Apr 2011 A1
20110158057 Brewer et al. Jun 2011 A1
20110242064 Ono et al. Oct 2011 A1
20110270358 Davis et al. Nov 2011 A1
20120067711 Yang Mar 2012 A1
20120068857 Rothkopf et al. Mar 2012 A1
20120075082 Rothkopf et al. Mar 2012 A1
20120112859 Park et al. May 2012 A1
20120113044 Strazisar et al. May 2012 A1
20120206248 Biggs Aug 2012 A1
20120272784 Bailey et al. Nov 2012 A1
20130037396 Yu Feb 2013 A1
20130087443 Kikuchi Apr 2013 A1
20130191220 Dent et al. Jul 2013 A1
20130235704 Grinberg Sep 2013 A1
20130261405 Lee et al. Oct 2013 A1
20130335196 Zhang et al. Dec 2013 A1
20140009397 Gillespie Jan 2014 A1
20140045547 Singamsetty et al. Feb 2014 A1
20140071098 You Mar 2014 A1
20140073486 Ahmed et al. Mar 2014 A1
20140132516 Tsai et al. May 2014 A1
20140197936 Biggs et al. Jul 2014 A1
20140340318 Stringer et al. Nov 2014 A1
20140347289 Suh et al. Nov 2014 A1
20140368442 Vahtola Dec 2014 A1
20140375579 Fujiwara Dec 2014 A1
20150049059 Zadesky et al. Feb 2015 A1
20150098309 Adams et al. Apr 2015 A1
20150124415 Goyal et al. May 2015 A1
20150186609 Utter, II Jul 2015 A1
20150221460 Teplitxky et al. Aug 2015 A1
20150293592 Cheong Oct 2015 A1
20150320346 Chen Nov 2015 A1
20150338642 Sanford Nov 2015 A1
20150341031 Marquas et al. Nov 2015 A1
20150366098 Lapetina et al. Dec 2015 A1
20160018846 Zenoff Jan 2016 A1
20160054813 Shediwy et al. Feb 2016 A1
20160058375 Rothkopf et al. Mar 2016 A1
20160061636 Gowreesunker et al. Mar 2016 A1
20160062623 Howard et al. Mar 2016 A1
20160069713 Ruh et al. Mar 2016 A1
20160109861 Kim et al. Apr 2016 A1
20160116306 Ferri et al. Apr 2016 A1
20160147432 Shi et al. May 2016 A1
20160170598 Zambetti et al. Jun 2016 A1
20160170608 Zambetti et al. Jun 2016 A1
20160170624 Zambetti et al. Jun 2016 A1
20160241688 Vossoughi Aug 2016 A1
20160253487 Sarkar et al. Sep 2016 A1
20160306446 Chung et al. Oct 2016 A1
20160320583 Hall, Jr. Nov 2016 A1
20160327911 Eim et al. Nov 2016 A1
20160338642 Parara et al. Nov 2016 A1
20160378069 Rothkopf et al. Dec 2016 A1
20160378070 Rothkopf et al. Dec 2016 A1
20160378071 Rothkopf et al. Dec 2016 A1
20170011210 Cheong et al. Jan 2017 A1
20170027461 Shin et al. Feb 2017 A1
20170031449 Karsten et al. Feb 2017 A1
20170045958 Battlogg et al. Feb 2017 A1
20170061863 Eguchi Mar 2017 A1
20170069443 Wang et al. Mar 2017 A1
20170069444 Wang et al. Mar 2017 A1
20170069447 Wang et al. Mar 2017 A1
20170090572 Holenarsipur Mar 2017 A1
20170090599 Kuboyama Mar 2017 A1
20170104902 Kim et al. Apr 2017 A1
20170139489 Chen et al. May 2017 A1
20170216519 Vouillamoz Aug 2017 A1
20170216668 Burton et al. Aug 2017 A1
20170238138 Aminzade Aug 2017 A1
20170251561 Fleck et al. Aug 2017 A1
20170269715 Kim et al. Sep 2017 A1
20170285404 Kubota et al. Oct 2017 A1
20170301314 Kim et al. Oct 2017 A1
20170307414 Ferri et al. Oct 2017 A1
20170331869 Bendahan et al. Nov 2017 A1
20170357465 Dzeryn et al. Dec 2017 A1
20180018026 Bushnell et al. Jan 2018 A1
20180059624 James Mar 2018 A1
20180136613 Ely et al. May 2018 A1
20180136686 Jackson et al. May 2018 A1
20180196517 Tan et al. Jul 2018 A1
20180225701 Han Aug 2018 A1
20180235491 Bayley et al. Aug 2018 A1
20180239306 Ely Aug 2018 A1
20180299834 Ely et al. Oct 2018 A1
20180307363 Ely et al. Oct 2018 A1
20180329368 Ely et al. Nov 2018 A1
20180335891 Shedletsky et al. Nov 2018 A1
20180337551 Park Nov 2018 A1
20180341342 Bushnell et al. Nov 2018 A1
20180364815 Moussette et al. Dec 2018 A1
20190017846 Boonsom et al. Jan 2019 A1
20190056700 Matsuno Feb 2019 A1
20190072902 Ely et al. Mar 2019 A1
20190072911 Ely et al. Mar 2019 A1
20190072912 Pandya et al. Mar 2019 A1
20190082547 Werner et al. Mar 2019 A1
20190088583 Ashikaga et al. Mar 2019 A1
20190146415 Ely et al. May 2019 A1
20190163324 Shedletsky May 2019 A1
20190250754 Ely et al. Aug 2019 A1
20190278232 Ely et al. Sep 2019 A1
20190294117 Ely et al. Sep 2019 A1
20190302902 Bushnell et al. Oct 2019 A1
20190317454 Holenarsipur et al. Oct 2019 A1
20190391539 Perkins et al. Dec 2019 A1
20200041962 Beyhs Feb 2020 A1
20200064774 Ely et al. Feb 2020 A1
20200064779 Pandya et al. Feb 2020 A1
20200073339 Roach et al. Mar 2020 A1
20200110473 Bushnell et al. Apr 2020 A1
20200159172 Bushnell et al. May 2020 A1
20200233380 Rothkopf Jul 2020 A1
20200233529 Shedletsky et al. Jul 2020 A1
20200271483 Boonsom Aug 2020 A1
20200310609 Ham Oct 2020 A1
20200326659 Ely et al. Oct 2020 A1
20210055696 Ely Feb 2021 A1
20210060783 Bryner et al. Mar 2021 A1
20210096688 Shedletsky et al. Apr 2021 A1
20210181682 Ely et al. Jun 2021 A1
20210181688 Ely et al. Jun 2021 A1
20210181690 Rothkopf et al. Jun 2021 A1
20210181691 Rothkopf et al. Jun 2021 A1
20210181692 Rothkopf et al. Jun 2021 A1
20210181865 Bushnell et al. Jun 2021 A1
20210255590 Ely et al. Aug 2021 A1
20210303081 Kuboyama et al. Sep 2021 A1
20210373501 Pandya et al. Dec 2021 A1
20210405594 Holenarsipur et al. Dec 2021 A1
20220043397 Ely et al. Feb 2022 A1
20220043402 Roach et al. Feb 2022 A1
20220075328 Taylor Mar 2022 A1
20220171344 Rothkopf et al. Jun 2022 A1
20220261111 Shedletsky et al. Aug 2022 A1
20220299944 Ely Sep 2022 A1
20220326660 Perkins Oct 2022 A1
Foreign Referenced Citations (195)
Number Date Country
1888928 Jan 1937 CH
1302740 Sep 2001 CN
1445627 Oct 2003 CN
1504843 Jun 2004 CN
1601408 Mar 2005 CN
1624427 Jun 2005 CN
1792295 Jun 2006 CN
1825224 Aug 2006 CN
101035148 Sep 2007 CN
101201587 Jun 2008 CN
201081979 Jul 2008 CN
101404928 Apr 2009 CN
201262741 Jun 2009 CN
101750958 Jun 2010 CN
201638168 Nov 2010 CN
101923314 Dec 2010 CN
102067070 May 2011 CN
102216959 Oct 2011 CN
202008579 Oct 2011 CN
102590925 Jul 2012 CN
102741772 Oct 2012 CN
102890443 Jan 2013 CN
202710937 Jan 2013 CN
103177891 Jun 2013 CN
103191557 Jul 2013 CN
103253067 Aug 2013 CN
103645804 Mar 2014 CN
203564224 Apr 2014 CN
103852090 Jun 2014 CN
203630524 Jun 2014 CN
103956006 Jul 2014 CN
203693601 Jul 2014 CN
203705837 Jul 2014 CN
203732900 Jul 2014 CN
103995456 Aug 2014 CN
104020660 Sep 2014 CN
203941395 Nov 2014 CN
104777987 Apr 2015 CN
104685794 Jun 2015 CN
204479929 Jul 2015 CN
204496177 Jul 2015 CN
104880937 Sep 2015 CN
104898406 Sep 2015 CN
204650147 Sep 2015 CN
105022947 Nov 2015 CN
105096979 Nov 2015 CN
105339871 Feb 2016 CN
105547146 May 2016 CN
105556433 May 2016 CN
105683876 Jun 2016 CN
105760067 Jul 2016 CN
105955519 Sep 2016 CN
205645648 Oct 2016 CN
205721636 Nov 2016 CN
205750744 Nov 2016 CN
106236051 Dec 2016 CN
106557218 Apr 2017 CN
206147524 May 2017 CN
206209589 May 2017 CN
107111342 Aug 2017 CN
107122088 Sep 2017 CN
107966895 Apr 2018 CN
209560397 Oct 2019 CN
209625187 Nov 2019 CN
106125968 Nov 2022 CN
3706194 Sep 1988 DE
102008023651 Nov 2009 DE
102016215087 Mar 2017 DE
0165548 Dec 1985 EP
0556155 Aug 1993 EP
1345095 Sep 2003 EP
1519452 Mar 2005 EP
1669724 Jun 2006 EP
1832969 Sep 2007 EP
2375295 Oct 2011 EP
2579186 Apr 2013 EP
2720129 Apr 2014 EP
2884239 Jun 2015 EP
2030093 Oct 1970 FR
2801402 May 2001 FR
2433211 Jun 2007 GB
S52151058 Dec 1977 JP
S52164551 Dec 1977 JP
S53093067 Aug 1978 JP
S54087779 Jun 1979 JP
S5708582 Jan 1982 JP
S5734457 Feb 1982 JP
S60103936 Jun 1985 JP
S60103937 Jun 1985 JP
H02285214 Nov 1990 JP
H04093719 Mar 1992 JP
H04157319 May 1992 JP
H05203465 Aug 1993 JP
H05312595 Nov 1993 JP
H06050927 Dec 1994 JP
H06331761 Dec 1994 JP
H06347293 Dec 1994 JP
H07116141 May 1995 JP
H0914941 Jan 1997 JP
H10161811 Jun 1998 JP
H11121210 Apr 1999 JP
H11191508 Jul 1999 JP
2000258559 Sep 2000 JP
2000316824 Nov 2000 JP
2000337892 Dec 2000 JP
2001084934 Mar 2001 JP
2001167651 Jun 2001 JP
2001202178 Jul 2001 JP
2001215288 Aug 2001 JP
2001524206 Nov 2001 JP
2002071480 Mar 2002 JP
2002165768 Jun 2002 JP
2003050668 Feb 2003 JP
2003151410 May 2003 JP
2003331693 Nov 2003 JP
2004184396 Jul 2004 JP
2004028979 Nov 2004 JP
2005017011 Jan 2005 JP
2005063200 Mar 2005 JP
2005099023 Apr 2005 JP
2005108630 Apr 2005 JP
2006101505 Apr 2006 JP
2006164275 Jun 2006 JP
3852854 Dec 2006 JP
2007101380 Apr 2007 JP
2007149620 Jun 2007 JP
2007248176 Sep 2007 JP
2007311153 Nov 2007 JP
2008053980 Mar 2008 JP
2008122124 May 2008 JP
2008122377 May 2008 JP
2008170436 Jul 2008 JP
2008235226 Oct 2008 JP
2009009382 Jan 2009 JP
2009070657 Apr 2009 JP
2009519737 May 2009 JP
2009540399 Nov 2009 JP
2010032545 Feb 2010 JP
2010515153 May 2010 JP
2010165001 Jul 2010 JP
2010186572 Aug 2010 JP
2010243344 Oct 2010 JP
2010244797 Oct 2010 JP
2011021929 Feb 2011 JP
2011165468 Aug 2011 JP
2011221659 Nov 2011 JP
2012053801 Mar 2012 JP
2013057516 Mar 2013 JP
2013079961 May 2013 JP
2013524189 Jun 2013 JP
3190075 Apr 2014 JP
5477393 Apr 2014 JP
2014512556 May 2014 JP
2014112222 Jun 2014 JP
2014174031 Sep 2014 JP
2018510451 Apr 2018 JP
20010030477 Apr 2001 KR
200278568 Mar 2002 KR
WO2003032538 Apr 2003 KR
20070011685 Jan 2007 KR
20070014247 Feb 2007 KR
100754674 Sep 2007 KR
20080028935 Apr 2008 KR
20080045397 May 2008 KR
2020100007563 Jul 2010 KR
20110011393 Feb 2011 KR
20110012784 Feb 2011 KR
20110103761 Sep 2011 KR
20110113368 Oct 2011 KR
20130036038 Apr 2013 KR
20130131873 Dec 2013 KR
20140051391 Apr 2014 KR
20140064689 May 2014 KR
20140104388 Aug 2014 KR
20160017070 Feb 2016 KR
20160048967 May 2016 KR
20170106395 Sep 2017 KR
1040225 Nov 2014 NL
129033 Nov 2013 RO
200633681 Oct 2006 TW
WO2001022038 Mar 2001 WO
WO2001069567 Sep 2001 WO
WO2010058376 May 2010 WO
WO2012083380 Jun 2012 WO
WO2012094805 Jul 2012 WO
WO2014018118 Jan 2014 WO
WO2014200766 Dec 2014 WO
WO2015147756 Oct 2015 WO
WO2016080669 May 2016 WO
WO2016104922 Jun 2016 WO
WO2016155761 Oct 2016 WO
WO2016196171 Dec 2016 WO
WO2016208835 Dec 2016 WO
WO2017013278 Jan 2017 WO
WO2020173085 Sep 2020 WO
Non-Patent Literature Citations (32)
Entry
Author Unknown, “Desirable Android Wear smartwatch from LG,” Gulf News, Dubai, 3 pages, Jan. 30, 2015.
Author Unknown, “Fossil Q ups smartwatch game with handsome design and build,” Business Mirror, Makati City, Philippines, 3 pages, Dec. 20, 2016.
Author Unknown, “How Vesag Helps Kids Women and Visitors,” http://www.sooperarticles.com/health-fitness-articles/children-health-articles/how-vesag-helps-kids-women-visitors-218542.html, 2 pages, at least as early as May 20, 2015.
Author Unknown, “mHealth,” http://mhealth.vesag.com/?m=201012, 7 pages, Dec. 23, 2010.
Author Unknown, “mHealth Summit 2010,” http://www.virtualpressoffice.com/eventsSubmenu.do?page=exhibitorPage&showId=1551&companyId=5394, 5 pages, Nov. 18, 2010.
Author Unknown, “MyKronoz ZeTime: World's Most Funded Hybrid Smartwatch Raised over $3M on Kickstarter, Running until Apr. 27th,” Business Wire, New York, New York, 3 pages, Apr. 21, 2017.
Author Unknown, “RedEye mini Plug-in Universal Remote Adapter for iPhone, iPod touch and iPad,” Amazon.com, 4 pages, date unknown.
Author Unknown, “Re iPhone Universal Remote Control—Infrared Remote Control Accessory for iPhone and iPod touch,” http://www.amazon.com/iPhone-Universal-Remote-Control-Accessory/dp/tech-data/B0038Z4 . . . , 2 pages, at least as early as Jul. 15, 2010.
Author Unknown, “Vesag Wrist Watch for Dementia Care from Vyzin,” http://vyasa-kaaranam-ketkadey.blogspot.com/2011/03/vesag-wrist-watch-for-dementia-care.html, 2 pages, Mar. 31, 2011.
Author Unknown, “Vyzin Electronics Private Limited launches Vesag Watch,” http://www.virtualpressoffice.com/showJointPage.do?page=jp&showId=1544, 5 pages, Jan. 6, 2011.
Author Unknown, “Vyzin Unveiled Personal Emergency Response System (PERS) with Remote Health Monitoring That Can Be Used for Entire Family,” http://www.24-7pressrelease.com/press-release/vyzin-unveiled-personal-emergency-response-system-pers-with-remote-health-monitoring-that-can-be-used-for-entire-family-219317.php, 2 pages, Jun. 17, 2011.
Author Unknown, “DeskThorityNet, Optical Switch Keyboards,” http://deskthority.net/keyboards-f2/optical-switch-keyboards-t1474.html, 22 pages, Jul. 11, 2015.
Epstein et al., “Economical, High-Performance Optical Encoders,” Hewlett-Packard Journal, pp. 99-106, Oct. 1988. [text only version].
GreyB, “Google Watch: Convert your arm into a keyboard,” http://www.whatafuture.com/2014/02/28/google-smartwatch/#sthash.Yk35cDXK.dpbs, 3 pages, Feb. 28, 2014.
IBM, “Additional Functionality Added to Cell Phone via “Learning” Function Button,” www.ip.com, 2 pages, Feb. 21, 2007.
Kim, Joseph, “2010 mHealth Summit Emerges as Major One-Stop U.S. Venue for Mobile Health,” http://www.medicineandtechnology.com/2010/08/2010-mhealth-summit-emerges-as-major.html, 3 pages, Aug. 26, 2010.
Krishnan et al., “A Miniature Surface Mount Reflective Optical Shaft Encoder,” Hewlett-Packard Journal, Article 8, pp. 1-6, Dec. 1996.
Narayanaswami et al., “Challenges and considerations for the design and production of a purpose-optimized body-worn wrist-watch computer,” Defense, Security, and Cockpit Displays, 2004.
M.T. Raghunath et al., User Interfaces for Applications on a Wrist Watch, Personal and Ubiquitous Computing, vol. 6, No. 1, 2002, Springer.
Rick, “How Vesag Helps Health Conscious Citizens,” http://sensetekgroup.com/2010/11/29/wireless-health-monitoring-system/, 2 pages, Nov. 29, 2010.
Sadhu, Rajendra, “How Vesag Helps People Who Want to ‘Be There’?,” http://ezinearticles.com/?How-Vesag-Helps-People-Who-Want-to-Be-There?&id-5423873, 1 page, Nov. 22, 2010.
Sadhu, Rajendra, “Mobile Innovation Helps Dementia and Alzheimer's Patients,” http://www.itnewsafrica.com/2010/11/mobile-innovation-helps-dementia-andalzheimer%E2%80%99s-patients/, 3 pages, Nov. 22, 2010.
Sherr, Sol, “Input Devices,” p. 55, Mar. 1988.
Tran et al., “Universal Programmable Remote Control/Telephone,” www.ip.com, 2 pages, May 1, 1992.
U.S. Appl. No. 17/377,482, filed Jul. 16, 2021, pending.
U.S. Appl. No. 17/840,222, filed Jun. 14, 2022, pending.
U.S. Appl. No. 17/899,498, filed Aug. 31, 2022, pending.
U.S. Appl. No. 17/951,020, filed Sep. 22, 2022, pending.
U.S. Appl. No. 17/951,973, filed Sep. 23, 2022, pending.
U.S. Appl. No. 17/989,057, filed Nov. 17, 2022, pending.
U.S. Appl. No. 18/074,358, filed Dec. 2, 2022, pending.
U.S. Appl. No. 18/075,253, filed Dec. 5, 2022, pending.
Related Publications (1)
Number Date Country
20230161299 A1 May 2023 US
Provisional Applications (1)
Number Date Country
62713745 Aug 2018 US
Continuations (1)
Number Date Country
Parent 16191349 Nov 2018 US
Child 18094930 US