Personal information technology has rapidly evolved with the introduction of smartphones. Such devices are nearly ubiquitous. It is, however, increasingly challenging to conveniently access and carry smartphones due to expanding sizes and form factors. They can also be distracting to the user and those nearby. Wearable devices with smaller form factors have more recently been used to provide users with activity information, notifications and other functionality in a manner that is more user-friendly and less distracting.
There are different types of wearable devices. One type that is becoming more and more popular is the smartwatch. In addition to telling time, smartwatches may run various apps and or perform in a manner similar to a smartphone. Thus, smartwatches can address the smartphone size issue, and may provide relevant information to a user in a more discreet manner than a smartphone.
Hybrid smartwatches incorporate digital technology with an analog timepiece in a wristwatch form factor. It is possible to treat the graphical display of the digital technology and the mechanical hands of the analog display as separate display surfaces. However, aspects of the disclosure employ symbiotic and synchronized use of both display surfaces to provide new types of information to the user and to otherwise enhance existing applications. This is done in a way that leverages the strengths and efficiencies of the analog and digital components, while conserving power and extending battery life.
Aspects of the technology involve a hybrid smartwatch configured to provide mechanical expressivity to a user. The hybrid smartwatch comprises a user interface subsystem, a mechanical movement control subsystem and one or more processors. The user interface subsystem includes a digital graphical display and a mechanical movement having one or more watch hands. The one or more watch hands are arranged along a face of the hybrid smartwatch. The mechanical movement control subsystem is operatively coupled to the one or more watch hands, and is configured to adjust the one or more watch hands in one or both of clockwise and counterclockwise directions. The one or more processors are operatively coupled to the digital graphical display and the mechanical movement control subsystem. The one or more processors are configured to select an expressive visualization to be presented to a user using the one or more watch hands. The expressive visualization provides a predetermined adjustment of one or more of the watch hands. The one or more processors are also configured to determine whether to concurrently present visual information on the digital graphical display along with the adjustment of the one or more watch hands and to instruct the mechanical movement control subsystem to adjust the one or more watch hands according to the selected expressive visualization. Upon a determination to concurrently present the visual information on the digital graphical display, the one or more processors are configured to cause the digital graphical display to present the visual information contemporaneously with the adjustment of the one or more watch hands.
In one example, the one or more processors are configured to select the expressive visualization based on one or more identified items of information to be provided to the user. In another example, the mechanical movement control subsystem includes a plurality of actuators, each actuator configured to rotate a given one of the watch hands. The digital graphical display may comprise a non-emissive display.
In one scenario, the expressive visualization is a buzzing visualization. Here, the mechanical movement control subsystem is configured to adjust the one or more watch hands to provide the buzzing visualization by oscillating one or more of the watch hands at a selected oscillating rate between two and five repetitions.
In another scenario, the expressive visualization is an anthropomorphic behavior. Here, the mechanical movement control subsystem is configured to adjust the one or more watch hands to provide the anthropomorphic behavior by rotating a pair of the watch hands towards and away from one another by either a same amount a plurality of times or by a different amount a plurality of times.
In a further scenario, the expressive visualization is a facial visualization. Here, the mechanical movement control subsystem is configured to align a first one of the watch hands at approximately 9 o'clock on the watch face and align a second one of the watch hands at approximately 3 o'clock on the watch face, and to provide the facial visualization by simultaneously adjusting the first and second watch hands clockwise and counterclockwise by between 2-15°. The one or more processors are configured to cause the digital graphical display to present the visual information along with the adjusting of the first and second watch hands. The visual information includes one or more facial features.
In yet another scenario, the expressive visualization is an information hiding visualization and the visual information is a notification to the user. Here, the mechanical movement control subsystem is configured to adjust the one or more watch hands to provide the information hiding visualization by arranging a first one of the watch hands at a particular location along the watch face, and adjusting a second one of the watch hands to appear to tap down on the notification multiple times by moving towards and away from the first watch hand. In this case, with each tap the notification is reduced in size.
In another scenario, the expressive visualization is an information revealing visualization and the visual information is a notification to the user. Here, the mechanical movement control subsystem is configured to adjust the one or more watch hands to provide the information revealing visualization by arranging a first one of the watch hands at a particular location along the watch face, and adjusting a second one of the watch hands to appear to open up the notification multiple times. In this case, with each adjustment of the second watch hand the notification increases in size.
In a further scenario, the expressive visualization is a physics simulation and the visual information is a selected object. Here, the mechanical movement control subsystem is configured to adjust one or more of the watch hands to provide the physics simulation by adjusting the one or more watch hands in selected directions by between 1-180°. In this case, with each adjustment the selected object is either apparently moved by a given one of the watch hands, or a given one of the watch hands is apparently moved by the selected object.
In accordance with other aspects of the disclosure, a method of providing mechanical expressivity to a user with a hybrid smartwatch is provided. The hybrid smartwatch includes a digital graphical display and one or more physical watch hands arranged along a face of the hybrid smartwatch. The method includes selecting, by one or more processors, an expressive visualization to be presented to a user using the one or more watch hands. The expressive visualization provides a predetermined adjustment of one or more of the watch hands. The method also includes determining, by the one or more processors, whether to concurrently present visual information on the digital graphical display along with the adjustment of the one or more watch hands; instructing, by the one or more processors, a mechanical movement control subsystem of the hybrid smartwatch to adjust the one or more watch hands according to the selected expressive visualization; and upon a determination to concurrently present the visual information on the digital graphical display, the one or more processors causing the digital graphical display to present the visual information contemporaneously with the adjustment of the one or more watch hands.
In one example, the expressive visualization is selected based on one or more identified items of information to be provided to the user. In another example, the expressive visualization is a buzzing visualization. Here, the buzzing visualization is provided by oscillating one or more of the watch hands at a selected oscillating rate between two and five repetitions. In this case, the one or more watch hands may oscillate at a rate of between 1-6 Hz.
In a further example, the expressive visualization is an anthropomorphic behavior. Here, the one or more watch hands are adjusted to provide the anthropomorphic behavior by rotating a pair of the watch hands towards and away from one another by either a same amount a plurality of times or by a different amount a plurality of times. In this case, the different amount may include a first one of the watch hands appearing to clap against a stationary second one of the watch hands.
In yet another example, the expressive visualization is a facial visualization. Here, a first one of the watch hands is aligned at approximately 9 o'clock on the watch face and a second one of the watch hands is aligned at approximately 3 o'clock on the watch face, and providing the facial visualization is performed by simultaneously adjusting the first and second watch hands clockwise and counterclockwise by between 2-15°. The one or more processors cause the digital graphical display to present the visual information along with the adjusting of the first and second watch hands. The visual information includes one or more facial features.
In a further example, the expressive visualization is an information hiding visualization and the visual information is a notification to the user. Here, the one or more watch hands are adjusted to provide the information hiding visualization by arranging a first one of the watch hands at a particular location along the watch face, and adjusting a second one of the watch hands to appear to tap down on the notification multiple times by moving towards and away from the first watch hand. With each tap the notification is reduced in size.
In yet another example, the expressive visualization is an information revealing visualization and the visual information is a notification to the user. Here, the one or more watch hands are adjusted to provide the information revealing visualization by arranging a first one of the watch hands at a particular location along the watch face, and adjusting a second one of the watch hands to appear to open up the notification multiple times. With each adjustment of the second watch hand the notification increases in size.
And in yet another example the expressive visualization is a physics simulation and the visual information is a selected object. Here, the physics simulation is provided by adjusting the one or more watch hands in selected directions by between 1-180°. With each adjustment, the selected object either is apparently moved by a given one of the watch hands, or a given one of the watch hands is apparently moved by the selected object.
The analog and digital display elements in a hybrid smartwatch as discussed herein provide a rich graphical interface in a wearable form factor. Programmable materials are utilized in conjunction with electromechanical control of the watch hands. The programmable materials may include electronic ink (E-ink) pigments or other non-emissive arrangements that are capable of displaying dynamic patterns. A mechanical movement control manages positioning of the watch hands. For instance, micro-stepper motors provide control, positioning and mechanical expressivity via resulting hand movement. While these servo-controlled hands are overlaid on a graphical display, the system coordinates the analog and digital displays to share responsibilities for the user interface.
As shown in
The memory 114 stores information accessible by the one or more processors 112, including instructions 116 and data 118 that may be executed or otherwise used by each processor 112. The memory 114 may be, e.g., a solid state memory or other type of non-transitory memory capable of storing information accessible by the processor(s), including write-capable and/or read-only memories.
The instructions 116 may be any set of instructions to be executed directly (such as machine code) or indirectly (such as scripts) by the processor. For example, the instructions may be stored as computing device code on the computing device-readable medium. In that regard, the terms “instructions” and “programs” may be used interchangeably herein. The instructions may be stored in object code format for direct processing by the processor, or in any other computing device language including scripts or collections of independent source code modules that are interpreted on demand or compiled in advance. Functions, methods and routines of the instructions are explained in detail below.
The data 118 may be retrieved, stored or modified by processor 112 in accordance with the instructions 116. As an example, data 118 of memory 114 may store predefined scenarios. A given scenario may identify a set of scenario requirements including visual effect types, content to be presented and pre-defined interactions between the watch hands and the graphical display. For instance, particular movements of the watch hands in combination with selected notification types may be included in the predefined scenarios.
User interface 120 includes various I/O elements. For instance, one or more user inputs 122 such as mechanical actuators 124 and/or soft actuators 126 are provided. The mechanical actuators 124 may include a crown, buttons, switches and other components. The soft actuators 126 may be incorporated into a touchscreen cover, e.g., a resistive or capacitive touch screen.
As noted above, one aspect of the technology is the use of analog watch elements enhanced with digital capabilities and connectivity. Thus, both a digital graphical display 128 and a mechanical movement (analog display) 130 are provided in the user interface 120 of the hybrid watch 100. The graphical display 128 may be an E-ink or other type of electrophoretic display. Alternatively, other non-emissive arrangements or even emissive displays may be employed. The mechanical movement 130 includes hour and minute hands. A seconds hand and/or other hand indicators may also be employed.
An example watch configuration 200 with such a user interface 120 is shown in
Returning to
The user interface 120 may also include one or more speakers, transducers or other audio outputs 138. A haptic interface or other tactile feedback 140 is used to provide non-visual and non-audible information to the wearer. And one or more cameras 142 can be included on the housing, band or incorporated into the display.
The hybrid smartwatch 100 also includes a position determination module 144, which may include a GPS chipset 146 or other positioning system components. Information from the accelerometer 134, gyroscope 136 and/or from data received or determined from remote devices (e.g., wireless base stations or wireless access points), can be employed by the position determination module 144 to calculate or otherwise estimate the physical location of the smartwatch 100.
In order to obtain information from and send information to remote devices, the smartwatch 100 may include a communication subsystem 150 having a wireless network connection module 152, a wireless ad hoc connection module 154, and/or a wired connection module 156. While not shown, the communication subsystem 150 has a baseband section for processing data and a transceiver section for transmitting data to and receiving data from the remote devices. The transceiver may operate at RF frequencies via one or more antennae. The wireless network connection module 152 may be configured to support communication via cellular, LTE, 4G and other networked architectures. The wireless ad hoc connection module 154 may be configured to support Bluetooth®, Bluetooth LE, near field communications, and other non-networked wireless arrangements. And the wired connection 156 may include a USB, micro USB, USB type C or other connector, for example to receive data and/or power from a laptop, tablet, smartphone or other device.
Returning to
As noted above, the micro-stepper motors or other actuation mechanism(s) 412 are configured to provide control, positioning and mechanical expressivity via resulting hand movement, for instance by causing the one or more hands to rotate or otherwise adjust in a predetermined manner. The micro-stepper motors enable unidirectional or bidirectional rotation of the hands (clockwise and/or counterclockwise) through electrical pulses that may be controlled by the one or more processors 112 of
According to one scenario, the electrical pulses have a pulse width on the order of 2 ms, for instance between about 1.75-2.25 ms. Here, the minute and hour hands may have one the order of 120 steps per revolution, although the number of steps for each hand may vary. In other examples, the pulse widths and steps per revolution may vary, e.g., by +/−10%, or more or less. In some scenarios, the steps are related to the application. For instance, time-related apps may have a 60 step resolution, while other apps may employ a higher (or lower) number of steps. And the pulse width may vary based on motor characteristics of the actuator(s). The timing and duration of the pulses and steps is controlled, for example, by the one or more processors 112 of
The graphical display 404 includes, in this scenario, a non-emissive display. The non-emissive display is bi-stable, which does not require power to maintain the displayed information. The non-emissive display may be arranged as a circle or other shape depending on the overall appearance of the smartwatch. Nonetheless, the display includes a central opening adapted to receive the mechanical movement component 406 of
The control and interplay of the pixels of the display and the positioning of the hands is performed cooperatively to create optimal user interfaces for different scenarios. For example, the user interfaces may be optimized according to predetermined criteria, which can vary with different interactions, applications and user preferences.
Aspects of the technology employ physical motion of the watch hands as a means for expressivity. Here, the hands may be used for visual mechatronic effects as a complement or alternative to the information presented on the digital display. For instance, the hybrid smartwatch is able to attract the user's attention with motion of the hands when illumination or sound is inappropriate or insufficient. Various scenarios include buzzing, clapping, stylizing visual features, hiding or minimizing information, revealing information, and influence of display objects on physical hand and vice versa. These scenarios are described with reference to the drawings.
Conversely,
In contrast,
The examples of
At block 1106, the processors determine whether to concurrently present visual information on the graphical display along with the adjustment of the one or more watch hands. Not every expressive visualization necessarily includes the presentation of corresponding visual information on the graphical display. At block 1108, the processors instruct or otherwise manage the mechanical movement control to adjust the hand(s), in accordance with the selected expressive visualization. This may include sending control signals to the mechanical movement subsystem or electrical pulses directly to micro-stepper motors to achieve the intended hand motion.
At block 1110, when it is determined that visual information will also be presented on the graphical display, the one or more processors cause the graphical display to generate the graphical element(s) thereon. This is done in conjunction with the expressive visualization of the hand adjustment. According to one aspect, the visual information of the graphical element(s) is synced with the mechanical adjustment of the hand(s), such as shown in
It should be understood that these operations do not have to be performed in the precise order described. Rather, various steps can be handled in a different order or simultaneously, and steps may also be added or omitted.
Depending on the specific arrangement, an emissive display, such as an OLED screen, may be employed instead of a non-emissive display.
Unless otherwise stated, the foregoing alternative examples are not mutually exclusive, but may be implemented in various combinations to achieve unique advantages. As these and other variations and combinations of the features discussed above can be utilized without departing from the subject matter defined by the claims, the foregoing description of the embodiments should be taken by way of illustration rather than by way of limitation of the subject matter defined by the claims. In addition, the provision of the examples described herein, as well as clauses phrased as “such as,” “including” and the like, should not be interpreted as limiting the subject matter of the claims to the specific examples; rather, the examples are intended to illustrate only one of many possible embodiments. Further, the same reference numbers in different drawings can identify the same or similar elements.
The present application is a continuation of U.S. patent application Ser. No. 15/960,808, filed Apr. 24, 2018, which is related to U.S. Provisional Application No. 62/661,769, filed Apr. 24, 2018, the entire disclosures of which are incorporated by reference herein.
Number | Name | Date | Kind |
---|---|---|---|
5329501 | Meister et al. | Jul 1994 | A |
5528559 | Lucas | Jun 1996 | A |
6700836 | Satodate et al. | Mar 2004 | B1 |
8693291 | Umamoto | Apr 2014 | B2 |
9348320 | Defazio et al. | May 2016 | B1 |
10088809 | Lee et al. | Oct 2018 | B2 |
10146188 | Katzer et al. | Dec 2018 | B2 |
10222750 | Bang et al. | Mar 2019 | B2 |
20030165086 | Brewer et al. | Sep 2003 | A1 |
20160306328 | Ko et al. | Oct 2016 | A1 |
20170068217 | Chen et al. | Mar 2017 | A1 |
20170082983 | Katzer et al. | Mar 2017 | A1 |
20170199498 | Hsieh et al. | Jul 2017 | A1 |
20170269555 | Poguntke | Sep 2017 | A1 |
20170322696 | Hartman | Nov 2017 | A1 |
20180074464 | Essery et al. | Mar 2018 | A1 |
20190258208 | Essery | Aug 2019 | A1 |
Number | Date | Country |
---|---|---|
1188552 | Jul 1998 | CN |
1350663 | May 2002 | CN |
101308361 | Nov 2008 | CN |
104375774 | Feb 2015 | CN |
106054577 | Oct 2016 | CN |
106569243 | Apr 2017 | CN |
107850869 | Mar 2018 | CN |
3091421 | Nov 2016 | EP |
2016141393 | Sep 2016 | WO |
2018067170 | Apr 2018 | WO |
Entry |
---|
International Preliminary Report on Patentability for International Application No. PCT/US2019/025563 dated Nov. 5, 2020. 7 pages. |
Notification of the First Office Action for Chinese Patent Application No. 201980023454.1 dated Mar. 31, 2021. 6 pages. |
Liu et al., “Characterizing Smartwatch Usage in the Wild”, MobiSys, Jun. 19-23, 2017, 14 pages. |
Palladino, Valentina, “Misfit Phase proves hybrid smartwatches could replace basic activity trackers”, Ars Technica, Jan. 15, 2017, 14 pages. |
Palladino, Valentina, “Revisiting Fossil hybrid smartwatches: From curiousity to practicality”, Ars Technica, Dec. 20, 2017, 6 pages. |
Wenig et al., “WatchThru: Expanding Smartwatch Displays with Mid-air Visuals and Wrist-worn Augmented Reality”, CHI, May 6-11, 2017, 6 pages. |
Leithinger et al., “Sublimate: State-Changing Virtual and Physical Rendering to Augment Interaction with Shape Displays”, CHI, Apr. 27-May 2, 2013, Paris, France, 10 Pages. |
Bell et al., “Dynamic Space Management for User Interfaces”, UIST 2000, San Diego, CA, 10 pages. |
Burstyn et al, “DisplaySkin: Exploring Pose-Aware Displays on a Flexible Electrophoretic Wristband”, TEI 2015, Jan. 15-19, 2015, Stanford, CA, 8 pages. |
Chen et al., “Duet: Exploring Joint Interactions on a Smart Phone and a Smart Watch”, CHI 2014, One of a CHInd, Toronto, ON, Canada, 10 pages. |
Lyons et al., “Facet: A Multi-Segment Wrist Worn System”, UIST, Oct. 7-10, 2012, Cambridge, Massachusetts, USA. 7 pages. |
Xu et al., “Shimmering Smartwatches: Exploring the Smartwatch Design Space”, TEI 2015, Jan. 15-19, 2015, Stanford, CA, USA, 8 pages. |
Olberding et al., “AugmentedForearm: Exploring the Design Space of a Display-enhanced Forearm”, 4th Augmented Human International Conference (AH'13), Mar. 7-8, 2013, Stuttgart, Germany, 4 pages. |
Pohl et al., “ScatterWatch: Subtle Notifications via Indirect Illumination Scattered in the Skin”, MobileHCI '16, Sep. 6-9, 2016, Florence, Italy, 10 pages. |
Lyons, “What Can a Dumb Watch Teach a Smartwatch? Informing the Design of Smartwatches”, ISWC '15, Sep. 7-11, 2015, Osaka, Japan, 8 pages. |
Jeong et al., “SmartwatchWearing Behavior Analysis: A Longitudinal Study”, Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, No. 3, Article 60. Publication date: Sep. 2017, 31 pages. |
Song et al., “Hot & Tight: Exploring Thermo and Squeeze Cues Recognition on Wrist Wearables”, ISWC '15, Sep. 7-11, 2015, Osaka, Japan, 4 pages. |
Xiao et al., “Expanding the Input Expressivity of Smartwatches with Mechanical Pan, Twist, Tilt and Click”, CHI 2014, Apr. 26-May 1, 2014, Toronto, ON, Canada, 4 pages. |
Laput et al., “Skin Buttons: Cheap, Small, Low-Power and Clickable Fixed-Icon Laser Projections”, UIST '14, Oct. 5-8, 2014, Honolulu, HI, USA, 6 pages. |
Gong et al., “Cito: An Actuated Smartwatch for Extended Interactions”, CHI 2017, May 6-11, 2017, Denver, CO, USA, 15 pages. |
Seyed et al, “Doppio: A Reconfigurable Dual-Face Smartwatch for Tangible Interaction”, CHI 2016, May 7-12, 2016, San Jose, CA, USA, 12 pages. |
Weigel et al., “SkinMarks: Enabling Interactions on Body Landmarks Using Conformal Skin Electronics”, CHI 2017, May 6-11, 2017, Denver, CO, USA, 11 pages. |
International Search Report and Written Opinion for International Application No. PCT/US2019/025563 dated Jul. 17, 2019. 12 pages. |
Non-Emissive Display; google.come; Mar. 16, 2020. |
Office Action for European Patent Application No. 19723526.0 dated Aug. 30, 2022. 3 pages. |
Office Action for Chinese Patent Application No. 202111305368.5 dated Jan. 19, 2023. 6 pages. |
Number | Date | Country | |
---|---|---|---|
20210003972 A1 | Jan 2021 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15960808 | Apr 2018 | US |
Child | 17023784 | US |