This relates generally to electronic devices, and, more particularly, to electronic devices with visual output devices.
Electronic devices often include visual output devices. For example, a wrist-watch may include a display for providing visual output to a user. Touch-sensitive displays may be used in gathering user input.
It can be challenging to incorporate visual output devices into electronic devices. In some scenarios, displays may be too bulky to incorporate into wearable devices without sacrificing comfort. In other scenarios, displays may convey too much information and may be a distraction for the user.
An electronic device may have an edge illuminator that forms a one-dimensional display. The edge illuminator may form a closed loop, may be straight, curved, segmented, or continuous, may be integrated into fabric, may be integrated into a speaker housing, may be integrated into a wearable band, a wearable device, a piece of clothing, a pair of shoes, a yoga mat or other fitness equipment, and/or an accessory such as a purse or bag, an electronic device case or cover, a pair of headphones, or other accessory, and/or may form part of any other suitable item. The edge illuminator may be mounted on a front edge of a device, a side edge of a device, a rear edge of a device, and/or any other suitable edge of a device. The edge illuminator may include a one-dimensional array of pixels formed from light-emitting diodes or other light-emitting devices. The edge illuminator may provide illumination to the border area of a device. The illumination may serve as low-resolution visual output for conveying certain information to a user without providing so much information that the user is distracted or otherwise interrupted from a present task.
The edge illuminator may form a secondary display that is coordinated with a primary display. The primary and secondary displays may form part of the same electronic device (e.g., may share a common housing) or may form part of two different electronic devices. The primary display may include a two-dimensional array of pixels. The visual content on the primary display may be synchronized with visual content on the secondary display.
The electronic device may be a wrist band that includes a strip of fabric that wraps around a longitudinal axis. Light sources such as a one-dimensional array of light-emitting diodes may be embedded in the strip of fabric and may emit light parallel to the longitudinal axis to produce edge illumination along an edge of the strip of fabric. The strip of fabric may include first and second fabric portions and the light sources may be interposed between the first and second fabric portions. The light sources may be located in the center of the strip of fabric, may be located in side pockets at the edge of the fabric, and/or may be in any other suitable location in the fabric. Arrangements in which the light sources form all or part of the edge of the fabric may also be used.
Optical structures such as light-diffusing strands, light-diffusing polymer structures, and/or light guides may be located between the light sources and the edge of the strip of fabric to help diffuse light and/or guide light to the edge of the strip of fabric. Light-reflecting layers may be formed on inner surfaces of the first and second fabric portions.
Control circuitry may control the light sources to produce the edge illumination in response to an event occurring on an external electronic device such as a notification on an external display. The colors of the edge illumination may match colors associated with the notification on the external display.
An electronic device may have an edge illuminator that forms a one-dimensional display. The edge illuminator may form a closed loop, may be straight, curved, segmented, or continuous, may be integrated into fabric, may be integrated into a speaker housing, may be integrated into a wearable band, a wearable device, a piece of clothing, a pair of shoes, a yoga mat or other fitness equipment, and/or an accessory such as a purse or bag, an electronic device case or cover, a pair of headphones, or other accessory, and/or may form part of any other suitable item. The edge illuminator may be mounted on a front edge of a device, a side edge of a device, a rear edge of a device, and/or any other suitable edge of a device. The edge illuminator may include a one-dimensional array of pixels formed from light-emitting diodes or other light-emitting devices. The edge illuminator may provide illumination to the border area of a device. The illumination may serve as low-resolution visual output for conveying certain information to a user without providing so much information that the user is distracted or otherwise interrupted from a present task. The edge illuminator may include light-blocking layers, light-reflecting layers, light-diffusing layers, light-guiding layers, and/or other layers to produce illumination in the desired location with the desired set of characteristics.
An illustrative system that may include one or more electronic devices with an edge illuminator is shown in
Electronic device 10 may be a computing device such as a laptop computer, a computer monitor containing an embedded computer (e.g., a desktop computer formed from a display with a desktop stand that has computer components embedded in the same housing as the display), a tablet computer, a cellular telephone, a media player, or other handheld or portable electronic device, a smaller device such as a wrist-watch device, a pendant device, a headphone or earpiece device, a device embedded in eyeglasses or other equipment worn on a user's head, or other wearable or miniature device, a television, a computer display that does not contain an embedded computer, a gaming device, a navigation device, a tower computer, an embedded system such as a system in which electronic equipment with a display is mounted in a kiosk or automobile, equipment that implements the functionality of two or more of these devices, or other electronic equipment.
As shown in
Electronic device 10 may include fabric 14. Fabric 14 may form all or part of a housing wall or other layer, may form an outermost layer, may form one or more inner covering layers, may form internal structures, and/or may form other fabric-based structures. Electronic device 10 may be soft (e.g., electronic device 10 may have a fabric surface that yields to a light touch), may have a rigid feel (e.g., the surface of electronic device 10 may be formed from a stiff fabric), may be coarse, may be smooth, may have ribs or other patterned textures, and/or may be formed as part of a device that has portions formed from non-fabric structures of plastic, metal, glass, crystalline materials, ceramics, or other materials. In an illustrative configuration, fabric 14 forms a strip that can be worn around a user's wrist or other body part. In another illustrative configuration, fabric 14 serves as a cosmetic cover for electronic device 10 that overlaps audio components (microphones and/or speakers).
Electronic device 10 may have control circuitry 16. Control circuitry 16 may include storage and processing circuitry for supporting the operation of device 10. The storage and processing circuitry may include storage such as hard disk drive storage, nonvolatile memory (e.g., flash memory or other electrically-programmable-read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random-access-memory), etc. Processing circuitry in control circuitry 16 may be used to control the operation of device 10. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors, power management units, audio chips, application specific integrated circuits, etc. Control circuitry 16 may include wired and/or wireless communications circuitry (e.g., antennas and associated radio-frequency transceiver circuitry such as cellular telephone communications circuitry, wireless local area network communications circuitry, etc.). The communications circuitry of control circuitry 16 may allow device 10 to communicate with keyboards, computer mice, remote controls, speakers, accessory displays, accessory cameras, and/or other electronic devices that serve as accessories for device 10.
Input-output circuitry in device 10 such as input-output devices 18 may be used to allow data to be supplied to device 10 and to allow data to be provided from device 10 to external devices. Input-output devices 18 may include input devices that gather user input and other input and may include output devices that supply visual output, audible output, or other output. These devices may include buttons, joysticks, scrolling wheels, touch pads, key pads, keyboards, microphones, speakers, tone generators, vibrators and other haptic output devices, light-emitting diodes and other status indicators, data ports, etc.
Input-output devices 18 may include one or more illumination structures such as edge illuminator 20. Edge illumination structures such as edge illuminator 20 may be formed in suitable locations within device 10 such as along one or more portions of the peripheral edge of housing 12 and/or fabric 14. Edge illuminator 20 may include one or more light-emitting diodes (e.g., organic light-emitting diodes and/or crystalline semiconductor light-emitting diodes), one or more lasers (e.g., laser diodes), one or more individually adjustable components that adjust light transmission and/or reflection and that are illuminated by a backlight or by ambient light (e.g., a liquid crystal display structure such as one or more liquid crystal light modulator cells or an array of such cells, an electronic ink display structure sometimes referred to as an electrophoretic display structure such as one or more electrophoretic cells or an array of such cells), and/or other electrically adjustable light-emitting and/or light modulating components. Illustrative configurations in which edge illuminator 20 includes one or more light-emitting components such as one or more light-emitting diodes may sometimes be described herein as an example. In general, any suitable light-emitting and/or light modulating components may be formed in edge illuminator 20, if desired. Edge illuminators may have ring shapes that extend around the entire border of a strip of fabric or housing structure. If desired, there may be two or more edge illuminators 20 in device 10 (e.g., a first illuminator 20 on a first edge of device 10 and a separate a second illuminator on a second edge of device 10, etc.). Such illuminators may be controlled in coordination with each other and/or may be individually controlled.
Edge illuminator 20 may include a single light-emitting component, two light-emitting components, 2-100 light-emitting components, at least 20 light-emitting components, fewer than 2000 light-emitting components, fewer than 400 light-emitting components, fewer than 100 light-emitting components, fewer than 40 light-emitting components, fewer than 25 light-emitting components, fewer than ten light-emitting components, or other suitable number of individually or collectively controllable light-emitting components. These light-emitting components may be used to emit light of any suitable color (white, a non-neutral color such as red, green, blue, etc.). In some arrangements, edge illuminator 20 may include one or more light guides, diffusers, reflectors, light-absorbing layers, and/or other structures for distributing light from one or more light-emitting components to desired locations (e.g., locations along the border of display 14). If desired, edge illuminator 20 may include a one-dimensional array or a two-dimensional array of light-emitting components (e.g., light-emitting diode pixels). Edge illuminator 20 may include a large number or light-emitting diodes or may contain only a single light-emitting diode or other small number of light-emitting diodes or other light-emitting components (e.g., so that edge illuminator 20 may emit diffuse illumination of a single color and/or may emit other illumination from a single component or other small number of light-emitting components). Light sources in edge illuminator 20 such as light-emitting diodes may form discrete regions of illumination separated by gaps and/or may form a continuous strip of illumination without visible gaps. If desired, the light emitted by illuminator 20 may be used to display visual content (e.g., using colors, time variations, and/or spatial patterns).
If desired, device 10 may include other illumination devices such as a display (e.g., an organic light-emitting diode display with an array of thin-film organic light-emitting diode pixels, a liquid crystal display with an array of liquid crystal display pixels and an optional backlight unit, a display having an array of pixels formed from respective crystalline light-emitting diodes each of which has a respective crystalline semiconductor light-emitting diode die, and/or other display). Arrangements in which device 10 includes edge illuminator 20 without including a display are sometimes described herein as an example.
If desired, edge illuminator 20 may provide light that matches or compliments the color of content on the display of a separate, external electronic device (e.g., a cellular telephone, laptop computer, tablet computer, wrist watch device, head-mounted device, and/or other electronic device belonging to the user wearing device 10), may provide light that has different colors, intensities, etc. in different areas, may provide static light, may provide light that flashes on and off and/or that otherwise varies in intensity and/or color as a function of time, may provide solid regions of color, may include text, graphics, icons, moving images, and other visual output. In some arrangements, visual output from edge illuminator 20 may be associated with image content (e.g., virtual reality content, augmented reality content, and/or mixed reality content) being displayed on a head-mounted device that is also worn by the user wearing device 10. The edge illumination may be used to convey information to people that are near the user wearing the head-mounted device (e.g., information about what content the user is watching on the head-mounted device and/or other information). To conserve power and reduce complexity and to incorporate edge illuminator 20 into portions of device 10 with curved surfaces, edge illuminator 20 may, if desired, be provided with less spatial resolution than the pixels of a display on a separate electronic device.
Edge illuminator 20 may be used to display content such as diffuse light, patterns of diffuse light, text, icons, still and/or moving images, low-resolution images, and/or other content. Diffuse light and/or other lower-resolution content may be displayed in coordination with the content on the display of a separate external electronic device such as a cellular telephone, laptop computer, tablet computer, wrist watch device, head-mounted device, television, and/or other suitable electronic device belonging to the user wearing device 10. For example, in an example in which edge illuminator 20 has a single solid color that extends in an unbroken ring around device 10, this color may be selected to match the average color of the image on an external display, may be selected to match the average intensity of the image on an external display, may have a different color and/or intensity that coordinates with the attributes of the image on an external display, and/or may have a fixed intensity and/or color that is displayed whenever content is displayed on an external display. In arrangements in which edge illuminator 20 has multiple individually adjustable areas, these areas may be adjusted to match the intensities, colors, and/or patterns of images on an external display and/or may be adjusted to create text, icons, and/or other patterns of visual content.
If desired, the colors, intensity, and pattern of illumination from edge illuminator 20 may be selected based on a notification that is displayed on an external display. For example, notifications on an external display may be associated with certain applications such as messaging applications, timer applications, email applications, music applications, voice-controlled assistant notifications, social media applications, etc. Each application and/or each notification may be represented using an icon on the external display. When such a notification is received on the external electronic device, edge illuminator 20 may provide illumination having colors and/or patterns that match the icon associated with the notification (e.g., edge illuminator 20 may display green and white colors when a messaging notification with a green and white icon is displayed on an external display).
Edge illuminator 20 may also produce illumination that may or may not be associated with content on an external display, such as illumination associated with fitness tracking, heart rate monitoring, sleep tracking, etc.
Input-output devices 18 may also include sensors 22. Sensors 22 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors (e.g., a one-dimensional or two-dimensional capacitive touch sensor integrated into edge illuminator 20 and/or a touch sensor that forms a button, trackpad, or other input device not associated with edge illuminator 20), and other sensors. If desired, sensors 22 may include optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, optical touch sensors, optical proximity sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, fingerprint sensors, temperature sensors, sensors for measuring three-dimensional non-contact gestures (“air gestures”), pressure sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors, radio-frequency sensors (e.g., sensors that gather position information, three-dimensional radio-frequency images, and/or other information using radar principals or other radio-frequency sensing), depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, gaze tracking sensors, three-dimensional sensors (e.g., pairs of two-dimensional image sensors that gather three-dimensional images using binocular vision, three-dimensional structured light sensors that emit an array of infrared light beams or other structured light using arrays of lasers or other light emitters and associated optical components and that capture images of the spots created as the beams illuminate target objects, and/or other three-dimensional image sensors), facial recognition sensors based on three-dimensional image sensors, and/or other sensors. In some arrangements, device 10 may use sensors 22 and/or other input-output devices to gather user input (e.g., buttons may be used to gather button press input, touch sensors overlapping displays can be used for gathering user touch screen input, touch pads may be used in gathering touch input, microphones may be used for gathering audio input, etc.). If desired, sensors in device 10 such as motion sensors (e.g., accelerometer, gyroscope, etc.), optical heart rates sensors, and/or other sensors may be used to gather user input by detecting hand gestures such as hand clenching, double-clenching (e.g., two first clenching gestures spaced closely in time), hand waving, finger movements, etc.
If desired, sensors 22 may include tension sensors (e.g., strain gauges, load cells, and/or other tension sensors) that can detect an amount of tension in device 10 (e.g., an amount of tension in fabric 14). Tension sensors in device 10 may be used to detect when device 10 is being worn. For example, if tension in fabric 14 is above a given threshold, control circuitry 16 may determine that device 10 is being worn around the user's wrist or other body part. When tension in fabric 14 is below the given threshold, control circuitry 16 may determine that device 10 is not being worn. Control circuitry 16 may operate device 10 in a first mode when the tension sensor indicates that device 10 is being worn and may operate device 10 in a second mode when the tension sensor indicates that device 10 is not being worn. For example, control circuitry 16 may use different types of output to provide notifications to the user based on whether device 10 is being worn or not (e.g., haptic and visual output when device 10 is being worn versus only visual output when device 10 is not being worn).
If desired, electronic device 10 may include additional components such as other devices in input-output devices 18. The additional components may include haptic output devices 24 (e.g., piezoelectric haptic actuators, haptic actuators based on electroactive polymer devices, electromechanical actuators, and/or other haptic output devices that provide a user with tactile output such as vibrations, impulses, etc.), audio output devices such as a speaker, audio input devices such as a microphone, light sources such as light-emitting diodes (e.g., crystalline semiconductor light-emitting diodes for status indicators and/or components), other optical output devices, and/or other circuitry for gathering input and/or providing output. Device 10 may also include an optional battery or other energy storage device, connector ports for supporting wired communications with ancillary equipment and for receiving wired power, and other circuitry. Device 10 may be operated in systems that include wired and/or wireless accessories (e.g., keyboards, computer mice, remote controls, trackpads, etc.).
If desired, a user may wear multiple devices 10. Each device 10 may have different functionality than the other devices 10 (e.g., one device 10 may be used for fitness applications and another device 10 may be used for gaming applications), or devices 10 may have the same functionality. Multiple devices 10 may be worn on the same wrist, may be worn on each of the user's wrists, may be worn on one or more fingers, may be worn on the wrist and the fingers of one arm, and/or may be worn on other parts of the body. The devices 10 may be paired with one another or may operate unpaired. If desired, visual and/or haptic output from multiple devices 10 may be coordinated for a combined effect.
Additional electronic devices in system 8 such as devices 80 may include devices such as a laptop computer, a computer monitor containing an embedded computer, a tablet computer, a desktop computer, a cellular telephone, a media player, or other handheld or portable electronic device, a smaller device such as a wristwatch device, a pendant device, a headphone or earpiece device, a head-mounted device such as glasses, goggles, a helmet, or other equipment worn on a user's head, or other wearable or miniature device, a television, a computer display that does not contain an embedded computer, a gaming device, a remote control, a navigation device, an embedded system such as a system in which equipment is mounted in a kiosk, in an automobile, airplane, or other vehicle, a removable external case for electronic equipment, a strap, a wrist band or head band, a removable cover for a device, a case or bag that has straps or that has other structures to receive and carry electronic equipment and other items, a necklace or arm band, a wallet, sleeve, pocket, or other structure into which electronic equipment or other items may be inserted, part of a chair, sofa, or other seating (e.g., cushions or other seating structures), part of an item of clothing or other wearable item (e.g., a hat, belt, wrist band, headband, sock, glove, shirt, pants, etc.), or equipment that implements the functionality of two or more of these devices. If desired, device 80 and device 10 may share a common housing.
Electronic device 80 of system 8 may include control circuitry 82. Control circuitry 82 may include storage and processing circuitry for supporting the operation of device 80 and/or system 8. The storage and processing circuitry may include storage such as nonvolatile memory (e.g., flash memory or other electrically-programmable-read-only memory configured to form a solid state drive), volatile memory (e.g., static or dynamic random-access-memory), etc. Processing circuitry in control circuitry 82 may be used to gather input from sensors and other input devices and may be used to control output devices. The processing circuitry may be based on one or more microprocessors, microcontrollers, digital signal processors, baseband processors and other wireless communications circuits, power management units, audio chips, application specific integrated circuits, etc.
Electronic device 80 may include input-output devices 86. Input-output devices 86 may be used in gathering user input, in gathering information on the environment surrounding device 80, and/or in providing a user with output. Devices 86 may include sensors 88. Sensors 88 may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors, optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), muscle activity sensors (EMG), radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices), optical sensors such as self-mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, and/or other sensors. In some arrangements, device 80 may use sensors 88 and/or other input-output devices 86 to gather user input (e.g., buttons may be used to gather button press input, touch sensors overlapping displays can be used for gathering user touch screen input, touch pads may be used in gathering touch input, microphones may be used for gathering audio input, accelerometers may be used in monitoring when a finger contacts an input surface and may therefore be used to gather finger press input, etc.).
Device 80 may include haptic output devices 94. Haptic output devices 94 may include actuators such as electromagnetic actuators, motors, piezoelectric actuators, electroactive polymer actuators, vibrators, linear actuators, rotational actuators, actuators that bend bendable members, actuator devices that create and/or control repulsive and/or attractive forces between devices 10 and/or 80 (e.g., components for creating electrostatic repulsion and/or attraction such as electrodes, components for producing ultrasonic output such as ultrasonic transducers, components for producing magnetic interactions such as electromagnets for producing direct-current and/or alternating-current magnetic fields, permanent magnets, magnetic materials such as iron or ferrite, and/or other circuitry for producing repulsive and/or attractive forces between devices 10 and/or 80). In some situations, actuators for creating forces in device 80 may be used in producing tactile output (e.g., vibrations on the user's skin or that can be heard from a distance). In other situations, these components may be used to interact with each other (e.g., by creating a dynamically adjustable electromagnetic repulsion and/or attraction force between a pair of devices 80 and/or between devices 10 and device 80 using electromagnets).
If desired, input-output devices 86 of device 80 may include other devices such as display 90 (e.g., to display images for a user), status indicator lights (e.g., a light-emitting diode that serves as a power indicator, and other light-based output devices), speakers 92 and other audio output devices, electromagnets, permanent magnets, structures formed from magnetic material (e.g., iron bars or other ferromagnetic members that are attracted to magnets such as electromagnets and/or permanent magnets), batteries, etc. Device 80 may also include power transmitting and/or receiving circuits configured to transmit and/or receive wired and/or wireless power signals.
To support communications between devices 10 and device 80 and/or to support communications between equipment in system 8 and external electronic equipment, control circuitry 82 may communicate using communications circuitry 84. Circuitry 84 may include antennas, radio-frequency transceiver circuitry, and other wireless communications circuitry and/or wired communications circuitry. Circuitry 84, which may sometimes be referred to as control circuitry and/or control and communications circuitry, may, for example, support bidirectional wireless communications between devices 10 and 80 using wireless signals 96 (e.g., wireless local area network signals, near-field communications signals, Bluetooth® signals, 60 GHz signals or other millimeter wave signals, ultra-wideband communications signals, etc.). Device 80 may also include power circuits for transmitting and/or receiving wired and/or wireless power and may include batteries. In configurations in which wireless power transfer is supported between devices 10 and device 80, in-band wireless communications may be supported using inductive power transfer coils (as an example).
Wireless signals 96 may be used to convey information such as location and orientation information. For example, control circuitry 82 in device 80 may determine the location of device 10 using wireless signals 96 and/or control circuitry 16 in device 10 may determine the location of device 80 using wireless signals 96. In one illustrative arrangement, device 10 may include a low-power transmitter (e.g., a Bluetooth® Low Energy transmitter, an ultra-wideband radio frequency signal transmitter, an RFID transmitter, a near-field communications transmitter, and/or other transmitter). Device 80 may have a corresponding receiver that detects the transmitted signals 96 from device 10 and determines the location of device 10 based on the received signals.
Device 80 may track the location (e.g., the indoor or outdoor location) of device 10 using signal strength measurement schemes (e.g., measuring the signal strength of radio signals from device 10) and/or using time-based measurement schemes such as time of flight measurement techniques, time difference of arrival measurement techniques, angle of arrival measurement techniques, triangulation methods, time-of-flight methods, using a crowdsourced location database, and other suitable measurement techniques. This type of location tracking may be achieved using ultra-wideband signals, Bluetooth® signals, WiFi® signals, millimeter wave signals, and/or other suitable signals. This is merely illustrative, however. If desired, control circuitry 82 of device 80 may determine the location of device 10 using Global Positioning System receiver circuitry, using proximity sensors (e.g., infrared proximity sensors or other proximity sensors), depth sensors (e.g., structured light depth sensors that emit beams of light in a grid, a random dot array, or other pattern, and that have image sensors that generate depth maps based on the resulting spots of light produced on target objects), sensors that gather three-dimensional depth information using a pair of stereoscopic image sensors, lidar (light detection and ranging) sensors, radar sensors, using image data from a camera, using motion sensor data, and/or using other circuitry in device 80.
If desired, angle of arrival measurement techniques may be employed by control circuitry 16 of device 10 and/or control circuitry 82 of device 80 to determine the relative orientation of device 10 and device 80. For example, control circuitry 82 may determine the orientation of device 80 relative to device 10 by determining a phase difference associated with signals 96 received by antennas in device 80 from device 10. The phase difference may be used to determine an angle of arrival of signals 96 from device 10 received by device 80. Similarly, control circuitry 16 of device 10 may, if desired, determine the orientation of device 10 relative to device 80 by determining a phase difference associated with signals 96 received by antennas in device 10 from device 80. The phase difference may be used to determine an angle of arrival of signals 96 from device 80 received by device 10.
To keep device 10 relatively small, lightweight, and low-power, device 10 may include a low-power signal transmitter (e.g., a Bluetooth® Low Energy transmitter, an ultra-wideband radio frequency signal transmitter, an RFID transmitter, a near-field communications transmitter, and/or other transmitter), but may not include a display or other electronics that might require a significant amount of space or power. This is merely illustrative, however. If desired, device 10 may include a display. Arrangements in which device 10 is a display-free wearable device that does not include a display are sometimes described herein as illustrative examples.
The one or more electronic devices 80 that communicate with devices 10 may sometimes be referred to as host devices or primary devices (devices 10 may sometimes be referred to as secondary devices). The host devices may run software that is used to track the location of devices 10, send control signals to devices 10, receive data from devices 10, and/or perform other functions related to the operation of devices 10.
Strands 32 may be formed from polymer, metal, glass, graphite, ceramic, natural materials such as cotton or bamboo, or other organic and/or inorganic materials and combinations of these materials. Conductive coatings such as metal coatings may be formed on non-conductive material. For example, plastic strands in fabric 14 may be coated with metal to make them conductive. Reflective coatings such as metal coatings may be applied to make strands reflective. Strands formed from white polymer (e.g., light-scattering particles in polymer) and/or that are coated with white polymer may help reflect light in some configurations. If desired, strands may be formed from bare metal wires or metal wire intertwined with insulating monofilaments (as examples). Bare metal strands and strands of polymer covered with conductive coatings may be provided with insulating polymer jackets. In some configuration, strands 32 may include optical fibers (e.g., lossy optical fibers with surface roughening or other features that allow the strands to guide light while emitting portion of the guided light outwardly). Optical waveguide strands (e.g., lossy optical fibers formed from glass, transparent polymer, etc.) can be provided with light from light sources such as light-emitting diodes to display information (e.g., desired patterns of light). In some cases, it may be desirable for lossy fiber to appear dark or colored in reflection when illuminated by external light, so that the lossy fiber may match the appearance of other fibers. In these cases, the lossy fiber can include regions that are colored on the outside of the fiber but only leak light slightly or not at all and other regions that emit light due to roughen of the fiber surface or localized adjustments to the cladding of the fiber in that region (e.g., localized cladding thinning).
In the example of
Housing 12 may be formed of plastic, glass, ceramics, fiber composites, metal (e.g., stainless steel, aluminum, metal links, interlinked chain links, etc.), elastomeric polymer (e.g., silicone and/or other stretchable plastics), fabric (e.g., fabric 14 of
In the example of
Housing 12 may have any suitable cross-sectional shape. As examples, housing 12 may have a circular cross-sectional shape, a rectangular cross-sectional shape, a rectangular cross-sectional shape with rounded corners, an oval cross-sectional shape, or other suitable cross-sectional shape. Housing 12 may have curved outer surfaces, may have planar outer surface, may be flat, or may have a combination of these shapes.
Housing 12 may be formed from a single unitary piece of fabric 14 (e.g., a continuous loop without seams or a string having ends that attach to each other or that attach to another structure) or may be formed with first and second ends that can be joined by a clasp (e.g., a magnetic clasp, an electrical clasp, a mechanical clasp, etc.) or other attachment structure.
One or more edge portions of device 10 may be provided with light from an edge illuminator such as edge illuminator 20. In the example of
Housing 12 may be formed using a unibody configuration in which some or all of housing 12 is machined or molded as a single structure or may be formed using multiple structures (e.g., an internal frame structure, one or more structures that form exterior housing surfaces, etc.).
In the example of
One or more edge portions of device 10 may be provided with light from an edge illuminator such as edge illuminator 20. In the example of
Light sources 44 may emit light in a single direction or in multiple directions. In the illustrative configuration of
If desired, light sources 44 may emit light through an optical structure such as optical structure 52. In one illustrative arrangement, optical structure 52 may be a light guide that helps guide light from light source 44 to edge 14E-1 or 14E-2. The light guide may be formed from a transparent polymer structure (e.g., a rigid polymer plate or a thin flexible light guide film), an optical fiber, a light guiding strand in fabric 32, and/or any other light guiding structure that guides light 38 to edges 14E-1 and 14E-2 in accordance with the principal of total internal reflection. The light guide may have curved surfaces and/or other desired three-dimensional shapes. Light-scattering features such as protrusions, recesses, and/or light-scattering particles in the light guide forming optical structure 52 may be used to help extract light 38 to form edge illumination. Optional reflective structures may be used to help recycle light that has scattered from the light guide in a direction away from the desired direction of light emission.
In another illustrative arrangement, optical structure 52 may be a diffuser that helps diffuse light 38 so that the illumination from edge illuminator 20 is continuous. Diffusers that may be used for optical structures 52 may include transparent members such as polymer or other transparent material with embedded light-scattering particles, fabric (e.g., strands 32 in fabric 14 that help diffuse light 38 as it travels towards edges 14E-1 and 14E-2), material with perforations, etc. In arrangements where strands 32 are used to form diffusers, strands 32 may be more closely interlaced in the light-diffusing regions to form optical structures 52. The emitted light 38 from light-emitting diodes 44 passes through member 52 to form edge illumination. If desired, an additional diffusing structure such as edge diffuser 50 may be formed along each of edges 14E-1 and 14E-2 to help further diffuse and blend the light from light sources 44. Diffuser 50 may be a strand of material such as a silicone fiber that extends along the length L of each edge (e.g., perpendicular to the direction in which light 38 is emitted) or may be any other suitable diffusing structure.
Light-emitting diodes 44 are located in the middle of the strip of fabric 14 and extend along the length of the strip of fabric 14 in a one-dimensional array in the example of
If desired, one or more attributes of the components and structures in edge illuminator 20 may be varied as a function of location along length L. For example, strand density (e.g., how closely strands 32 are interlaced), strand material, color of light 38, pitch of light sources 44, light-output capabilities of light sources 44, attributes of optical structures 52 and/or 50, and/or attributes of other structures in edge illuminator 20 may vary as a function of location along length L.
Interlacing equipment such as weaving equipment, knitting equipment, braiding equipment, or other suitable interlacing equipment may be used to form fabric 14. Fabric 14 may include pocket 60 (sometimes referred to as a gap, space, cavity, void, position, location, etc.) for receiving electrical components. Regions in fabric 14 that receive electrical components such as pocket 60 may be formed by creating a space or gap between portions of fabric 14 such as fabric portion 14-1 and fabric portion 14-2. The term “pocket” may be used to refer to a void between fabric portions and/or may be used to refer to a position or location between fabric portions (e.g., a position between strands of material in fabric 14, with or without an actual void).
Electrical components such as light sources 44 may be inserted into pocket 60 during the formation of fabric 14 or may be inserted into pocket 60 after fabric 14 is formed. If desired, light sources 44 may be electrically and/or mechanically connected to one or more conductive strands in pocket 60 (e.g., in arrangements where signal path 46 is formed from one or more conductive strands 32 in fabric 14).
As shown in
To give the appearance of continuous illumination, light sources 44 may emit light 38 through diffusing structures such as strands 32 having a higher density than the strands in other portions of fabric 14. Other diffusing structures such as transparent polymer or other transparent material with embedded light-scattering particles, material with perforations, and/or other diffusers may be used in addition to or instead of strands 32 in pockets 54.
Low-resolution displays such as edge illuminator 20 may be useful for providing notifications to a user without distracting the user from a present task. In particular, the color, pattern, and/or intensity of light from edge illuminator 20 may convey a sufficient amount of information so that the user understands the nature or source of the notification without providing so much information that the user is distracted by specific content within the notification. Control circuitry 16 may determine characteristics of visual output from edge illuminator 20 and/or of haptic output from haptic output devices 24 based on the importance (e.g., time-sensitivity) of the notification and/or based on the context of device 10 (e.g., based on data from sensors 22 and/or based on other contextual information such as the date, time of day, etc.). This type of context-adaptive visual output from edge illuminator 20 may be used to provide notifications that may or may not be associated with a primary display (e.g., a primary display 90 that shares a housing with edge illuminator 20) or primary device (e.g., a separate external electronic device such as device 80 having a display 90 with a two-dimensional array of pixels and/or having a head-mounted display configured to display augmented reality, virtual reality, and/or mixed reality image content).
In response to an event on a primary device 80 (e.g., a messaging notification, an incoming call notification, a social media notification, a calendar notification, a timer notification, a reminder notification, etc.), control circuitry 16 of device 10 and/or control circuitry 82 of device 80 may determine an importance of the event, as shown in block 100. This may include, for example, determining how time-sensitive the event is (e.g., whether the event is associated with a due date, timer, calendar, etc.), determining a source of the notification (e.g., which application the event is associated with such as a messaging application, calendar application, social media application, etc.), determining the identity of the person associated with the event (e.g., whether the sender of a message or email is a family member, colleague, unknown individual, etc.), determining whether any existing user preferences are associated with the event, and/or determining other information that indicates a level of importance of the event.
During the operations of block 102, control circuitry 16 of device 10 and/or control circuitry 82 of device 80 may determine the context of device 10 (e.g., the secondary device in this example). For example, control circuitry 16 and/or control circuitry 82 may determine the location of device 10 (e.g., home, work, gym, car, etc.), the movements of device 10, whether device 10 is being used during exercise, work, school, bedtime, sleep, driving, etc., and/or may determine other information about the context in which device 10 is operating. If desired, sensors in device 10 such as a motion sensor may detect hand motions and/or finger movements being made by the hand wearing device 10.
During the operations of block 104, control circuitry 16 of device 10 and/or control circuitry 82 of device 80 may determine characteristics of visual output from edge illuminator 20 and/or haptic output from haptic output devices 24 based on the importance of the event (determined during the operations of block 100) and based on the context of device 10 (determined during the operations of block 102). This may include, for example, determining the color(s), intensity, pattern, and/or other characteristics of illumination from edge illuminator 20 and the intensity and pattern of haptic feedback from haptic output devices 24.
During the operations of block 106, control circuitry 16 of device 10 and/or control circuitry 82 of device 80 may use edge illuminator 20 to provide the selected type of visual output and may use haptic output devices 24 to provide the selected type of haptic output. If desired, device 10 and/or device 80 may monitor for user input during and/or immediately after providing visual output and/or haptic output. For example, sensors in device 10 and/or device 80 may monitor hand movements and/or finger gestures that may be used to interact with the visual or haptic output. A hand waving gesture or other user input may be used to silence or turn off the output, a pinch-to-zoom hand gesture or other user input may be used to obtain more information about the notification associated with the output, and/or other user input may be used to otherwise interact with the visual output and/or haptic output on device 10.
As described above, one aspect of the present technology is the gathering and use of information such as sensor information. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to calculated control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of provisional patent application No. 63/292,913, filed Dec. 22, 2021, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
D344411 | Henry | Feb 1994 | S |
5568971 | Jewell | Oct 1996 | A |
5601356 | McWilliams | Feb 1997 | A |
6213619 | Yu | Apr 2001 | B1 |
6550930 | Portouche | Apr 2003 | B1 |
6578982 | Lynch | Jun 2003 | B1 |
6659618 | Waters | Dec 2003 | B2 |
6992572 | Lovegrove | Jan 2006 | B1 |
7071828 | Wong | Jul 2006 | B2 |
8303129 | Thielen | Nov 2012 | B1 |
8398255 | Starogiannis | Mar 2013 | B2 |
8550648 | Smith | Oct 2013 | B2 |
8562165 | Thompson | Oct 2013 | B2 |
9225811 | Kim et al. | Dec 2015 | B2 |
9322544 | Carriere | Apr 2016 | B2 |
9752762 | Poe, III | Sep 2017 | B1 |
10175653 | Bloom | Jan 2019 | B1 |
10254795 | Huitema | Apr 2019 | B2 |
10344924 | Ganahl | Jul 2019 | B1 |
10372020 | Raskin | Aug 2019 | B2 |
10528133 | De Nardi | Jan 2020 | B2 |
10551012 | Schorr, III | Feb 2020 | B2 |
10571114 | Carriere | Feb 2020 | B2 |
10595618 | Wang et al. | Mar 2020 | B2 |
10670202 | Ganahl | Jun 2020 | B2 |
10677436 | Schorr, III | Jun 2020 | B2 |
10702030 | Gibson | Jul 2020 | B2 |
10842234 | Allan | Nov 2020 | B2 |
11146871 | Sunshine et al. | Oct 2021 | B2 |
11164917 | Wang et al. | Nov 2021 | B1 |
11255531 | Schorr, III | Feb 2022 | B2 |
11262064 | Boyd | Mar 2022 | B1 |
11350860 | Wong | Jun 2022 | B1 |
11478158 | Connor | Oct 2022 | B2 |
11517634 | Fulbrook | Dec 2022 | B2 |
11802680 | Kearns | Oct 2023 | B2 |
12093088 | Hiemstra | Sep 2024 | B2 |
20060198120 | Guzman | Sep 2006 | A1 |
20080266112 | van de Sluis | Oct 2008 | A1 |
20090265971 | Cook | Oct 2009 | A1 |
20110182057 | Watson | Jul 2011 | A1 |
20110310592 | Smith | Dec 2011 | A1 |
20120082012 | Blanckaert | Apr 2012 | A1 |
20130170158 | Van Abeelen | Jul 2013 | A1 |
20140200423 | Eisen | Jul 2014 | A1 |
20150169011 | Bibl | Jun 2015 | A1 |
20150258431 | Stafford | Sep 2015 | A1 |
20160109953 | Desh | Apr 2016 | A1 |
20160129279 | Ferolito | May 2016 | A1 |
20160154170 | Thompson et al. | Jun 2016 | A1 |
20160351838 | Momma | Dec 2016 | A1 |
20170085688 | Zhou | Mar 2017 | A1 |
20170164878 | Connor | Jun 2017 | A1 |
20170235341 | Huitema | Aug 2017 | A1 |
20170303646 | Bricken | Oct 2017 | A1 |
20170343888 | Raskin | Nov 2017 | A1 |
20170364156 | Kim | Dec 2017 | A1 |
20180000205 | Chinowsky | Jan 2018 | A1 |
20180020193 | Blum | Jan 2018 | A1 |
20180042513 | Connor | Feb 2018 | A1 |
20180376561 | Pham | Dec 2018 | A1 |
20190045296 | Ralph | Feb 2019 | A1 |
20190098969 | Gibson | Apr 2019 | A1 |
20190099613 | Estes | Apr 2019 | A1 |
20190116942 | Allan | Apr 2019 | A1 |
20190144062 | Poole | May 2019 | A1 |
20190290138 | Yang | Sep 2019 | A1 |
20190298265 | Keating | Oct 2019 | A1 |
20190304265 | Guercio | Oct 2019 | A1 |
20200000345 | Connor | Jan 2020 | A1 |
20200146625 | Nousiainen | May 2020 | A1 |
20220034490 | Kearns | Feb 2022 | A1 |
20220262481 | Weast et al. | Aug 2022 | A1 |
20220313847 | Fulbrook | Oct 2022 | A1 |
Number | Date | Country |
---|---|---|
2978545 | Mar 2019 | CA |
Number | Date | Country | |
---|---|---|---|
63292913 | Dec 2021 | US |