Embodiments relate generally to determining whether electronic devices are in a sensing state to measure physiological parameters. More particularly, the described embodiments relate to methods and systems for detecting and analyzing one or more sensing signals corresponding to light detected by the device to determine whether the device is in a sensing state, and using the sensing signals to determine physiological parameters if the device is in the sensing state.
Wearable electronic devices are being used more and more for biological signal measurements. Optical measurement of biological signals is prone to errors from light reflecting from users' skin instead of traveling through the skin. Since most measurements are performed automatically and not under supervision, ensuring the reliability of measurements is particularly important. Some devices include dedicated sensors for detecting a proximity of the device to a user. However, these dedicated sensors consume precious space and power, often requiring sacrifices in either device design or performance.
Embodiments of the systems, devices, methods, and apparatuses described in the present disclosure are directed to determining whether electronic devices are in a sensing state to measure physiological parameters. More particularly, the described embodiments relate to methods and systems for detecting and analyzing one or more sensing signals corresponding to light detected by the device to determine whether the device is in a sensing state, and using the sensing signals to determine physiological parameters if the device is in the sensing state.
One embodiment may take the form of a wearable electronic device that includes a device housing defining a rear surface, an optical sensing assembly, and a processing unit. The optical sensing assembly may include a light emitter adapted to emit light toward a user and a light detector adapted to detect light that has interacted with the user and output a sensing signal corresponding to the detected light. The processing unit may be operably coupled to the optical sensing assembly. The processing unit may be adapted to determine at least partially based on the sensing signal, whether the wearable electronic device is in a sensing state in which a separation distance from the rear surface to the user is less than or equal to a maximum sensing distance, and in response to determining that the wearable electronic device is in the sensing state, determine, based at least partially on the sensing signal, a physiological parameter of the user.
Another embodiment may take the form of a method that includes the steps of detecting, by an optical sensing assembly of a wearable electronic device, light that has interacted with a user, outputting a sensing signal corresponding to the detected light, determining, at least partially based on the sensing signal, whether the wearable electronic device is in a sensing state in which the rear surface is contacting the user, and in response to determining that the wearable electronic device is in the sensing state, determining a physiological parameter of the user based at least partially on the sensing signal.
Another embodiment may take the form of a method that includes the steps of performing an optical measurement including emitting light toward the user, detecting light that has interacted with the user, and determining, based on the detected light, whether a separation distance between the wearable electronic device and the user is less than or equal to a maximum sensing distance. The method further includes, in response to determining that the separation distance is less than or equal to the maximum sensing distance, determining, based at least partially on the detected light, at least one of a heart rate, blood-oxygen saturation value, or a total hemoglobin value. The method further includes, in response to determining that the separation distance is greater than the maximum sensing distance, repeating the optical measurement.
In addition to the example aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the drawings and by study of the following description.
The disclosure will be readily understood by the following detailed description in conjunction with the accompanying drawings, wherein like reference numerals designate like structural elements, and in which:
The use of cross-hatching or shading in the accompanying figures is generally provided to clarify the boundaries between adjacent elements and also to facilitate legibility of the figures. Accordingly, neither the presence nor the absence of cross-hatching or shading conveys or indicates any preference or requirement for particular materials, material properties, element proportions, element dimensions, commonalities of similarly illustrated elements, or any other characteristic, attribute, or property for any element illustrated in the accompanying figures.
Additionally, it should be understood that the proportions and dimensions (either relative or absolute) of the various features and elements (and collections and groupings thereof) and the boundaries, separations, and positional relationships presented therebetween, are provided in the accompanying figures merely to facilitate an understanding of the various embodiments described herein and, accordingly, may not necessarily be presented or illustrated to scale, and are not intended to indicate any preference or requirement for an illustrated embodiment to the exclusion of embodiments described with reference thereto.
Reference will now be made in detail to representative embodiments illustrated in the accompanying drawings. It should be understood that the following description is not intended to limit the embodiments to one preferred embodiment. To the contrary, it is intended to cover alternatives, modifications, and equivalents as can be included within the spirit and scope of the described embodiments as defined by the appended claims.
The following disclosure relates to determining one or more physiological parameters using a wearable electronic device. A wearable electronic device (e.g., an electronic watch) may detect and analyze one or more sensing signals corresponding to light detected by the device to determine whether the device is in a sensing state (e.g., the device is within a maximum sensing distance of a user). If the wearable electronic device is in the sensing state, the device may determine one or more physiological parameters from the same signals used to determine whether the device is in the sensing state. Using the same light emitters, light detectors, and sensing signals to determine whether the device is in a sensing state and to determine physiological parameters improves device performance by consuming less power and increases manufacturing efficiency and reduces device size by requiring fewer device components.
As used herein, the term “maximum sensing distance” may refer to the distance between the wearable electronic device and a user that is required for a reliable measurement of one or more physiological parameters. For many physiological parameters, reliable measurement requires that substantially all or a significant portion of the light emitted by light emitter(s) of the device travels through the user's skin before it is returned and sensed by light detector(s) of the device. When a light emitter of a wearable electronic device is within a maximum sensing distance of the user, substantially all or a significant portion of the light emitted by the light emitter travels through the user's skin before it is returned and sensed by the light detector(s) of the device. As a result, reliable measurements may be taken. In contrast, when a light emitter is not within the maximum sensing distance of the user, substantially all or a significant portion of the light emitted by the light emitter is reflected back to the light detector(s) without traveling through the user's skin. Accordingly, it is more difficult or impossible to take reliable measurements.
As noted above, the wearable electronic devices herein may detect and analyze one or more signals corresponding to light detected by the device to determine whether the device is within a maximum sensing distance of the user (e.g., whether a separation distance between the device and the user is less than or equal to the maximum sensing distance). At certain wavelengths, the light traveling through the user's skin in is significantly attenuated compared to the light that is reflected from the skin. As a result, the device may determine whether it is within the maximum sensing distance of the user based on a signal level (e.g., an amplitude, intensity, signal strength, etc.) of the detected signal. For example, in some cases, the device may determine that it is within the maximum sensing distance of the user if the signal level is below a predetermined threshold.
In some cases, determining that the device is in the sensing state includes determining that multiple locations or regions of the exterior surface of the device (e.g., a rear exterior surface of the device) are within the maximum sensing distance of the user. The wearable electronic device may include multiple light emitters and/or light detectors positioned at different locations beneath an exterior surface of the device. As a result, multiple light emitters and/or light detectors may be used to generate multiple sensing signals that may be analyzed to determine whether multiple different locations or regions of the exterior surface of the device are within the maximum sensing distance of the user. This may provide more reliable physiological measurements by ensuring that the device is not tilted or otherwise in a position in which emitted light does not sufficiently propagate through the user's skin.
Additionally or alternatively, determining that the device is in the sensing state may include emitting and detecting light at multiple different wavelengths. The wearable electronic device may analyze multiple sensing signals corresponding to each of the different wavelengths to achieve a more reliable determination that one or more locations or regions of the exterior surface of the device are within the maximum sensing distance of the user. For example, in some cases, the device may determine that it is within the maximum sensing distance of the user if the signal levels of multiple sensing signals satisfy a boundary condition. This may avoid or reduce false positives from single sensing signals.
As used herein, the term “light emitter” may refer to a spatially located source of light. A light emitter may include one or more light sources, including light-emitting diodes (LEDs), laser diodes, and the like. A light emitter may emit light in response to a signal, such as a control signal from a measurement engine or a processing unit or a current applied to the light emitter. In some cases, the wavelength of light emitted by a light emitter is not controllable, and the light emitter is used to emit light at a particular wavelength. Alternatively, the wavelength of light emitted by a light emitter may be controllable As used herein, the term “wavelength” may refer to a single wavelength value or a relatively narrow range of wavelengths (e.g., a 2 nm, 5 nm, or 15 nm range) in which the light has substantially the same optical properties, such as color.
The term “physically coupled,” as used herein, may refer to two or more elements, structures, objects, components, parts or the like that are physically attached to one another. As used herein, “operably coupled” or “electrically coupled” may refer to two or more devices that operate with one another, communicate with one another, are in electrical connection with one another, and/or otherwise interact with one another in any suitable manner for operation and/or communication, including wired, wirelessly, or some combination thereof.
These and other embodiments are discussed with reference to
The wearable electronic device 100 may further include one or more input devices (e.g., a crown 103, a button, etc.), one or more output devices (e.g., a display 104, a speaker, etc.), and a processing unit 105. The wearable electronic device 100 may include an enclosure 102 that defines an interior volume 108. The input device(s), the output device(s), the processing unit 105, the measurement engine 106, and the optical sensing assembly 101 may be positioned at least partially within the interior volume 108 of the enclosure 102.
Broadly, the light emitters 110 emit light and the light detectors 120 detect light. The processing unit 105 analyzes one or more sensing signals corresponding to the detected light to determine whether the wearable electronic device 100 is in a sensing state, and if so, determines one or more physiological parameters from the sensing signals. As noted above, the wearable electronic device 100 may be in a sensing state when one or more locations or regions of an exterior surface of the wearable electronic device (e.g., a rear exterior surface 100b of the wearable electronic device) are within a maximum sensing distance of the user (e.g., whether a separation distance between the device and the user is less than or equal to the maximum sensing distance). As used herein, the term “maximum sensing distance” may refer to the distance between the wearable electronic device and a user that is required for a reliable measurement of one or more physiological parameters.
The maximum sensing distance may vary for different wearable electronic devices, users, and/or physiological parameters being detected. In some cases, the maximum sensing distance is zero (e.g., the device must be contacting the user to perform a reliable measurement). In some cases, the maximum sensing distance may be 0.1 mm, 0.5 mm, 1 mm, 2 mm, or more. The maximum sensing distance may be determined based on an amount that the wearable electronic device moves relative to the user during normal use. For example, the maximum sensing distance may be equal to or slightly greater than (e.g., within 5%, 10%, 25%, or 50%) of an amount the distance between the wearable electronic device and the user changes during normal use.
As noted herein, the processing unit 105 may analyze sensing signal(s) received from the light detector(s) 120 to determine whether the wearable electronic device 100 is in a sensing state. The processing unit may determine whether it is within the maximum sensing distance of the user based on a signal level (e.g., an amplitude, intensity, signal strength, etc.) of the detected signal. For example, in some cases, the device may determine that it is within the maximum sensing distance of the user if the signal level is below a predetermined threshold.
At operation 202, a processing unit (e.g., processing unit 105) initiates an optical measurement at a wearable electronic device. In an example optical measurement, one or more light emitters (e.g., light emitter(s) 110) emit light (e.g., light 150) toward a user (e.g., a user 140). The light interacts with the user, which may include a portion of the light being absorbed by the user's tissue (e.g., skin, blood vessels, muscles, and the like) and/or a portion of the light being returned (e.g., reflected, scattered, etc.) from the user.
At operation 204, the wearable electronic device (e.g., one or more light detectors 120) detects the light that has interacted with the user. As a result of the light interacting with the user, a returned portion (e.g., a returned portion 150a) of the light travels from the user to the wearable electronic device, where it is detected by the light detector. The light detector may output a sensing signal to the processing unit in response to detecting the returned portion of the light. The sensing signal may represent a waveform of the returned portion of the light.
The light detector(s) may be capable of outputting multiple signals, each corresponding to light emitted by a different light emitter. In some cases, the processing unit uses a multiplexing technique in which emission and/or sensing of the light from each light emitter occurs at different times. In some cases, the processing unit may cause the light detector(s) sense light from multiple emitters at the same time and use signal processing techniques to separate the signals or otherwise extract relevant information. In some cases, the optical sensing assembly 101 may include one or more physical components that allow the light detector(s) 120 to sense light from multiple emitters, including filters and the like.
At operation 206, the wearable electronic device (e.g., the processing unit 105) determines, based on the detected light, whether the wearable electronic device is in a sensing state. In various embodiments, a signal level (e.g., an amplitude, a power, or an intensity) of the sensing signal may correspond to a separation distance between the wearable electronic device and the user. The wearable electronic device may determine that it is in the sensing state by determining that it is within a maximum separation distance of the user (e.g., determining that a separation distance between the wearable electronic device and the user is less than or equal to the maximum sensing distance).
The maximum sensing distance Z may vary for different wearable electronic devices, users, and/or physiological parameters being detected. In some cases, the maximum sensing distance is zero (e.g., the device must be contacting the user to perform a reliable measurement). In some cases, the maximum sensing distance may be 0.1 mm, 0.5 mm, 1 mm, 2 mm, or more. The maximum sensing distance may be determined based on an amount that the wearable electronic device moves relative to the user during normal use. For example, the maximum sensing distance may be equal to or slightly greater than (e.g., within 5%, 10%, 25%, or 50%) of an amount the distance between the wearable electronic device and the user changes during normal use.
As noted herein, determining that the wearable electronic device is in the sensing state includes determining that multiple locations or regions of the exterior surface of the device (e.g., a rear exterior surface of the device) are within the maximum sensing distance of the user. The wearable electronic device may include multiple light emitters and/or light detectors positioned at different locations beneath an exterior surface of the device. As a result, multiple light emitters and/or light detectors may be used to generate multiple sensing signals that may be analyzed to determine whether multiple different locations or regions of the exterior surface of the device are within the maximum sensing distance of the user. This may provide more reliable physiological measurements by ensuring that the device is not tilted or otherwise in a position in which emitted light does not sufficiently propagate through the user's skin.
Additionally or alternatively, determining that the device is in the sensing state may include emitting and detecting light at multiple different wavelengths. The wearable electronic device may analyze multiple sensing signals corresponding to each of the different wavelengths to achieve a more reliable determination that one or more locations or regions of the exterior surface of the device are within the maximum sensing distance of the user. For example, in some cases, the device may determine that it is within the maximum sensing distance of the user if the signal levels of multiple sensing signals satisfy a boundary condition. This may avoid or reduce false positives from single sensing signals.
At operation 208, if the wearable electronic device is in the sensing state, the method 200 proceeds to operation 210. If the wearable electronic device is not in the sensing state, the method 200 proceeds to operation 212. In some cases, operation 212 is optional, and the method 200 may return to operation 202 if the wearable electronic device is not in the sensing state.
At operation 210, the processing unit determines, based on the detected light, at least one physiological parameter of the user. The processing unit may analyze the light detected during operation 204 (the same light that is used to determine whether the wearable electronic device is in the sensing state) to determine the physiological parameter(s). In some cases, the light emitter(s) emit additional light and/or the light detector(s) detect additional light for use in determining the physiological parameter(s). Example physiological parameters include, but are not limited to, a heart rate, a blood-oxygen saturation value, a blood glucose value, a total hemoglobin value, or the like. As noted herein, the detected light may originate from multiple light emitters and/or be emitted at multiple different wavelengths. In some cases the processing unit may analyze detected light from multiple emitters and/or detectors to determine the physiological parameter(s). In some cases, some or all of operation 210 may be performed by a device that is operably coupled to the wearable electronic device, such as a connected smartphone, a server, or another connected computing device.
At operation 212, the wearable electronic device notifies the user that the device is not in the sensing state. The notification may be provided in a graphical user interface of the wearable electronic device or another device that is operably coupled to the wearable electronic device (e.g., a smartphone). Additionally or alternatively, the wearable electronic device may take other actions besides notifying the user, including tightening a band of the wearable electronic device, adjusting a position of the wearable electronic device, changing an operating state of the wearable electronic device, or the like. Following operation 212 (or if operation 212 is omitted, following operation 208), the method 200 may return to operation 202 to initiate a subsequent optical measurement. In some cases, the wearable electronic device may wait a time period before initiating the subsequent optical measurement.
The watch body 406 may include an enclosure 402. The enclosure 402 may include a front side enclosure member that faces away from a user's skin when the watch 400 is worn by a user, and a back-side enclosure member that faces toward the user's skin. Alternatively, the enclosure 402 may include a singular enclosure member, or more than two enclosure members. The one or more enclosure members may be metallic, plastic, ceramic, glass, or other types of enclosure members (or combinations of such materials).
The enclosure 402 may include a cover 402a mounted to a front side of the watch body 406 (i.e., facing away from a user's skin) and may protect a display 404 mounted within the enclosure 402. The display 404 may produce graphical output that may be viewable by a user through the cover 402a. In some cases, the cover 402a may be part of a display stack, which may include a touch sensing or force sensing capability. The display may be configured to depict a graphical output of the watch 400, and a user may interact with the graphical output (e.g., using a finger, stylus, or other pointer). As one example, the user may select (or otherwise interact with) a graphic, icon, or the like presented on the display by touching or pressing (e.g., providing touch input) on the cover 402a at the location of the graphic. As used herein, the term “cover” may be used to refer to any transparent, semi-transparent, or translucent surface made out of glass, a crystalline material (such as sapphire or zirconia), plastic, or the like. Thus, it should be appreciated that the term “cover,” as used herein, encompasses amorphous solids as well as crystalline solids. The cover 402a may form a part of the enclosure 202. In some examples, the cover 402a may be a sapphire cover. The cover 402a may also be formed of glass, plastic, or other materials.
The watch body 406 may include at least one input device or selection device, such as a button, crown, scroll wheel, knob, dial, or the like, which input device may be operated by a user of the watch 400.
The watch 400 may include one or more input devices (e.g., a crown 403, a button 409, a scroll wheel, a knob, a dial, or the like). The input devices may be used to provide inputs to the watch 400. The crown 403 and/or button 409 may be positioned along a portion of the enclosure 402, for example along a sidewall of the enclosure as shown in
The crown 403 may be user-rotatable, and may be manipulated (e.g., rotated, pressed) by a user. The crown 403 and/or button 409 may be mechanically, electrically, magnetically, and/or optically coupled to components within the enclosure 402, as one example. A user's manipulation of the crown 403 and/or button 409 may be used, in turn, to manipulate or select various elements displayed on the display, to adjust a volume of a speaker, to turn the watch 400 on or off, and so on.
In some embodiments, the button 409, the crown 403, scroll wheel, knob, dial, or the like may be touch sensitive, conductive, and/or have a conductive surface, and a signal route may be provided between the conductive portion and a circuit within the watch body 406, such as a processing unit.
The enclosure 402 may include structures for attaching the watch band 407 to the watch body 406. In some cases, the structures may include elongate recesses or openings through which ends of the watch band 407 may be inserted and attached to the watch body 406. In other cases (not shown), the structures may include indents (e.g., dimples or depressions) in the enclosure 402, which indents may receive ends of spring pins that are attached to or threaded through ends of a watch band to attach the watch band to the watch body. The watch band 407 may be used to secure the watch 400 to a user, another device, a retaining mechanism, and so on. In some cases, the watch 400 includes one or more components (e.g., motors, shape-memory alloys, etc.) for automatically or mechanically changing a tightness of the watch band 407, for example, to change a separation distance between the watch 400 and the user.
In some examples, the watch 400 may lack any or all of the cover 402a, the display 404, the button 409, or the crown 403. For example, the watch 400 may include an audio input or output interface, a touch input interface, a force input or haptic output interface, or other input or output interface that does not require the display 204, the button 409, or the crown 403. The watch 400 may also include the aforementioned input or output interfaces in addition to the display 404, the button 409, or the crown 403. When the watch 400 lacks the display, the front side of the watch 400 may be covered by the cover 402a, or by a metallic or other type of enclosure member.
The optical sensing assembly 501 may include an optical sensing assembly housing 576. The optical sensing assembly housing 576 may define one or more cavities (e.g., cavities 572a, 572b) that are defined by one or more walls 574a, 574b, 574c (e.g., light blocking walls) of the optical sensing assembly housing. One or more of the walls (e.g., wall 574a) may separate the cavities 572a, 572b. One or more of the walls (e.g., walls 574b, 564c) may at least partially surround the light emitter 510 and/or the light detector 520. The optical sensing assembly 501 may be attached to an interior surface of the cover 502b. For example, the walls 574a-c may be attached to an interior surface of the cover 502b using adhesive or any other suitable mechanism for joining the optical sensing assembly housing 576 to the cover 502b. In some cases the walls 574a-c extend to an exterior surface of the cover 502b and define the windows 570a, 570b in the cover 502b. Additionally or alternatively, the windows 570a, 570b may be defined by masking or other treatments or techniques of the cover 502b.
In some cases, the optical sensing assembly 501 may include one or more optical elements (e.g., lenses, light films, and the like) for directing light emitted and/or detected by the optical sensing assembly.
As noted herein, in some cases, the wearable electronic devices described herein may analyze multiple sensing signals to determine whether the device is in the sensing state. The wearable electronic devices may include multiple light emitters and/or light detectors positioned at different locations beneath an exterior surface of the device.
The multiple light emitters 610a-d and/or light detectors 620a-d may be used to determine whether different locations or regions of the exterior surface of the watch 400 are within the maximum sensing distance of the user. This may provide more reliable physiological measurements by ensuring that the watch 400 is not tilted or otherwise in a position in which emitted light does not sufficiently propagate through the user's skin.
The second embodiment of the watch 400 may include similar components and/or functionality as other devices described herein. As shown in
The optical sensing assembly 601 may include an optical sensing assembly housing 676. The optical sensing assembly housing 676 may define one or more cavities (e.g., cavities 672e-g) beneath each window 672a-h that are defined by one or more walls (e.g., walls 674a-d) of the optical sensing assembly housing. In some cases, the optical sensing assembly 601 may include one or more optical elements (e.g., lenses, light films, and the like) for directing light emitted and/or detected by the optical sensing assembly.
In various embodiments, a light emitter 610a-d and a light detector 620a-d may define a sensor pair. The light emitter 610a-d of the sensor pair emits light that is detected by the light detector 620a-d of the sensor pair. The light emitters 610a-d and light detectors 620a-d may define multiple sensor pairs. A light emitter 610a-d may be a member of multiple sensor pairs in that the light emitted by the light emitter may be detected by multiple light detectors 620a-d. Similarly, a light detector 620a-d may be a member of multiple sensor pairs in that the light detector may detect light from multiple light emitters 610a-d. The respective light emitters 610a-d and/or light detectors 620a-d of each sensor pair may be positioned at different locations beneath an exterior surface of the watch 400. As a result, different sensor pairs may be used to determine whether different locations or regions of the exterior surface of the watch 400 are within the maximum sensing distance of the user. Similarly, different sensor pairs may provide different physiological data based on the different locations of the respective light emitters and light detectors of the sensor pairs, and the resulting different light paths between the respective light emitters and light detectors.
In some cases, determining that the watch 400 is in the sensing state includes determining that each sensor pair of the optical sensing assembly 601 indicates that the region of the rear exterior surface corresponding to the sensor pair is within the maximum sensing distance. Alternatively, determining that the watch 400 is in the sensing state may include determining that a majority or another threshold number of the sensor pairs of the optical sensing assembly 601 indicate that the region of the rear exterior surface corresponding to the sensor pair is within the maximum sensing distance. In some cases, the watch 400 may reject signals detected by sensor pairs that indicate the region of the watch is not within the maximum sensing distance and may use non-rejected signals to determine one or more physiological parameters. Alternatively, the watch 400 may not perform any determination of physiological parameters until all of the sensor pairs indicate that the regions corresponding to each sensor pair are within the maximum sensing distance of the user 640.
Additionally or alternatively, determining that the device is in the sensing state may include emitting and detecting light at multiple different wavelengths. For example, the light emitters 610a-d may include two or more light sources for emitting light at different wavelengths. One or more of the light detectors 620a-d may detect light from different light emitters 610a-d and/or having different wavelengths. The watch 400 may analyze multiple sensing signals corresponding to each of the different wavelengths to achieve a more reliable determination that one or more locations or regions of the exterior surface of the watch are within the maximum sensing distance of the user 640. For example, in some cases, the watch 400 may determine that it is within the maximum sensing distance of the user 640 if the signal levels of multiple sensing signals satisfy a boundary condition. This may avoid or reduce false positives from single sensing signals.
The processing unit 702 can control some or all of the operations of the electronic device 700. The processing unit 702 can communicate, either directly or indirectly, with some or all of the components of the electronic device 700. For example, a system bus or other communication mechanism 716 can provide communication between the processing unit 702, the power source 714, the memory 704, the input device(s) 706, and the output device(s) 710.
The processing unit 702 can be implemented as any electronic device capable of processing, receiving, or transmitting data or instructions. For example, the processing unit 702 can be a microprocessor, a central processing unit (CPU), an application-specific integrated circuit (ASIC), a digital signal processor (DSP), or combinations of such devices. As described herein, the term “processing unit” is meant to encompass a single processor or processing unit, multiple processors, multiple processing units, or other suitably configured computing element or elements.
It should be noted that the components of the electronic device 700 can be controlled by multiple processing units. For example, select components of the electronic device 700 (e.g., an input device 706) may be controlled by a first processing unit and other components of the electronic device 700 (e.g., the display 712) may be controlled by a second processing unit, where the first and second processing units may or may not be in communication with each other. In some cases, the processing unit 702 may determine a biological parameter of a user of the electronic device, such as an ECG for the user.
The power source 714 can be implemented with any device capable of providing energy to the electronic device 700. For example, the power source 714 may be one or more batteries or rechargeable batteries. Additionally or alternatively, the power source 714 can be a power connector or power cord that connects the electronic device 700 to another power source, such as a wall outlet.
The memory 704 can store electronic data that can be used by the electronic device 700. For example, the memory 704 can store electrical data or content such as, for example, audio and video files, documents and applications, device settings and user preferences, timing signals, control signals, and data structures or databases. The memory 704 can be configured as any type of memory. By way of example only, the memory 704 can be implemented as random access memory, read-only memory, Flash memory, removable memory, other types of storage elements, or combinations of such devices.
In various embodiments, the display 712 provides a graphical output, for example associated with an operating system, user interface, and/or applications of the electronic device 700. In one embodiment, the display 712 includes one or more sensors and is configured as a touch-sensitive (e.g., single-touch, multi-touch) and/or force-sensitive display to receive inputs from a user. For example, the display 712 may be integrated with a touch sensor (e.g., a capacitive touch sensor) and/or a force sensor to provide a touch- and/or force-sensitive display. The display 712 is operably coupled to the processing unit 702 of the electronic device 700.
The display 712 can be implemented with any suitable technology, including, but not limited to liquid crystal display (LCD) technology, light emitting diode (LED) technology, organic light-emitting display (OLED) technology, organic electroluminescence (OEL) technology, or another type of display technology. In some cases, the display 712 is positioned beneath and viewable through a cover that forms at least a portion of an enclosure of the electronic device 700. In various embodiments, graphical outputs of the display 712 may be responsive to estimated physiological parameters determined by the device 700. For the processing unit 702 may cause the display 712 to display a notification or other graphical object(s) related to physiological parameters.
In various embodiments, the input devices 706 may include any suitable components for detecting inputs. Examples of input devices 706 include light sensors, temperature sensors, audio sensors (e.g., microphones), optical or visual sensors (e.g., cameras, visible light sensors, or invisible light sensors), proximity sensors, touch sensors, force sensors, mechanical devices (e.g., crowns, switches, buttons, or keys), vibration sensors, orientation sensors, motion sensors (e.g., accelerometers or velocity sensors), location sensors (e.g., global positioning system (GPS) devices), thermal sensors, communication devices (e.g., wired or wireless communication devices), resistive sensors, magnetic sensors, electroactive polymers (EAPs), strain gauges, electrodes, and so on, or some combination thereof. Each input device 706 may be configured to detect one or more particular types of input and provide a signal (e.g., an input signal) corresponding to the detected input. The signal may be provided, for example, to the processing unit 702.
As discussed above, in some cases, the input device(s) 706 include a touch sensor (e.g., a capacitive touch sensor) integrated with the display 712 to provide a touch-sensitive display. Similarly, in some cases, the input device(s) 706 include a force sensor (e.g., a capacitive force sensor) integrated with the display 712 to provide a force-sensitive display.
The output devices 710 may include any suitable components for providing outputs. Examples of output devices 710 include light emitters, audio output devices (e.g., speakers), visual output devices (e.g., lights or displays), tactile output devices (e.g., haptic output devices), communication devices (e.g., wired or wireless communication devices), and so on, or some combination thereof. Each output device 710 may be configured to receive one or more signals (e.g., an output signal provided by the processing unit 702) and provide an output corresponding to the signal.
In some cases, input devices 706 and output devices 710 are implemented together as a single device. For example, an input/output device or port can transmit electronic signals via a communications network, such as a wireless and/or wired network connection. Examples of wireless and wired network connections include, but are not limited to, cellular, Wi-Fi, Bluetooth, IR, and Ethernet connections.
The processing unit 702 may be operably coupled to the input devices 706 and the output devices 710. The processing unit 702 may be adapted to exchange signals with the input devices 706 and the output devices 710. For example, the processing unit 702 may receive an input signal from an input device 706 that corresponds to an input detected by the input device 706. The processing unit 702 may interpret the received input signal to determine whether to provide and/or change one or more outputs in response to the input signal. The processing unit 702 may then send an output signal to one or more of the output devices 710, to provide and/or change outputs as appropriate.
The foregoing description, for purposes of explanation, uses specific nomenclature to provide a thorough understanding of the described embodiments. However, it will be apparent to one skilled in the art that the specific details are not required in order to practice the described embodiments. Thus, the foregoing descriptions of the specific embodiments described herein are presented for purposes of illustration and description. They are not targeted to be exhaustive or to limit the embodiments to the precise forms disclosed. It will be apparent to one of ordinary skill in the art that many modifications and variations are possible in view of the above teachings.
Although the disclosure above is described in terms of various exemplary embodiments and implementations, it should be understood that the various features, aspects and functionality described in one or more of the individual embodiments are not limited in their applicability to the particular embodiment with which they are described, but instead can be applied, alone or in various combinations, to one or more of the some embodiments of the invention, whether or not such embodiments are described and whether or not such features are presented as being a part of a described embodiment. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments but is instead defined by the claims herein presented.
One may appreciate that although many embodiments are disclosed above, that the operations and steps presented with respect to methods and techniques described herein are meant as exemplary and accordingly are not exhaustive. One may further appreciate that alternate step order or fewer or additional operations may be required or desired for particular embodiments.
As used herein, the phrase “at least one of” preceding a series of items, with the term “and” or “or” to separate any of the items, modifies the list as a whole, rather than each member of the list. The phrase “at least one of” does not require selection of at least one of each item listed; rather, the phrase allows a meaning that includes at a minimum one of any of the items, and/or at a minimum one of any combination of the items, and/or at a minimum one of each of the items. By way of example, the phrases “at least one of A, B, and C” or “at least one of A, B, or C” each refer to only A, only B, or only C; any combination of A, B, and C; and/or one or more of each of A, B, and C. Similarly, it may be appreciated that an order of elements presented for a conjunctive or disjunctive list provided herein should not be construed as limiting the disclosure to only that order provided.
As described above, one aspect of the present technology is determining physiological parameters, and the like. The present disclosure contemplates that in some instances this gathered data may include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter IDs (or other social media aliases or handles), home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information data, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to provide haptic or audiovisual outputs that are tailored to the user. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the US, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (“HIPAA”); whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, in the case of determining spatial parameters, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an app that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing such personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data. For example, haptic outputs may be provided based on non-personal information data or a bare minimum amount of personal information, such as events or states at the device associated with a user, other non-personal information, or publicly available information.
This application is a nonprovisional of and claims the benefit under 35 U.S.C. 119(e) of U.S. Provisional Patent Application No. 63/077,457, filed Sep. 11, 2020, the contents of which are incorporated herein by reference as if fully disclosed herein.
Number | Name | Date | Kind |
---|---|---|---|
4931767 | Albrecht et al. | Jun 1990 | A |
5287376 | Paoli | Feb 1994 | A |
5483261 | Yasutake | Jan 1996 | A |
5488204 | Mead et al. | Jan 1996 | A |
5488678 | Taneya | Jan 1996 | A |
5617439 | Kakimoto | Apr 1997 | A |
5644667 | Tabuchi | Jul 1997 | A |
5742631 | Paoli | Apr 1998 | A |
5825352 | Bisset et al. | Oct 1998 | A |
5835079 | Shieh | Nov 1998 | A |
5848088 | Mori et al. | Dec 1998 | A |
5880411 | Gillespie et al. | Mar 1999 | A |
5940556 | Moslehi et al. | Aug 1999 | A |
6083172 | Baker, Jr. et al. | Jul 2000 | A |
6094270 | Uomori | Jul 2000 | A |
6122042 | Wunderman et al. | Sep 2000 | A |
6188391 | Seely et al. | Feb 2001 | B1 |
6310610 | Beaton et al. | Oct 2001 | B1 |
6330378 | Forrest | Dec 2001 | B1 |
6345133 | Morozov | Feb 2002 | B1 |
6393185 | Deacon | May 2002 | B1 |
6526300 | Kiani et al. | Feb 2003 | B1 |
6533729 | Khair | Mar 2003 | B1 |
6584136 | Ju et al. | Jun 2003 | B2 |
6594409 | Dutt et al. | Jul 2003 | B2 |
6615065 | Barrett et al. | Sep 2003 | B1 |
6628686 | Sargent | Sep 2003 | B1 |
6657723 | Cohen | Dec 2003 | B2 |
6662033 | Casciani et al. | Dec 2003 | B2 |
6690387 | Zimmerman et al. | Feb 2004 | B2 |
6795622 | Forrest | Sep 2004 | B2 |
6892449 | Brophy et al. | May 2005 | B1 |
6940182 | Hilton et al. | Sep 2005 | B2 |
6947639 | Singh | Sep 2005 | B2 |
6952504 | Bi | Oct 2005 | B2 |
6987906 | Nakama et al. | Jan 2006 | B2 |
7015894 | Morohoshi | Mar 2006 | B2 |
7054517 | Mossberg | May 2006 | B2 |
7058245 | Farahi | Jun 2006 | B2 |
7079715 | Kish | Jul 2006 | B2 |
7184064 | Zimmerman et al. | Feb 2007 | B2 |
7203401 | Mossberg | Apr 2007 | B2 |
7203426 | Wu et al. | Apr 2007 | B2 |
7209611 | Joyner | Apr 2007 | B2 |
7237858 | Igarashi | Jul 2007 | B2 |
7245379 | Schwabe | Jul 2007 | B2 |
7269356 | Winzer | Sep 2007 | B2 |
7283694 | Welch | Oct 2007 | B2 |
7314451 | Halperin et al. | Jan 2008 | B2 |
7324195 | Packirisamy et al. | Jan 2008 | B2 |
7366364 | Singh | Apr 2008 | B2 |
7444048 | Peters et al. | Oct 2008 | B2 |
7447393 | Yan | Nov 2008 | B2 |
7460742 | Joyner | Dec 2008 | B2 |
7477384 | Schwabe | Jan 2009 | B2 |
7483599 | Dominic et al. | Jan 2009 | B2 |
7526007 | Chua et al. | Apr 2009 | B2 |
7558301 | Lin et al. | Jul 2009 | B2 |
7616110 | Crump et al. | Nov 2009 | B2 |
7643860 | Gueissaz | Jan 2010 | B2 |
7663607 | Hotelling et al. | Feb 2010 | B2 |
7680364 | Nilsson | Mar 2010 | B2 |
7689075 | Jenkins et al. | Mar 2010 | B2 |
7720328 | Yan | May 2010 | B2 |
7798634 | Miyahara et al. | Sep 2010 | B2 |
7885302 | Eberhard | Feb 2011 | B2 |
7885492 | Welch | Feb 2011 | B2 |
7974504 | Nagarajan | Jul 2011 | B2 |
8019400 | Diab et al. | Sep 2011 | B2 |
8175670 | Baker, Jr. et al. | May 2012 | B2 |
8300994 | Welch et al. | Oct 2012 | B2 |
8378811 | Crump et al. | Feb 2013 | B2 |
8515217 | Bernasconi et al. | Aug 2013 | B2 |
8559775 | Babie et al. | Oct 2013 | B2 |
8564784 | Wang et al. | Oct 2013 | B2 |
8618930 | Papadopoulos et al. | Dec 2013 | B2 |
8700111 | LeBoeuf et al. | Apr 2014 | B2 |
8724100 | Asghari et al. | May 2014 | B1 |
8792869 | Prentice et al. | Jul 2014 | B2 |
8873026 | Puig | Oct 2014 | B2 |
8920332 | Hong et al. | Dec 2014 | B2 |
8948832 | Hong et al. | Feb 2015 | B2 |
8983250 | Black et al. | Mar 2015 | B2 |
9020004 | Jeong | Apr 2015 | B2 |
9028123 | Nichol | May 2015 | B2 |
9031412 | Nagarajan | May 2015 | B2 |
9039614 | Yuen et al. | May 2015 | B2 |
9049998 | Brumback et al. | Jun 2015 | B2 |
9066691 | Addison et al. | Jun 2015 | B2 |
9091715 | Alameh et al. | Jul 2015 | B2 |
9110259 | Black | Aug 2015 | B1 |
9135397 | Denyer et al. | Sep 2015 | B2 |
9176282 | Pottier | Nov 2015 | B2 |
9217669 | Wu et al. | Dec 2015 | B2 |
9237855 | Hong et al. | Jan 2016 | B2 |
9241635 | Yuen et al. | Jan 2016 | B2 |
9274507 | Kim et al. | Mar 2016 | B2 |
9314197 | Eisen et al. | Apr 2016 | B2 |
9348154 | Hayakawa | May 2016 | B2 |
9360554 | Retterath et al. | Jun 2016 | B2 |
9370689 | Guillama et al. | Jun 2016 | B2 |
9392946 | Sarantos | Jul 2016 | B1 |
9405066 | Mahgerefteh | Aug 2016 | B2 |
9423418 | Alameh et al. | Aug 2016 | B2 |
9510790 | Kang et al. | Dec 2016 | B2 |
9513321 | Frangen | Dec 2016 | B2 |
9515378 | Prasad | Dec 2016 | B2 |
9526421 | Papadopoulos et al. | Dec 2016 | B2 |
9526433 | Lapetina et al. | Dec 2016 | B2 |
9543736 | Barwicz et al. | Jan 2017 | B1 |
9558336 | Lee | Jan 2017 | B2 |
9603569 | Mirov et al. | Mar 2017 | B2 |
9620931 | Tanaka | Apr 2017 | B2 |
9643181 | Chang | May 2017 | B1 |
9766370 | Aloe et al. | Sep 2017 | B2 |
9784829 | Zeng | Oct 2017 | B2 |
9804027 | Fish et al. | Oct 2017 | B2 |
9829631 | Lambert | Nov 2017 | B2 |
9833179 | Ikeda | Dec 2017 | B2 |
9861286 | Islam | Jan 2018 | B1 |
9875560 | Rajagopaian | Jan 2018 | B2 |
9880352 | Florjanczyk | Jan 2018 | B2 |
9943237 | Baker et al. | Apr 2018 | B2 |
9946020 | Horth | Apr 2018 | B1 |
9948063 | Caneau et al. | Apr 2018 | B2 |
9952433 | Um et al. | Apr 2018 | B2 |
9974466 | Kimmel | May 2018 | B2 |
10009668 | Liboiron-Ladouceur | Jun 2018 | B2 |
10016613 | Kavounas et al. | Jul 2018 | B2 |
10132996 | Lambert | Nov 2018 | B2 |
10136859 | Cutaia | Nov 2018 | B2 |
10181021 | Verkatraman et al. | Jan 2019 | B2 |
10188330 | Kadlec et al. | Jan 2019 | B1 |
10203454 | Liu | Feb 2019 | B2 |
10238351 | Halperin et al. | Mar 2019 | B2 |
10243684 | Wen | Mar 2019 | B2 |
10278591 | Gil | May 2019 | B2 |
10285898 | Douglas et al. | May 2019 | B2 |
10310196 | Hutchison | Jun 2019 | B2 |
10317200 | Han et al. | Jun 2019 | B1 |
10372160 | Lee et al. | Aug 2019 | B2 |
10376164 | Presura et al. | Aug 2019 | B2 |
10429597 | ten Have et al. | Oct 2019 | B2 |
10444067 | Hsu et al. | Oct 2019 | B2 |
10529003 | Mazed | Jan 2020 | B2 |
10537270 | Sarussi et al. | Jan 2020 | B2 |
10559708 | Chua | Feb 2020 | B2 |
10610157 | Pandya et al. | Apr 2020 | B2 |
10645470 | Baxi et al. | May 2020 | B2 |
10646145 | Pekander et al. | May 2020 | B2 |
10702211 | Clavelle et al. | Jul 2020 | B2 |
10705211 | Jacobs et al. | Jul 2020 | B2 |
10741064 | Schwarz et al. | Aug 2020 | B2 |
10795508 | Han et al. | Oct 2020 | B2 |
10799133 | Lee | Oct 2020 | B2 |
10806386 | Lobbestael et al. | Oct 2020 | B2 |
10843066 | Nicoli | Nov 2020 | B2 |
10852492 | Vermeulen et al. | Dec 2020 | B1 |
10874348 | Han et al. | Dec 2020 | B1 |
10996399 | Yang et al. | May 2021 | B2 |
11035318 | Kuboyama et al. | Jun 2021 | B2 |
11145310 | Sakurai | Oct 2021 | B2 |
11156497 | Bismuto et al. | Oct 2021 | B2 |
11158996 | Bismuto et al. | Oct 2021 | B2 |
11190556 | Meiyappan et al. | Nov 2021 | B2 |
11224381 | McHale et al. | Jan 2022 | B2 |
11226459 | Bishop et al. | Jan 2022 | B2 |
11255663 | Binder | Feb 2022 | B2 |
11309929 | Wong | Apr 2022 | B2 |
11482513 | Krasulick et al. | Oct 2022 | B2 |
11511440 | Polanco et al. | Nov 2022 | B2 |
11857298 | Allec et al. | Jan 2024 | B1 |
20020029128 | Jones et al. | Mar 2002 | A1 |
20050053112 | Shams-Zadeh-Amiri | Mar 2005 | A1 |
20050063431 | Gallup et al. | Mar 2005 | A1 |
20060002443 | Farber et al. | Jan 2006 | A1 |
20060253010 | Brady et al. | Nov 2006 | A1 |
20080044128 | Kish et al. | Feb 2008 | A1 |
20080310470 | Ooi et al. | Dec 2008 | A1 |
20100158067 | Nakatsuka et al. | Jun 2010 | A1 |
20120119920 | Sallop et al. | May 2012 | A1 |
20120310062 | Li et al. | Dec 2012 | A1 |
20130030267 | Lisogurski et al. | Jan 2013 | A1 |
20140029943 | Mathai et al. | Jan 2014 | A1 |
20140069951 | Schmidt et al. | Mar 2014 | A1 |
20140073968 | Engelbrecht et al. | Mar 2014 | A1 |
20150099943 | Russell | Apr 2015 | A1 |
20150164352 | Yoon et al. | Jun 2015 | A1 |
20160129279 | Ferolito | May 2016 | A1 |
20160199002 | Lee | Jul 2016 | A1 |
20160224750 | Kethman et al. | Aug 2016 | A1 |
20170095216 | Laty | Apr 2017 | A1 |
20170115825 | Eriksson et al. | Apr 2017 | A1 |
20170135633 | Connor | May 2017 | A1 |
20170164878 | Connor | Jun 2017 | A1 |
20170347902 | Van Gool et al. | Dec 2017 | A1 |
20170360316 | Gu | Dec 2017 | A1 |
20180014785 | Li | Jan 2018 | A1 |
20180073924 | Steinmann et al. | Mar 2018 | A1 |
20180098708 | Lee | Apr 2018 | A1 |
20180156660 | Turgeon et al. | Jun 2018 | A1 |
20180227754 | Paez Velazquez | Aug 2018 | A1 |
20190015045 | Li | Jan 2019 | A1 |
20190069781 | Kim et al. | Mar 2019 | A1 |
20190083034 | Shim et al. | Mar 2019 | A1 |
20190339468 | Evans et al. | Nov 2019 | A1 |
20190342009 | Evans et al. | Nov 2019 | A1 |
20200085374 | Lin | Mar 2020 | A1 |
20200253547 | Harris et al. | Aug 2020 | A1 |
20200297955 | Shouldice | Sep 2020 | A1 |
20210194481 | Rademeyer et al. | Jun 2021 | A1 |
20220011157 | Bismuto et al. | Jan 2022 | A1 |
20220059992 | Hill et al. | Feb 2022 | A1 |
20220075036 | Zhou et al. | Mar 2022 | A1 |
20220091333 | Wu | Mar 2022 | A1 |
20220099896 | Arbore et al. | Mar 2022 | A1 |
20230404419 | Allec et al. | Dec 2023 | A1 |
Number | Date | Country |
---|---|---|
1403985 | Mar 2004 | EP |
1432045 | Jun 2004 | EP |
3561561 | Oct 2019 | EP |
2949024 | Feb 2011 | FR |
S60127776 | Jul 1985 | JP |
S63177495 | Jul 1988 | JP |
2000163031 | Jun 2000 | JP |
2002342033 | Nov 2002 | JP |
2008262118 | Oct 2008 | JP |
WO 01014929 | Mar 2001 | WO |
WO 02011339 | Feb 2002 | WO |
WO 04031824 | Apr 2004 | WO |
WO 05091036 | Sep 2005 | WO |
WO 11090274 | Jul 2011 | WO |
WO 15051253 | Apr 2015 | WO |
WO 15094378 | Jun 2015 | WO |
WO 15105881 | Jul 2015 | WO |
WO 17040431 | Mar 2017 | WO |
WO 17184420 | Oct 2017 | WO |
WO 17184423 | Oct 2017 | WO |
WO 19152990 | Aug 2019 | WO |
WO 20106974 | May 2020 | WO |
Entry |
---|
US 11,819,316 B1, 11/2023, Allec et al. (withdrawn) |
Gonzalez-Sanchez et al., “Capacitive Sensing for Non-Invasive Breathing and Heart Monitoring in Non-Restrained, Non-Sedated Laboratory Mice,” Sensors 2016, vol. 16, No. 1052, pp. 1-16. |
He et al., “Integrated Polarization Compensator for WDM Waveguide Demultiplexers,” IEEE Photonics Technology Letters vol. 11, No. 2, Feb. 1999, pp. 224-226. |
Kybartas et al., “Capacitive Sensor for Respiratory Monitoring,” Conference “Biomedical Engineering,” Nov. 2015, 6 pages. |
Lapedus, “Electroplating IC Packages—Tooling challenges increase as advanced packaging ramps up,” Semiconductor Engineering, https://semiengineering.com/electroplating-ic-packages, Apr. 10, 2017, 22 pages. |
Materials and Processes for Electronic Applications, Series Editor: James J. Licari, AvanTeco, Whittier, California, Elsevier Inc., 2009, 20 pages. |
Worhoff et al., “Flip-chip assembly for photonic circuits,” MESA+ Research Institute, University of Twente, Integrated Optical MicroSystems Group, The Netherlands, 2004, 12 pages. |
Number | Date | Country | |
---|---|---|---|
63077457 | Sep 2020 | US |