This application is based on and claims priority under 35 U.S.C. § 119 to Korean Patent Application No. 10-2020-0015262, filed on Feb. 7, 2020, in the Korean Intellectual Property Office, and to Korean Patent Application No. 10-2020-0068451 filed on Jun. 5, 2020 in the Korean Intellectual Property Office, the disclosures of which are incorporated by reference herein in their entireties.
The following description relates to an electronic device and bio-information estimation technology of the electronic device.
Generally, methods of non-invasively measuring blood pressure without damaging a human body include a method to measure blood pressure by measuring a cuff-based pressure and a method to estimate blood pressure by measuring a pulse wave without the use of a cuff. A Korotkoff-sound method is one of cuff-based blood pressure measurement methods, in which a pressure in a cuff wound around an upper arm is increased and blood pressure is measured by listening to the sound generated in the blood vessel through a stethoscope while decreasing the pressure. Another cuff-based blood pressure measurement method is an oscillometric method using an automated machine, in which a cuff is wound around an upper arm, a pressure in the cuff is increased, a pressure in the cuff is continuously measured while the cuff pressure is gradually decreased, and blood pressure is measured based on a point where a change in a pressure signal is large. Cuffless blood pressure measurement methods generally include a method of measuring blood pressure by calculating a pulse transit time (PTT) and a method a pulse wave analysis (PWA) method of estimating blood pressure by analyzing a shape of a pulse wave.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Additional aspects will be set forth in part in the description which follows and, in part, will be apparent from the description, or may be learned by practice of the presented embodiments.
According to an aspect of an example embodiment, an electronic device may include a main body; a sensor part disposed on a side of the main body; and a processor configured to control a display to display a first graphical object related to a contact state of a finger based on information related to the contact state of the finger received from the sensor part before measurement of a bio-signal; and control the display to display a second graphical object related to a contact force of the finger based on information related to the contact force of the finger received from the sensor part during measurement of the bio-signal.
The sensor part may include an optical sensor comprising a light source configured to emit light to the finger in contact with a finger contact surface and a photodetector configured to detect light scattered or reflected from the finger, and a force sensor configured to measure the contact force based on the finger contacting the finger contact surface.
In response to receiving a request for estimating bio-information, the processor is further configured to control the display to display the first graphical object representing an appearance of the main body and the sensor part, and a third graphical object representing an appearance of the finger that is in normal contact with the sensor part in the first graphical object.
The processor is further configured to control the display to display the first graphical object and repeatedly display or eliminate the third graphical object at least one or more times after a predetermined period of time.
The processor is further configured to determine whether the contact state of the finger is normal based on the information related to the contact state received from the sensor part.
The processor is further configured to control the display to display a third graphical object to induce a user to exert a force with the finger toward the sensor part based on the contact state being normal.
The third graphical object represents an appearance of the finger that repeatedly moves from a predetermined position spaced apart from the sensor part of the main body to a position of the sensor part.
The third graphical object includes an arrow directed toward the sensor part.
The processor is further configured to control the display to display a fourth graphical object indicating that the contact state is normal based on the contact state being normal.
The processor is further configured to display at least one of a fifth graphical object indicating that the contact state is abnormal, a sixth graphical object visually displaying a reason for the contact state being abnormal, and text describing a reason for the contact state being abnormal based on the contact state being abnormal.
The processor is further configured to display a seventh graphical object representing a range of a reference contact force that the finger is to exert to the sensor part based on the contact state being normal.
In response to receiving the contact force from the sensor part, the processor is further configured to control the display to display an eighth graphical object representing the contact force.
The processor is further configured to control the display to display the seventh graphical object, and display a gamified screen in which the eighth graphical object moves along the seventh graphical object in response to the contact force being received in real-time from the sensor part.
The processor is further configured to control a speaker to output a warning sound or control the display to display a ninth graphical object to warn a user based on the contact force being outside of the range of the reference contact force.
The processor is further configured to extract a feature based on the bio-signal measured by the sensor part, and estimate bio-information based on at least one of the extracted feature and the contact force.
According to an aspect of an example embodiment, a method of estimating bio-information which is performed by an electronic device comprising a sensor part disposed on a side of a main body and a processor may include acquiring, by the sensor part, a contact state of a finger before measurement of a bio-signal; controlling a display, by the processor, to display a first graphical object related to the contact state of the finger; acquiring, by the sensor part, a contact force of the finger during the measurement of the bio-signal; and controlling the display, by the processor, to display a second graphical object related to the contact force of the finger.
The method may include receiving a request for estimating bio-information; and controlling the display to display the first graphical object representing an appearance of the main body including the sensor part and a third graphical object representing an appearance of the finger that is in normal contact with the sensor part in the first graphical object.
The controlling the display to display the first graphical object related to the contact state comprises determining whether the contact state of the finger is normal based on the contact state.
The controlling the display to display the first graphical object related to the contact state comprises controlling the display to display a third graphical object to induce a user to exert a force with the finger toward the sensor part based on the contact state being normal.
The controlling the display to display the first graphical object related to the contact state comprises controlling the display to display at least one of a fifth graphical object indicating that the contact state is abnormal, a sixth graphical object visually displaying a reason for the contact state being abnormal, and text describing a reason for the contact state being abnormal based on the contact state being abnormal.
The controlling the display to display the second graphical object related to the contact force comprises displaying a seventh graphical object representing a range of a reference contact force that the finger is to exert to the sensor part based on the contact state being normal.
The controlling the display to display the second graphical object related to the contact force comprises, in response to receiving the contact force from the sensor part, controlling the display to display an eighth graphical object representing the contact force.
The method may include acquiring a bio-signal by the sensor part; and estimating bio-information based on the bio-signal and the contact force.
An electronic device may include a main body; a sensor part disposed on a side of the main body; and a processor provided inside the main body, electrically connected to the sensor part, and configured to control the sensor part, and process data received from the sensor part, wherein the sensor part comprises a housing disposed to be partially externally exposed from the side of the main body, a finger contact surface formed on an exposed surface of the housing to allow a finger to contact the finger contact surface, and an optical sensor disposed inside the housing.
The finger contact surface includes a convexly curved shape along a direction parallel to a length direction of the finger that is placed on and in contact with the finger contact surface.
The finger contact surface includes a convexly curved shape along a direction perpendicular to a length direction of the finger, or includes a shape having a flat top and curved surfaces on both sides.
The finger contact surface comprises a first light transmissive region and second light transmissive region that are formed on sides of the finger contact surface, and a third light transmissive region formed between the first light transmissive region and the second light transmissive region.
The optical sensor comprises a first light source, a second light source, and a photodetector provided between the first light source and the second light source.
The housing comprises a first light path configured to direct light emitted from the first light source toward the finger through the first light transmissive region, a second light path configured to direct light emitted from the second light source toward the finger through the second light transmissive region, and a third light path configured to direct light scattered or reflected from the finger toward the photodetector through the third light transmissive region.
The housing further comprises partition walls formed between the first light path and the second light path, and between the second light path and the third light path.
The electronic device further comprises a force sensor provided in the housing, and configured to measure a contact force applied by the finger to the finger contact surface.
The processor is configured to control the sensor part to operate in a general mode or in a bio-information estimation mode; and switch a mode of the sensor part to the bio-information estimation mode based on a user manipulation of the sensor part or an input of a request for estimating bio-information through a display mounted in the main body based on the sensor part being in the general mode.
Other features and aspects will be apparent from the following detailed description, the drawings, and the claims.
The above and other aspects, features, and advantages of certain embodiments of the present disclosure will be more apparent from the following description taken in conjunction with the accompanying drawings, in which:
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements, features, and structures may be exaggerated for clarity, illustration, and convenience.
Details of example embodiments are provided in the following detailed description with reference to the accompanying drawings. The disclosure may be understood more readily by reference to the following detailed description of example embodiments and the accompanying drawings. The disclosure may, however, be embodied in many different forms and should not be construed as being limited to the embodiments set forth herein. Rather, these embodiments are provided so that the disclosure will be thorough and complete and will fully convey the concept of the invention to those skilled in the art, and the disclosure will only be defined by the appended claims. Like reference numerals refer to like elements throughout the specification.
It will be understood that, although the terms “first,” “second,” etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. Also, the singular forms of terms are intended to include the plural forms of the terms as well, unless the context clearly indicates otherwise. In the specification, unless explicitly described to the contrary, the word “comprise,” and variations such as “comprises” or “comprising,” will be understood to imply the inclusion of stated elements but not the exclusion of any other elements. Terms such as “unit” and “module” denote units that process at least one function or operation, and they may be implemented by using hardware, software, or a combination of hardware and software.
The electronic device 100 according to example embodiments may be a smart watch or a smart band-type wearable device. However, the electronic device 100 is not limited thereto, and may be a mobile device, such as a smartphone or a tablet personal computer (PC).
Referring to
The main body MB may include modules for performing general functions of the electronic device 100 and a sensor part 10 for estimating bio-information. A battery may be embedded in the main body MB or the strap ST to supply power to various modules. The strap ST may be connected to the main body MB. The strap ST may be flexible so as to be bent around a user's wrist. The strap ST may include a first strap and a second strap that is separated from the first strap. Respective ends of the first strap and the second strap may be connected to each end of the main body MB, and the first strap and the second strap may be fastened to each other using fastening means formed on the other sides thereof. In this case, the fastening means may be formed as Velcro fastening, pin fastening, or the like, but is not limited thereto. In addition, the strap ST may be formed as one integrated piece, such as a band, which is not separated into pieces.
A display DP may be disposed on a top surface of the main body MB to visually display various types of information. The display DP may include a touch screen panel capable of receiving a touch input of a user.
The sensor part 10 may be disposed in the form of a button on a side of the main body MB. The sensor part 10 may operate in a bio-information estimation mode and in a general mode under the control of a processor. When operating in the bio-information estimation mode, the sensor part 10 may acquire force information applied by an object to the sensor part 10 when the object is in contact with the sensor part 10. Also, when the object is in contact with the sensor part 10, the sensor part 10 may acquire light information reflected or scattered from the object. When operating in the general mode, the sensor part 10 may perform a user interface function for controlling general functions of the electronic device 100, for example, selection/execution of an application, adjustment of a graphical user interface (GUI) of the display DP, and the like.
Referring to
The housing HS may have a part in the form of a button, which is externally exposed through the side of the main body MB. For example, a supporter SP inside the main body MB may support the housing HS from at least one of the periphery and the lower end of the housing HS. In the embodiment of
The housing HS may include a finger contact surface 11 which contacts a finger that is placed on and in contact with the finger contact surface 11.
A first-axis (A-A′) cross-section of the finger contact surface 11 may be convexly curved in an outward direction of the main body MB. For example, the first-axis cross-section (A-A′ cross-section) of the finger contact surface 11 may have a shape in which the height of the cross-section gradually decreases as the distance to the center of the finger contact surface 11 increases, as shown in (1) of
In another example, the first-axis cross-section of the finger contact surface 11 may have a shape in which the height of a given region of the cross-section from the center of an upper portion is horizontal and the height gradually decreases thereafter as the distance to the center increases. For example, the first-axis cross-section may have a shape in which the height gradually decreases in the form of a curve, as shown in (2) of
In another example, the first-axis cross-section of the finger contact surface 11 may be a plane. For example, as shown in (4) of
The examples of the shape of the first-axis cross-section of the finger contact surface 11 may also be applied as examples of the shape of a second-axis cross-section (B-B′ cross-section). The first-axis cross-section and the second-axis cross-section may have the same shape or different shapes. For example, the first-axis cross-section and the second-axis cross-section may both have the same shape as (2) of
When the finger contact surface 11 forms a curved surface, a deeper deformation of the finger may be made, as compared to a case of a flat surface, when the finger is pressed with the same force. Accordingly, it is possible for the user to produce the same deformation of the finger by applying less force to the finger contact surface 11.
Referring back to
The optical sensor 20 may be disposed inside the housing HS. However, the embodiment is not limited thereto, and the optical sensor 20 may be disposed at a lower end outside the housing HS.
The optical sensor 20 may include light sources 21a and 21b that irradiate a finger when the finger is placed on and in contact with the finger contact surface 11, and a photodetector 22 that detects light scattered or reflected by the tissue on the surface or inside of the finger that is irradiated by the light sources 21a and 21b.
The light sources 21a and 21b may include a first light source 21a and a second light source 21b disposed on both sides of a substrate of the optical sensor 20 as illustrated. However, the number of light sources is not limited. In this case, the light sources 21a and 21b may include at least one of a light-emitting diode (LED), a laser diode, and a phosphor, but are not limited thereto.
The first light source 21a and the second light source 21b may be configured to emit light of different wavelengths from each other. For example, both of the first light source 21a and the second light source 21b may emit light of infrared (IR) wavelength band or green wavelength band. Alternatively, one of the first light source 21a and the second light source 21b may emit light of infrared wavelength and the other may emit light of green wavelength. In addition, each of the light sources 21a and 21b may include a plurality of LEDs and the plurality of LEDs may all be configured to emit light of the same wavelength or some of the plurality of LEDs may be configured to emit light of different wavelengths. For example, the light source 21a may include an IR LED which emits light of an infrared wavelength and a green LED which emits light of a green wavelength, and the light source 21b may also include an IR LED and a green LED.
The photodetector 22 may be interposed between the first light source 21a and the second light source 21b on the substrate of the optical sensor 20. The photodetector 22 may be a complementary metal-oxide semiconductor (CMOS) image sensor, but is not limited thereto such that the photodetector 22 may include a photodiode, a phototransistor (PTr), a charge-coupled device (CCD) image sensor, and the like. When the light scattered or reflected by the finger is detected, the photodetector 22 may convert the intensity of the light into electrical digital light signal data and transmit the digital light signal data to a processor.
In addition, the force sensor 30 may be disposed inside of the housing HS or at the bottom outside of the housing HS. The force sensor 30 may be laminated on the bottom or the top of the optical sensor 20. The force sensor 30 may measure a pressing force of a finger in contact with the finger contact surface 11. For example, the force sensor 30 may include a strain gauge, and measure the magnitude of force at which the user presses the sensor 10.
Also, referring to
A processor 40 may be embedded in the main body MB of the electronic device 100. The processor 40 may be electrically connected to the sensor part 10. The processor 40 may control the optical sensor 20 and the force sensor 30, receive measured data from the optical sensor 20 and the force sensor 30, and process the received data.
The processor 40 may control the sensor part 10 to operate in a abio-information measurement mode or in a general mode.
For example, the processor 40 may operate in the general mode, and receive a command that the user inputs by manipulating the sensor part 10, and process the received command. For example, when the user manipulates the sensor part 10 in the general mode or requests execution of a bio-information estimation application through a display DP capable of receiving touch input, an interface related to the bio-information estimation application may be output to the display DP by executing the bio-information estimation application.
When a request for estimating bio-information is received according to manipulation of the sensor part 10 in the general mode or manipulation of the display DP, the processor 40 may switch the mode of the sensor part 10 to the bio-information estimation mode and control the electrically connected light sources 21a and 21b, photodetector 22, and force sensor 30. For example, the processor may control the intensity of light, duration of light, and on/off statuses of the light sources 21a and 21b and power supply to the force sensor 30.
When the processor 40 receives light signal data from the photodetector 22 and contact force data from the force sensor 30 in the bio-information estimation mode, the processor 40 may process the light signal data and the contact force data by executing, for example, a predefined bio-information estimation algorithm. For example, the processor 40 may monitor an environment for measuring bio-signal by using the light signal data and/or the contact force data, thereby guiding the user to maintain a normal measurement environment, and may estimate bio-information using a measured bio-signal.
The processor 40 may output a data processing result using various output modules of the electronic device 100, for example, the display DP, a speaker, and the like. The processor 40 may visually display various graphical objects that guide the environment for measuring a bio-signal on the display DP, in which case the various graphical objects may be provided as a gamified screen or in the form of various graphs, so as to intuitively arouse a user's interest.
Referring to
The sensor part 610 may include an optical sensor 611 and a force sensor 612.
The optical sensor 611 may include one or more light sources configured to emit light to a finger when the finger comes in contact with a finger contact surface, and a photodetector configured to detect light scattered or reflected from the surface and/or internal tissue of the finger irradiated by the light sources. The one or more light sources may emit light of different wavelengths from each other.
The force sensor 612 may measure a contact force applied by the finger in contact with the finger contact surface to the finger contact surface.
The processor 620 may include a sensor controller 621 and a data processor 622.
The sensor controller 621 may control the sensor part 610 to operate in a general mode or in a bio-information estimation mode. For example, when the electronic device 600 is driven, the sensor controller 621 may control the sensor part 610 in the general mode. When the user manipulates a button of the sensor part 610 in the general mode or requests estimation of bio-information by performing an action, such as touch/drag of a display, the mode of the sensor part 610 may be switched to the bio-information estimation mode. In addition, when the user inputs a predefined gesture through a camera module mounted in the electronic device 100 or inputs a voice command through a microphone mounted in the electronic device 100, the mode of the sensor part 610 may be switched to the bio-information estimation mode.
When a request for estimating bio-information is received and accordingly the mode of the sensor part 610 is switched to the bio-information estimation mode, the sensor controller 621 may, for example, drive the light sources of the optical sensor 611 and control the intensity of electric current, duration, or the like, of the light sources. Also, the sensor controller 621 may supply power to various modules including the force sensor 612.
When the sensor part 610 measures a bio-signal, the data processor 622 may monitor a measurement and guide the user to normally measure the bio-signal.
For example, when the sensor controller 621 switches the mode of the sensor part 610 to the bio-information estimation mode as described above, the data processor 622 may display a graphical object to induce the user to bring his/her finger properly into contact with the finger contact surface. Here, the graphical object may include, but is not limited to, text, characters, icons, images, figures, and the like.
Referring to
For example, the data processor 622 may simultaneously display the first graphical object 811 and the second graphical object 812 on the display by generating one graphical image including the first graphical object 811 and the second graphical object 812. In another example, the second graphical object 812 and/or the text 813 may repeatedly appear and disappear once or more at specific time intervals after a predetermined period of time while the first graphical object 811 is being displayed.
In addition, when the user touches the finger contact surface of the sensor part 610 with the finger according to the guidance and accordingly contact state information is received from the optical sensor 611, the data processor 622 may determine whether the contact state of the finger is normal. For example, based on the intensity of light signal received from the optical sensor 611, image data, fingerprint data, and the like, the data processor 622 may determine the contact state, such as whether the finger is in contact, the contact position, an initial contact force of the finger received from the force sensor 612, or the like.
For example, when a predefined criterion is satisfied, for example, when the force sensor 612 measures a contact force that is greater than or equal to a predefined threshold value, measures a contact force for a period greater than or equal to a threshold period, or measures a contact force greater than or equal to the predefined threshold value for a period greater than or equal to a threshold period, the data processor 622 may determine that the contact state is normal. However, the embodiment is not limited thereto.
When it is determined that the contact state is normal, the data processor 622 may display a graphical object on the display to induce the measurement of a bio-signal. For example, as shown in
In addition, a third graphical object 823 may be displayed to induce the user to exert a force with the finger toward the sensor part SB. In this case, the third graphical object 823 may include an arrow superimposed on the second graphical object 822 as illustrated. Moreover, in another example, the third graphical object 823 may be modified from the second graphical object 822 such that an index finger repeatedly moves from a predetermined position spaced apart from the sensor part SB to the sensor part SB.
In addition, as shown in
When it is determined that the contact state is abnormal, the data processor 622 may display a graphical object to induce the user to bring his/her finger again into contact with the finger contact surface. For example, as shown in
When it is determined that the contact state is normal, the data processor 622 may guide a contact force so that the user presses the sensor part 610 with his/her finger at an appropriate force while a bio-signal is being measured.
For example,
Referring to
Referring to
Referring to
Referring to
For example, the eighth graphical object 1220 may move upward in the game screen as the actual contact force received from the force sensor 612 increases, and may move downward in the game screen as the actual contact force decreases. In this case, the seventh graphical object may include a game item 1230 to provide the user with a benefit, such as a game score, when the actual contact force received from the force sensor 612 is maintained normally for a predetermined period of time. In addition, when the actual contact force received from the force sensor 612 falls out of the reference contact force, the data processor 622 may display a ninth graphical object 1240 on the display to give a visual warning as shown in
In the foregoing description, the examples in which the data processor 622 sequentially displays the graphical objects of
For example, when a request for estimating bio-information is received, the graphical objects of
The data processor 622 may receive a bio-signal from the optical sensor 611 and preprocess the received bio-signal. In this case, the bio-signal may include a photoplethysmogram (PPG) signal, an impedance plethysmogram (IPG) signal, a pressure wave signal, a video plethysmogram (VPG) signal, and the like. For example, when the bio-signal is received, the data processor 622 may remove noise by performing, for example, band-pass filtering at 0.4-10 Hz. Alternatively, bio-signal correction may be performed through fast Fourier transform-based reconstruction of the bio-signal. However, the embodiment is not limited thereto.
In addition, the data processor 622 may estimate bio-information on the basis of data received from the optical sensor 611 and the force sensor 612 by performing an algorithm for estimating bio-information. In this case, the bio-information may include, but is not limited to, mean blood pressure, systolic blood pressure, diastolic blood pressure, a vascular age, arterial stiffness, an aortic artery pressure waveform, a vascular elasticity, a stress index, and a fatigue level.
For example, the data processor 622 may extract features including a peak amplitude value, a time of a peak amplitude point, pulse waveform components, the area of a predetermined section of a bio-signal, and the like, from the bio-signal, and estimate the bio-information by combining one or more extracted features. In this case, the bio-information may be estimated by using a predefined bio-information estimation model. The bio-information estimation model may be defined as various linear or non-linear combination functions, such as addition, subtraction, division, multiplication, logarithmic value, regression equation, and the like, with no specific limitation.
In another example, the data processor 622 may acquire a contact pressure on the basis of the contact force received from the force sensor and the area of the finger contact surface of the sensor part 610, and estimate a blood pressure on the basis of oscillometry based on a maximum peak point of the bio-signal and the contact pressure.
The data processor 622 may acquire features for estimating blood pressure from the acquired oscillometric envelope OW. For example, the features for estimating blood pressure may include an amplitude value MA at a maximum peak point, a contact pressure value MP at the maximum peak point, contact pressure values SP and DP that are on the left and right sides and predetermined proportions (e.g., 0.5 to 0.7) of the contact pressure value MP at the maximum peak point. When the features are acquired, the data processor 622 may estimate blood pressure by applying the features to a predefined blood pressure estimation model.
Referring to
The data processor 622 may display graphical objects related to a contact state, a contact force, and the like, through a display as described above. In addition, when a bio-information estimate value is obtained, the data processor 622 may visually display the obtained bio-information estimate value through the display. In this case, when a bio-information estimation result falls out of a normal range, the data processor 622 may visually output alarm/warning information. Alternatively, the data processor 622 may non-visually output warning information related to a contact state, a contact force, and a bio-information estimate value in voice or through a non-visual output means, such as a haptic device.
The storage 710 may store reference information for estimating bio-information and a processing result of the sensor part 610 and/or the processor 620. In this case, the reference information may include user information, such as a user's age, gender, health condition, and the like, a normal contact state, such as a contact position of a finger or the like, a condition for driving a light source, a reference contact force, a bio-information estimation model, and the like. However, the reference information is not limited to these examples.
The storage 710 may include at least one type of storage medium of a flash memory type, a hard disk type, a multimedia card micro type, a card type memory (for example, secure digital (SD) or extreme digital (XD) memory), a random access memory (RAM), a static random access memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, and an optical disk, but is not limited thereto.
The communication interface 720 may communicate with an external device under the control of the processor 620 and transmit and receive various types of data related to bio-information estimation. For example, the communication interface 729 may transmit the processing result of the processor 620 to the external device, so that the external device performs management of bio-information history for the user, monitoring of the user's health condition, output of bio-information history and the health condition monitoring result, and the like. In this case, the external device may include, but not limited to, a smartphone, a tablet PC, a desktop computer, a notebook computer, and devices of a medical institution including a cuff-based blood pressure measurement device.
In another example, the communication interface 720 may receive a bio-information estimation model required for estimating bio-information, characteristic information of a user, and the like from the external device. The received information may be stored in the storage 710.
The communication interface 720 may communicate with the external device by using Bluetooth communication, Bluetooth low energy (BLE) communication, near field communication (NFC), wireless local access network (WLAN) communication, ZigBee communication, infrared data association (IrDA) communication, wireless fidelity (Wi-Fi) Direct (WFD) communication, ultra-wideband (UWB) communication, Ant+ communication, Wi-Fi communication, radio frequency identification (RFID) communication, 3G communication, 4G communication, and/or 5G communication. However, these are merely examples, and the embodiment is not limited thereto.
The method of
First, when a finger is in contact with the sensor part, the electronic device may acquire a contact state of the finger through the sensor part (operation 1310). Upon receiving a request for estimating bio-information, the electronic device may display a graphical object on a display to guide the finger to be in contact normally. In this case, the graphical object may include a graphical object representing the appearance of the main body and a graphical object representing the appearance of the finger to guide the finger to be in normal contact with the sensor part.
Then, based on information on a contact state of the finger acquired in operation 1310, it may be determined whether or not the contact state is normal (operation 1320). For example, whether the contact state is normal or abnormal may be determined by analyzing whether an accurate measurement site of the finger is in contact with the sensor part, whether an initial contact force is within a preset threshold range, whether a contact force is measured for a threshold period of time, or the like.
Then, when it is determined that the contact state is abnormal (operation 1320—NO), a graphical object indicating that the contact state is abnormal may be displayed in order to guide the contact state to the normal state (operation 1330).
When it is determined that the contact state is normal (operation 1320—YES), a graphical object may be displayed to induce the user to apply a pressure to the sensor part with a finger for measuring a bio-signal (operation 1340). For example, a graphical object representing a finger may repeatedly blink at a position of the sensor part or an arrow may be displayed, in order to emphasize the contact of the finger with the sensor part for the sake of easy recognition by the user.
Then, when a contact force of the finger is acquired (operation 1350), at the same time, a graphical object related to the contact force may be displayed to induce the user to press the sensor part with a predefined normal force (operation 1360). For example, a graphical object related to a reference contact force at which the finger must press the sensor part and a graphical object representing an actual contact force measured through the force sensor may be displayed. In this case, the graphical objects may be presented as a gamified screen to induce the user to maintain the contact force while arousing the user's interest.
In addition, a bio-signal may be obtained through the sensor part while the user changes the contact force with the finger according to the guidance for the contact force (operation 1370).
Then, bio-information may be estimated based on the bio-signal and the contact force (operation 1380). For example, as described above, a feature of a maximum peak point may be extracted from the bio-signal, and blood pressure may be estimated using the extracted feature and a bio-information estimation model, or a contact pressure may be obtained based on the contact force and blood pressure may be estimated based on oscillometry.
Then, a bio-information estimation result may be output (operation 1390). The bio-information estimation result may be visually output to the display, or non-visually output by using a speaker, a haptic module, or the like.
The example embodiments can be implemented as computer-readable code stored in a non-transitory computer-readable medium. Code and code segments constituting the computer program can be inferred by a computer programmer skilled in the art. The non-transitory computer-readable medium includes all types of record media in which computer-readable data are stored.
Examples of the non-transitory computer-readable medium include a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage. Further, the record medium may be implemented in the form of a carrier wave such as Internet transmission. In addition, the non-transitory computer-readable medium may be distributed to computer systems over a network, in which computer readable codes may be stored and executed in a distributed manner.
A number of examples have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2020-0015262 | Feb 2020 | KR | national |
10-2020-0068451 | Jun 2020 | KR | national |