This nonprovisional application is based on Japanese Patent Application No. 2022-136122 filed on Aug. 29, 2022, and Japanese Patent Application No. 2022-199705 filed on Dec. 14, 2022, with the Japan Patent Office, the entire contents of which are hereby incorporated by reference.
The present disclosure relates to an emotion estimating device, an emotion estimating system, and an emotion estimating method.
In recent years, not only health of body but also health of heart are desired. However, even when the health of the body can be quantified by a health examination or the like, it is difficult to quantify the health of the heart, and means and a method for visualizing a state (emotion) of the heart are studied every day. In addition, the state of the body (physical condition) becomes apparent as a symptom and is easily perceived by the person in question, and the surrounding person also easily notices the state of the body. However, because the state of the heart does not become apparent as a symptom unlike the physical condition, the surrounding person also hardly notices the state of the heart, and even the person in question does not easily notice the state of the heart. For this reason, there is a case where the state of the heart remains in an unsatisfactory state, detection of the state of the heart is delayed and becomes serious, leading to a mental disease such as depression, and it is a social problem to keep the physical state healthy.
Accordingly, there is a great demand for visualizing the state (emotion) of the heart, and there is a greater demand when a method for estimating the state of the heart is simple. For example, Japanese Patent Laying-Open No. 2020-120908 discloses a mental state estimating system that extracts an expression index of a subject from an image obtained by capturing an expression of the subject and estimates a mental state of the subject based on the extracted expression index. Japanese Patent Laying-Open No. 2019-017946 discloses a mood estimating system that estimates a variation amount of a mood of the subject based on biological information in a resting state and biological information in a non-resting state. Furthermore, Japanese Patent No. 6388824 discloses an emotion information estimating device that stores the biological information of the subject and emotion information and a physical state of the user corresponding to the biological information, learns a relationship between the biological information and the emotion information, and estimates the emotion information from the biological information for each physical state. Japanese Patent Laying-Open No. 2017-144222 discloses an emotion estimating device that acquires physiological data and non-physiological data of the subject, calculates an awakening degree of the subject and a comfort degree of the subject, and estimates the emotion of the subject from the calculated values.
In the disclosed device and method, in order to estimate the emotion of the subject, the image obtained by imaging the expression of the subject is required, or the biological information and physiological data of the subject are required. For this reason, in order to estimate the emotion of the subject, work of capturing the expression and acquiring the biological information and the physiological data is required to be performed, and it cannot be said that the emotion is estimated by the simple method. In particular, a device and a method for estimating the emotion of the subject based on information obtained from daily activities without causing the subject to perform special work are desired in order to estimate the emotion.
The present disclosure has been made to solve such a problem, and an object of the present disclosure is to provide an emotion estimating device, an emotion estimating system, and an emotion estimating method capable of estimating the emotion of the subject based on the information obtained from daily activities.
An emotion estimating device according to one aspect of the present disclosure is an emotion estimating device that estimates an emotion of a subject. The emotion estimating device includes: an interface that receives input of walking data of the subject measured by a measurement device and emotion data obtained by quantifying the emotion of the subject; a storage that stores the walking data and the emotion data received by the interface; and a computer that obtains corresponding data in which a plurality of walking parameters included in the walking data stored in the storage are associated with the emotion data. When the input of the walking data is newly received in the interface, the computer estimates the emotion of the subject from the plurality of walking parameters included in the newly received walking data based on the corresponding data, and outputs information indicating the estimated emotion of the subject from the interface.
An emotion estimating system according to one aspect of the present disclosure includes a measurement device that measures the walking data of the subject, and the above emotion estimating device.
An emotion estimating method according to one aspect of the present disclosure is an emotion estimating method for estimating an emotion of a subject. The emotion estimating method includes: receiving input of walking data of the subject measured by a measurement device and emotion data obtained by quantifying the emotion of the subject; storing the input walking data and the input emotion data in a storage; obtaining corresponding data in which a plurality of walking parameters included in the walking data stored in the storage are associated with the emotion data; when the input of the walking data is newly received, estimating the emotion of the subject from the plurality of walking parameters included in the newly received walking data based on the corresponding data; and outputting information indicating the estimated emotion of the subject.
The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.
The present disclosure provides an emotion estimating device, an emotion estimating system, and an emotion estimating method that focus on a relationship between a state of a heart and a gait and specify the gait of a subject from walking data to estimate an emotion of the subject. Therefore, in the emotion estimating device, the emotion estimating system, and the emotion estimating method according to the present disclosure, the emotion from a simple motion such as walking in daily activities can be estimated. Hereinafter, embodiments will be described with reference to the drawings. In the following description, the same configuration is denoted by the same reference numeral. Names and functions of such components are also the same. Therefore, no redundant detailed description will be given of such components.
[Configuration of Emotion Estimating System]
Although a method for obtaining the corresponding data will be described later, the walking data of subject P needs to be measured by measurement device 2 in order to obtain the corresponding data. In emotion estimating system 100 of
Although not illustrated, sensor module 21 includes an acceleration sensor, an angular velocity sensor, an arithmetic circuit that operates a walking parameter from measurement values of these sensors, and a communication circuit that wirelessly transmits the walking parameter and the measurement value operated by the arithmetic circuit to emotion estimating device 1. For example, the acceleration sensor can measure accelerations of three axes of X, Y, Z, and the angular velocity sensor can measure angular velocities of the three axes of X, Y, Z. Consequently, in addition to the measurement value of the acceleration sensor and the measurement value of the angular velocity sensor, sensor module 21 can obtain the walking parameters of a stride, a pitch, a walking speed, time required for one step, a stance phase period, a swing phase period, a toe upward angle at time of landing, a heel upward angle at time of leaving, a pronation, a maximum foot upward height, and a maximum value of acceleration in a vertical direction at time of landing as the walking data of subject P. At this point, the maximum value of the acceleration in the vertical direction at the time of landing is an example of a walking parameter that evaluates impact (landing impact) applied to the foot at the time of landing. Examples of the walking parameter of the landing impact include a method for evaluating the landing impact by the vertical movement of the center of gravity of the head or the body, and a method for evaluating the landing impact by directly measuring floor reaction force with a force plate, a foot pressure mat, or the like.
Although it has been described that the data transfer such as the walking parameter from sensor module 21 to emotion estimating device 1 is performed by wireless communication, the present disclosure is not limited thereto. For example, the data transfer such as the walking parameter from sensor module 21 to emotion estimating device 1 may be performed by wired communication or by a recording medium (for example, a memory chip or a USB memory).
In addition, although it has been described that sensor module 21 operates the walking parameter from the measurement value of the acceleration sensor and the measurement value of the angular velocity sensor, the present disclosure is not limited thereto. For example, sensor module 21 may only transmit the measurement value of the acceleration sensor and the measurement value of the angular velocity sensor to emotion estimating device 1, or emotion estimating device 1 may operate the walking parameter from the measurement value of the acceleration sensor and the measurement value of the angular velocity sensor acquired from sensor module 21.
In the present disclosure, measurement device 2 will be described as the smart shoe. However, measurement device 2 that measures the walking data of subject P is not limited to the smart shoe. For example, measurement device 2 may be a portable device such as a three-dimensional posture recognition camera (for example, Kinect (registered trademark)) capable of recognizing the movement of the entire body including the foot of subject P, or a smartphone or a smart watch having an acceleration sensor and an angular velocity sensor.
In addition, in order to obtain corresponding data as illustrated in
In the present disclosure, it is described that emotion estimating system 100 including emotion estimating device 1 and measurement device 2 estimates the emotion of subject P from the walking data, but the configuration of the system is not limited thereto. For example, as the configuration of the system, measurement device 2 may be integrated with emotion estimating device 1, or emotion estimating device 1 may be integrated with measurement device 2. Specifically, a system in which the walking data of subject P is measured by a smartphone and the emotion of subject P is estimated from the walking data measured by the smartphone is conceivable as the system in which measurement device 2 is integrated with emotion estimating device 1. Furthermore, a system that measures the walking data of subject P with the smart shoes and estimating the emotion of subject P from the walking data measured with the smart shoes is conceivable as the system in which emotion estimating device 1 is integrated with measurement device 2.
[Configuration of Emotion Estimating Device]
Processor 11 is an example of the “computer”. Processor 11 is a computer that reads a program (for example, an operating system (OS) 130 and an estimation program 131) stored in storage 13, develops the read program in memory 12, and executes the program. For example, processor 11 includes a central processing unit (CPU), a field programmable gate array (FPGA), or a graphics processing unit (GPU), or a multi-processing unit (MPU). Processor 11 may be configured of processing circuitry.
Memory 12 includes a volatile memory such as a dynamic random access memory (DRAM) or a static random access memory (SRAM) or a nonvolatile memory such as a read only memory (ROM) or a flash memory.
Storage 13 is an example of the “storage”. Storage 13 is configured of a nonvolatile storage device such as a solid state drive (SSD) and a hard disk drive (HDD). In addition to OS 130 and estimation program 131, storage 13 stores walking data 132, emotion data 133, corresponding data 134, and the like.
Estimation program 131 is a program that executes processing for estimating the emotion of subject P from walking data 132 (estimation processing II in
Walking data 132 includes the acceleration and angular velocity measured by measurement device 2 and the walking parameter calculated from these measured values. The input of walking data 132 is received from measurement device 2 through communication interface 16, and stored in storage 13.
Emotion data 133 includes quantified data obtained by evaluating the emotion of subject P in a 7-point scale questionnaire. The input of emotion data 133 is received through interface 14, and stored in storage 13.
Corresponding data 134 is data in which walking data 132 and emotion data 133 are associated with each other, and for example, includes data of an analysis result obtained by performing main component analysis, a multiple regression equation obtained by performing multiple regression analysis, and data of a coefficient of an explanatory variable. That is, corresponding data 134 may include data required for estimating the emotion of subject P from newly received walking data 132.
Interface 14 is an example of the “interface”, the “input circuitry”, and the “output circuitry”. When subject P operates a keyboard, a mouse, a touch device, and the like, interface 14 receives the input of the emotion data 133 answered in the questionnaire by. Interface 14 also outputs information indicating the estimated emotion of subject P to a display, a speaker, or the like.
Media reading device 15 receives a storage medium such as a removable disk 18, a memory chip, or a USB memory, and acquires data stored in removable disk 18, the memory chip, the USB memory, or the like.
Communication interface 16 is an example of the “interface”, the “input circuitry”, and the “output circuitry”. Communication interface 16 transmits and receives the data to and from measurement device 2 or another device by performing wired communication or wireless communication. For example, communication interface 16 communicates with measurement device 2 to receive the input of walking data 132 measured by measurement device 2. Communication interface 16 may output information indicating the estimated emotion of subject P to another device by communicating with the other device.
Although it has been described that emotion estimating device 1 obtains corresponding data 134 from walking data 132 and emotion data 133, the corresponding data may be received from a server or the like through communication interface 16 as long as generalized corresponding data 134 is previously prepared. Emotion estimating device 1 may read corresponding data 134 stored in removable disk 18 or the like by media reading device 15.
[Emotion Estimation Processing]
With reference to a flowchart of the emotion estimation processing executed by emotion estimating device 1, processing for estimating the emotion of subject P from walking data 132 will be described.
In order to estimate the emotion of subject P from walking data 132, emotion estimating device 1 needs to previously obtain corresponding data 134 in which a plurality of walking parameters included in walking data 132 are associated with emotion data 133. For this reason, as preprocessing for performing processing for estimating the emotion of subject P from walking data 132, emotion estimating device 1 executes preparation processing I for obtaining corresponding data 134 in which walking data 132 and emotion data 133 are associated with each other as illustrated in
In preparation processing I, in order to obtain corresponding data 134, after the emotion induction that inspires the specific emotion to subject P is performed, subject P is caused to walk, and walking data 132 is measured by measurement device 2. For example, human emotions can be classified using a Russell's circumplex model. FIG. is a schematic diagram in which emotions are classified using a Russell's circumplex model. As illustrated in
In preparation processing I, before the emotion induction is performed on subject P, processing a for causing subject P to walk in a normal state (control state) in which the emotion induction is not performed and measuring walking data 132 by measurement device 2 is performed. In processing a in
In preparation processing I, after processing a for measuring walking data 132 in the normal state in which the emotion induction is not performed, processing b for measuring walking data 132 in the emotion induction state is performed. In the present embodiment, an emotion induction problem that evokes past experiences is set for each of three emotions of “delighted”, “depressed”, and “irritated” that perform the emotion induction.
An emotion induction problem Q2 is a problem that evokes the past experiences in order to induce the “depressed” emotion, and a sentence that “Please recall a very depressed event. Please actually think what you thought about the event and feel the same emotion.” is described.
An emotion induction problem Q3 is a problem that evokes the past experience in order to induce the “irritated” emotion, and a sentence that “Please recall a very irritated event. Please actually think what you thought about the event and feel the same emotion.” is described. Although the example in which emotion induction problems Q1 to Q3 are sentences read by subject P has been described, the present disclosure is not limited thereto, and the emotion induction problems Q1 to Q3 may be voices, images, moving images, or the like.
After subject P is induced to the emotion by reading one of the sentences of the emotion induction problems Q1 to Q3, subject P answers the questionnaire that evaluates the emotion using the seven-point scale in order to check how much the emotion is induced.
In process b in
Emotion estimating device 1 obtains the corresponding data in which walking data 132 and emotion data 133 acquired in processing a and processing b in
From data R1, it can be seen that there is a tendency in a change in the walking parameter due to the induced emotion. For example, in the case where the emotion is “irritated”, a significant difference can be found in the values of many walking parameters as compared with the case of the control state. Specifically, in the case where the emotion is “irritated”, the “pitch” increases, the “walking speed” increases, the “time required for one step” decreases, the “stance phase period” decreases, and the “swing phase period” decreases. For these walking parameters, it is determined that there is a significant difference in a statistical temporary test with a significance level of 5%. In data R1, values assigned with two asterisks (**) are values determined to have a significant difference in the statistical temporary test with the significance level of 5%.
Furthermore, in the case where the emotion is “delighted”, the “walking speed” increases. When the emotion is “depressed”, the “stride” decreases, the “heel upward angle at time of leaving” decreases, and the “pronation” increases. For these walking parameters, it is determined that there is the significant difference in the statistical temporary test with the significance level of 10%. In data R1, a value assigned with one asterisk (*) is a value determined to have the significant difference in the statistical temporary test with the significance level of 10%.
The walking motion is not a motion in which each walking parameter independently changes, but is a motion in which each walking parameter mutually affects and changes. Accordingly, the main component analysis is applied to a plurality of walking parameters, and a tendency exerted on the plurality of walking parameters for each emotion is analyzed. At this point, the main component analysis is one of statistical methods for synthesizing a variable (main component) that represents the most variation of the whole with a small number of uncorrelated variables from a large number of correlated variables and dimensionally compressing the same.
In data R2, a factor load amount of each walking parameter in three main components PC1, PC2, PC3 is illustrated. At this point, the factor load amount is a correlation coefficient between each walking parameter and the main components PC1, PC2, PC3. Focusing on main component PC2 in data R2, the absolute value of the correlation coefficient is as high as 0.4 or more in the three walking parameters of “stride”, “walking speed”, and “toe upward angle at time of landing”. Because three walking parameters indicates any negative correlation, it can be seen that the smaller the “stride”, the slower the “walking speed”, and the smaller the “toe upward angle at time of landing”, the larger the value of main component PC2. From these relationships, newly synthesized main component PC2 can be interpreted as a variable representing “walking with small motion”.
Although a broken line is drawn at a position where the value of main component PC2 is 0 (zero), many pieces of emotion data 133 of the “delighted” emotion are plotted around the broken line. Furthermore, many pieces of emotion data 133 of the “depressed” emotion of are plotted in the positive range of the value of main component PC2, and many pieces of emotion data 133 of the “irritated” emotion are plotted in the negative range of the value of main component PC2. Therefore, from the graph in
Because the emotion changes to “irritated”, “delighted”, “depressed” due to the change in the value of main component PC2, it is considered that scoring can be performed on the axis of “comfort-discomfort” of the Russell's circumplex model in
The score around the objective variables “delighted”−“depressed” is obtained by a combination of the score of the “delighted” emotion and the score of the “depressed” emotion. However, the score of the “depressed” emotion is evaluated on the 7-point scale as illustrated in
Specifically, in the case of the most delight in life, the score of the “delighted” emotion is 7 points, and the score of the “depressed” emotion is 1 point. Therefore, the objective variable=7 points+(8 points−1 points)=14 points. In the case where the user is the most depressed in life, the score of the “delighted” emotion is 1 point, and the score of the emotion of “depressed” is 7 points. Therefore, the objective variable=1 point+(8 points−7 points)=2 points. In the normal case, because the score of the “delighted” emotion is 4 points and the score of the “depressed” emotion is 4 points, the objective variable=4 points+(8 points−4 points)=8 points.
In data R3, coefficients of the explanatory variables with respect to the objective variable are illustrated. Among the explanatory variables indicated in the data R3, the absolute values of the coefficients of the three walking parameters of “stride”, “pitch”, and “walking speed” are as high as 3 or more. That is, the three walking parameters of “stride”, “pitch”, and “walking speed” have a large contribution to the target variable. Therefore, emotion estimating device 1 can estimate the emotion from walking data 134 in the range of “delighted”−“depressed” by adopting a multiple regression equation having three walking parameters of “stride”, “pitch”, and “walking speed” as explanatory variables as corresponding data 132.
Since the degree of contribution to the target variable is larger in the order of “walking speed”, “stride”, and “pitch”, emotion estimating device 1 may estimate the emotion from the upper two walking parameters. Furthermore, emotion estimating device 1 may estimate the emotion from at least one walking parameter among three walking parameters of “stride”, “pitch”, and “walking speed”. Furthermore, emotion estimating device 1 may estimate the emotion by adding a walking parameter other than the three walking parameters of “stride”, “pitch”, and “walking speed”, and can enhance the estimation accuracy by increasing the walking parameter that estimates the emotion.
Returning to
When new walking data 132 is received (YES in step S106), emotion estimating device 1 estimates the emotion of subject P based on newly received walking data 132 and corresponding data 134 (step S107). Specifically, emotion estimating device 1 substitutes the values of the three walking parameters “stride”, “pitch”, and “walking speed” from newly received walking data 132 into the multiple regression equation in
Emotion estimating device 1 outputs information indicating the emotion of subject P estimated in step S107 from interface 14 (step S108). At this point, the information indicating the emotion includes not only simple information about characters or voices of “delighted”, “depressed”, and “irritated” but also information about icons, sounds, images, moving images, and the like corresponding to the emotions of “delighted”, “depressed”, and “irritated”. Specifically, emotion estimating device 1 displays characters of “delighted”, “depressed”, and “irritated” on the display or outputs voices of “delighted”, “depressed”, and “irritated” from the speaker in response to the estimated emotion. In addition, as information indicating emotions, “delighted” may be scored and displayed as 100 points and “depressed” may be scored and displayed as 0 points, or emotions estimated may be plotted and displayed on a two-dimensional coordinate plane of the Russell's circumplex model. Furthermore, emotion estimating device 1 is not limited to a method capable of visually or audibly recognizing the information indicating the emotion, and may output the information by a method capable of recognizing the information by smell, touch, or the like.
In emotion estimating system 100 of the first embodiment, the estimation of the emotion of subject P from walking data 132 has been described. However, when it is known that there is a certain correspondence relationship between the emotion and the gait, there is a possibility that the emotion can be changed by changing the gait. Therefore, in an emotion estimating system according to a second embodiment, a configuration in which walking advice that changes the emotion estimated from the walking data to a different emotion is performed will be described. The emotion estimating system of the second embodiment has the same hardware configuration as emotion estimating system 100 of the first embodiment, and includes the hardware configuration of emotion estimating device 1. Accordingly, the hardware configuration of the emotion estimating system of the second embodiment will not be described in detail.
Emotion estimating device 1 determines whether the emotion of subject P estimated in step S107 is an emotion classified as discomfort in the Russell's circumplex model illustrated in
When it is determined that the emotion of subject P is the comfort emotion (NO in step S107a), emotion estimating device 1 outputs information indicating the emotion of subject P estimated in step S107 from interface 14 (step S108).
On the other hand, when the emotion of subject P is determined to be the discomfort emotion (YES in step S107a), emotion estimating device 1 obtains walking data 132 associated with the comfort emotion from corresponding data 134, and outputs the walking advice that brings walking data 132 estimated to be the discomfort emotion closer to walking data 132 associated with the comfort emotion from interface 14 (step S109). Specifically, when the emotion of subject P is estimated as “depressed” and determined as the discomfort emotion, emotion estimating device 1 obtains the value of the walking parameter in which the value of the objective variable estimated as the “delighted” emotion is 14 points from the multiple regression equation in
When subject P improves walking by receiving the walking advice, the emotion of subject P may be changed from the discomfort emotion to the comfort emotion. In the flowchart of
In emotion estimating system 100 of the first embodiment, it has been described that emotion estimating device 1 needs to previously obtain the corresponding data in which the plurality of walking parameters included in the walking data are associated with the emotion data in order to estimate the emotion of subject P from the walking data. However, in the association processing based on the example described in the first embodiment, it has been found that at least one walking parameter among the three walking parameters of “stride”, “pitch”, and “walking speed” has a high degree of contribution to the emotion of subject P. Accordingly, in a third embodiment, an emotion estimating device that scores the emotion of subject P from these walking parameters to easily estimate the emotion and an emotion estimating system including the emotion estimating device will be described.
Emotion estimating device 1A needs to perform comparison with population data previously prepared in order to score the emotion of subject P from the walking data. In the present disclosure, an example of scoring the emotion using generalized population data including the walking data of subject with various attributes (for example, gender, age, and race) will be described. However, the population data appointed by emotion estimating device 1A may be population data including walking data collected for each attribute (for example, gender, age, and race), population data including walking data collected for each individual (for example, cumulative over the last 10 days), or the like. Furthermore, in the present disclosure, it is described that the population data is stored in emotion estimating device 1A, but the population data may be stored in a place other than emotion estimating device 1A such as a cloud.
In emotion estimating system 200 of
Processor 11A is typically a computer such as a CPU or an MPU. Processor 11A functions as a control unit that controls operation of each unit of emotion estimating device 1 by reading and executing a program stored in memory 12A. Processor 11A executes the program to implement scoring processing of the emotion of subject P in emotion estimating device 1.
Memory 12A is implemented by a RAM, a ROM, a flash memory, and the like. Memory 12A stores the program executed by processor 11A, data used by processor 11A, or the like. Microphone 13A receives a voice input to emotion estimating device 1A and provides a voice signal corresponding to the voice input to processor 11A.
Input device 14A receives the operation input to emotion estimating device 1A. Typically, input device 14A is implemented by a touch panel. The touch panel is provided on display 17A having a function as a display unit, and for example, is a capacitive type. The touch panel detects a touch operation on the touch panel by an external object every predetermined time, and inputs touch coordinates to processor 11A. However, input device 14A may include a button or the like.
Memory interface 15A reads data from external storage medium 150. Processor 11A reads the data stored in storage medium 150 through memory interface 15A, and stores the data in memory 12A. Processor 11A reads the data from memory 12A and stores the data in external storage medium 150 through memory interface 15A.
Storage medium 150 includes a medium that stores a program in a nonvolatile manner, such as a compact disc (CD), a digital versatile disk (DVD), a Blu-ray (registered trademark) disc (BD), a universal serial bus (USB) memory, or a secure digital (SD) memory card.
Communication interface (UF) 16A is a communication interface that exchanges various data between emotion estimating device 1A and measurement device 2, and is implemented by an adapter, a connector, or the like. For example, a wireless communication method using Bluetooth (registered trademark) low energy (BLE), or a wireless LAN is adopted as a communication method.
Speaker 18A converts the voice signal provided from processor 11A into the voice and outputs the voice to the outside of emotion estimating device 1A.
Wireless communication unit 19A is connected to a mobile communication network through a communication antenna 190 and transmits and receives the signal for wireless communication. Thus, emotion estimating device 1A can communicate with another communication device through the mobile communication network such as long term evolution (LTE) or 5G.
Sensor 20A is an acceleration sensor, and can measure the motion of subject P carrying emotion estimating device 1A. Accordingly, sensor 20A can detect an immobility time during which subject P is sitting or sleeping.
[Mind Score Calculation Processing]
Processing in which emotion estimating device 1A obtains the mind score by scoring the emotion of subject P around “lively”−“depressed” of the Russell's circumplex model will be described below. At this point, the mind score is an evaluation value obtained by scoring the emotion of subject P with “lively” as 100 points and “depressed” as 1 point.
Emotion estimating device 1A receives the input of the walking data measured by measurement device 2 every predetermined period (for example, every minute). The case where emotion estimating device 1A is always connected to measurement device 2 will be described below, but emotion estimating device 1A may not be always connected to measurement device 2. When emotion estimating device 1A is not always connected to measurement device 2, emotion estimating device 1A receives the input of the walking data at timing when emotion estimating device 1A is connected to measurement device 2. For example, in the case where emotion estimating device 1A is a smartphone and measurement device 2 is sensor module 21 of the smart shoe, the smartphone receives the input of the walking data at timing when subject P wears the smart shoe and starts communication between the smartphone and sensor module 21. When the smartphone and sensor module 21 are in a communicable state, thereafter, the smartphone may receive the input of the walking data from sensor module 21 every predetermined period (for example, every minute).
First, emotion estimating device 1A determines whether it is timing to accept the input of the walking data (step S301). When it is determined that it is the timing to receive the input of the walking data (YES in step S301), emotion estimating device 1A receives the input of the walking data measured by measurement device 2 (step S302). The walking data received by emotion estimating device 1A includes parameters of the walking speed, the stride, and the ground contact angle (the toe upward angle at time of landing or the heel upward angle at time of leaving).
Emotion estimating device 1A only needs to receive the walking data including at least one parameter of the walking speed, the stride, and the pitch in order to obtain the mind score. Furthermore, the following description will be given assuming that the walking data is data measured by measurement device 2, but walking parameters such as the walking speed, the stride, and the pitch may be measured by sensor 20A (the acceleration sensor of the smartphone) of emotion estimating device 1A or may be measured by another wearable device.
When it is determined that it is not the timing to receive the input of the walking data (NO in step S301), emotion estimating device 1A skips the processing of step S302. Emotion estimating device 1A receives the input of immobility time data every predetermined period (for example, every minute). Emotion estimating device 1A determines whether it is the timing to receive the input of the immobility time data (step S303). When it is determined that it is the timing to receive the input of the immobility time data (YES in step S303), emotion estimating device 1A receives the input of the immobility time data (step S304).
At this point, the immobility time data is the time during which subject P is not walking, for example, the time during which subject P is sitting or sleeping. The immobility time data is measured by sensor 20A (the acceleration sensor of the smartphone) of emotion estimating device 1A. Alternatively, the immobility time data may be calculated from the walking parameter obtained from the smart shoes as measurement device 2.
Emotion estimating device 1A can also score the emotion of subject P other than at the time of walking by correcting the mind score using the immobility time data in addition to the walking data, and can obtain the mind score. Specifically, emotion estimating device 1A subtracts the point from the mind score according to the immobility time data. This is based on the result of a study in which the relationship between the sitting time and the mental health was verified (Yuko KAI and four others, “Relationship between Sitting Behavior and Mental Health in Japanese Workers”, Physical Fitness Research, BULLETIN OF THE PHYSICAL FITNESS RESEARCH INSTITUTE, No. 114, pp. 1 to 10, Apr., 2016) that the mental health was degraded when the sitting time was long. Consequently, emotion estimating device 1A can score the emotion of subject P other than walking by subtracting the score corresponding to the immobility time during which subject P is not walking from the mind score obtained at the time of walking.
When it is determined that it is not the timing to receive the input of the immobility time data (NO in step S303), emotion estimating device 1A skips the processing of step S304. Subsequently, emotion estimating device 1A receives the input of the emotion data at predetermined timing (for example, once every morning). Emotion estimating device 1A determines whether it is the timing to accept the input of the emotion data (step S305). When it is determined that it is the timing to receive the input of the emotion time data (YES in step S305), emotion estimating device 1A receives the input of the emotion data (step S306). At this point, the emotion data is a subjective evaluation value of subject P obtained by subject P answering the questionnaire displayed on display 17A of emotion estimating device 1A.
Emotion estimating device 1A can improve the accuracy of the mind score by correcting the mind score using the emotion data in addition to the walking data. The mind score obtained from the walking data fluctuates greatly according to the mood of subject P. Accordingly, emotion estimating device 1A can adjust the mind score according to the mood of the day by inputting the emotion data once every morning, and the accuracy of the mind score is improved. In particular, the input of the emotion data once every morning means initial value adjustment when the activity from the state of sleeping for a long time is started to start scoring of the emotion by the walking data. However, the timing of inputting the emotion data is not limited to once every morning, but the emotion data may be input every several hours. The number of times of inputting the emotion data is desirably set to such a number that subject P does not feel bothersome to the input operation. Furthermore, the content of the emotion data is not limited to the mood of subject P this morning, but may be subjective evaluation values of subject P such as the current mood, physical fatigue, and sleep quality according to the questionnaire displayed on input screen 171.
Returning to
Specifically, how to obtain the mind score will be described.
The three walking parameters of the walking speed, the stride, and the ground contact angle can be estimated to be more “lively” emotion as the value is larger. Accordingly, emotion estimating device 1A calculates the walking score based on the value of each walking parameter and the population data according to the following Equation 1. Equation 1 is an example, and the walking score may be calculated by another equation.
Walking score=A*walking speed score+B*stride score+C*ground contact angle score (Equation 1)
At this point, the walking speed score, the stride score, and the ground contact angle score are calculated as follows.
Walking speed score=D1+E1*(walking speed−average value of population data of walking speed)/standard deviation of population data of walking speed.
Stride score=D2+E2*(average value of stride−stride population data)/standard deviation of population data of stride.
Ground contact angle score=D3+E3*(ground contact angle−average value of population data of ground contact angles)/standard deviation of population data of ground contact angles.
A to C are weighting coefficients for each score. The values of A to C can be set, for example, in a range of 0.1 to 0.9, and the condition that A+B+C=1 is satisfied. In addition, D1 to D3 are initial values of each score, and for example, can be set to the value in the range of 40 to 80, and may be all the same values or different values. E1 to E3 are addition coefficients of each score, and for example, can be set to the value in the range of 5 to 20, and may be all the same value or different values. The walking score is the value from 1 point to 100 points, and is 100 points when the walking score is greater than or equal to 100 points.
It can be estimated that the longer the immobility time included in the immobility time data, the closer to the “depressed” emotion. Accordingly, emotion estimating device 1A calculates the immobility time score according to the following Equation 2. Equation 2 is an example, and the immobility time score may be calculated by another equation.
Immobility time score=F−G(immobility time−60 min) (Equation 2)
F is an initial value of the immobility time score, and for example, can be set to the value in the range of 40 to 80. G is a subtraction coefficient of the immobility time score, and for example, can be set to the value in the range of 1 to 5. The immobility score is the value from 1 point to 100 points, and is 1 point when the immobility score is less than or equal to 1 point. In addition, in a time period in which the walking data can be acquired, the immobility time score may be increased by 1 point every minute.
For example, the emotion score is scored as −2 points for 1 star, −1 points for 2 stars, 0 points for 3 stars, 1 point for 4 stars, and 2 points for 5 stars based on the emotion data input on input screen 171 in
As illustrated in
Mind score=((walking score+immobility time score)/2)+emotion score*10 (Equation 3)
For example, emotion estimating device 1A calculates the mind score using the most recently calculated walking score for a period in which the walking data cannot be obtained from measurement device 2 such that the 24 hour mind score can be calculated. Furthermore, emotion estimating device 1A can calculate the mind score with the value of the immobilization time score+the emotion score*10 such that the mind score can be calculated even when the most recently calculated walking score is unavailable (for example, in the case where subject P does not walk throughout the day).
Returning to
In addition to the mind score, walking advice 176 corresponding to the current number of steps and the mind score is also output on output screen 174. In walking advice 176, because the mind score is lowered, “why don't you take a short walk to change your mood?” is given to urge subject P to walk. Although the walking score, the immobility time score, and the emotion score are not displayed on output screen 174, the walking score, the immobility time score, and the emotion score may be displayed.
Emotion estimating device 1A may output the temporal change of each walking parameter in addition to the output of the mind score. Specifically, when a “next” button 177 on output screen 174 in
Returning to
In the method for obtaining the mind score described with reference to
In
Furthermore, in the third embodiment, it has been described that emotion estimating device 1A scores the emotion of subject P from the walking data with axes of “lively”-“depressed” of the Russell's circumplex model, but the axes may be further expanded to the range of “lively”−“depressed”. For example, emotion estimating device 1A scores the emotion of subject P by setting the mind score to 100 points for “lively”, 1 point for “depressed”, and −20 points for “gloom”. The “gloom” state is considered to be the state in which the mind is further lowered than the “depressed” state. There is also a research result in the literature (Tsutomu Murata, and 4 others, “Characteristics of walking of elderly person having depression tendency”, Japanese Journal of Health Promotion and Physical Therapy, Vol. 7, No. 3: 127-131, 2017) that “As characteristics of the gait of an elderly person who tends to be depressed, a decrease in stride and stride involved in a decrease in walking speed and an increase in standing time and both leg support time were recognized.”, and it is considered that the score is even lower from the mind score when “depressed”.
[Modification]
The present disclosure is not limited to the above-described embodiments, but various modifications and applications are possible. In particular, the analysis and method for associating walking data 132 and emotion data 133 described above are merely examples. For example, by scoring not only the axis of “comfort-discomfort” but also the axis of “awakening-calming” of the Russell's annular model in
In the case where measurement device 2 is the smart shoe, the walking parameters that can be measured as the walking data include the stride, the pitch, the walking speed, the time required for one step, the stance phase period, the swing phase period, the toe upward angle at time of landing, the heel upward angle at time of leaving, the pronation, the maximum foot upward height, and the maximum value of acceleration in the vertical direction at the time of landing. Consequently, walking data 132 that can receive the input by emotion estimating device 1 can include at least three parameters of the stride, the pitch, the walking speed, the time required for one step, the stance phase period, the swing phase period, the toe upward angle at time of landing, the heel upward angle at time of leaving, the pronation, the maximum foot upward height, and the maximum value of acceleration in the vertical direction at the time of landing. The walking parameter described above is an example of the walking parameter that can be measured in the smart shoes, and another walking parameter may be obtained by calculation, or another sensor may be provided such that another walking parameter can be measured.
In the case where measurement device 2 is the smartphone, depending on the model, the walking parameter that can be measured as the walking data may not include the toe upward angle at time of landing, the heel upward angle at time of leaving, the pronation, and the maximum foot upward height. In this case, walking data 132 that can receive the input by emotion estimating device 1 can include at least two parameters of the stride, the pitch, the walking speed, the time required for one step, the stance phase period, and the swing phase period. Even in the case where measurement device 2 is the smartphone, the toe upward angle at time of landing, the heel upward angle at time of leaving, the pronation, the maximum foot upward height, and the walking parameter of the maximum value of the acceleration in the vertical direction at the time of landing can be acquired from information from another three-dimensional posture recognition camera or the like.
In the above-described embodiment, walking data 132 is measured by measurement device 2 in the state where the emotion induction is performed to impart the specific emotion to subject P. However, the method for causing subject P to have the specific emotion is not limited to the emotion induction, and subject P may be made aware of the specific emotion to cause subject P to have the specific emotion. In addition, subject P may be caused to walk at the timing when subject P has the specific emotion, and walking data 132 may be measured by measurement device 2.
Thus, the emotion estimating device of the present disclosure estimates the emotion of the subject from the plurality of walking parameters included in the newly received walking data based on the corresponding data, so that the emotion estimating device can estimate the emotion of the subject based on the walking data obtained from daily activities.
Although the embodiments of the present invention have been described, it should be considered that the disclosed embodiment is an example in all respects and not restrictive. The scope of the present invention is indicated by the claims, and it is intended that all modifications within the meaning and scope of the claims are included in the present invention.
Number | Date | Country | Kind |
---|---|---|---|
2022-136122 | Aug 2022 | JP | national |
2022-199705 | Dec 2022 | JP | national |