The present disclosure relates to an information processing apparatus, an information processing method, and a computer-readable storage medium.
A technology for detecting movement of a user and identifying a behavior of the user by using a wearable device that is worn on the user is known.
For example, Japanese Laid-open Patent Publication No. 2003-46630 describes a mobile phone device that detects acceleration information on a user and controls an operation mode by using the detected acceleration information. Japanese Laid-open Patent Publication No. 2004-184351 describes an operation information measurement system that measures operation information on a desired part of a user and recognizes a movement state of the whole body.
Here, there is a need to identify a behavior pattern of a user based on information on a behavior state of the user.
An information processing apparatus according to an aspect of the present disclosure includes: a behavior state detection sensor configured to detect behavior state information on a behavior state of a user; a behavior pattern information generation unit configured to generate behavior pattern information in a multidimensional space formed of coordinate axes based on the behavior state information to generate a group of spaces in each of which density of a behavior pattern information group as a collection of pieces of the behavior pattern information exceeds predetermined density, the coordinate axes representing at least parameters of a date and time, a place, and a duration of detection of the behavior state; a behavior score calculation unit configured to calculate, as a behavior score, information on a size of the space including the behavior pattern information group; and a behavior pattern identification unit configured to identify, as a behavior pattern of the user, the behavior pattern information group in the space for which a value of the behavior score is equal to or larger than a predetermined value.
An information processing apparatus according to an aspect of the present disclosure includes: a behavior state sensor configured to detect behavior state information on a behavior state of a user; a biological sensor configured to detect biological information related to biological information on the user; an autonomic nerve activity level calculation unit configured to calculate an autonomic nerve activity level of the user based on the biological information; and an output control unit configured to change intensity of output from an output unit in accordance with intensity of the autonomic nerve activity level.
An information processing apparatus according to an aspect of the present disclosure includes: a behavior state sensor configured to detect behavior state information on a behavior state of a user; a biological sensor configured to detect biological information related to biological information on the user; an autonomic nerve activity level calculation unit configured to calculate an autonomic nerve activity level of the user based on the biological information; and an autonomic nerve activity level correction unit configured to correct the autonomic nerve activity level based on one of a country and a region in which the behavior pattern of the user is identified.
An information processing method according to an aspect of the present disclosure includes: detecting behavior information on a behavior state of a user; generating behavior pattern information in a multidimensional space formed of coordinate axes based on the behavior state information to generate a group of spaces in each of which density of a behavior pattern information group as a collection of pieces of the behavior pattern information exceeds predetermined density, the coordinate axes representing at least parameters of a date and time, a place, and a duration of detection of the behavior state; calculating, as a behavior score, information on a size of the space including the behavior pattern information group; and identifying, as a behavior pattern of the user, the behavior pattern information group in the space for which a value of the behavior score is equal to or larger than a predetermined value.
An information processing method according to an aspect of the present disclosure includes: detecting behavior state information on a behavior state of a user; detecting biological information related to biological information on the user; calculating an autonomic nerve activity level of the user based on the biological information; and changing intensity of output from an output unit in accordance with intensity of the autonomic nerve activity level.
An information processing method according to an aspect of the present disclosure includes: detecting behavior state information on a behavior state of a user; detecting biological information on biological information on the user; calculating an autonomic nerve activity level of the user based on the biological information; and correcting the autonomic nerve activity level based on one of a country and a region in which the behavior pattern of the user is identified.
A non-transitory computer-readable storage medium according to an aspect of the present disclosure stores a program causing a computer to execute: detecting behavior state information on a behavior state of a user; generating behavior pattern information in a multidimensional space formed of coordinate axes based on the behavior state information to generate a group of spaces in each of which density of a behavior pattern information group as a collection of pieces of the behavior pattern information exceeds predetermined density, the coordinate axes representing at least parameters of a date and time, a place, and a duration of detection of the behavior state; calculating, as a behavior score, information on a size of the space including the behavior pattern information group; and identifying, as a behavior pattern of the user, the behavior pattern information group in the space for which a value of the behavior score is equal to or larger than a predetermined value.
A non-transitory computer-readable storage medium according to an aspect of the present disclosure stores a program causing a computer to execute: detecting behavior state information on a behavior state of a user; detecting biological information related to biological information on the user; calculating an autonomic nerve activity level of the user based on the biological information; and changing intensity of output from an output unit in accordance with intensity of the autonomic nerve activity level.
A non-transitory computer-readable storage medium according to an aspect of the present disclosure stores a program causing a computer to execute: detecting behavior state information on a behavior state of a user; detecting biological information related to biological information on the user; calculating an autonomic nerve activity level of the user based on the biological information; and correcting the autonomic nerve activity level based on one of a country and a region in which the behavior pattern of the user is identified.
Embodiments of the present disclosure will be described in detail below with reference to the accompanying drawings. The present disclosure is not limited by the embodiments below, and if a plurality of embodiments are provided, the present disclosure includes configurations that are made by combinations of the embodiments. In addition, in the embodiments below, the same components are denoted by the same reference symbols, and therefore, repeated explanation will be omitted.
The behavior state sensor 20 is a sensor that detects behavior state information on a behavior state of the user U who is wearing the information processing apparatus 10. The behavior state information on the user U may include various kinds of information on a behavior of the user U. The behavior state information on the user U may include information on at least physical body movement of the user U, a date and time at which the behavior is performed, a place where the behavior is performed, and a duration during which the behavior is performed.
The behavior state sensor 20 includes a camera 20A, a microphone 20B, a GNSS receiver 20C, an acceleration sensor 20D, a gyro sensor 20E, a light sensor 20F, a temperature sensor 20G, and a humidity sensor 20H. However, the behavior state sensor 20 may include an arbitrary sensor that detects the behavior state information; for example, the behavior state sensor 20 may include at least one of the camera 20A, the microphone 20B, the GNSS receiver 20C, the acceleration sensor 20D, the gyro sensor 20E, the light sensor 20F, the temperature sensor 20G, and the humidity sensor 20H or may include a different sensor.
The camera 20A is an image capturing apparatus and detects, as the behavior state information, visible light around the information processing apparatus 10 (the user U) and captures an image around the information processing apparatus 10. The camera 20A may be a video camera that captures an image at a predetermined frame rate. A position and an orientation of the camera 20A arranged in the information processing apparatus 10 may be set arbitrarily; for example, the camera 20A may be arranged in the device 10A illustrated in
The microphone 20B is a microphone that detects, as the behavior state information, sound (sound wave information) around the information processing apparatus 10. A position, an orientation, and the number of the microphones 20B arranged in the information processing apparatus 10 may be set arbitrarily. Meanwhile, if the plurality of the microphones 20B are provided, information on directions in which the microphones 20B are oriented is also acquired.
The GNSS receiver 20C is a device that detects, as the behavior state information, positional information on the information processing apparatus 10 (the user U). The positional information in this example is earth coordinates. In the present embodiment, the GNSS receiver 20C is what is called a global navigation satellite system (GNSS) module that receives radio waves from satellites and detects the positional information on the information processing apparatus 10 (the user U).
The acceleration sensor 20D is a sensor that detects, as the behavior state information, acceleration of the information processing apparatus 10 (the user U), and detects, for example, gravity, vibration, and shock.
The gyro sensor 20E is a sensor that detects, as the behavior state information, rotation or an orientation of the information processing apparatus 10 (the user U), and performs detection by using the principle of the Coriolis force, the Euler's force, centrifugal force, or the like.
The light sensor 20F is a sensor that detects, as the behavior state information, light intensity around the information processing apparatus 10 (the user U). The light sensor 20F is able to detect intensity of visible light, infrared light, or ultraviolet light.
The temperature sensor 20G is a sensor that detects, as the behavior state information, temperature around the information processing apparatus 10 (the user U).
The humidity sensor 20H is a sensor that detects, as the behavior state information, humidity around the information processing apparatus 10 (the user U).
The input unit 22 is a device that receives operation performed by the user, and may be, for example, a touch panel or the like.
The output unit 24 outputs an output result obtained by the information processing apparatus 10. The output unit 24 includes, for example, a display unit 24A that displays a video and a sound output unit 24B that outputs sound. In the present embodiment, the display unit 24A is, for example, what is called a head mounted display (HMD). The sound output unit 24B is a speaker that outputs sound.
The communication unit 26 is a module that performs communication with an external apparatus or the like, and may include, for example, an antenna or the like. A communication system of the communication unit 26 is wireless communication in the present embodiment, but an arbitrary communication system is applicable.
The storage unit 28 is a memory for storing various kinds of information, such as contents of calculation performed by the control unit 30 and a program, and includes at least one of a main storage device, such as a random access memory (RAM) or a read only memory (ROM), and an external storage device, such as a hard disk drive (HDD), for example.
The storage unit 28 stores therein a learning model 28A and map data 28B. The learning model 28A is an artificial intelligent (AI) model that is used to identify an environment in which the user U is present, on the basis environmental information. The map data 28B is data including positional information on an existent building or a natural object, and is data in which earth coordinates are associated with an existent building or a natural object. A process using the learning model 28A, the map data 28B, and the like will be described later. Meanwhile, the learning model 28A, the map data 28B, and the program that is stored in the storage unit 28 for the control unit 30 may be stored in a recording medium that is readable by the information processing apparatus 10. Further, the program that is stored in the storage unit 28 for the control unit 30, the learning model 28A, and the map data 28B need not always be stored in the storage unit 28 in advance, but may be acquired by the information processing apparatus 10 from an external apparatus through communication when the data is to be used.
The control unit 30 controls operation of each of the units of the information processing apparatus 10. The control unit 30 is implemented by, for example, causing a central processing unit (CPU), a micro processing unit (MPU), or the like to execute a program that is stored in a storage unit (not illustrated) by using a RAM or the like as a work area. The control unit 30 may be implemented by, for example, an integrated circuit, such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA). The control unit 30 may be implemented by a combination of a hardware and software.
The control unit 30 includes a behavior state information acquisition unit 40, a behavior pattern information generation unit 42, a behavior score calculation unit 44, a behavior pattern identification unit 46, a storage control unit 48, and an output control unit 50.
The behavior state information acquisition unit 40 controls the behavior state sensor 20 and causes the behavior state sensor 20 to output the behavior state information on the user U. The behavior state information acquisition unit 40 acquires the behavior state information that is detected by the behavior state sensor 20.
The behavior pattern information generation unit 42 generates behavior pattern information on the basis of the behavior state information that is acquired by the behavior state information acquisition unit 40. The behavior pattern information generation unit 42 generates the behavior pattern information in a multidimensional space, in which at least parameters of a date and time, a place, and a duration of detection of the behavior state of the user U are adopted as coordinate axes, on the basis of the behavior state information, for example.
The behavior score calculation unit 44 calculates a behavior score on the basis of the behavior pattern information that is generated by the behavior pattern information generation unit 42. The behavior score calculation unit 44 generates a group of spaces in each of which density of a behavior pattern information group as a collection of pieces of the behavior state information exceeds predetermined density, for example. The behavior score calculation unit 44 calculates the behavior score of the behavior pattern information group on the basis of the grouped space that includes the behavior pattern information group. Specifically, the behavior score calculation unit 44 calculates, as the behavior score, information on a size of the space that includes the behavior pattern information group, for example.
The behavior pattern identification unit 46 identifies the behavior pattern of the user U on the basis of the behavior score that is calculated by the behavior score calculation unit 44. The behavior pattern identification unit 46 determines, as the behavior pattern of the user U, a behavior that corresponds to the behavior pattern information group for which a value of the behavior score exceeds a predetermined threshold. The behavior pattern identification unit 46 identifies a type of the behavior performed by the user U on the basis of image data, sound data, positional information, acceleration information, posture information, intensity information on infrared light and ultraviolet light, temperature information, humidity information, or the like that is acquired by the behavior state information acquisition unit 40, for example.
The storage control unit 48 causes the storage unit 28 to performs storing. The storage unit 28 stores therein information on the behavior pattern of the user U that is identified by the behavior pattern identification unit 46. The storage control unit 48 stores the information on the behavior pattern of the user U identified by the behavior pattern identification unit 46 in the storage unit 28 in a predetermined format. The predetermined format will be described later.
The output control unit 50 causes the output unit 24 to perform output. The output control unit 50 causes the display unit 24A to display the information on the behavior pattern, for example. The output control unit 50 causes the sound output unit 24B to output the information on the behavior pattern by sound, for example.
Details of Process
Details of a process performed by the information processing apparatus 10 according to the first embodiment will be descried below with reference to
The control unit 30 acquires the behavior state information on the behavior of the user U from the behavior state sensor 20 (Step S10). Specifically, the behavior state information acquisition unit 40 acquires, from the camera 20A, image data that is obtained by capturing an image around the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the microphone 20B, sound data that is obtained by collecting sound around the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the GNSS receiver 20C, positional information on the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the acceleration sensor 20D, acceleration information on the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the gyro sensor 20E, posture information on the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the light sensor 20F, intensity information on infrared light and ultraviolet light around the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the temperature sensor 20G, information on temperature around the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 acquires, from the humidity sensor 20H, information on humidity around the information processing apparatus 10 (the user U). The behavior state information acquisition unit 40 sequentially acquires pieces of the information as described above for each period. The behavior state information acquisition unit 40 may acquire the pieces of behavior state information at the same timing or different timings. Further, the predetermined period until acquisition of a next piece of behavior state information may be set arbitrarily, and the predetermined period may be the same or different for each piece of environmental information.
The behavior of the user U may include three elements, that is, physical body movement of the user U, a date and time at which the behavior is performed, a place where the behavior is performed, and a duration during which the behavior is performed in the place. The behavior of the user U may include an action, such as “play golf”, “watch movie”, or “go shopping”, in addition to the body movement of the user U. Even when the body movement of the user U acquired by the behavior state information acquisition unit 40 is the same, an action may be different if the positional information is different.
The control unit 30 generates the behavior pattern information on the user U (Step S11). Specifically, the behavior pattern information generation unit 42 generates the behavior pattern information on the user U on the basis of the behavior state information on the user U that is acquired by the behavior state information acquisition unit 40. The control unit 30 generates a group of behavior pattern information groups (Step S12). Specifically, the behavior pattern information generation unit 42 generates a group of spaces in each of which the density of the behavior pattern information group as a collection of pieces of the behavior state information exceeds predetermined density. The control unit 30 calculates the behavior score (Step S13). Specifically, the behavior score calculation unit 44 calculates, as the behavior score, a distance from a center to an end portion of the space that includes the behavior pattern information group. In other words, the behavior score calculation unit 44 calculates, as the behavior score, a size of the space that includes the behavior pattern information group.
The behavior pattern information generation unit 42 generates the behavior pattern information by plotting points P at predetermined time intervals in the three-dimensional space illustrated in
The behavior pattern information generation unit 42 generates, as an identical behavior pattern, a group of spaces in each of which density of the behavior pattern information group that is generated as the behavior pattern information and that is a collection of the points P exceeds predetermined density in the three-dimensional space illustrated in
Here, the behavior of the user U may include behaviors of purchases of things that are not frequently purchased, such as “go to a car dealership” for “purchase of a vehicle” and “go to a home improvement store” for “purchase of a chair”, for example. The behaviors as described above may be randomly plotted at different places, at different durations, and at different times in the three-dimensional space illustrated in
In the example illustrated in
Meanwhile, the duration is set as a duration from start to end of the behavior, but the present disclosure is not limited to this example. For example, as for a behavior as an intermittent event, such as “bat swing is performed 200 times” or “walk (or run) a certain number of steps”, the duration may be frequency. For example, when the user U has a habit of regularly performing exercises, and if the duration is changed to frequency, all kinds of behavior patterns can be treated as parameters of “exercise” as “movement”, so that it becomes possible to increase the possibility to display data that may be interest the user U.
Referring back to
If it is determined as Yes at Step S14, the control unit 30 identifies the behavior pattern (Step S15). Specifically, the behavior pattern identification unit 46 identifies, as the behavior pattern of the user U, a behavior that corresponds to the behavior pattern information group for which the behavior score is equal to or smaller than the predetermined threshold.
The control unit 30 identifies a type of a behavior state of the identified behavior pattern (Step S16). Specifically, the behavior pattern identification unit 46 may identify the type of the behavior state performed by the user U, on the basis of the behavior state information that is acquired by the behavior state information acquisition unit 40.
More specifically, the behavior pattern identification unit 46 may identify the behavior state of the user U by using, for example, the learning model 28A. The learning model 28A is an artificial intelligence (AI) model that is constructed by adopting a detection result of the behavior state sensor 20 and information on the type of the behavior state indicated by the detection result of the behavior state sensor 20 as a single data set, and performing learning by using a plurality of data sets as teacher data. The behavior pattern identification unit 46 inputs the detection result of the behavior state sensor 20 to the learned learning model 28A, acquires information indicating the type of the behavior state that is indicated by the detection result, and identifies the type of the behavior state of the user U. The behavior pattern identification unit 46 identifies that, for example, the user U is playing golf, going shopping, or staying in a movie theater by using the learned learning model 28A, for example.
The control unit 30 stores the behavior pattern in the storage unit 28 (Step S17). Specifically, the storage control unit 48 records the behavior pattern identified at Step S16 in a predetermined format.
As illustrated in
In the region D1, a numbering value of the identified behavior pattern is stored. The region D1 is configured with, for example, 3 bytes. In the region D2, the number of dimensions of the space in which the behavior pattern information is plotted as a group is stored. The region D2 is configured with, for example, 1 byte. In this case, a 255-dimensional space may be adopted at maximum. In the region D3, a behavior score R of the behavior pattern information group that is determined as the behavior pattern of the user U is stored. The behavior score R may vary due to a determination error of the behavior pattern, and a smaller value of the behavior score R indicates higher reliability of the behavior pattern. The region D4 is a reserved region. In the region D5 that is a last bit of a region of a last one byte of the reserved region, an identifier that indicates whether the behavior pattern is an ordinary pattern or an extraordinary pattern is stored. In the region D5, 0 is written for the behavior pattern of the ordinary behavior as will be described later, and 1 is written for the behavior pattern of the extraordinary behavior. The reserved region may be used to add incidental information when each of the behavior patterns occurs. The reserved region may have a region of 6 bytes or more for example, and may be used to write each piece of numeral information corresponding to dimensions of N-dimensions (N is an arbitrary integer).
Referring back to
The control unit 30 determines whether the process is to be terminated (Step S19). Specifically, if the control unit 30 receives operation of terminating the process, operation of turning off a power supply, or the like, it is determined that the process is to be terminated. If it is determined that the process is not to be terminated (Step S19; No), the process goes to Step S10. If it is determined that the process is to be terminated (Step S19; Yes), the process in
As described above, the information processing apparatus 10 according to the first embodiment detects the behavior state of the user U and generates the behavior pattern information group corresponding to the behavior state in the multidimensional space. Therefore, the information processing apparatus 10 according to the first embodiment is able to identify the behavior pattern of the user U on the basis of the behavior pattern information group.
A second embodiment will be described below.
In the second embodiment, the information processing apparatus 10 determines whether the identified behavior pattern of the user U is an ordinary behavior that is ordinarily performed or an extraordinary behavior that is extraordinarily performed.
In
In the second embodiment, the behavior pattern information included in the space SB is a behavior pattern for which the behavior score is smaller than a first threshold. In the second embodiment, the behavior pattern for which the behavior score is smaller than the first threshold is determined as the behavior pattern of the ordinary behavior. In the example illustrated in
In the second embodiment, a behavior pattern in a space between the space SA and the space SB is a behavior pattern for which the behavior score is equal to or larger than the first threshold and smaller than a second threshold. In the second embodiment, the behavior pattern for which the behavior score is equal to or larger than the first threshold and smaller than the second threshold is determined as the behavior pattern of the extraordinary behavior. In the example illustrated in
In the second embodiment, a behavior pattern included in an outer space of the space SA is a behavior pattern for which the behavior score is equal to or larger than the second threshold. In the second embodiment, the behavior pattern for which the behavior score is equal to or larger than the second threshold is excluded from targets and not included as the behavior pattern of the user.
Referring back to
The control unit 30 determines whether the behavior score is smaller than the first threshold (Step S24). Specifically, the behavior pattern identification unit 46 determines whether the behavior score that is calculated by the behavior score calculation unit 44 at Step S23 is smaller than the predetermined first threshold. If it is determined that the behavior score is smaller than the first threshold (Step S24; Yes), the process goes to Step S25. If it is determined that the behavior score is not smaller than the first threshold (Step S24; No), the process goes to Step S28.
If it is determined as Yes at Step S24, the control unit 30 identifies the behavior pattern of the ordinary behavior (Step S25). Specifically, the behavior pattern identification unit 46 identifies, as the behavior pattern of the ordinary behavior of the user U, a behavior that corresponds to the behavior pattern information group for which the behavior score is smaller than the predetermined first threshold.
The control unit 30 identifies a type of the behavior state of the identified behavior pattern of the ordinary behavior (Step S26). Specifically, the behavior pattern identification unit 46 may identify the type of the behavior state of the ordinary behavior that is performed by the user U, on the basis of the behavior state information that is acquired by the behavior state information acquisition unit 40.
The control unit 30 stores the behavior pattern of the ordinary behavior in the storage unit 28 (Step S27). Specifically, the storage control unit 48 stores the behavior pattern of the ordinary behavior that is identified at Step S25 in a predetermined format.
If it is determined as No at Step S24, the control unit 30 determines whether the behavior score is equal to or larger than the first threshold and smaller than the second threshold (Step S28). Specifically, the behavior pattern identification unit 46 determines whether the behavior score that is calculated by the behavior score calculation unit 44 at Step S23 is equal to or larger than the predetermined first threshold and smaller than the second threshold. If it is determined that the behavior score is equal to or larger than the first threshold and smaller than the second threshold (Step S28; Yes), the process goes to Step S29. If it is determined that the behavior score is not equal to or larger than the first threshold and smaller than the second threshold (Step S28; No), the process goes to Step S32.
If it is determined as Yes at Step S28, the control unit 30 identifies the behavior pattern of the extraordinary behavior (Step S29). Specifically, the behavior pattern identification unit 46 identifies, as the behavior pattern of the extraordinary behavior, a behavior that corresponds to the behavior pattern information group for which the behavior score is equal to or larger than the predetermined first threshold and smaller than the second threshold.
The control unit 30 identifies a type of the behavior state of the identified behavior pattern of the extraordinary behavior (Step S30). Specifically, the behavior pattern identification unit 46 may identify the type of the behavior state of the extraordinary behavior that is performed by the user U, on the basis of the behavior state information that is acquired by the behavior state information acquisition unit 40.
The control unit 30 stores the behavior pattern of the ordinary behavior in the storage unit 28 (Step S31). Specifically, the storage control unit 48 stores the behavior pattern of the extraordinary behavior that is identified at Step S25 in a predetermined format.
Processes at Step S32 and Step S33 are the same as the processes at Step S18 and Step S19 illustrated in
As described above, the information processing apparatus 10 according to the second embodiment determines whether the behavior pattern is a behavior pattern of the ordinary behavior or the extraordinary behavior on the basis of the behavior score. With this configuration, the information processing apparatus 10 according to the second embodiment is able to determine whether the same behavior is an ordinary routine or a new behavior.
Specifically, identification of the ordinary behavior and the extraordinary behavior in the second embodiment may be adopted to determine whether a user has an interest or not, in particular, whether a behavior pattern that occurs in a commuting time on a weekday is a routine behavior or an intentionally performed behavior pattern. For example, when a user goes to work from a certain station to a different station at a fixed time every day, it is possible to determine that a behavior of walking to the station is a routine rather than an active behavior that is performed with an interest if the behavior is performed at the same time, and therefore, it is possible to perform calculation while statistically eliminating data of this behavior pattern.
An information processing apparatus according to a third embodiment will be described below with reference to
As illustrated in
The behavior of the user U includes biological information, such as a degree of excitement, in addition to the physical movement. Therefore, when the behavior pattern of the user U is to be identified, it is preferable to take into account a psychological situation of the user U at the time of the behavior. The information processing apparatus 10a calculates an activity level score that indicates a degree at which the user U enjoys the behavior.
The biological sensor 32 is a sensor that detects biological information on the user U. The biological sensor 32 may be arranged at an arbitrary position as long as it is possible to detect the biological information on the user U. It is preferable that the biological information used here is information for which a value changes depending on the state of the user U, instead of stable information, such as a fingerprint, for example. Furthermore, it is further preferable that the biological information used here is information on an autonomic nerve of the user U, that is, information for which a value changes regardless of intention of the user U. Specifically, the biological sensor 32 includes a pulse wave sensor 32A and detects, as the biological information, a pulse wave of the user U. The biological sensor 32 may include a brain wave sensor that detects a brain wave of the user U.
The pulse wave sensor 32A is a sensor that detects a pulse wave of the user U. The pulse wave sensor 32A may be a transmissive photoelectric sensor that includes a light emitting unit and a light receiving unit, for example. In this case, for example, the pulse wave sensor 32A may be configured such that the light emitting unit and the light receiving unit face each other across a fingertip of the user U and the light receiving unit receives light that has transmitted through the fingertip, and measure a waveform of a pulse by using the fact that a blood flow increases with an increase in pulse wave pressure. However, the pulse wave sensor 32A is not limited to the example as described above, any type that is able to detect a pulse wave is applicable.
The biological information acquisition unit 52 controls the biological sensor 32 and causes the biological sensor 32 to detect the biological information. The biological information acquisition unit 52 acquires the biological information that is detected by the biological sensor 32.
The activity level score calculation unit 54 calculates an autonomic nerve activity level on the basis of the biological information that is acquired by the biological information. A method of calculating the autonomic nerve activity level will be described later. The activity level score calculation unit 54 calculates the activity level score. The activity level score calculation unit 54 calculates an activity level score on the basis of the behavior score that is calculated by the behavior score calculation unit 44, the behavior pattern that is identified by the behavior pattern identification unit 46, and the autonomic nerve activity level.
Details of Process
Details of a process performed by the information processing apparatus 10a according to the third embodiment will be described below with reference to
A process at Step S40 is the same as the process at Step S10 illustrated in
The control unit 30a acquires the biological information on the user U (Step S41). Specifically, the biological information acquisition unit 52 controls the pulse wave sensor 32A of the biological sensor 32 and acquires pulse wave information on the user U. In the present embodiment, as will be described later, the autonomic nerve activity level, which is an index that indicates a degree of stress, a degree of relaxation, a degree of interest, and a degree of concentration in a psychological state of the user U, is calculated by using the pulse wave information on the user U.
Processes from Step S42 to Step S47 are the same as the processes from Step S12 to Step S17 illustrated in
The control unit 30a calculates the activity level score (Step S48). Specifically, the activity level score calculation unit 54 calculates the activity level score of the user U on the basis of the pulse wave information that is acquired at Step S41.
A pulse wave will be described below with reference to
Fluctuation of the duration of the R-R interval includes some kinds of characteristic fluctuation. One kind of fluctuation is fluctuation of a low-frequency component that appears at around 0.1 Hz, which is caused by variation of activity of a sympathetic nervous system due to feedback regulation of blood pressure of blood vessel. Another kind of fluctuation is fluctuation of a high-frequency component that reflects variation that is synchronized with respiration, that is, respiratory sinus arrhythmia. The high-frequency component reflects direct interference of a respiratory center to an anterior vagus nerve, stretch receptors in lungs, and baroreceptor reflex due to a change in blood pressure caused by respiration, and is mainly used as a parasympathetic nerve index that affects the heart. In other words, among waveform components that are obtained by measuring fluctuation of an R-R wave interval of the pulse wave, a power spectrum of the low-frequency component represents a degree of activity of the sympathetic nerve and a power spectrum of the high-frequency component represents a degree of activity of the parasympathetic nerve.
The fluctuation of the input pulse wave is obtained from a differential value of R-R interval values. In this case, if the differential value of the R-R interval values of the pulse wave is not data of a temporally equal interval, the activity level score calculation unit 54 converts the data to chronological data of an equal interval by using cubic spline interpolation or the like. The activity level score calculation unit 54 performs orthogonal transformation on the differential value of the R-R interval values by the fast Fourier transform or the like. Accordingly, the activity level score calculation unit 54 calculates the power spectrums of the high-frequency components and the low-frequency components of the differential value of the R-R interval values of the pulse wave. The activity level score calculation unit 54 calculates a total sum of the power spectrums of the high-frequency components as RRHF. The activity level score calculation unit 54 calculates a total sum of the power spectrums of the low-frequency components as RRLF. The activity level score calculation unit 54 calculates an autonomic nerve activity level by using Expression (1) below. The activity level score calculation unit 54 may be referred to as an autonomic nerve activity level calculation unit.
In Expression (1), AN represents the autonomic nerve activity level, RRHF represents the total sum of the power spectrums of the high-frequency components, and RRLF represents the total sum of the power spectrums of the low-frequency components. C1 and C2 are fixed values that are defined to prevent divergence of solution.
The activity level score calculation unit 54 calculates the activity level score by using Expression (2) below.
NS=ƒ(AP,R,AN) (2)
In Expression (2), NS represents the activity level score, AP represents the behavior pattern, R represents the behavior score, and AN represents the autonomic nerve activity level. In other words, the activity level score calculation unit 54 may calculate the activity level score by using a function that includes, as parameters, the behavior pattern, the behavior score, and the autonomic nerve activity level.
Furthermore, the activity level score calculation unit 54 may calculate the activity level score by using, for example, a learning model. The learning model is an AI model that is constructed by adopting the behavior pattern, the behavior score, the autonomic nerve activity level, and the activity level score as a single data set, and performing learning by using a plurality of data sets as teacher data. In this case, the activity level score calculation unit 54 inputs the behavior pattern, the behavior score, and the autonomic nerve activity level to the learned learning model, acquires information indicating, and calculates the activity level score.
The control unit 30a provides the activity level score to the user U (Step S49). Specifically, the output control unit 50 controls at least one of the display unit 24A and the sound output unit 24B, and provides the activity level score to the user U.
The control unit 30a stores the behavior pattern and the activity level score in the storage unit 28 (Step S50). Specifically, the storage control unit 48 stores the behavior pattern identified at Step S46 and the activity level score in a predetermined format.
In the region D1a, a numbering value of the identified behavior pattern is stored. The region D1a is configured with, for example, 3 bytes. In the region D2a, the number of dimensions of the space in which the behavior pattern information is plotted as a group is stored. The region D2a is configured with, for example, 1 byte. In the region D3a, the behavior score R of the behavior pattern information group that is determined as the behavior pattern of the user U is stored. In the region D4a, the autonomic nerve activity level is stored. The region D4a is configured with, for example, 2 bytes. In the region D5a, the activity level score is stored. The region D5a is configured with, for example, 2 bytes. The region D6a is a reserved region.
Processes at Step S51 and Step S52 are the same as the processes at Step S18 and Step S19 illustrated in
As described above, the information processing apparatus 10a according to the third embodiment is able to calculate the activity level score that indicates a degree at which the user enjoys the behavior when the user U is performing the behavior that is identified as the behavior pattern of the user U. With this configuration, the information processing apparatus 10a according to the third embodiment is able to more appropriately identify the behavior pattern of the user U.
An information processing apparatus according to a fourth embodiment will be described below with reference to
As illustrated in
In a different country, a region, or the like, even the same behavior pattern may be perceived in a different manner due to a difference in sensibility. The information processing apparatus 10b corrects the activity level score depending on a country or a region.
The correction data 28C is data that is used when the activity level score correction unit 56 corrects the activity level score. The correction data 28C is data with which, for example, the behavior pattern and a correction coefficient that is to be multiplied by the activity level score in accordance with a country or a region are associated.
As illustrated in
The activity level score correction unit 56 corrects the activity level score that is calculated by the activity level score calculation unit 54. Specifically, the activity level score correction unit 56 corrects the activity level score by using the correction data 28C, on the basis of a country or a region in which the behavior pattern is identified.
Details of Process
Details of a process performed by the information processing apparatus 10b according to the fourth embodiment will be described below with reference to
Processes at Step S60 to Step S68 are the same as the processes at Step S40 to Step S48 illustrated in
The control unit 30b corrects the activity level score (Step S69). Specifically, the activity level score correction unit 56 corrects the activity level score that is calculated by the activity level score calculation unit 54 by using the correction data 28C on the basis of the positional information that is identified by the behavior pattern identification unit 46.
The control unit 30b provides the corrected activity level score to the user U (Step S70). Specifically, the output control unit 50 controls at least one of the display unit 24A and the sound output unit 24B and provides the corrected activity level score to the user U.
The control unit 30b stores the behavior pattern and the corrected activity level score in the storage unit 28 (Step S71). Specifically, the storage control unit 48 records the behavior pattern identified at Step S66 and the corrected activity level score in a predetermined format.
Processes at Step S72 and Step S73 are the same as the processes at Step S51 and Step S52 illustrated in
As described above, the information processing apparatus 10b according to the fourth embodiment multiplies the activity level score by the correction coefficient in accordance with a country or a region, and corrects the activity level score. With this configuration, the information processing apparatus 10b according to the fourth embodiment is able to more appropriately correct the activity level score in accordance with a country or a region.
A modification of the fourth embodiment will be described below. In the fourth embodiment, as illustrated in
For example, if it is determined that the region in which the autonomic nerve activity level of the user U is calculated is the area A1, the activity level score correction unit 56 corrects the autonomic nerve activity level by multiplying the calculated autonomic nerve activity level by 0.5. With this configuration, in the modification of the fourth embodiment, it is possible to more appropriately calculate the autonomic nerve activity level depending on each region.
A fifth embodiment will be described below.
In the fifth embodiment, the information processing apparatus 10b separately calculates the activity level score for the identified behavior pattern of the ordinary behavior and the identified behavior pattern of the extraordinary behavior of the user U. Further, in the fifth embodiment, the information processing apparatus 10b separately corrects the calculated activity level score of the behavior pattern of the ordinary behavior and the calculated activity level score of the behavior pattern of the extraordinary behavior of the user U.
Processes at Step S80 to Step S84 are the same as the processes at Step S40 to Step S44 illustrated in
Processes at Step S85 to Step S87 are the same as the processes at Step S24 to Step S26 illustrated in
The control unit 30b calculates the activity level score of the behavior pattern of the ordinary behavior (Step S88). Specifically, the activity level score calculation unit 54 calculates the activity level score of the behavior pattern of the ordinary behavior of the user U on the basis of the pulse wave information acquired at Step S81.
The control unit 30b corrects the activity level score of the behavior pattern of the ordinary behavior (Step S89). Specifically, the activity level score correction unit 56 corrects the activity level score of the behavior pattern of the ordinary behavior that is calculated by the activity level score calculation unit 54, by using the correction data 28C on the basis of the positional information that is identified by the behavior pattern identification unit 46.
The control unit 30b provides the corrected activity level score of the behavior pattern of the ordinary behavior to the user U (Step S90). Specifically, the output control unit 50 controls at least one of the display unit 24A and the sound output unit 24B, and provides the corrected activity level score to the user U.
The control unit 30b stores the corrected activity level score of the behavior pattern of the ordinary behavior in the storage unit 28 (Step S91). Specifically, the storage control unit 48 records the behavior pattern of the ordinary behavior that is identified at Step S86 and the corrected activity level score in a predetermined format.
If it is determined as No at Step S85, the process goes to Step S92. Processes at Step S92 to Step S94 are the same as the processes at Step S28 to Step S30 illustrated in
The control unit 30b calculates the activity level score of the behavior pattern of the extraordinary behavior (Step S95). Specifically, the activity level score calculation unit 54 calculates the activity level score of the behavior pattern of the extraordinary behavior of the user U on the basis of the pulse wave information that is acquired at Step S81.
The control unit 30b corrects the activity level score of the behavior pattern of the extraordinary behavior (Step S96). Specifically, the activity level score correction unit 56 corrects the activity level score of the behavior pattern of the extraordinary behavior that is calculated by the activity level score calculation unit 54, by using the correction data 28C on the basis of the positional information that is identified by the behavior pattern identification unit 46.
The control unit 30b provides the corrected activity level score of the behavior pattern of the extraordinary behavior to the user U (Step S97). Specifically, the output control unit 50 controls at least one of the display unit 24A and the sound output unit 24B, and provides the corrected activity level score to the user U.
The control unit 30b stores the behavior pattern of the extraordinary behavior and the corrected activity level score in the storage unit 28 (Step S91). Specifically, the storage control unit 48 stores the behavior pattern of the extraordinary behavior that is identified at Step S86 and the corrected activity level score in a predetermined format.
Processes at Step S99 and Step S100 are the same as the processes at Step S51 and Step S52 illustrated in
As described above, the information processing apparatus 10b according to the fifth embodiment is able to separately calculate the activity level score in each of a case where a behavior that is identified as the behavior pattern of the ordinary behavior is performed and a case where a behavior that is identified as the behavior pattern of the extraordinary behavior is performed. With this configuration, the information processing apparatus 10b according to the fifth embodiment is more appropriately identify the behavior pattern of the user U.
Furthermore, the information processing apparatus 10b according to the fifth embodiment multiplies the activity level score by the correction coefficient in accordance with a country or a region and corrects the activity level score of the behavior pattern of the ordinary behavior and the activity level score of the behavior pattern of the extraordinary behavior. With this configuration, the information processing apparatus 10b according to the fifth embodiment is able to more appropriately correct the activity level score in accordance with a country or a region.
An information processing apparatus according to a sixth embodiment will be described below with reference to
As illustrated in
Users may have different hobbies and interests, so that even when the users are performing the same behaviors, a value of the activity level score may vary for each of the users. The information processing apparatus 10c according to the sixth embodiment calculates the activity level score by using a learned model that is customized for each of the users.
The history data 28D is data related to a history of activity level scores. The history data 28D may include information on ranks of the activity level scores in a predetermined period for each user. Specifically, the history data 28D may include information on the behavior pattern for which the activity level score is higher than a predetermined rank in the predetermined period. The predetermined period is, for example, three months, but not limited thereto.
As illustrated in
As illustrated in
As illustrated in
As illustrated in
The history data acquisition unit 58 acquires the history data 28D from a storage unit 28c. Specifically, the history data acquisition unit 58 acquires the history data 28D of a certain user for whom the activity level score is to be calculated.
The learning unit 60 generates a learned model for calculating the activity level score of the user through learning by machine learning based on learning data. In the present embodiment, the learning unit 60 generates a learned model for calculating the activity level score on the basis of the history data 28D that is acquired by the history data acquisition unit 58, for example. The learning unit 60 performs learning on a weight of a deep neural network (DNN) as the learned model for calculating the activity level score, for example. The learning unit 60 may perform learning by using a well-known machine learning method, such as deep learning, for example. The learning unit 60 may update the learned model every time the learning data is updated, for example.
A learning process according to the sixth embodiment will be described below with reference to
The control unit 30c acquires the learning data (Step S110). Specifically, the history data acquisition unit 58 acquires, from the storage unit 28b, the history data 28D corresponding to the predetermined period for the user for whom the activity level score is to be calculated. The history data 28D acquired by the history data acquisition unit 58 may include information on at least the rank, the behavior pattern, the behavior score, and the activity level score. The history data acquisition unit 58 acquires the history data 28D including the first rank to the 1000-th rank in the past three months, for example.
The control unit 30c performs the learning process (Step S111). Specifically, the learning unit 60 generates, through machine learning, the learned model for calculating the activity level score of the user, by using the history data 28D that is acquired by the history data acquisition unit 58. More specifically, the learning unit 60 generates the learned model through learning by adopting the behavior pattern, the behavior score, the autonomic nerve activity level, and the activity level score as a single data set, and performing learning using a plurality of (for example, 1000) data sets as teacher data. The learning unit 60 generates the learned model for each user for whom the activity level score is to be calculated, for example. In other words, in the present embodiment, the learned model that is customized for each of the users is generated.
The control unit 30c stores the learned model (Step S112). Specifically, the learning unit 60 stores the generated learned model in the storage unit 28c.
Details of Process
Details of a process performed by the information processing apparatus 10c according to the sixth embodiment will be described below with reference to
Processes from Step S120 to Step S127 are the same as the processes from Step S40 to Step S47 illustrated in
The control unit 30c calculates the activity level score on the basis of the learned model corresponding to the user (Step S128). Specifically, the activity level score calculation unit 54 calculates the activity level score of the user by using the learned model that is customized for the user.
Processes from Step S129 to Step S132 are the same as the processes from Step S49 to Step S52 illustrated in
As described above, the information processing apparatus 10c according to the sixth embodiment calculates the activity level scores of the user U1 to the user U3 by using the learned model that is generated and customized in accordance with a history of the activity level scores for each of the users, such as the user U1 to the user U3. With this configuration, the information processing apparatus 10c according to the sixth embodiment is able to more appropriately calculate the activity level score in accordance with the sensibility of the user.
An information processing system according to a seventh embodiment will be described below with reference to
As illustrated in
In the sixth embodiment as described above, the activity level of the user is calculated by using the learned model that is generated and customized in accordance with the history of the activity level scores for each user. In the seventh embodiment, the server apparatus 100 stores pieces of history data of a plurality of users as shared data, generates a learned model on the basis of pieces of history data of a plurality of users who have similar activity level scores and tendencies of behavior patterns, and calculates the activity level of a user.
A configuration of the server apparatus according to the seventh embodiment will be described with reference to
As illustrated in
The communication unit 110 is implemented by, for example, a network interface card (NIC), a communication circuit, or the like. The communication unit 110 is connected to the network N in a wired or wireless manner, and transmits and receives information to and from the information processing apparatuses 10c.
The control unit 120 controls operation of each of the units of the server apparatus 100. The control unit 120 is implemented by, for example, causing a CPU, an MPU, or the like to execute a program that is stored in a storage unit (not illustrated) by using a RAM or the like as a work area. The control unit 120 may be implemented by, for example, an integrated circuit, such as an ASIC or an FPGA. The control unit 120 may be implemented by a combination of a hardware and software. The control unit 120 includes an acquisition unit 122, a determination unit 124, a requesting unit 126, and a providing unit 128.
The acquisition unit 122 acquires, from the communication unit 110, the history data related to a history of the activity level scores of each of the users who are wearing the information processing apparatuses 10c, for example. The acquisition unit 122 acquires, for example, history data 28D3 to history data 28D1 of the user U1 to the user U3 illustrated in
The determination unit 124 determines a tendency of the history data in the shared data 132. The determination unit 124 determines whether users with similar tendencies of the history data are present, for example.
If a plurality of users with similar tendencies of history data with are present, the requesting unit 126 asks whether it is possible to allow a different user to use the subject history data.
If use of the history data is approved, the providing unit 128 provides the history data to the user with the similar tendency of the history data.
The storage unit 130 is a memory for storing various kinds of information, such as contents of calculation performed by the control unit 120 and a program, and includes at least one of a main storage device, such as a RAM or a ROM, and an external storage device, such as an HDD, for example.
The storage unit 130 stores therein the shared data 132. The shared data 132 may include pieces of the history data related to the activity level scores of the plurality of users who are wearing the information processing apparatus 10c. The shared data 132 may include the history data 28D1 to the history data 28D3 of the user U1 to the user U3 illustrated in
Process Performed by Server Apparatus
A process performed by the server apparatus according to the seventh embodiment will be described below with reference to
The control unit 120 refers to the shared data 132 and determines whether users with similar history data are present (Step S140). For example, it is assumed that the shared data 132 includes the history data 28D1 to the history data 28D3 of the user U1 to the user U3 illustrated in
In the present embodiment, the case has been described in which the determination unit 124 selects two from the three users, that is the user U1 to the user U3, but this is described by way of example, and the number of users in an actual population is not specifically limited. The determination unit 124 may determine that three or more users have similar history data. The determination unit 124 may determine whether pieces of the history data are similar to each other by a method other than the method that is described in the present embodiment. The determination unit 124 may determine whether the pieces of history data are similar to each other in accordance with a predetermined conditional expression that is mathematically defined, for example.
If it is determined that the similar users are present (Step S140; Yes), the process goes to Step S141. If it is determined that the similar users are not present (Step S140; No), the process in
If it is determined as Yes at Step S140, the control unit 120 requests the users with the similar the history data to approve sharing (Step S141). Specifically, the requesting unit 126 transmits a notice for requesting approval for sharing of the history data to the user U1 and the user U2 via the communication unit 110.
The control unit 120 determines whether the request for sharing is approved (Step S142). Specifically, the requesting unit 126 determines whether a replay indicating approval of sharing of the history data is provided from the user U1 or the user U2 in response to the request for sharing the history data, which is transmitted at Step S141. If it is determined that the request for sharing is approved (Step S142; Yes), the process goes to Step S143. If it is determined that the request for sharing is not approved (Step S142; No), the process in
If it is determined as Yes at Step S142, the control unit 120 shares the history data (Step S143). Specifically, for example, if the user U2 has approved sharing of the history data, the providing unit 128 transmits the history data 28D1 of the user U1 to the information processing apparatus 10c that is worn on the user U1 via the communication unit 110. Then, the process in
The learning process according to the seventh embodiment will be described below with reference to
The control unit 30c acquires the history data (Step S150). Specifically, the history data acquisition unit 58 acquires, from the storage unit 28c, the history data 28D1 corresponding to the predetermined period for the user for whom the activity level score is to be calculated. The history data 28D1 that is acquired by the history data acquisition unit 58 may include information on at least the rank, the behavior pattern, the behavior score, and the activity level score. The history data acquisition unit 58 acquires the history data 28D1 including the first rank to the 1000-th rank in past three months, for example. Specifically, the history data acquisition unit 58 acquires 1000 data sets as teacher data.
The control unit 30c acquires, from the server apparatus 100, the history data that is similar to the history data of the user U1 (Step S151). Specifically, the history data acquisition unit 58 acquires the history data 28D2 of the user U2 from the server apparatus 100 via the communication unit 26, for example. Here, for example, it may be possible that a small number of behavior patterns are included in the 1000 data sets of the history data 28D1. For example, when the learned model is to be generated, in some cases, 6000 to 8000 data sets or more may be needed. In the present embodiment, the history data acquisition unit 58 acquires the history data 28D2 that is similar to the history data 28D1 from the server apparatus 100, so that it is possible to compensate for the number of the data sets and generate a more appropriate learned model.
The control unit 30c performs the learning process (Step S152). Specifically, the learning unit 60 performs learning through machine learning, and generates the learned model for calculating the activity level score of the user by using the history data 28D1 and the history data 28D2 that are acquired by the history data acquisition unit 58.
The control unit 30c stores the learned model (Step S153). Specifically, the learning unit 60 stores the generated learned model in the storage unit 28b. Then, the process in
Details of Process
Details of a process performed by the information processing apparatus 10c will be described below with reference to
Processes from Step S160 to Step S167 are the same as the processes from Step S120 to Step S127 illustrated in
The control unit 30c calculates the activity level score on the basis of the learned model that is generated by using the shared data (Step S168). Specifically, the activity level score calculation unit 54 calculates the activity level score of the user U1 by using the learned model that is generated by using the history data 28D1 and the history data 28D2.
Processes from Step S169 to Step S172 are the same as the processes from Step S129 to Step S132 illustrated in
As described above, the information processing apparatus 10c according to the seventh embodiment generates the learned model by using the history data 28D1 of the user U1 and the history data 28D2 of the user U2 that is similar to the history data 28D1, and calculates the activity level score by using the learned model. With this configuration, the information processing apparatus 10c according to the seventh embodiment is more appropriately calculate the activity level score.
An information processing apparatus according to an eighth embodiment will be described below with reference to
As illustrated in
The biological information on the user changes from time to time, and therefore, the activity level score of the user may change depending on a change of the biological information. The information processing apparatus 10d provides a change of the activity level score in an easily understandable manner by providing a temporal change of the activity level score of the user in a comparative manner.
The tactile stimulation output unit 26C is a device that outputs tactile stimulation to the user U. For example, the tactile stimulation output unit 26C outputs the tactile stimulation to the user by physical operation, such as vibration; however, a type of the tactile stimulation is not limited to vibration, and may be arbitrary stimulation.
The output control unit 50 of a control unit 30d of the information processing apparatus 10d according to the eighth embodiment causes the output unit 24a to output information indicating a temporal change of the activity level score that is calculated by the activity level score calculation unit 54. In other words, the output control unit 50 causes the output unit 24a to outputs information indicating a relative change of the activity level score.
A method of displaying a temporal change of the activity level score in a comparative manner will be described below with reference to
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
The output control unit 50 may cause the sound output unit 24B or the tactile stimulation output unit 26C to output information indicating a temporal change of the activity level score, for example.
A method of outputting the information indicating the temporal change of the activity level score by the sound output unit 24B or the tactile stimulation output unit 24C will be described below with reference to
In
Here, at a time t1, it is assumed that the activity level score is changed to the activity level score NS2 that is larger than the activity level score NS1. In this case, the output control unit 50 causes the sound output unit 24B to output a set of sound corresponding to the activity level score NS1 and sound corresponding to the activity level score NS2, for example. The output control unit 50 causes the tactile stimulation output unit 24C to output a set of stimulation corresponding to the activity level score NS1 and stimulation corresponding to the activity level score NS2, for example. The sound corresponding to the activity level score NS2 is louder than the sound corresponding to the activity level score NS1. The stimulation corresponding to the activity level score NS2 is stronger than the stimulation corresponding to the activity level score NS1. It is preferable to change the volume of the sound corresponding to the activity level score NS2 in accordance with a ratio of the activity level score NS2 to the activity level score NS1, for example. It is preferable to change the magnitude of the stimulation the activity level score NS2 in accordance with a ratio of the activity level score NS2 to the activity level score NS1, for example. With this configuration, the user is able to recognize magnitude of the activity level score NS2 relative to the activity level score NS1.
Here, at a time t2, it is assumed that the activity level score is changed to an activity level score NS3 that is smaller than the activity level score NS1. In this case, the output control unit 50 causes the sound output unit 24B to output a set of sound corresponding to the activity level score NS1 and sound corresponding to the activity level score NS3, for example. The output control unit 50 causes the tactile stimulation output unit 24C to output stimulation corresponding to the activity level score NS1 and stimulation corresponding to the activity level score NS3, for example. The sound corresponding to the activity level score NS3 is lower than the sound corresponding to the activity level score NS1. The stimulation corresponding to the activity level score NS3 is weaker than the stimulation corresponding to the activity level score NS1. It is preferable to change the volume of the sound corresponding to the activity level score NS3 in accordance with a ratio of the activity level score NS2 to the activity level score NS1, for example. It is preferable to change the magnitude of the stimulation of the activity level score NS3 in accordance with a ratio of the activity level score NS2 to the activity level score NS1, for example. With this configuration, the user is able to recognize the magnitude of the activity level score NS2 relative to the activity level score NS1.
As described above, the information processing apparatus 10d according to the eighth embodiment provides a temporal change of the activity level score of the user in a comparative manner. With this configuration, the information processing apparatus 10d according to the eighth embodiment makes it possible to easily recognize a temporal change of the activity level score.
A program for performing the information processing method described above may be provided by being stored in a non-transitory computer-readable storage medium, or may be provided via a network such as the Internet. Examples of the computer-readable storage medium include optical discs such as a digital versatile disc (DVD) and a compact disc (CD), and other types of storage devices such as a hard disk and a semiconductor memory.
According to the present disclosure, it is possible to identify a behavior pattern of a user based on information on a behavior state of the user.
The information processing apparatus, the information processing method, and the program according to the present disclosure may be applied to a technique for analyzing a behavior of a user.
Although the present disclosure has been described with respect to specific embodiments for a complete and clear disclosure, the appended claims are not to be thus limited but are to be construed as embodying all modifications and alternative constructions that may occur to one skilled in the art that fairly fall within the basic teaching herein set forth.
Number | Date | Country | Kind |
---|---|---|---|
2020-160161 | Sep 2020 | JP | national |
2020-160244 | Sep 2020 | JP | national |
2020-160245 | Sep 2020 | JP | national |
2020-160246 | Sep 2020 | JP | national |
This application is a Continuation of International Application No. PCT/JP2021/033904, filed Sep. 15, 2021, which designates the United States, incorporated herein by reference, and which claims the benefit of priority from Japanese Patent Applications No. 2020-160161, No. 2020-160244, No. 2020-160245, and No. 2020-160246, each filed Sep. 24, 2020, all incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/033904 | Sep 2021 | US |
Child | 18187816 | US |