The present disclosure relates to an information processing system, an information processing method, and a recording medium.
Conventionally, measurement of an emotion of a user in a specific situation and estimation of a target of the emotion have been performed to provide feedback for the emotion.
For example, Patent Literature 1 below discloses that an estimated emotion of a user and an object causing the user to have the emotion are acquired, and when the estimated emotion of the user is positive, presentation information for maintaining the emotion is presented to the user, but when the estimated emotion of the user is negative, presentation information for removing the object is presented to the user.
Further, Patent Literature 2 below relates to a technique for detecting a positive emotion and a negative emotion from a content of a conversation during the conversation. Further, Patent Literature 3 below discloses a system that determines whether a mental state is positive or negative based on a facial expression. Further, Patent Literature 4 below discloses an action control system that properly learns appropriateness of an action of a robot based on an emotion change of a user by changing suitability associated with a selected action of the robot based on an emotion change of the user.
Further, Patent Literature 5 below discloses that, in a dialogue system, a negative emotion and a positive emotion are estimated from a user's utterance or motion, and a response corresponding to the emotion is made to cause the user to feel like the system sympathizes with the user, thereby building a trust between the user and the system.
Patent Literature 1: JP 2017-201499 A
Patent Literature 2: JP 2017-091570 A
Patent Literature 3: JP 2016-147006 A
Patent Literature 4: JP 2016-012340 A
Patent Literature 5: JP 2006-178063 A
However, as for the response to an emotion of the user, only a response that is in sympathy with the emotion of the user (that is, showing empathy) or a response causing the user to maintain a positive emotion is made, and a response for making a state of the user better has not been considered.
Further, Patent Literature 1 described above discloses that, when the user has a negative emotion, information for removing an object causing the user to have the emotion is presented, but in this case, acquisition of the object is essential, and in a case where the object cannot be removed, it is difficult to reduce the negative emotion of the user.
Therefore, the present disclosure proposes an information processing system, an information processing method, and a recording medium capable of making a state of a user better according to an emotion of the user and improving quality of life.
According to the present disclosure, an information processing system is provided that includes: a control unit that estimates whether a user is positive or negative, and has any one of a function of promoting an action of the user when it is estimated that the user is positive, a function of suppressing an action of the user when it is estimated that the user is negative, or a function of presenting a positive interpretation on a situation or action of the user when it is estimated that the user is negative.
According to the present disclosure, an information processing method, by a processor, is provided that includes: estimating whether a user is positive or negative; and performing any one of a function of promoting an action of the user when it is estimated that the user is positive, a function of suppressing an action of the user when it is estimated that the user is negative, or a function of presenting a positive interpretation on a situation or action of the user when it is estimated that the user is negative.
According to the present disclosure, a recording medium recording a program is provided that causes a computer to function as a control unit that estimates whether a user is positive or negative, and has any one of a function of promoting an action of the user when it is estimated that the user is positive, a function of suppressing an action of the user when it is estimated that the user is negative, or a function of presenting a positive interpretation on a situation or action of the user when it is estimated that the user is negative.
As described above, according to the present disclosure, it is possible to make a state of a user better according to an emotion of the user and improve quality of life.
Note that the above-described effect is not necessarily restrictive, and any one of effects described in the present specification or other effects that can be understood from the present specification may be exhibited in addition to or in place of the above-described effect.
Hereinafter, a preferred embodiment of the present disclosure will be described in detail with reference to the appended drawings. Note that, in the present specification and the drawings, components having substantially the same functional configuration are provided with the same reference signs, so that repeated description of these components is omitted.
Further, the description will be made in the following order.
1. Overview of Information Processing System According to Embodiment of Present Disclosure
2. Exemplary Embodiments
2-1. First Exemplary Embodiment (Promotion and Suppression of Action)
(2-1-1. Configuration Example)
(2-1-2. Operation Processing)
2-2. Second Exemplary Embodiment (Learning in Database)
(2-2-1. Configuration Example)
(2-2-2. Operation Processing)
2-3. Third Exemplary Embodiment (Brain Stimulation)
(2-3-1. Configuration Example)
(2-3-2. Operation Processing)
2-4. Fourth Exemplary Embodiment (Ethical and Legal Considerations)
(2-4-1. Configuration Example)
(2-4-2. Operation Processing)
2-5. Fifth Exemplary Embodiment (Reframing)
(2-5-1. Configuration Example)
(2-5-2. Operation Processing)
(2-5-3. Adding Response Showing Empathy)
(2-5-4. Automatic Generation of Reframing)
2-6. Sixth Exemplary Embodiment (Example of Application to Small Communities)
(2-6-1. Configuration Example)
(2-6-2. Operation Processing)
3. Conclusion
An information processing system according to the present embodiment is an agent system that estimates an emotion of a user based on an action of the user or a situation, and gives a feedback corresponding to the estimated emotion, thereby further improving a life of the user, that is, improving quality of life (QOL) of the user. It is necessary to have a negative emotion such as anger or sadness toward an environment, situation, or people, in order to detect a danger and avoid the danger, but an excessive negative emotion leads to stress and may adversely affect an immune system or the like. On the other hand, having a positive emotion has a good influence on the immune system or the like, and can be said as a more favorable state.
(Background)
In recent years, a voice agent that recognizes an uttered voice of the user and directly responds to a user's question or request in one short-term session (completed with a request and a response) has become popular. Such a voice agent is installed, for example, as a home agent for domestic use, in a dedicated speaker device (so-called home agent device) placed in a kitchen, a dining room, or the like.
Further, in the above-described existing techniques, a voice agent that makes a response corresponding to an estimated emotion of the user is also proposed.
However, in all of the techniques, the system expresses the same emotion as that of the user, thereby showing empathy to the user, and improvement (including long-term state improvement such as improvement of a living, a life, or the like) of a state of the user regardless of positive/negative has not been sufficiently considered.
In this regard, according to the present embodiment, it is possible to make a state of a user better and improve quality of life by providing feedback corresponding to an emotion of the user.
Specifically, the information processing system (agent system) according to the present embodiment recognizes an action of the user in a specific situation, estimates an emotion at that time, and gives feedback for promoting (increasing the action) or suppressing (reducing the action) the action at that time according to the estimated emotion, thereby further improving a life of the user. For example, in a case where the emotion of the user is positive, the information processing system according to the present embodiment gives positive feedback (a positive reinforcer based on behavior analysis) for promoting the action at that time to increase the action. Further, in a case where the emotion of the user is negative, the information processing system according to the present embodiment gives negative feedback (a negative reinforcer based on behavior analysis) for suppressing the action at that time to reduce the action. Further, the information processing system according to the present embodiment can also turn the emotion of the user into positive by performing reframing that presents a positive interpretation with respect to a situation or action that the user perceives as negative.
Here,
The recognition of the situation or action of the user, and the emotion estimation can be performed by using various sensing data, information (schedule information, a mail, a posted message on a social network, or the like) input by the user, information (date, map, weather, surveillance footage, or the like) acquired from the outside, or machine learning. A specific example of the sensing data will be described later. For example, information sensed by a sensor such as a camera, a microphone, an acceleration sensor, a gyro sensor, a location information acquisition unit, a biological sensor (for example, body temperature, pulse, heartbeat, sweating, blink, or brain wave), a gaze sensor, or an environmental sensor (for example, temperature, humidity, illuminance, or wind pressure) can be assumed. These sensors may be provided in, for example, the information processing device 10, or may be implemented by another wearable device, a smartphone, or the like that is communicatively connected to the information processing device 10.
Further, in a case where the user forgot to eat a dessert before going out and has a negative emotion as assumed in
Whether or not the user has a negative emotion may be identified based on, for example, a database in which emotions (negative/positive) are associated with situations and actions, the database being prepared in advance. Further, a timing for the reframing may be when the user takes an action such as “going out”, when a stomach growls (rumbling), when the user mumbles something about forgetting to eat the dessert, when the user posts on a social network, when the user sighs while keeping eyes on a food shop, a food sign, or the like, when the user walks with his/her head down (it can be estimated that the user is depressed), when it is estimated from an eye movement, brain waves, or the like that the user is thinking of something, when it is estimated from a blood sugar level, brain waves, and the like that the user is hungry or is thinking about food, or the like.
The information processing system according to the embodiment of the present disclosure has been described hereinabove. The information processing system (also referred to as agent system) according to the present embodiment is implemented by various devices. For example, the agent system according to the present embodiment can be implemented by an output device such as a smartphone, a tablet terminal, a mobile phone, a dedicated terminal such as a home agent (speaker device), a projector, or a wearable device such as a head mounted display (HMD), a smart band, a smart watch, or a smart ear mounted on the ear. The HMD may be, for example, a spectacle-type HMD that includes earphones and a see-through display unit and can be worn on a daily basis (see
Further, the agent system according to the present embodiment is an application executed on these output devices or a server, and may be implemented by a plurality of devices. Further, in the agent system according to the present embodiment, an arbitrary output device may provide feedback as appropriate. For example, while the user is at home, a speaker device or display device (a TV receiver, a projector, a large display device installed in a room, or the like) in the room mainly provides the feedback, but while the user is outside the home, a smartphone, a smart band, a smart earphone, or the like mainly provides the feedback. Further, the feedback may be provided by a plurality of devices at substantially the same time.
Further, the feedback (promotion, suppression, and reframing) to the user can be provided in a form of sound output, image display, text display, tactile stimulation on the body, brain stimulation, or the like.
Subsequently, the information processing system according to the present embodiment will be described in detail by using a plurality of exemplary embodiments.
The control unit 100a functions as an arithmetic processing device and a control device, and controls an overall operation in 10a according to various programs. The control unit 100a is implemented by an electronic circuit such as a central processing unit (CPU) or a microprocessor, for example. Further, the control unit 100a may include a read only memory (ROM) that stores programs, arithmetic operation parameters, or the like to use, and a random access memory (RAM) that temporarily stores parameters varying as appropriate, or the like.
Further, the control unit 100a according to the present embodiment also functions as a situation recognition unit 101, an action recognition unit 102, an identification unit 103, and a feedback selection unit 104.
The situation recognition unit 101 recognizes an environment in which the user is placed as a situation. Specifically, the situation recognition unit 101 recognizes a user situation based on sensing data (sound, camera video, biological information, motion information, or the like) sensed by a sensor unit 122. For example, the situation recognition unit 101 recognizes a vehicle (train, bus, car, bicycle, or the like) used by the user based on a location of the user, a moving speed, and acceleration sensor data. Further, for the situation recognition, the sensor unit 122 may label an environment with an AND condition of language (for example, “train & crowded” (crowded train) or “room & child” (child's room)).
Further, the situation recognition unit 101 may also recognize a situation that the user actually perceives (experiences). For example, the situation recognition unit 101 can recognize the situation that the user actually perceives by using only a captured image (for example, a camera provided in the HMD and having an image capturing view angle corresponding to a visual field of the user) obtained by capturing an image of an area in a gaze direction of the user, or sound data collected by a microphone positioned near the ear of the user. Alternatively, in order to limit the situation to only a situation where the user pays attention, the situation recognition unit 101 may exclude data when the eyes are closed for a long time by sensing opening/closing of the eyes, or exclude data other than that when the user pays attention (concentration) by sensing a brain activity such as brain waves.
Further, the situation recognition unit 101 may recognize the situation by referring to information (schedule information, a mail content, a content posted on a social network, or the like) input by the user, information (a date and time, weather, traffic information, a user's purchase history, sensing data acquired from a sensor device (surveillance camera, surveillance microphone, or the like) installed in the vicinity, or the like) acquired by the information processing device 10a, or the like, in addition to the sensing data sensed by the sensor unit 122.
In addition, the situation recognition unit 101 may also recognize a situation (for example, interaction in a music group in which the user participates, browsing, or downloading) on a social network or a situation in a virtual reality (VR) world, in addition to a situation in the real world.
Further, the situation recognition unit 101 may use, as a situation recognition method, a neural network trained by deep learning by giving sensing data and a language label as trainer data.
The action recognition unit 102 recognizes the action of the user. Specifically, the action recognition unit 102 recognizes a user action based on sensing data (sound, camera video, biological information, motion information, or the like) sensed by the sensor unit 122. For example, the action recognition unit 102 recognizes an action such as “stand, sit, walk, run, lie, fall, or talk” in real time based on sensing data sensed by an acceleration sensor, a gyro sensor, a microphone, or the like.
The identification unit 103 identifies a user emotion based on the user situation recognized by the situation recognition unit 101 and the user action recognized by the action recognition unit 102. The “emotion” may be expressed as a basic emotion such as joy or anger, but here, as an example, the emotion may be expressed as positive/negative (hereinafter, also referred to as P/N). The positive/negative emotion corresponds to, for example, an axis of “pleasure/misery” in the Russell's circular model that organizes human emotions on two axes: “arousal” and “pleasure/misery (valence)”. Examples of the positive emotion include joy, happiness, excitement, relaxation, and satisfaction. Examples of the negative emotion include anxiety, anger, dissatisfaction, irritation, discomfort, sadness, depression, and boredom. Note that a level of the positive or negative emotion may be expressed by valence and normalized from −1 to 1. An emotion having valence of “−1” is the negative emotion, an emotion having valence of “0” is a neutral emotion, and an emotion having valence of “1” is the positive emotion.
The identification unit 103 according to the present embodiment may identify the user emotion (P/N) based on the user action and the user situation by using an identification database (DB) 141 for P/N identification, generated in advance. Here,
Referring to the example illustrated in
Note that in the example illustrated in
The feedback selection unit 104 has a function of selecting feedback to the user based on an identification result of the identification unit 103. Specifically, the feedback selection unit 104 selects feedback (hereinafter, also referred to as “FB”) to the user by using a feedback DB 142 based on the identification result of the identification unit 103. Here,
For a positive action, FB that is pleasant to the user is provided in order to reinforce the action, and for a negative action, FB that is unpleasant to the user is provided in order to suppress the action. Only one FB or a plurality of FBs may be provided. When being selected from a plurality of FB candidates, FB may be selected according to the priority illustrated in
In the example illustrated in
(Communication Unit 110)
The communication unit 110 is connected to an external device in a wired or wirelessly manner, and transmits and receives data to and from the external device. The communication unit 110 is communicatively connected to the external device through a network, for example, a wired/wireless local area network (LAN), Wi-Fi (registered trademark), Bluetooth (registered trademark), a mobile communication network (long term evolution (LTE) or 3rd generation mobile communication system (3G)), or the like.
(Input Unit 120)
The input unit 120 acquires input information for the information processing device 10a and outputs the input information to the control unit 100a. The input unit 120 includes, for example, an operation input unit 121 and the sensor unit 122.
The operation input unit 121 detects information on an operation input by the user with respect to the information processing device 10a. The operation input unit 121 may be, for example, a touch sensor, a pressure sensor, or a proximity sensor. Alternatively, the operation input unit 121 may be a physical component such as a button, a switch, or a lever.
The sensor unit 122 is a sensor that senses various types of sensing data for recognizing a user situation or a user action. The sensor unit 122 is assumed to be, for example, a camera (stereo camera, visible light camera, infrared camera, depth camera, or the like), a microphone, a gyro sensor, an acceleration sensor, a geomagnetic sensor, a biological sensor (heartbeat, body temperature, sweating, blood pressure, pulse, respiration, gaze, blink, eye movement, gaze duration, brain waves, body movement, body position, skin temperature, skin electrical resistance, microvibration (MV), myogenic potential, SpO2 (blood oxygen saturation), or the like), a location information acquisition unit of a global navigation satellite system (GNSS) or the like, an environment sensor (illuminance sensor, atmospheric pressure sensor, temperature (air temperature) sensor, humidity sensor, or altitude sensor), an ultrasonic sensor, or an infrared sensor. In addition to the GNSS, the location information acquisition unit may sense a location by using Wi-Fi (registered trademark), Bluetooth (registered trademark), performing transmission and reception with a mobile phone, a personal handyphone system (PHS), a smartphone, or the like, or using near-field communication, for example. The number of respective sensors may be plural. Further, the microphone may be a directional microphone.
(Output Unit 130)
The output unit 130 has a function of outputting feedback to the user under the control of the control unit 100a. The output unit 130 includes, for example, a display unit 131 and a sound output unit 132. The display unit 131 is a display device that displays an image (still image or moving image) or text. The display unit 131 is implemented by, for example, a display device such as a liquid-crystal display (LCD) or an organic electroluminescence (EL) display. The output unit 130 may be, for example, a see-through display unit provided in a spectacle-type HMD. Here, display information such as a feedback image is displayed in augmented reality (AR) by being superimposed on a real space. Further, the sound output unit 132 outputs an agent voice, music, a melody, a sound effect, and the like. The number of display units 131 and the number of sound output units 132 may be plural. Further, the sound output unit 132 may be a directional speaker.
(Storage Unit 140)
The storage unit 140 is implemented by a read only memory (ROM) that stores programs, arithmetic operation parameters, or the like to be used in processing performed by the control unit 100a, and a random access memory (RAM) that temporarily stores parameters varying as appropriate, or the like.
Further, the storage unit 140 stores the identification DB 141 and the feedback DB 142. The identification DB 141 includes data for identifying whether the user emotion is negative or positive based on a situation recognition result and an action recognition result as described with reference to
The configuration of the information processing device 10a according to the present embodiment has been specifically described above. Note that the configuration of the information processing device 10a is not limited to the example illustrated in
Further,
Next, FB processing of the information processing system according to the present embodiment will be specifically described with reference to
As illustrated in
Next, the information processing device 10a refers to the identification DB 141 to perform P/N identification based on a situation recognition result and an action recognition result (Step S109).
Next, the information processing device 10a refers to the feedback DB 142 to select feedback based on a P/N identification result (Step S112).
Then, the information processing device 10a provides the selected feedback (Step S115).
The above-described processing in Steps S103 to S115 are repeated until the system is terminated (for example, an explicit termination instruction from the user) (Step S118).
Next, a flow of P/N identification processing and P/N feedback selection processing will be described with reference to
As illustrated in
Next, in a case where the identification unit 103 identifies that the user emotion is positive (Step S126/Yes), the feedback selection unit 104 selects positive reinforcer FB (Step S129). For example, in a case where the user U1 takes a “greeting” action in a situation of “meeting person” and it is identified that the user emotion is positive, feedback corresponding to R2 (ID) and feedback corresponding to R4 (ID) among FBs illustrated in
On the other hand, in a case where it is identified that the user emotion is negative (Step S132/Yes), the feedback selection unit 104 select negative reinforcer FB (Step S135). For example, a case where a user U2 is walking with his/her head down will be described. The situation recognition unit 101 recognizes, from location information of the user U2, a situation of “commuting” in which the user U2 is moving between his/her home and a work place, and the action recognition unit 102 recognizes from an acceleration sensor that the user U2 is walking and recognizes from an acceleration sensor on the head that the user U2 is facing downward, thereby recognizing that the user U2 is walking with his/her head down.
The identification unit 103 identifies whether the action of walking with his/her head down during commuting is positive or negative. Specifically, the identification DB 141 is referred to. Since it is easy to feel depressed when walking with his/her head down, as indicated in D5 (ID) of
As such, as the negative reinforcer FB, an unpleasant sound is played, an unpleasant vibration is applied, a video of a disliked animal is played, or a favorite character says a negative word, such that the action of “walking with his/her head down” of the user U2 can be weakened, that is, suppressed. In addition, when walking while keeping his/her head up, reinforcement such as playing a favorite music is performed as feedback, the action of walking while keeping his/her head up is promoted, and the action of walking with his/her head down can be further suppressed.
Note that, since there is a case where the emotion is neutral, no feedback is provided in a case where the identification result does not correspond to either positive or negative (Step S132/No).
Next, a second exemplary embodiment will be described with reference to
More specifically, in the information processing system according to the present exemplary embodiment, a user emotion is recognized based on sensing data of the user, and an emotion recognition result is recorded in a database together with a situation and an action at that time, such that data learning in the identification DB 141 and the feedback DB 142 is performed. Hereinafter, a configuration and operation processing of an information processing device 10b according to the present exemplary embodiment will be specifically described.
The control unit 100b functions as an arithmetic processing device and a control device, and controls an overall operation in 10b according to various programs. Further, the control unit 100b also functions as a situation recognition unit 101, an action recognition unit 102, an identification unit 103, a feedback selection unit 104, an emotion recognition unit 105, and a learning unit 106.
The storage unit 140b stores an identification DB 141, a feedback DB 142, and a learning DB 143.
The emotion recognition unit 105 recognizes a user emotion based on sensing data acquired by a sensor unit 122 and the like. For example, the emotion recognition unit 105 may recognize the emotion by analyzing a facial expression in a captured image obtained by capturing an image of a face of the user, or may recognize the emotion by analyzing biological sensor data such as heartbeat or pulse. An algorithm of the emotion recognition is not particularly limited. Further, in the emotion recognition, the emotion may be expressed as a basic emotion such as joy or anger. However, similarly to the first exemplary embodiment described above, the emotion may also be expressed by valence as an example and normalized from −1 to 1. An emotion having valence of “−1” is the negative emotion, an emotion having valence of “0” is a neutral emotion, and an emotion having valence of “1” is the positive emotion. For example, the emotion recognition unit 105 compares a value of the biological sensor data with a predetermined threshold value to calculate the valence (pleasure/misery), thereby recognizing the emotion. Note that, here, a case where emotions are represented by discrete values such as “−1”, “0”, and “1”, respectively, has been described by way of example, but the present embodiment is not limited thereto, and the emotions may be represented by continuous values. For example, sensing data (analog value) such as the biological sensor data may be accumulated and quantified as a value of the emotion.
An emotion recognition result obtained by the emotion recognition unit 105 is accumulated in the learning DB 143 together with a situation recognition result obtained by the situation recognition unit 101 and an action recognition result obtained by the action recognition unit 102 at the same time.
In the learning DB 143, the situation recognition result obtained by the situation recognition unit 101, the action recognition result obtained by the action recognition unit 102, and the emotion recognition result obtained by the emotion recognition unit 105 are accumulated together with time. Here, an example of a data structure of the learning DB 143 is illustrated in
The learning unit 106 can perform the data learning in the identification DB 141 and the data learning in the feedback DB 142 by using the data accumulated in the learning DB 143. Details will be described in a description of a flowchart illustrated in
First, a specific example of the P/N identification data learning in the identification DB 141 will be described using the table illustrated in
Further, in a case where P/N identification data [U2, neighborhood, greeting, 1] indicating that the emotion of another user, for example, the user U2, also becomes positive when greeting in the neighborhood already exists when such P/N identification data of the user U1 is added, the learning unit 106 may generalize these two P/N identification data, generate P/N identification data [*, neighborhood, greeting, 1] indicating that the emotion of everyone becomes positive when greeting in the neighborhood, and perform data organization.
Note that P/N identification data based on general knowledge or P/N identification data based on a questionnaire result regarding preference such as likes and dislikes of the user may be stored in the identification DB 141 in an initial stage for a case where the learning is not performed.
Next, a specific example of the P/N feedback data learning in the feedback DB 142 will be described using the table illustrated in
Further, in a case where feedback data [user ID: U2, situation: *, action: *, P/N: 1, FB content: dog] indicating that “dog” can also be used as positive feedback for another user, for example, the user U2 already exists when such P/N feedback data of the user U1 is added, the learning unit 106 may generalize these two feedback data, generate positive reinforcer feedback data [user ID: *, situation: *, action: *, P/N: 1, FB content: dog] indicating that “dog” can be used as positive feedback for everyone, and perform data organization.
Note that feedback data based on general knowledge or feedback data based on a questionnaire result regarding preference such as likes and dislikes of the user may be stored in the feedback DB 142 in an initial stage for a case where the learning is not performed.
The learning in the database has been described above as the second exemplary embodiment. Note that the identification unit 103 may perform the P/N identification based on the emotion recognition result obtained by the emotion recognition unit 105.
Feedback for promoting or suppressing an action according to the present embodiment is not limited to a positive or negative reinforcer based on behavior analysis. For example, it is also possible to promote or suppress the action by directly stimulating the brain, like transcranial direct current stimulation (tDCS). With the tDCS, it is possible to promote or suppress perception or action by applying a weak direct current to the head. Specifically, it is known that anodal stimulation promotes a motor function such as jump, and cathodal stimulation suppresses perception such as an itch.
Therefore, in the present exemplary embodiment, it is possible to provide feedback for promoting or suppressing an action of the user by applying anodal stimulation or cathodal stimulation to the brain of the user.
The brain stimulation unit 133 can provide feedback for promoting an action of the user by applying anodal stimulation to the brain of the user, and provide feedback for suppressing an action of the user by applying cathodal stimulation to the brain of the user. The brain stimulation unit 133 is implemented by, for example, an electrode. The brain stimulation unit 133 may be provided on, for example, a surface of a headband (a band that surrounds an entire circumference of the head, or a band that passes through the temporal region and/or the parietal region) put on the head of the user, the surface coming into contact with a portion of the head between the ears. A plurality of brain stimulation units 133 (electrodes) are arranged so as to come into contact with the sensorimotor areas on both sides of the head of the user when the headband is put on, for example. Further, the information processing device 10c may be implemented by an HMD including such a headband. Note that the shape of the headband or the HMD is not particularly limited.
Other configurations are similar to those of the first exemplary embodiment. Further, the second exemplary embodiment (database learning function) may be applied to the present exemplary embodiment. Further, the identification unit 103 may perform P/N identification based on the emotion recognition result obtained by the emotion recognition unit 105 described in the second exemplary embodiment.
Next, the information processing device 10c refers to the identification DB 141 to perform P/N identification based on a situation recognition result and an action recognition result (Step S309).
Next, the information processing device 10c provides brain stimulation feedback according to a P/N identification result (Step S312). Specifically, in the information processing device 10c, the brain stimulation unit 133 provides brain anodal stimulation feedback to promote the action in a case where it is identified that the emotion is positive, and the brain stimulation unit 133 provides brain cathodal stimulation feedback to suppress the action in a case where it is identified that the emotion is negative. For example, the information processing device 10c can simultaneously provide the brain anodal stimulation feedback when it is recognized that a certain user pleasurably greets a person in the neighborhood (recognition of a situation, an action, and a positive emotion) to further promote the greeting action. Further, in the information processing device 10c, the brain stimulation unit 133 can apply the cathodal stimulation when, for example, a certain user sees a cat that is run over by a car at an intersection and does not move any more, and thus becomes sad and has a negative emotion, to suppress the negative emotion. The action recognition unit 102 can determine a gaze or fixation point of the user, and a gaze target by using an outward-facing camera or gaze sensor (such as an infrared sensor) provided in the HMD, and recognize that the user is seeing (perceiving) an animal that is run over by a car and thus does not move.
The above-described processing in Steps S303 to S312 are repeated until the system is terminated (for example, an explicit termination instruction from the user) (Step S315).
Next, a fourth exemplary embodiment according to the present embodiment will be described with reference to
Therefore, in the present exemplary embodiment, feedback for promoting an action is provided in a case where no concern is identified after checking whether or not the action is ethically or legally negative, in addition to performing identification of a positive/negative emotion.
A configuration of an information processing device 10d according to the present exemplary embodiment may be any one of those of the information processing devices 10a, 10b, and 10c according to the first to third exemplary embodiments described above. That is, the present exemplary embodiment can be combined with any one of the first exemplary embodiment, the second exemplary embodiment, or the third exemplary embodiment. Further, the identification unit 103 may perform P/N identification based on the emotion recognition result obtained by the emotion recognition unit 105 described in the second exemplary embodiment.
In the information processing device 10d according to the present exemplary embodiment, the identification unit 103 performs emotional P/N identification, and also performs ethical P/N identification and legal P/N identification. Identification data for the ethical P/N identification and the legal P/N identification are stored in the identification DB 141, for example.
Here,
As illustrated in
Next, in a case where it is identified that the emotion is positive (Step S406/Yes), the identification unit 103 refers to the display unit 131 (for example, the table illustrated in
Next, in a case where it is identified that the action is legally negative (Step S412/No), the identification unit 103 refers to the display unit 131 (for example, the table illustrated in
The ethical P/N identification (Step S409) and the legal P/N identification (Step S415) described above can be performed at least when it is identified that the emotion is positive in the emotional P/N identification (Step S403). For example, even in a case where the action of shouting and the action of hitting make the user have a positive emotion, these actions are ethically or legally negative actions and thus should not be promoted. On the other hand, in a case where the action of shouting or hitting makes the user have a negative emotion, FB for suppressing the action is provided without performing the ethical P/N identification or legal P/N identification. Therefore, in the flowchart illustrated in
Then, in a case where it is identified that the emotion is negative (Step S421/Yes), in a case where it is identified that the emotion is positive (Step S406/Yes), but it is identified that the action is legally negative (Step S412/Yes), or in a case where it is identified that the action is ethically negative (Step S418/Yes), the feedback selection unit 104 selects negative reinforcer FB in order to suppress the action (Step S424).
On the other hand, in a case where it is identified that the emotion is positive (Step S421/Yes), and it is identified that the action is not legally or ethically negative (is positive) (Step S412/No and Step S418/No), the feedback selection unit 104 selects positive reinforcer FB as FB for the action (Step S427).
As a result, in the information processing system according to the present exemplary embodiment, for example, even in a case where the user U2 is a person who feels pleasure in violence such as hitting a person, and the user U2 involuntarily hits a person who has bumped into the user U2 on the train and feels pleasure, since such an action is not an ethically or legally favorable action, the action is not promoted, and it is possible to suppress the action with the negative reinforcer FB. Note that recognition of an action such as hitting a person is performed by, for example, analyzing sensing data of an acceleration sensor (an example of the sensor unit 122) provided in an HMD (information processing device 10d) worn by the user U2 and a video of a camera (an example of the sensor unit 122) provided in the HMD. Specifically, for example, the action recognition unit 102 can recognize the action in which the user hits the other person in a case where the action recognition unit 102 analyzes that a change in acceleration peculiar to the action of hitting a person is shown in the sensing data of the acceleration sensor, and analyzes from the camera video that the arm extending from the front side (user side) comes into contact with the other person.
Next, a fifth exemplary embodiment according to the present embodiment will be described. In the present exemplary embodiment, reframing that presents a positive interpretation with respect to an action that makes the user have a negative emotion is performed, such that it is possible to reduce the negative emotion of the user, regard the action as an action in a positive situation, and promote the action.
A specific example of a reframing effect will be described below. For example, a certain user luckily sat in a seat on a train on the way home after a long walk, and when the user stood up, thinking that he/she had arrived at the nearest station, the user realized that there is still one more stop. When the user looked back at his/her seat to sit down again, the other person was already sitting on the seat, and thus the user suddenly felt tired and felt a little depressed. The user had no choice but to stand as he/she is, and at this time, the user heard a voice of the agent saying that “you stood up by mistake, but you did a good thing because you gave up your seat to another person” from an earphone speaker of an HMD or the like. From what the agent said, the user can think of things in a positive way, for example, “maybe I did a good thing”, and an effect that the feeling of depression disappears and the feeling gets lighter can be expected.
Here, the system recognizes a situation where the user is on a train based on, for example, location information of a GPS of the HMD (information processing device 10e) worn by the user and acceleration sensor information. Further, the information processing device 10e analyzes the acceleration sensor information and recognizes (performs action recognition) that the user transitions from a sitting state to a standing state. Next, the information processing device 10e recognizes that the user does not get off the train, but remains standing. Further, the information processing device 10e recognizes that an emotion of the user is in a negative state based on a facial expression of the user acquired by a camera, pulse data, and the like. Then, the information processing device 10e refers to the feedback DB 142 based on the situation where the user was sitting in the seat on the train, the action of standing up from the seat but not getting off at the station, and the identification result indicating that the emotion is negative, and provides, in a case where there is corresponding reframing FB (for example, the phrase as described above), the reframing FB (for example, outputting a voice of the agent from the sound output unit 132).
A configuration of the information processing device 10e according to the present exemplary embodiment may be any one of those of the information processing devices 10a, 10b, 10c, and 10d according to the first to fourth exemplary embodiments described above. That is, the present exemplary embodiment can be combined with any one of the first exemplary embodiment, the second exemplary embodiment, the third exemplary embodiment, or the fourth exemplary embodiment. Further, the identification unit 103 may perform P/N identification based on the emotion recognition result obtained by the emotion recognition unit 105 described in the second exemplary embodiment.
In the information processing device 10e according to the present exemplary embodiment, the feedback selection unit 104 refers to the feedback DB 142 to select feedback (action promoting FB, action suppressing FB, or reframing FB) to the user based on an emotional P/N identification result. Here,
Among feedback data illustrated in
For example, when the train is delayed and the user may be late for work, and thus the emotion of the user becomes negative (frustrated, worried, or the like), the information processing device 10e may use the feedback DB 142 to present an altruistic positive interpretation like “I'm sorry that the train is delayed, but it is fortunate that no one was injured” in a case where information indicating that the delay is not caused by an injury accident can be acquired from train delay information. Alternatively, when someone bumps into the user or is stepped on the user on the train and thus the user has a negative emotion, the information processing device 10e uses the feedback DB 142 to present an altruistic positive interpretation like “it is fortunate that he/she did not fall”.
In order to make the altruistic positive interpretation more acceptable to the user, the information processing device 10e may present the altruistic positive interpretation only when the other person is a weak person such as an old person, a child, an injured person, or a pregnant woman.
In addition, when the user has a negative emotion “it is still dirty” after looking at a place cleaned by another person, the information processing device 10e may perform the reframing in which a change from a subjective evaluation to an objective evaluation is performed, for example, presenting an objective evaluation such as a cleanliness level, or promoting comparison before and after the cleaning, rather than comparison with a result of cleaning performed by the user.
The information processing device 10e according to the present embodiment may extract, based on the feedback data registered in advance in the feedback DB 142 as described above, reframing FB of which conditions of a situation, an action, and an emotion match, and present the reframing FB to the user. Alternatively, the information processing device 10e can also automatically generate reframing depending on the situation, or automatically add (learn) the data in the feedback DB 142. Such automatic generation of the reframing will be described later. Note that the information processing device 10e may have at least the reframing FB function among the action promoting FB function, the action suppressing FB function, and the reframing FB function.
As illustrated in
Next, in a case where it is identified that the emotion is positive (Step S506/Yes), the feedback selection unit 104 refers to the feedback DB 142 and selects corresponding positive reinforcer FB (Step S509). For example, R2 (ID), R4 (ID), or the like in the table illustrated in
On the other hand, in a case where it is identified that the emotion is negative (Step S512/Yes), the feedback selection unit 104 refers to the feedback DB 142 and determines whether or not there is reframing FB corresponding to the recognized situation, action, and emotion (Step S515).
Next, in a case where there is reframing FB (Step S515/Yes), the feedback selection unit 104 selects the corresponding reframing FB (Step S521). For example, R6 (ID), R7 (ID), or the like in the table illustrated in
On the other hand, in a case where there is no reframing FB (Step S515/No), the feedback selection unit 104 selects a corresponding negative reinforcer FB (Step S518). For example, R1 (ID), R3 (ID), R5 (ID), or the like in the table illustrated in
In the example described above, the system performs the reframing without mentioning the negative emotion of the user. However, a highly sympathetic response to the negative emotion may be added to more effectively suppress the negative emotion and achieve a change to a positive interpretation. For example, a condition of the reframing FB is further limited, and a response showing empathy to the negative emotion is added to a content of the corresponding reframing FB.
More specifically, for example, in the table illustrated in
The information processing device 10e according to the present embodiment can apply the configuration of the second exemplary embodiment described above, and perform learning in the feedback DB 142 including the reframing FB by using, for example, the learning DB 143.
Change to Altruistic Interpretation
For example, the information processing device 10e can automatically generate the reframing FB in a case where corresponding actions such as “dropping/leaving-picking up/using”, “standing-sitting”, and “using-not using” are specified in advance as a rule, and a situation where the corresponding actions are taken and emotions are opposite in the same situation (train, conference room, home, company, or the like) is stored in the learning DB 143.
For example, among the data accumulated in the learning DB 143 illustrated in
Further, among the data accumulated in the learning DB 143 illustrated in
Alternatively, the information processing device 10e can generate a content (text) of the reframing FB by extracting, from a social media on which texts or voices are posted by multiple users, a post including an interpretation on a situation where emotions are opposite in the same situation, and entering the post in a database. For example, in a case where a post including “I couldn't sit on a train (situation)-I wanted to sit down because I've had a hard day (interpretation)-it's annoying (emotion)” and a post including “I couldn't sit on a train (situation)-but some people seemed to sit down (interpretation)-I'happy for them (emotion)” are extracted from a social media, the information processing device 10e presents, to a user who has a negative emotion, an interpretation of another user who has a positive emotion in the similar situation. Specifically, for example, the information processing device 10e can present, based on the posts collected from the social media, a positive interpretation like “but some people seemed to sit down” when the user has a negative emotion because he/she cannot sit on the train.
In addition, the information processing device 10e may acquire a user evaluation (explicit evaluation, emotion recognition result (whether or not the emotion is actually turned into positive), or the like) with respect to the reframing FB to learn effective reframing.
Presentation of Relative Evaluation
Further, in a case where keywords related to “comparison” such as “strong, high, large, and clean” are stored as a knowledge database in advance, the information processing device 10e can perform the reframing by presenting a relative evaluation when the user utters these keywords and has a negative emotion. A reframing content can be generated by extracting, for example, from a social media, posts that include opposite emotions for similar situations including a keyword related to “comparison” and evaluations (interpretations) thereof. For example, in a case where a post including “the child's clothing is dirty (situation [evaluation target])-washing becomes difficult (evaluation)-angry (emotion)” and a post including “the child's clothing is dirty (situation [evaluation target])-the child is having fun (evaluation)-happy (emotion)” are extracted, it is possible to present a positive interpretation in which an evaluation standard is changed to a user who has a negative emotion for a similar relative evaluation. Specifically, for example, the information processing device 10e can present, based on the posts collected from the social media, a positive interpretation like “but the child is having fun!” when the user has a negative emotion because the child's clothing is dirty.
The information processing system according to the present embodiment can also provide a positive action promoting FB, a negative action suppressing FB, and a reframing FB to a small community such as a family, a company, a department, a school class, or a neighborhood association. In a case of the small community, members or places are limited, which enables more accurate situational recognition and action recognition.
As an example, assuming a situation where a child is cleaning a room, a cleaning action is promoted and an action such as playing during the cleaning is suppressed. In addition, when the child did not properly clean the room, the parent often scolds the child, which rather suppresses the cleaning action of the child. Therefore, the reframing is performed to turn such a negative emotion of the parent.
A configuration of an information processing device 10f according to the present exemplary embodiment may be any one of those of the information processing devices 10a to 10e according to the first to fifth exemplary embodiments described above. That is, the present exemplary embodiment can be combined with any one of the first exemplary embodiment, the second exemplary embodiment, the third exemplary embodiment, the fourth exemplary embodiment, or the fifth exemplary embodiment. Further, the identification unit 103 may perform P/N identification based on the emotion recognition result obtained by the emotion recognition unit 105 described in the second exemplary embodiment.
In the information processing device 10f according to the present exemplary embodiment, the feedback selection unit 104 refers to the feedback DB 142 to select feedback (action promoting FB, action suppressing FB, or reframing FB) to the user (child or parent) based on an emotional P/N identification result. Further, the feedback selection unit 104 may select feedback to the user (child or parent) in consideration of the ethical P/N identification or legal P/N identification described in the fourth exemplary embodiment.
Here,
Next, specific operation processing according to the present exemplary embodiment will be described. In the present exemplary embodiment, for example, the operation processing illustrated in
For example, in the information processing device 10e, the situation recognition unit 101 recognizes that a parent and a child are in a child's room from a video of a camera installed in each room of a house. Furthermore, in the information processing device 10e, the action recognition unit 102 recognizes that the child is cleaning from the video of the camera.
Next, the identification unit 103 of the information processing device 10e refers to the identification DB 141 to perform P/N identification with respect to the cleaning of the child's room by the child (Step S503 illustrated in
Next, since it is identified that P/N of the action in which the child cleans the child's room is (ethically) positive (S506/Yes illustrated in
On the other hand, when it is recognized that the child stopped cleaning and started playing with a toy, the identification unit 103 of the information processing device 10e determines that playing during cleaning is ethically negative according to D3 (ID) in
In this case, since it is identified that P/N of the action in which the child plays during cleaning is (ethically) negative (S512/Yes illustrated in
Next, presentation of reframing in a case where the parent feels that the room is still messed up after the child finishes cleaning and scolds the child by saying that “it's not clean at all!” will be described. The situation recognition unit 101 of the information processing device 10e can analyze a video of a camera provided in the child's room to recognize a degree of messiness of the child's room before cleaning (such as a state where things are scattered) and a state of the room after cleaning. The situation recognition unit 101 can compare videos before and after cleaning by the child and calculate a degree of achievement of cleaning (for example, a decrease in the number of scattered things, an increase rate of a floor area in which nothing is placed, or the like).
Further, the action recognition unit 102 of the information processing device 10e can analyze an uttered voice of the parent collected from a microphone and recognize that the parent is scolding the child. Further, the emotion recognition unit 105 of the information processing device 10e can identify that the emotion of the parent is negative based on a facial expression of the parent analyzed from a video of a camera installed in the room, a pulse rate acquired from a smart band worn by the mother, or a voice recognition result of uttered voice data acquired from a microphone installed in the room or a microphone of an HMD worn by the parent.
As such, in a case where the parent takes an action of scolding the child and the emotion of the parent becomes negative in a situation where the child has cleaned, the feedback selection unit 104 of the information processing device 10e refers to the feedback DB 142 and provides, in a case where there is reframing FB, the reframing FB (Step S515/Yes and S521 illustrated in
As described above, in the information processing system according to the embodiment of the present disclosure, it is possible to make a state of a user better according to an emotion of the user and improve quality of life. The information processing system according to the present embodiment can perform at least one of the action promoting FB function, the action suppressing FB function, or the reframing FB function described above.
Although the preferred embodiment of the present disclosure has been described above in detail with reference to the appended drawings, the present technology is not limited to such an example. It is obvious that a person with an ordinary skill in a technological field of the present disclosure could conceive of various modifications or corrections within the scope of the technical ideas described in the appended claims, and it should be understood that such modifications or corrections also fall within the technical scope of the present disclosure.
For example, a computer program can also be created for causing hardware such as a CPU, a ROM, and a RAM built in the information processing device 10 to function as the information processing device 10. Further, a computer-readable storage medium storing the computer program is also provided.
Furthermore, the effects described in the present specification are merely illustrative or exemplary and are not restrictive. That is, the technology according to the present disclosure can exhibit other effects obvious to those skilled in the art from the description of the present specification in addition to or in place of the above-described effects.
Note that the present technology can also have the following configurations.
(1)
An information processing system comprising:
a control unit that
estimates whether a user is positive or negative, and
has any one of a function of promoting an action of the user when it is estimated that the user is positive,
a function of suppressing an action of the user when it is estimated that the user is negative, or
a function of presenting a positive interpretation on a situation or action of the user when it is estimated that the user is negative.
(2)
The information processing system according to (1), wherein
the control unit
estimates whether the user is positive or negative due to any one of a situation or an action of the user based on data indicating a relationship between any one of a situation or an action and a positive or negative emotional state for the situation or action, the data being learned in advance.
(3)
The information processing system according to (1), wherein
the control unit
estimates an emotion of the user to estimate whether the user is positive or negative.
(4)
The information processing system according to (3), wherein the control unit estimates whether the emotion of the user is positive or negative based on sensing data of the user.
(5)
The information processing system according to any one of (1) to 4, wherein
the control unit
controls an agent that interacts with the user, and
the agent provides positive feedback to the user, as the function of promoting an action of the user when it is estimated that the user is positive.
(6)
The information processing system according to (5), wherein
the control unit
presents, to the user, at least one of a predetermined comfortable sound, image, or vibration as the positive feedback.
(7)
The information processing system according to any one of (1) to (6), wherein
the control unit
applies anodal stimulation to a brain of the user, as the function of promoting an action of the user when it is estimated that the user is positive.
(8)
The information processing system according to any one of (1) to (7), wherein
the control unit
controls an agent that interacts with the user, and
the agent provides negative feedback to the user, as the function of suppressing an action of the user when it is estimated that the user is negative.
(9)
The information processing system according to (8), wherein
the control unit
presents, to the user, at least one of a predetermined uncomfortable sound, image, or vibration as the negative feedback.
(10)
The information processing system according to any one of (1) to (9), wherein
the control unit
applies cathodal stimulation to a brain of the user, as the function of suppressing an action of the user when it is estimated that the user is negative.
(11)
The information processing system according to any one of (1) to (10), wherein
the control unit
provides, in a case where an action of the user is legally and ethically unproblematic when it is estimated that the user is positive, positive feedback for promoting the action.
(12)
The information processing system according to any one of (1) to (11), wherein
the control unit
controls an agent that interacts with the user, and
performs, when it is estimated that the user is negative, a control to cause the agent to present, in a case where a situation or action of the user and a text representing a positive interpretation on the situation or action are stored, the text representing the positive interpretation.
(13)
The information processing system according to (12), wherein
the control unit
performs a control to generate the text representing the positive interpretation based on information indicating opposite emotions for similar or corresponding actions in similar situations and present the generated text to the user.
(14)
An information processing method, by a processor, comprising:
estimating whether a user is positive or negative; and
performing any one of a function of promoting an action of the user when it is estimated that the user is positive,
a function of suppressing an action of the user when it is estimated that the user is negative, or
a function of presenting a positive interpretation on a situation or action of the user when it is estimated that the user is negative.
(15)
A recording medium recording a program for causing a computer to function as
a control unit that
estimates whether a user is positive or negative, and
has any one of a function of promoting an action of the user when it is estimated that the user is positive,
a function of suppressing an action of the user when it is estimated that the user is negative, or
a function of presenting a positive interpretation on a situation or action of the user when it is estimated that the user is negative.
Number | Date | Country | Kind |
---|---|---|---|
2018-083548 | Apr 2018 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2019/003924 | 2/4/2019 | WO | 00 |