This application is based on and claims priority under 35 USC 119 from Japanese Patent Application No. 2016-210313 filed Oct. 27, 2016.
The present invention relates to a conversation control system.
According to an aspect of the invention, there is provided a conversation control system including a conversation device, an acquisition unit that acquires personality information of a user that is registered in advance, a detection unit that detects biological information of the user, an estimation unit that estimates a mental state of the user from the acquired personality information and the detected biological information, and a changing unit that changes a personality of the conversation device in accordance with the estimated mental state of the user.
Exemplary embodiment(s) of the present invention will be described in detail based on the following figures, wherein:
A conversation control system 10 according to an exemplary embodiment of the present invention will be described with reference to
The biological sensor 70 detects a physical symptom of the current emotion of the user 60, for example, biological information. The biological information includes, for example, at least one of the skin potential, the heart rate, and data regarding volume pulse waves of peripheral blood vessels of the user 60. Information regarding the skin potential includes the displacement and distribution of the skin potential at normal times and a variation in the skin potential per unit time in addition to the value of the current skin potential. Similarly, information regarding the heart rate includes the displacement of the heart rate at normal times and a variation in the heart rate per unit time in addition to the current heart rate. In addition, the data regarding the volume pulse waves of the peripheral blood vessels includes data regarding the contraction and expansion of the current blood vessel.
First, the conversation type robot 20 of this exemplary embodiment will be described with reference to
The control micro-processor 201 controls the overall operation of the components of the conversation type robot 20 on the basis of a control program stored in the storage device 203. The memory 202 temporarily stores conversation sounds during a conversation between the conversation type robot 20 and the user, conversation contents, a photo of the face, images of a facial expression, a behavior, and the physical state of the user 60 which are captured by the camera 205, and the like. The storage device 203 stores a control program for controlling each unit of the conversation type robot 20. The communication interface 204 performs communication control for causing the conversation type robot 20 to communicate with the control server 40 through the access point 50.
The camera 205 captures the facial image, the facial expression, the behavior, a change in the physical state of the user, and the like, and stores the captured images in the memory 202. The microphone 206 detects the user's sound during a conversation with the user and stores, that is, records the detected sound in the memory 202. The memory 202 may store conversation contents after the analysis of sound contents and the pitch of a sound or the speed of words, instead of directly recording a sound. The speaker 207 outputs a sound generated by a conversation controller, to be described later, of the conversation type robot 20. The motor 208 moves the conversation type robot 20 to a predetermined position on the basis of movement control information generated in a movement controller to be described later. The current position detection device 209, which is configured to include an acceleration sensor, a GPS signal reception device, a positional information signal reception device, and the like, specifies the current position of the conversation type robot 20 and temporarily stores the specified current position in the memory 202.
The sensor information transmission unit 211 transmits the photo of the face of the user 60 which is captured by the camera 205 of the conversation type robot 20 and external information of the user 60 which is detected by the camera 205 and the microphone 206 to the control server 40. The external information includes data regarding a facial expression and a behavior of the user 60 which are captured by the camera 205, and data regarding the pitch of a sound and the speed of words of the user 60 which are detected by the microphone 206. Meanwhile, a portion of the external information, for example, the angles of the mouth and eyebrows of the user 60, the number of blinks, information regarding a body temperature obtained by analyzing an RGB image of the user 60 which is captured by a camera, and information such as the pitch of a sound can also be handled as biological information, but any of the external information and the biological information is transmitted to the control server 40 by the sensor information transmission unit 211.
The robot personality information reception unit 212 receives information regarding a personality to be taken by the conversation type robot 20, which is transmitted from a robot personality information transmission unit of the control server 40 to be described later, and temporarily stores the received information in the memory 202.
The conversation controller 213 controls conversation performed between the conversation type robot 20 and the user 60. Specifically, the conversation controller 213 generates a response message in accordance with a conversation method and conversation contents based on the personality to be taken by the robot which is received by the robot personality information reception unit 212 with reference to the robot personality information database 215 to be described later and outputs the generated response message to the speaker 207, or controls the driving of the motor 208 and changes the posture or behavior of the conversation type robot 20.
The movement controller 214 controls the movement of the conversation type robot 20. The movement controller 214 generates movement control information regarding movement from the current position to a target location in a case where an instruction for movement is given from the control server 40, controls the operation of the motor 208 while referring to information regarding the current position detected by the current position detection device 209, and moves the conversation type robot 20.
The robot personality information database 215 stores a conversation method and response contents of the conversation type robot 20 for each personality to be taken by the conversation type robot 20.
Next, the control server 40 of this exemplary embodiment will be described with reference to
The CPU 401 controls the overall operation of the components of the control server 40 on the basis of a control program stored in the storage device 403. The memory 402 stores positional information of the conversation type robot 20 which is transmitted from the conversation type robot 20, a photo of the face, external information, or biological information of the user 60, and biological information of the user 60 which is transmitted from the biological sensor 70 attached to the user 60.
The storage device 403 is a hard disk drive (HDD), a solid state drive (SSD), or the like, and stores a control program for controlling the control server 40. Further, although will be described later, the storage device 403 stores a machine learning model which is used when a user personality database or the control server 40 estimates the current mental state of the user 60.
The communication interface 404 performs communication control for the control server 40 to transmit and receive various data to and from the conversation type robot 20 and the biological sensor 70 attached to the user 60 through the access point 50. The user interface 405 is constituted by a display device such as a liquid crystal display and an input device such as a keyboard or a mouse, and is used to make a manager control the control program stored in the storage device 403.
The user specification unit 411 specifies who the user 60 is as a conversation party of the conversation type robot 20 on the basis of the photo of the face of the user 60 which is transmitted from the sensor information transmission unit 211 of the conversation type robot 20. Meanwhile, the specification of the user 60 may adopt a method using voiceprint authentication for analyzing sound data in addition to a method using the photo of the face.
The user personality acquisition unit 412 acquires personality information at normal times representing a mental tendency of the user 60 at normal times which is specified by the user specification unit 411 from the user personality database 415. These pieces of personality information of the respective users at normal times may be stored in the user personality database 417 by causing the user personality acquisition unit 412 to analyze results of a personality diagnosis test or a questionnaire performed on each of the users in advance. Alternatively, the user personality acquisition unit 412 may perform a personality diagnosis test on the user 60 in advance through the conversation type robot 20 and analyze the result thereof to generate personality information of the user 60 at normal times and store the generated personality information in the user personality database 417.
The sensor information acquisition unit 413 receives external information and biological information of the user which are transmitted from the sensor information transmission unit 211 of the conversation type robot 20 and biological information transmitted from the biological sensor 70, and stores the received information in the memory 402.
The mental state estimation unit 414 inputs the personality information of the user 60 at normal times which is acquired by the user personality acquisition unit 412 and the external information and the biological information of the user 60 which are acquired by the sensor information acquisition unit 413 to a machine learning model stored in the learning model memory 418 to be described later, and obtains the mental state of the user 60 as an output, thereby estimating the current mental state.
The robot personality determination unit 415 determines a personality to be taken by the conversation type robot 20 in accordance with the current mental state of the user 60 which is estimated by the mental state estimation unit 414. A correspondence table (not shown) storing various current mental states of the user 60 and personalities to be taken by the conversation type robot 20 in association with each other is generated in advance, and the robot personality determination unit 415 determines a personality to be taken by the conversation type robot 20 with reference to the correspondence table. For example, when the current mental state of the user 60 is “introvert and stable”, a personality to be taken by the conversation type robot 20 is set to be “introvert and stable”. This correspondence table may be manually created by a manager, or may be generated by machine learning. In a case where the correspondence table is generated by machine learning, the users 60 having various personalities (mental states) are caused to have a conversation with the conversation type robot 20 exhibiting various personalities, biological information detected by the biological sensor 70 attached to the user 60 or the camera 205 of the conversation type robot 20 is analyzed, and a personality of the conversation type robot 20 which is estimated to make the user 60 of each of the personalities feel comfortable is registered in the correspondence table.
The robot personality information transmission unit 416 transmits the personality to be taken by the conversation type robot 20 which is determined by the robot personality determination unit 415 to the conversation type robot 20.
The user personality database 417 stores personality information at normal times representing a mental tendency at normal times for each user. For example, the personality information of the user at normal times is represented by a diplomatic scale, a neurotic scale, and a psychotic scale, and is stored as a numerical value for each user. Meanwhile, the personality information of the user at normal times is not limited to the scales represented by the above-described elements, and may be represented by another scale such as a mental stability scale, a social adaptation scale, or an impulsive scale.
The learning model memory 418 stores a machine learning model. The machine learning model outputs the current mental state of the user in a case where personality information of the user at normal times indicating a mental tendency at normal times and the current biological information of the user are input.
Next, a flow of a conversation control process in the conversation control system 10 will be described with reference to
In step S703 performed in parallel with step S701, the sensor information acquisition unit 413 of the control server 40 acquires data H(t) regarding the heart rate and data B(t) regarding volume pulse waves of peripheral blood vessels of the user 60 from the biological sensor 70, and stores the acquired data in the memory 402. In the next step S704, the mental state estimation unit 414 calculates a degree of emotion V(t) at the present point in time on the basis of the data H(t) regarding the heart rate and the data B(t) of the volume pulse waves of the peripheral blood vessels, and proceeds to step S705.
In step S705, the mental state estimation unit 414 of the control server 40 estimates a mental state at the present point in time of the user 60. Specifically, the user personality acquisition unit 412 acquires personality information at normal times representing a mental tendency of the user 60 at normal times with reference to the user personality database 417. Further, the mental state estimation unit 414 calculates the degree of displacement of the mental state of the user 60 at the present point in time from personality information at normal times P0 on the basis of the degree of excitement A (t) and the degree of emotion V(t) which are respectively calculated in steps S702 and S704. More specifically, a mental state f(t) is calculated by the following expression.
f(t)=P0×g(A(t),V(t))
In
In step S706 of
In a case where it is determined in step S706 that the mental state of the user 60 is each of second to fourth mental states, the process proceeds to the processing of steps S708 to S710 in accordance with the determined mental states, and the robot personality determination unit 415 determines a personality to be taken by the conversation type robot 20 to be B to D and generates pieces of robot personality information corresponding to the respective personalities with reference to the above-described correspondence table. The robot personality information transmission unit 416 transmits the robot personality information to the conversation type robot 20, and the process is terminated.
In a case where it is determined in step S706 that the mental state of the user 60 does not correspond to any mental state determined in advance, the process is terminated.
Next, another method of the conversation control process in the conversation control system 10 will be described with reference to
In step S1003 performed in parallel with steps S1001 and S1002, the user personality acquisition unit 412 acquires personality information at normal times P0 (diplomatic scale e, neurotic scale s, and psychotic scale p) which represents a mental tendency of the user 60 at normal times with reference to the user personality database 417, and the process proceeds to step S1004.
In step S1004, the mental state estimation unit 414 inputs the data E(t) regarding the skin potential, the data H(t) regarding the heart rate, the data B(t) regarding the volume pulse waves of the peripheral blood vessels, and the personality information at normal times (e, s, p) of the user 60 which are acquired in steps S1001 to S1003 to the machine learning model stored in the learning model memory 418, and obtains a current mental state f(t) of the user 60 as an output, thereby estimating the current mental state.
In step S1005, the robot personality determination unit 415 of the control server 40 in step S1005 determines whether or not the estimated mental state at the present point in time of the user 60 corresponds to any mental state determined in advance. In a case where the mental state of the user 60 is a first mental state (for example, introvert and stable), the process proceeds to the processing of step S1006. The robot personality determination unit 415 determines a personality to be taken by the conversation type robot 20 to be a personality A (for example, introvert and stable, similar to the user 60) to generate robot personality information with reference to the above-described correspondence table. The generated robot personality information is transmitted to the conversation type robot 20 by the robot personality information transmission unit 416, and the process is terminated. Meanwhile, the robot personality information reception unit 212 of the conversation type robot 20 receives the robot personality information transmitted from the control server 40, and the conversation controller 213 has a conversation with the user 60 by the determined personality of the robot while referring to the robot personality information database 215 on the basis of the received robot personality information.
In a case where it is determined in step S1005 that the mental state of the user 60 is each of second to fourth mental states, the process proceeds to the processing of steps S1007 to S1009 in accordance with the determined mental states, and the robot personality determination unit 415 determines a personality to be taken by the conversation type robot 20 to be B to D and generates pieces of robot personality information corresponding to the respective personalities with reference to the above-described correspondence table. The robot personality information transmission unit 416 transmits the robot personality information to the conversation type robot 20, and the process is terminated.
In a case where it is determined in step S706 that the mental state of the user 60 does not correspond to any mental state determined in advance, the process is terminated.
Meanwhile, in the description of
In addition, in the description of
Meanwhile, in the above-described exemplary embodiment, a case where the conversation type robot 20 is used as a conversation device has been described. However, in the present invention, the conversation device may not only be the conversation type robot 20 but also a device having a conversation function, and may be, for example, a portable terminal device having a conversation function.
The foregoing description of the exemplary embodiments of the present invention has been provided for the purposes of illustration and description. It is not intended to be exhaustive or limit the invention to the precise forms disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, thereby enabling others skilled in the art to understand the invention for various embodiments and with the various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Number | Date | Country | Kind |
---|---|---|---|
2016-210313 | Oct 2016 | JP | national |