This application claims the benefit of Japanese Patent Application No. 2022-150384, filed on Sep. 21, 2022, the entire disclosure of which is incorporated by reference herein.
The present disclosure relates to an apparatus, a control method for the apparatus, and a recording medium.
Technology is known for controlling the actions of apparatuses such as robots and the like so as to make the apparatuses more similar to familiar beings such as friends or pets. For example, Unexamined Japanese Patent Application Publication No. 2001-334482 describes technology related to action determination of a robot that acts like an animal.
One aspect of the present disclosure is an apparatus including:
A more complete understanding of this application can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
Hereinafter, embodiments are described while referencing the drawings. Note that, in the drawings, identical or corresponding components are denoted with the same reference numerals.
An embodiment in which an apparatus control device according to Embodiment 1 is applied to a robot 200 illustrated in
As illustrated in
The coupler 205 couples the torso 206 and the head 204 so as to enable rotation (by the twist motor 221) around a first rotational axis that passes through the coupler 205 and extends in a front-back direction of the torso 206. The twist motor 221 rotates the head 204, with respect to the torso 206, clockwise (right rotation) within a forward rotation angle range around the first rotational axis (forward rotation), counter-clockwise (left rotation) within a reverse rotation angle range around the first rotational axis (reverse rotation), and the like. Note that, in this description, the term “clockwise” refers to clockwise when viewing the direction of the head 204 from the torso 206. Additionally, herein, clockwise rotation is also referred to as “twist rotation to the right”, and counter-clockwise rotation is also referred to as “twist rotation to the left.” A maximum value of the angle of twist rotation to the right (right rotation) or the left (left rotation) can be set as desired, and the angle of the head 204 in a state, as illustrated in
The coupler 205 couples the torso 206 and the head 204 so as to enable rotation (by the vertical motor 222) around a second rotational axis that passes through the coupler 205 and extends in a width direction of the torso 206. The vertical motor 222 rotates the head 204 upward (forward rotation) within a forward rotation angle range around the second rotational axis, downward (reverse rotation) within a reverse rotation angle range around the second rotational axis, and the like. A maximum value of the angle of rotation upward or downward can be set as desired, and the angle of the head 204 in a state, as illustrated in
When the head 204 is rotated to the vertical reference angle or upward from the vertical reference angle by vertical rotation around the second rotational axis, the head 204 can contact, via the exterior 201, the placement surface such as the floor or the table on which the robot 200 is placed. Note that, in
As illustrated in
The robot 200 includes an acceleration sensor 212 on the torso 206. The acceleration sensor 212 can detect an attitude (orientation) of the robot 200, and can detect being picked up, the orientation being changed, being thrown, and the like by the user. The robot 200 includes a gyrosensor 214 on the torso 206. The gyrosensor 214 can detect rolling, rotating, and the like of the robot 200.
The robot 200 includes a microphone 213 on the torso 206. The microphone 213 can detect external sounds. Furthermore, the robot 200 includes a speaker 231 on the torso 206. The speaker 231 can be used to emit a sound (sound effect) of the robot 200.
Note that, in the present embodiment, the acceleration sensor 212, the gyrosensor 214, the microphone 213, and the speaker 231 are provided on the torso 206, but a configuration is possible in which all or a portion of these components are provided on the head 204. Note that a configuration is possible in which, in addition to the acceleration sensor 212, gyrosensor 214, the microphone 213, and the speaker 231 provided on the torso 206, all or a portion of these components are also provided on the head 204. The touch sensor 211 is respectively provided on the head 204 and the torso 206, but a configuration is possible in which the touch sensor 211 is provided on only one of the head 204 and the torso 206. Moreover, a configuration is possible in which a plurality of any of these components is provided.
Next, the functional configuration of the robot 200 is described. As illustrated in
A configuration is possible in which the apparatus control device 100, and the external stimulus detector 210, the driver 220, the sound outputter 230, and the operation inputter 240 are connected by a wired interface such as a universal serial bus (USB) cable or the like, or by a wireless interface such as Bluetooth (registered trademark) or the like. Additionally, a configuration is possible in which the controller 110 and the storage 120 are connected via the bus line BL.
The apparatus control device 100 controls, by the controller 110 and the storage 120, actions of the robot 200.
In one example, the controller 110 is configured from a central processing unit (CPU) or the like, and executes various processings (robot control processing and the like), described later, using programs stored in the storage 120. Note that the controller 110 is compatible with multithreading functionality in which a plurality of processings are executed in parallel. As such, the controller 110 can execute the various processings (robot control processing, sound effect playback thread, motion playback thread, and the like), described later, in parallel. Additionally, the controller 110 is provided with a clock function and a timer function, and can measure the date and time, and the like.
The storage 120 is configured from read-only memory (ROM), flash memory, random access memory (RAM), or the like. Programs to be executed by the CPU of the controller 110, and data needed in advance to execute these programs are stored in the ROM. The flash memory is writable non-volatile memory, and stores data that is desired to be retained even after the power is turned OFF. Data that is created or modified during the execution of the programs is stored in the RAM.
The external stimulus detector 210 includes the touch sensor 211, the acceleration sensor 212, the gyrosensor 214, and the microphone 213 described above. The controller 110 acquires, as a signal expressing an external stimulus acting on the robot 200, detection values (external stimulus data) detected by the various sensors of the external stimulus detector 210. Note that a configuration is possible in which the external stimulus detector 210 includes sensors other than the touch sensor 211, the acceleration sensor 212, the gyrosensor 214, and the microphone 213. The types of external stimuli acquirable by the controller 110 can be increased by increasing the types of sensors of the external stimulus detector 210.
The touch sensor 211 detects contacting by some sort of object. The touch sensor 211 is configured from a pressure sensor or a capacitance sensor, for example. A detection value detected by the touch sensor 211 expresses the strength of contact. Additionally, the touch sensor 211 is capable of directional contact detection, and detects the strength of contact in three axial directions, namely contact from the front-back direction (the X-axis direction), contact from a width (left-right) direction (Y-axis direction), and contact from a vertical direction (Z-axis direction) of the torso 206 of the robot 200. Therefore, the detection value of the touch sensor 211 is three-dimensional data constituted by values of the strength of contact from the X-axis direction, the strength of contact from the Y-axis direction, and the strength of contact from the Z-axis direction. The controller 110 can, on the basis of the detection value from the touch sensor 211, detect that the robot 200 is being pet, is being struck, and the like by the user.
The acceleration sensor 212 detects acceleration in three axial directions, namely the front-back direction (X-axis direction), the width (left-right) direction (Y-axis direction), and the vertical direction (Z direction) of the torso 206 of the robot 200. Therefore, the acceleration value detected by the acceleration sensor 212 is three-dimensional data constituted by values of X-axis direction acceleration, Y-axis direction acceleration, and Z-axis direction acceleration. The acceleration sensor 212 detects gravitational acceleration when the robot 200 is stopped and, as such, the controller 110 can detect a current attitude of the robot 200 on the basis of the gravitational acceleration detected by the acceleration sensor 212. Additionally, when, for example, the user picks up or throws the robot 200, the acceleration sensor 212 detects, in addition to the gravitational acceleration, acceleration caused by the movement of the robot 200. Accordingly, the controller 110 can detect the movement of the robot 200 by removing the gravitational acceleration component from the detection value detected by the acceleration sensor 212.
The gyrosensor 214 detects an angular velocity from when rotation is applied to the torso 206 of the robot 200. Specifically, the gyrosensor 214 detects the angular velocity on three axes of rotation, namely rotation around the front-back direction (the X-axis direction), rotation around the width (left-right) direction (the Y-axis direction), and rotation around the vertical direction (the Z-axis direction) of the torso 206. Therefore, an angular velocity value detected by the gyrosensor 214 is three-dimensional data constituted by the values of X-axis rotation angular velocity, Y-axis rotation angular velocity, and Z-axis rotation angular velocity. The controller 110 can more accurately detect the movement of the robot 200 by combining the detection value detected by the acceleration sensor 212 and the detection value detected by the gyrosensor 214.
Note that the touch sensor 211, the acceleration sensor 212, and the gyrosensor 214 are synchronized, detect each of the strength of contact, the acceleration, and the angular velocity at the same timing, and output the detection values to the controller 110. Specifically, the touch sensor 211, the acceleration sensor 212, and the gyrosensor 214 detect the strength of contact, the acceleration, and the angular velocity at the same timing every 0.25 seconds, for example.
The microphone 213 detects ambient sound of the robot 200. The controller 110 can, for example, detect, on the basis of a component of the sound detected by the microphone 213, that the user is speaking to the robot 200, that the user is clapping their hands, and the like.
The driver 220 includes the twist motor 221 and the vertical motor 222. The driver 220 is driven by the controller 110. As a result, the robot 200 can express actions such as, for example, lifting the head 204 up (rotating upward around the second rotational axis), twisting the head 204 sideways (twisting/rotating to the right or to the left around the first rotational axis), and the like. Motion data for driving the driver 220 in order to express these actions is recorded in a control content table 124, described later.
The sound outputter 230 includes the speaker 231, and sound is output from the speaker 231 as a result of sound data being input into the sound outputter 230 by the controller 110. For example, the robot 200 emits a pseudo-animal sound as a result of the controller 110 inputting animal sound data of the robot 200 into the sound outputter 230. This animal sound data is also recorded as sound effect data in the control content table 124.
In one example, the operation inputter 240 is configured from an operation button, a volume knob, or the like. The operation inputter 240 is an interface for receiving user operations such as, for example, turning the power ON/OFF, adjusting the volume of the output sound, and the like.
Next, of the data stored in the storage 120 of the apparatus control device 100, the data unique to the present embodiment, namely, emotion data 121, emotion change data 122, growth days count data 123, and the control content table 124 are described in order.
The emotion data 121 is data for imparting pseudo-emotions to the robot 200, and is data (X, Y) that represents coordinates on an emotion map 300. As illustrated in
The emotion data 121 has two values, namely the X value (degree of relaxation, degree of worry) and the Y value (degree of excitement, degree of disinterest) that express a plurality (in the present embodiment, four) of mutually different pseudo-emotions, and points on the emotion map 300 represented by the X value and the Y value represent the pseudo-emotions of the robot 200. An initial value of the emotion data 121 is (0, 0). The emotion data 121 is a parameter expressing a pseudo-emotion of the robot 200 and, as such, is also called an “emotion parameter.” Note that, in
In the present embodiment, regarding the size of the emotion map 300 as the initial value, as illustrated by frame 301 of
The emotion change data 122 is data that sets an amount of change that each of an X value and a Y value of the emotion data 121 is increased or decreased. In the present embodiment, as emotion change data 122 corresponding to the X of the emotion data 121, DXP that increases the X value and DXM that decreases the X value are provided and, as emotion change data 122 corresponding to the Y value of the emotion data 121, DYP that increases the Y value and DYM that decreases the Y value are provided. Specifically, the emotion change data 122 includes the following four variables. These variables are parameters that change the pseudo-emotion of the robot 200 and, as such, are also called “emotion change parameters.”
DXP: Tendency to relax (tendency to change in the positive value direction of the X value on the emotion map)
DXM: Tendency to worry (tendency to change in the negative value direction of the X value on the emotion map)
DYP: Tendency to be excited (tendency to change in the positive value direction of the Y value on the emotion map)
DYM: Tendency to be disinterested (tendency to change in the negative value direction of the Y value on the emotion map)
In the present embodiment, an example is described in which the initial value of each of these variables is set to 10 and, during robot control processing, described below, the value increases to a maximum of 20 by processing for learning emotion change data. Due to this learning processing, the emotion change data 122, that is, the degree of change of emotion changes and, as such, the robot 200 assumes various personalities in accordance with the manner in which the user interacts with the robot 200. That is, the personality of each individual robot 200 is formed differently on the basis of the manner in which the user interacts with the robot 200.
In the present embodiment, each piece of personality data (personality value) is derived by subtracting 10 from each piece of emotion change data 122. Specifically, a value obtained by subtracting 10 from DXP that expresses a tendency to be relaxed is set as a personality value (chirpy), a value obtained by subtracting 10 from DXM that expresses a tendency to be worried is set as a personality value (shy), a value obtained by subtracting 10 from DYP that expresses a tendency to be excited is set as a personality value (active), and a value obtained by subtracting 10 from DYM that expresses a tendency to be disinterested is set as a personality value (spoiled). As a result, for example, as illustrated in
Furthermore, in the present embodiment, the emotion change data 122, that is, the degree of change of emotion also changes due to a degree of familiarity (value expressing a degree of familiarity indicating how familiar the external stimulus is to the robot 200) acquired during the robot control processing described below. As such, the robot 200 can perform actions that take the manner in which the user has interacted with the robot 200 in the past into consideration.
The growth days count data 123 has an initial value of 1, and 1 is added for each passing day. The growth days count data 123 represents a pseudo growth days count (number of days from a pseudo birth) of the robot 200. Here, a period of the growth days count expressed by the growth days count data 123 is called a “second period.”
As illustrated in
As illustrated in
Regarding the sound effect data, to facilitate ease of understanding, text describing each piece of the sound effect data is included in
Note that, in the control content table 124 illustrated in
Next, the robot control processing executed by the controller 110 of the apparatus control device 100 is described while referencing the flowchart illustrated in
Firstly, the controller 110 initializes the various types of data such as the emotion data 121, the emotion change data 122, the growth days count data 123, and the like (step S101). Note that, a configuration is possible in which, for the second and subsequent startups of the robot 200, the various values from when the power of the robot 200 was last turned OFF are set in step S101. This can be realized by the controller 110 storing the various data values in nonvolatile memory (flash memory or the like) of the storage 120 when an operation for turning the power OFF is performed the last time and, when the power is thereafter turned ON, setting the stored values as the various data values.
Next, the controller 110 acquires an external stimulus detected by the external stimulus detector 210 (step S102). Then, the controller 110 determines whether there is a control condition, among the control conditions defined in the control content table 124, that is satisfied by the external stimulus acquired in step S102 (step S103).
When any of the control conditions defined in the control content table 124 is satisfied by the acquired external stimulus (step S103; Yes), the controller 110 references the control content table 124 and acquires the control data corresponding to the control condition that is satisfied by the acquired external stimulus (step S104).
Then, the controller 110 acquires the degree of familiarity on the basis of the external stimulus acquired in step S102 and history information about external stimuli that have been acquired in the past (step S105). The degree of familiarity is a parameter that is used to generate a phenomenon whereby, when the robot 200 is repeatedly subjected to the same external stimulus, the robot 200 gets used to that stimulus and the emotion does not significantly change. In the present embodiment, the degree of familiarity is a value from 1 to 10. Any method can be used to acquire the degree of familiarity. For example, the controller 110 can acquire the degree of familiarity by the method described in Unexamined Japanese Patent Application Publication No. 2021-153680.
Next, the controller 110 acquires the emotion change data 122 in accordance with the external stimulus acquired in step S102, and corrects the emotion change data 122 on the basis of the degree of familiarity (step S106). Specifically, when, for example, petting of the head 204 is detected by the touch sensor 211 of the head 204 as the external stimulus, the robot 200 obtains a pseudo sense of relaxation and, as such, the controller 110 acquires DXP as the emotion change data 122 to be added to the X value of the emotion data 121. Then, the controller 100 divides DXP of the emotion change data 122 by the value of the degree of familiarity. Due to this, as the value of the degree of familiarity increases, the value of the emotion change data 122 decreases and the pseudo-emotion is less likely to change.
Moreover, the controller 110 sets the emotion data 121 in accordance with the emotion change data 122 acquired (and corrected) in step S106 (step S107). Specifically, when, for example, DXP is acquired as the emotion change data 122 in step S106, the controller 110 adds the corrected DXP of the emotion change data 122 to the X value of the emotion data 121.
In steps S106 and S107, any type of settings are possible for the type of emotion change data 122 acquired (and corrected) and the emotion data 121 set for each individual external stimulus. Examples are described below.
The head 204 is petted (relax): X=X+DXP/degree of familiarity
The head 204 is struck (worry): X=X−DXM/degree of familiarity
(these external stimuli can be detected by the touch sensor 211 of the head 204)
The torso 206 is petted (excite): Y=Y+DYP/degree of familiarity
The torso 206 is struck (disinterest): Y=Y−DYM/degree of familiarity
(these external stimuli can be detected by the touch sensor 211 of the torso 206)
Held with head upward (happy): X=X+DXP/degree of familiarity, and Y=Y+DYP/degree of familiarity
Suspended with head downward (sad): X=X−DXM/degree of familiarity, and Y=Y−DYM/degree of familiarity
(these external stimuli can be detected by the touch sensor 211, the acceleration sensor 212, and the gyrosensor 214)
Spoken to in kind voice (peaceful): X=X+DXP/degree of familiarity, and Y=Y−DYM/degree of familiarity
Yelled out in loud voice (upset): X=X−DXM/degree of familiarity, and Y=Y+DYP/degree of familiarity
(these external stimuli can be detected by the microphone 213)
However, in a case in which a value (X value, Y value) of the emotion data 121 exceeds the maximum value of the emotion map 300 when adding the emotion change data 122, that value of the emotion data 121 is set to the maximum value of the emotion map 300. In addition, in a case in which a value of the emotion data 121 is less than the minimum value of the emotion map 300 when subtracting the emotion change data 122, that value of the emotion data 121 is set to the minimum value of the emotion map 300.
Moreover, the controller 110 executes control data change/playback processing with the control data acquired in step S104 and the emotion data 121 set in step S107 as arguments (step S108), and executes step S111. The control data change/playback processing is processing in which the control data acquired in step S104 is adjusted (changed) in accordance with the emotion data 121 set in step S107, and the robot 200 is controlled. This control data change/playback processing is described in detail later. Note that, when the emotion change data 122 is corrected in step S106, after step S108 ends, the controller 110 returns the emotion change data 122 to the uncorrected state.
Meanwhile, when, in step S103, none of the control conditions defined in the control content table 124 are satisfied by the acquired external stimulus (step S103; No), the controller 110 determines whether to perform a spontaneous action such as a breathing action or the like (step S109). Any method may be used as the method for determining whether to perform the spontaneous action but, in the present embodiment, it is assumed that the determination of step S109 is Yes and the breathing action is performed every breathing cycle (for example, two seconds).
When not performing the spontaneous action (step S109; No), the controller 110 executes step S111. When performing the spontaneous action (step S109; Yes), the controller 110 executes the spontaneous action (for example, a breathing action) (step S110), and executes step S111.
The control data of this spontaneous action also is stored in the control content table 124 (such as illustrated in, for example, “breathing cycle elapsed” of the “control conditions” of
In step S111, the controller 110 uses the clock function to determine whether a date has changed. When the date has not changed (step S111; No), the controller 110 executes step S102.
When the date has changed (step S111; Yes), the controller 110 determines whether it is in a first period (step S112). When the first period is, for example, a period 50 days from the pseudo birth (for example, the first startup by the user after purchase) of the robot 200, the controller 110 determines that it is in the first period when the growth days count data 123 is 50 or less. When it is not in the first period (step S112; No), the controller 110 executes step S115.
When it is in the first period (step S112; Yes), the controller 110 performs learning of the emotion change data 122 (step S113). Specifically, the learning of the emotion change data 122 is adding 1 to the DXP of the emotion change data 122 when the X value of the emotion data 121 is set to the maximum value of the emotion map 300 even once in step S107 of that day. The learning of the emotion change data 122 is adding 1 to the DYP of the emotion change data 122 when the Y value of the emotion data 121 is set to the maximum value of the emotion map 300 even once. The learning of the emotion change data 122 is adding 1 to the DXM of the emotion change data 122 when the X value of the emotion data 121 is set to the minimum value of the emotion map 300 even once. The learning of the emotion change data 122 is adding 1 to the DYM of the emotion change data 122 when the Y value of the emotion data 121 is set to the minimum value of the emotion map 300 even once. The emotion change data 122 is learned and updated as a result of the addition processing described above.
Note that, when the various values of the emotion change data 122 become exceedingly large, the amount of change of one time of the emotion data 121 becomes exceedingly large and, as such, the maximum value of the various values of the emotion change data 122 is set to 20, for example, and the various values are limited to that maximum value or less. Here, 1 is added to each piece of the emotion change data 122, but the value to be added is not limited to 1. For example, a configuration is possible in which a number of times at which the various values of the emotion data 121 are set to the maximum value or the minimum value of the emotion map 300 is counted and, when that number of times is great, the numerical value to be added to the emotion change data 122 is increased.
Returning to
In
Returning to
Next, the control data change/playback processing in which, in step S108 of the robot control processing described above, the control data and the emotion data 121 are called as arguments is described while referencing
Firstly, the controller 110 determines whether the sound effect data is included in the control data (step S201). When the sound effect data is not included (step S201; No), step S205 is executed.
When the sound effect data is included (step S201; Yes), the controller 110 sets a frequency change degree and a desinence change degree on the basis of the emotion data 121 (step S202). Specifically, the frequency change degree is set to a value obtained by dividing the X value of the emotion data 121 by 10, and the desinence change degree is set to a value obtained by dividing the Y value of the emotion data 121 by 10. That is, the frequency change degree and the desinence change degree are both set to values from −30 to 30.
Next, the controller 110 acquires the desinence position from the sound effect data (step S203). As illustrated in
Then, the controller 110 starts up a sound effect playback thread, described later, with the sound effect data, the desinence position, the frequency change degree, and the desinence change degree as arguments (step S204), and executes step S205. The sound effect playback thread is described later in detail but, in this thread, the sound effect is output from the sound outputter 230 by the sound effect data adjusted (changed) on the basis of the emotion data.
In step S205, the controller 110 determines whether the motion data is included in the control data. When the motion data is not included in the control data (step S205; No), the controller 110 ends the control data change/playback processing.
When the motion data is included in the control data (step S205; Yes), the controller 110 sets a speed change degree and an amplitude change degree on the basis of the emotion data 121 (step S206). Specifically, the speed change degree is set to a value obtained by dividing the X value of the emotion data 121 by 10, and the amplitude change degree is set to a value obtained by dividing the Y value of the emotion data 121 by 10. That is, the speed change degree and the amplitude change degree are both set to values from −30 to 30.
Then, the controller 110 starts up a motion playback thread, described later, with the motion data, the speed change degree, and the amplitude change degree as arguments (step S207), and ends the control data change/playback processing. The motion playback thread is described later in detail but, in this thread, the driver 220 is driven by the motion data adjusted (changed) on the basis of the emotion data 121 and, as a result, an action of the robot 200 is expressed.
Next, the sound effect playback thread called in step S204 of the control data change/playback processing (
Firstly, the controller 110 uses the sound outputter 230 to playback from the beginning to the desinence position of the sound effect data at a frequency changed by the frequency change degree (step S301). Any method may be used to change the frequency. For example, the frequency may be changed by changing a playback speed in accordance with the frequency change degree. In one example, when the frequency change degree is 10, the frequency is raised 10% by speeding up the playback speed 10% from a normal speed.
Next, the controller 110 uses the sound outputter 230 to playback from the desinence position to the end of the sound effect data at a frequency changed by the frequency change degree and the desinence change degree (step S302), and ends the sound effect playback thread. Any method may be used to change the frequency by the frequency change degree and the desinence change degree. For example, the frequency may be changed on the basis of a value obtained by summing these two change degrees, or the frequency may be changed by the frequency change degree and then further changed by the desinence change degree. When the frequency change degree is 10 and the desinence change degree is 5, in the method of the former, that is, when changing on the basis of a value obtained by summing the frequency change degree and the desinence change degree, the frequency is raised 15% (10+5=15). In the method of the latter, that is, when changing the frequency by the frequency change degree and then further changing the frequency by the desinence change degree, the frequency is raised 15.5% (1.1×1.05=1.155). In such a case, the controller 110 may raise the frequency by increasing the playback speed.
Next, the motion playback thread called in step S207 of the control data change/playback processing (
Firstly, the controller 110 changes the motion data on the basis of the speed change degree and the amplitude change degree (step S401). More specifically, time data of the motion data is multiplied by (100/(100+speed change degree)), and rotational angle data is multiplied by ((100+amplitude change degree)/100). In one example, when the speed change degree is −10, the speed is reduced 10% by multiplying the time data of the motion data by 100/(100−10) and, when the amplitude change degree is 10, the rotational angle is increased 10% by multiplying the rotational angle data by (100+10)/100.
However, when the changed motion data exceeds the limits of the driver 220, the motion data may be changed so as to be in the range that does not exceed those limits. Additionally, a configuration is possible in which the motion data in the control content table 124 is set, in advance, to values whereby the limits of the driver 220 are not exceeded even when the speed and/or the amplitude is increased +30%.
Then, the controller 110 drives the driver 220 on the basis of the motion data changed in step S401 (step S402), and ends the motion playback thread.
As a result of the control data change/playback processing described above, the control data is changed on the basis of the emotion data 121. Accordingly, the robot 200 can perform actions corresponding to emotions (output sound effects from the sound outputter 230, make gestures by the driver 220) without control data being specifically stored for every pseudo-emotion (piece of emotion data 122) of the robot 200. Specifically, even for pieces of the control data for which the sound effect or the same gesture are the same, the frequency and the up-down (tone) of the desinence in the case of sound effects, and the speed and amplitude of the action in the case of gestures are respectively adjusted (changed) on the basis of the coordinates of the emotion on the emotion map 300 at that time and, as a result, sound effects and gestures corresponding to emotions can be expressed. Accordingly, the robot 200 can be made to act in a more emotionally abundant manner than in the conventional technology, even though the amount of control data is the same.
Note that, in the control data change/playback processing described above, when the control data is a sound effect, the controller 110 adjusts (changes) the frequency and/or the up-down (tone) of the desinence on the basis of the emotion data 121. However, the present disclosure is not limited to the frequency and/or the tone of the sound effect being adjusted. A configuration is possible in which the controller 110 controls so as to adjust (change) an amount of output time of the sound effect, for example, on the basis of the emotion data 121.
In the control data change/playback processing described above, the controller 110 adjusts the control data on the basis of the emotion data 121. However, a configuration is possible in which the controller 110 adjusts the control data on the basis of the emotion change data 122 in addition to the emotion data 121 or instead of the emotion data 121.
In one example, in step S202 described above, the change degrees of the sound effect data are set with the frequency change degree=X/10 and the desinence change degree=Y/10. However, a configuration is possible in which the controller 110 sets these change degrees using the emotion change data 122. For example, the following settings are possible.
Frequency change degree=(X+(DXP−10)−(DXM−10))/10
Desinence change degree=(Y+(DYP−10)−(DYM−10))/10
In step S206 described above, the change degrees of the motion data are set with the speed change degree=X/10 and the amplitude change degree=Y/10.
However, a configuration is possible in which the controller 110 sets these change degrees using the emotion change data 122. For example, the following settings are possible.
Speed change degree=(X+(DXP−10)−(DXM−10))/10
Amplitude change degree=(Y±(DYP−10)−(DYM−10))/10
Note that the emotion change data 122 takes a value from 10 to 20 and, as such, in the equations described above, each of (DXP, DXM, DYP, and DYM) is reduced by 10 to set the value in a range from 0 to 10 and, then, the calculation is carried out.
In the example described above, the controller 110 adjusts (changes) the sound effect data and the motion data on the basis of only the emotion data 121 (emotion) or on the basis of both the emotion data 121 (emotion) and the emotion change data 122 (personality). However, a configuration is possible in which the controller 110 adjusts (changes) the sound effect data and/or the motion data on the basis of only the emotion change data 122 (personality).
A case is considered in which, the robot 200 is configured to output a sound effect such as “AHHHH!” from the sound outputter when the robot 200 detects an abnormality such as falling or the like. In such a case, it is desirable that the sound effect be continuously output in a period in which the abnormality is continuing in order to more clearly notify the user of the abnormality. Embodiment 2, which enables such, is described next.
The functional configuration and the structure of the robot 200 according to Embodiment 2 are the same as in Embodiment 1 and, as such, description thereof is omitted. However, control content for cases in which an abnormality is detected is stored in the control content table 124 according to Embodiment 2. Specifically, a condition of an external stimulus for which an abnormality is detected is defined as the control condition, and sound effect data of a sound effect to be output when the abnormality is detected is stored as the control data. Moreover, as illustrated in
As described later, the controller 110 lengthens the sound effect by repeatedly playing back the data that is from the repeat position P1 to the repeat position P2. However, in many cases, an amplitude value at P1 of the sound effect data differs from an amplitude value at P2 of the sound effect data and, as such, when the data that is from P1 to P2 is simply repeated, a step are generated in the waveform of the speech output at the transitions of the repeating data due to the difference between the amplitude values at P1 and P2, and an unnatural sound is produced. As such, in Embodiment 2, as illustrated in
Note that, although it is possible to prevent the generation of the steps in the waveform at the transitions of the repeating data by playing back from P1 to P2 in a back-and-forth manner, the slope of the waveform at these transitions may change rapidly. Moreover, these rapid changes in the slope of the waveform at the transitions may negatively affect the sound. Accordingly, when setting the repeat positions P1 and P2, a creator of the sound effect data may set the repeat positions P1 and P2 after actually playing back from P1 to P2 in a back-and-forth manner to confirm that there is no unnaturalness and, then, store the resulting data in the control content table 124 as the sound effect data.
When a sound effect is lengthened without playing back from the repeat position P1 to the repeat position P2 in a back-and-forth manner (when a sound effect is lengthened by repeating playback from the repeat position P1 to the repeat position P2 in the forward direction), it is necessary to adjust not only the slope, but also the amplitude at the transitions of the repetitions. However, the amplitude reliably matches as a result of performing the back-and-forth playback. Accordingly, by playing back, in a back-and-forth manner, the portion from the repeat position P1 to the repeat position P2 in the forward direction and the reverse direction, it is possible to remarkably reduce the work of setting the repeat positions P1 and P2 compared to when not performing the back-and-forth playback.
In Embodiment 2, processing for detecting an abnormality such as falling or the like is executed and, as such, an abnormality detection thread is described while referencing
Firstly, the controller 110 initializes the value of the variable T that stores the type of the abnormality (step S501). The value of the variable T at the time of initialization can be any value that can express that there is no abnormality. For example, the value of the variable T at the time of initialization may be set to 0.
Next, the controller 110 acquires an external stimulus detected by the external stimulus detector 210 (step S502). Then, the controller 110 determines, on the basis of the acquired external stimulus, whether an abnormality is detected (step S503). Examples of the abnormality include the robot 200 falling, rolling, being picked up by the fur, being rotated, and the like. Each of these abnormalities can be detected on the basis of the acceleration and/or the angular velocity.
In one example, the controller 110 can determine that “the robot 200 is falling” when the sum of squares of the acceleration on each axis detected by the acceleration sensor is less than a falling threshold. Additionally, the controller 110 can determine that “the robot 200 is being rolled” when the value of the Y-axis angular velocity detected by the gyrosensor exceeds a rolling threshold. Moreover, the controller 110 can determine that “the robot 200 is being picked up by the fur” when the value of the Z-axis acceleration detected by the acceleration sensor exceeds a pick up threshold. Furthermore, the controller 110 can determine that “the robot 200 is being rotated” when the Z-axis angular velocity detected by the gyrosensor exceeds a rotation threshold.
When the controller 110 does note detect these abnormalities (step S503; No), step S502 is executed. When the controller 110 detects any of these abnormalities (step S503; Yes), the controller 110 stores the type (type such as “fall”, “roll”, or the like) of the detected abnormality in the variable T (step S504). In this step, the controller 110 stores a value associated with the type of abnormality in the variable T. For example, the controller 110 stores 1 for “fall”, 2 for “roll”, and the like in the variable T,
Then, the controller 110 starts up a sound effect lengthening thread, described later (step S505). The sound effect lengthening thread is processing in which a sound effect corresponding to the type of the abnormality is lengthened for the period in which the abnormality is continuing. This processing is described later in detail.
Then, the controller 110 acquires the external stimulus again (step S506), and determines whether an abnormality of the type stored in the variable T is detected (step S507). When an abnormality of the type stored in the variable T is detected (step S507; Yes), step S506 is executed. When an abnormality of the type stored in the variable T is not detected (step S507; No), step S501 is executed.
As a result of the abnormality detection thread described above, the type of the abnormality is stored in the variable T during the period in which the abnormality is being detected and, when the abnormality is no longer detected, the variable T is initialized. Next, the sound effect lengthening thread started up in step S505 of the abnormality detection thread (
Firstly, the controller 110 references the variable T and the control content table 124, acquires the sound effect data corresponding to the detected abnormality (step S601), and acquires the repeat positions P1 and P2 included in the sound effect data (step S602).
Next, the controller 110 plays back the sound effect data from the beginning to the position P1 by the sound outputter 230 (step S603), and further plays back from the position P1 to the position P2 in the forward direction (step S604).
Then, the controller 110 references the variable T, and determines whether the abnormality is still continuing, that is, whether the value of the variable T has not changed (has not been initialized) (step S605). When the abnormality is not continuing (step S605; No), the controller 110 plays back the sound effect data from the position P2 to the end by the sound outputter 230 (step S606), and ends the processing of the sound effect lengthening thread.
When the abnormality is continuing (step S605; Yes), the controller 110 plays back the sound effect data from the position P2 to the position P1 in the reverse direction by the sound outputter 230 (step S607), and executes step S604.
As a result of the sound effect lengthening thread described above, the robot 200 according to Embodiment 2 can, for the period in which the abnormality is being detected, lengthen and output a sound effect in a manner so as not to impart unnaturalness.
The present disclosure is not limited to the embodiments described above, and various modifications and uses are possible. For example, a configuration is possible in which Embodiment 1 and Embodiment 2 are combined and, in addition to when there is an abnormality, the sound effect is lengthened and output on the basis of the pseudo-emotion of the robot 200 so as to seem natural.
In the embodiments described above, a configuration is described in which the apparatus control device 100 is built into the robot 200, but a configuration is possible in which the apparatus control device 100 is not built into the robot 200. For example, a configuration is possible in which, as illustrated in
In the embodiments described above, the apparatus control device 100 is a control device that controls the robot 200. However, the apparatus to be controlled is not limited to the robot 200. Examples of the apparatus to be controlled include a wristwatch, and the like. For example, in the case of a wristwatch that is capable of outputting sound and that includes an acceleration sensor and a gyrosensor, wherein a pseudo-creature can, as an application software, be raised in the apparatus, impacts or the like applied to the wristwatch and detected by the acceleration sensor and the gyrosensor can be envisioned as the external stimulus. Additionally, it is expected that the emotion change data 122 and the emotion data 121 are updated in accordance with this external stimulus, and the sound effect data set in the control content table 124 is adjusted (changed) on the basis of the emotion data 121 from the point in time at which the user wears the wristwatch, and outputted.
Accordingly, a configuration is possible in which, when the wristwatch is being handled roughly, a sad-like sound effect is emitted when the user puts the wristwatch on, and when the wristwatch is being handled with care, a happy-like sound effect is emitted when the user is puts the wristwatch on. Furthermore, when configured so that the emotion change data 122 is set for a first period (for example, fifty days), individuality (pseudo-personality) will develop in the wristwatch on the basis of how the user handles the wristwatch in the first period. That is, the same model of wristwatch becomes a wristwatch that tends to feel happiness in cases in which the wristwatch is handled with care by the user, and becomes a wristwatch that tends to feel sadness in cases in which the wristwatch is handled roughly by the user.
Thus, the apparatus control device 100 is not limited to a robot and can be applied to various apparatuses that include an acceleration sensor, a gyrosensor, and the like, and can provide the applied apparatus with pseudo-emotions, a personality, and the like. Furthermore, the apparatus control device 100 can be applied to various apparatuses to cause a user to feel as if they are pseudo-raising that apparatus.
In the embodiments described above, a description is given in which the action programs executed by the CPU of the controller 110 are stored in advance in the ROM or the like of the storage 120. However, the present disclosure is not limited thereto, and a configuration is possible in which the action programs for executing the various processings described above are installed on an existing general-purpose computer or the like, thereby causing that computer to function as a device corresponding to the apparatus control device 100 according to the embodiments described above.
Any method can be used to provide such programs. For example, the programs may be stored and distributed on a non-transitory computer-readable recording medium (flexible disc, Compact Disc (CD)-ROM, Digital Versatile Disc (DVD)-ROM, Magneto Optical (MO) disc, memory card, USB memory, or the like), or may be provided by storing the programs in a storage on a network such as the internet, and causing these programs to be downloaded.
Additionally, in cases in which the processings described above are realized by being divided between an operating system (OS) and an application/program, or are realized by cooperation between an OS and an application/program, it is possible to store only the portion of the application/program on the non-transitory recording medium or in the storage. Additionally, the programs can be piggybacked on carrier waves and distributed via a network. For example, the programs may be posted to a bulletin board system (BBS) on a network, and distributed via the network. Moreover, a configuration is possible in which the processings described above are executed by starting these programs and, under the control of the operating system (OS), executing the programs in the same manner as other applications/programs.
The foregoing describes some example embodiments for explanatory purposes. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of the invention is defined only by the included claims, along with the full range of equivalents to which such claims are entitled.
Number | Date | Country | Kind |
---|---|---|---|
2022-150384 | Sep 2022 | JP | national |