This application claims the benefit of Japanese Patent Application No. 2022-154437, filed on Sep. 28, 2022, the entire disclosure of which is incorporated by reference herein.
The present disclosure relates generally to an apparatus, an apparatus control method, and a recording medium.
Devices have been developed for controlling the actions of apparatuses such as robots and the like so as to make the apparatuses more similar to familiar beings such as friends or pets. Additionally, technology is known for imparting pseudo-emotions to such apparatuses. For example, Unexamined Japanese Patent Application Publication No. 2021-069767 describes an apparatus control device that sets, in accordance with an external stimulus acting on an apparatus, an emotion parameter expressing a pseudo-emotion of the apparatus, and controls an action of the apparatus in accordance with the set emotion parameter.
One aspect of an apparatus according to the present disclosure is an apparatus including:
A more complete understanding of this application can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
Hereinafter, embodiments of the present disclosure are described while referencing the drawings. Note that, in the drawings, identical or corresponding components are denoted with the same reference numerals.
An embodiment, in which an apparatus control device according to an embodiment is applied to a robot 200 illustrated in
As illustrated in
The coupler 205 couples the torso 206 and the head 204 so as to enable rotation (by the twist motor 221) around a first rotational axis that passes through the coupler 205 and extends in a front-back direction of the torso 206. The twist motor 221 rotates the head 204, with respect to the torso 206, clockwise (right rotation) within a forward rotation angle range around the first rotational axis (forward rotation), counter-clockwise (left rotation) within a reverse rotation angle range around the first rotational axis (reverse rotation), and the like. Note that, in this description, the term “clockwise” refers to clockwise when viewing the direction of the head 204 from the torso 206. Additionally, herein, clockwise rotation is also referred to as “twist rotation to the right”, and counter-clockwise rotation is also referred to as “twist rotation to the left.” A maximum value of the angle of twist rotation to the right (right rotation) or the left (left rotation) can be set as desired, and the angle of the head 204 in a state, as illustrated in
The coupler 205 couples the torso 206 and the head 204 so as to enable rotation (by the vertical motor 222) around a second rotational axis that passes through the coupler 205 and extends in a width direction of the torso 206. The vertical motor 222 rotates the head 204 upward (forward rotation) within a forward rotation angle range around the second rotational axis, downward (reverse rotation) within a reverse rotation angle range around the second rotational axis, and the like. A maximum value of the angle of rotation upward or downward can be set as desired, and the angle of the head 204 in a state, as illustrated in
When the head 204 is rotated to the vertical reference angle or upward from the vertical reference angle by vertical rotation around the second rotational axis, the head 204 can contact, via the exterior 201, the placement surface such as the floor or the table on which the robot 200 is placed. Note that, in
As illustrated in
The robot 200 includes an acceleration sensor 212 on the torso 206. The acceleration sensor 212 can detect an attitude (orientation) of the robot 200, and can detect being picked up, the orientation being changed, being thrown, and the like by the user. The robot 200 includes a gyrosensor 213 on the torso 206. The gyrosensor 213 can detect rolling, rotating, and the like of the robot 200.
The robot 200 includes a microphone 214 on the torso 206. The microphone 214 can detect external sounds. Furthermore, the robot 200 includes a speaker 215 on the torso 206. The speaker 215 can be used to emit a sound (sound effect) of the robot 200.
The robot 200 includes a power receiver 251 on a bottom surface of the torso 206. The robot 200 is driven by a rechargeable battery (cell) 252 provided in the housing 207, and receives, by the power receiver 251, power sent from a wireless charger to charge the battery 252. In one example, the wireless charger resembles a pet cage (house), and includes a sheet-like power-supplying placement surface. Charging of the battery 252 starts when the robot 200 is placed on the power-supplying placement surface of the wireless charger.
Note that, in the present embodiment, the acceleration sensor 212, the gyrosensor 213, the microphone 214, and the speaker 215 are provided on the torso 206, but a configuration is possible in which all or a portion of these components are provided on the head 204. Note that a configuration is possible in which, in addition to the acceleration sensor 212, the gyrosensor 213, the microphone 214, and the speaker 215 provided on the torso 206, all or a portion of these components are also provided on the head 204. The touch sensor 211 is provided on each of the head 204 and the torso 206, but a configuration is possible in which the touch sensor 211 is provided on only one of the head 204 and the torso 206. Moreover, a configuration is possible in which a plurality of any of these components is provided.
Next, the functional configuration of the robot 200 is described. As illustrated in
A configuration is possible in which the apparatus control device 100, and the external stimulus detector 210, the driver 220, the sound outputter 230, the operation inputter 240, and the power supply controller 250 are connected by a wired interface such as a universal serial bus (USB) cable or the like, or by a wireless interface such as Bluetooth (registered trademark) or the like. Additionally, a configuration is possible in which the controller 110 and the storage 120 are connected via the bus line BL.
The apparatus control device 100 controls, by the controller 110 and the storage 120, actions of the robot 200.
In one example, the controller 110 is configured from a central processing unit (CPU) or the like, and executes various processings (robot control processing, emotion change processing, and the like), described later, using programs stored in the storage 120. Note that the controller 110 is compatible with multithreading functionality, in which a plurality of processings are executed in parallel. As such, the controller 110 can execute the various processings (robot control processing, emotion change processing, and the like), described later, in parallel. Additionally, the controller 110 is provided with a clock function and a timer function, and can measure the date and time, and the like.
The controller 110 includes, as a functional configuration according to the present disclosure, an amount of change learner 111, an amount of change acquirer 112, an emotion setter 113, an action controller 114, and an emotion changer 115.
The amount of change learner 111 learns and stores an emotion change parameter (emotion change data 122, described later) in accordance with the external stimulus detected by the external stimulus detector 210. The emotion change parameter is a parameter for changing the pseudo-emotion of the robot 200. Specifically, the amount of change learner 111 increases/decreases the emotion change data 122 in accordance with the external stimulus by robot control processing, described later.
The amount of change acquirer 112 acquires, in accordance with the detected external stimulus, the emotion change parameter (the emotion change data 122) learned by the amount of change learner 111. The amount of change acquirer 112 is described in detail later.
The emotion setter 113 sets, in accordance with the emotion change parameter (the emotion change data 122) acquired by the amount of change acquirer 112, an emotion parameter (emotion data 121, described later) expressing the pseudo-emotion of the robot 200. The emotion setter 113 is described in detail later.
The action controller 114 controls the actions of the robot 200 in accordance with the external stimulus detected by the external stimulus detector 210. The action controller 114 is described in detail later.
The emotion changer 115 executes emotion change processing, described later, to change the pseudo-emotion of the robot 200 in a step-wise manner when the external stimulus acting on the robot 200 is not detected in a predetermined period. The emotion changer 115 is described in detail later.
The storage 120 is configured from read-only memory (ROM), flash memory, random access memory (RAM), or the like. Programs to be executed by the CPU of the controller 110, and data needed in advance to execute these programs are stored in the ROM. The flash memory is writable non-volatile memory, and stores data that is desired to be retained even after the power is turned OFF. Data that is created or modified during the execution of the programs is stored in the RAM.
The external stimulus detector 210 includes the touch sensor 211, the acceleration sensor 212, the gyrosensor 213, and the microphone 214 described above. The controller 110 acquires, as external stimulus data, detection values detected by the various sensors of the external stimulus detector 210. The external stimulus data expresses the external stimulus acting on the robot 200. Note that a configuration is possible in which the external stimulus detector 210 includes sensors other than the touch sensor 211, the acceleration sensor 212, the gyrosensor 213, and the microphone 214. The types of external stimuli acquirable by the controller 110 can be increased by increasing the types of sensors of the external stimulus detector 210.
The touch sensor 211 detects contacting by some sort of object. The touch sensor 211 is configured from a pressure sensor or a capacitance sensor, for example. A detection value detected by the touch sensor 211 expresses the strength of contact. Additionally, the touch sensor 211 is capable of directional contact detection, and detects the strength of contact in three axial directions, namely contact from the front-back direction (the X-axis direction), contact from a width (left-right) direction (Y-axis direction), and contact from a vertical direction (Z-axis direction) of the torso 206 of the robot 200. Therefore, the detection value of the touch sensor 211 is three-dimensional data constituted by values of the strength of contact from the X-axis direction, the strength of contact from the Y-axis direction, and the strength of contact from the Z-axis direction. The controller 110 can, on the basis of the detection value from the touch sensor 211, detect that the robot 200 is being pet, is being struck, and the like by the user.
The acceleration sensor 212 detects acceleration in three axial directions, namely the front-back direction (X-axis direction), the width (left-right) direction (Y-axis direction), and the vertical direction (Z direction) of the torso 206 of the robot 200. Therefore, the acceleration value detected by the acceleration sensor 212 is three-dimensional data constituted by values of X-axis direction acceleration, Y-axis direction acceleration, and Z-axis direction acceleration. The acceleration sensor 212 detects gravitational acceleration when the robot 200 is stopped and, as such, the controller 110 can detect a current attitude of the robot 200 on the basis of the gravitational acceleration detected by the acceleration sensor 212. Additionally, when, for example, the user picks up or throws the robot 200, the acceleration sensor 212 detects, in addition to the gravitational acceleration, acceleration caused by the movement of the robot 200. Accordingly, the controller 110 can detect the movement of the robot 200 by removing the gravitational acceleration component from the detection value detected by the acceleration sensor 212.
The gyrosensor 213 detects an angular velocity from when rotation is applied to the torso 206 of the robot 200. Specifically, the gyrosensor 213 detects the angular velocity on three axes of rotation, namely rotation around the front-back direction (the X-axis direction), rotation around the width (left-right) direction (the Y-axis direction), and rotation around the vertical direction (the Z-axis direction) of the torso 206. Therefore, an angular velocity value detected by the gyrosensor 213 is three-dimensional data constituted by the values of X-axis rotation angular velocity, Y-axis rotation angular velocity, and Z-axis rotation angular velocity. The controller 110 can more accurately detect the movement of the robot 200 by combining the detection value detected by the acceleration sensor 212 and the detection value detected by the gyrosensor 213.
Note that the touch sensor 211, the acceleration sensor 212, and the gyrosensor 213 are synchronized, detect each of the strength of contact, the acceleration, and the angular velocity at the same timing, and output the detection values to the controller 110 at the same timing. Specifically, the touch sensor 211, the acceleration sensor 212, and the gyrosensor 213 detect the strength of contact, the acceleration, and the angular velocity at the same timing every 0.25 seconds, for example.
The microphone 214 detects ambient sound of the robot 200. The controller 110 can, for example, detect, on the basis of a component of the sound detected by the microphone 214, that the user is speaking to the robot 200, that the user is clapping their hands, and the like.
The driver 220 includes the twist motor 221 and the vertical motor 222. The driver 220 is driven by the controller 110. As a result, the robot 200 can express actions such as, for example, lifting the head 204 up (rotating upward around the second rotational axis), twisting the head 204 sideways (twisting/rotating to the right or to the left around the first rotational axis), and the like. Motion data for driving the driver 220 in order to express these actions is recorded in a control content table 123, described later.
The sound outputter 230 includes the speaker 215, and sound is output from the speaker 215 as a result of sound data being input into the sound outputter 230 by the controller 110. For example, the robot 200 emits a pseudo-animal sound as a result of the controller 110 inputting animal sound data of the robot 200 into the sound outputter 230. This animal sound data is also recorded as sound effect data in the control content table 123.
In one example, the operation inputter 240 is configured from an operation button, a volume knob, or the like. The operation inputter 240 is an interface for receiving user operations such as, for example, turning the power ON/OFF, adjusting the volume of the output sound, and the like.
The power supply controller 250 includes a sub-microcomputer, a charging integrated circuit (IC), a power supply control IC, the power receiver 251, and the like, and performs charging of the battery 252 of the robot 200, acquiring of a remaining amount of the battery 252, and power supply controlling of the robot 200.
Next, of the data stored in the storage 120 of the apparatus control device 100, the data unique to the present embodiment, namely, the emotion data 121, the emotion change data 122, the control content table 123, leave-alone time data 124, and an amount of change table 125 are described in order.
The emotion data 121 is data for imparting pseudo-emotions to the robot 200, and is data (X, Y) that represents coordinates on an emotion map 300. As illustrated in
The emotion data 121 has two values, namely the X value (degree of relaxation, degree of worry) and the Y value (degree of excitement, degree of disinterest) that express a plurality (in the present embodiment, four) of mutually different pseudo-emotions, and points on the emotion map 300 represented by the X value and the Y value represent the pseudo-emotions of the robot 200. An initial value of the emotion data 121 is (0, 0). The emotion data 121 is a parameter expressing the pseudo-emotion of the robot 200 and, as such, is also called an “emotion parameter.” Note that, in
In the present embodiment, regarding the size of the emotion map 300, as illustrated by frame 301 of
Returning to
In the present embodiment, an example is described in which the initial value of each of these variables is set to 10 and, during robot control processing, described later, the value increases to a maximum of 20 by processing for learning emotion change data. Due to this learning processing, the emotion change data 122, that is, the degree of change of emotion changes and, as such, the robot 200 assumes various personalities in accordance with the manner in which the user interacts with the robot 200. That is, the personality of each individual robot 200 is formed differently on the basis of the manner in which the user interacts with the robot 200.
In the present embodiment, each piece of personality data (personality value) is derived by subtracting 10 from each piece of emotion change data 122. Specifically, a value obtained by subtracting 10 from DXP that expresses a tendency to be relaxed is set as a personality value (happy), a value obtained by subtracting 10 from DXM that expresses a tendency to be worried is set as a personality value (shy), a value obtained by subtracting 10 from DYP that expresses a tendency to be excited is set as a personality value (active), and a value obtained by subtracting 10 from DYM that expresses a tendency to be disinterested is set as a personality value (spoiled). As a result, for example, as illustrated in
As illustrated in
As illustrated in
Regarding the sound effect data, to facilitate ease of understanding, text describing each piece of the sound effect data is included in
Note that, in the control content table 123 illustrated in
Returning to
As illustrated in
Next, the robot control processing executed by the controller 110 of the apparatus control device 100 is described while referencing the flowchart illustrated in
Firstly, the controller 110 initializes the various types of data such as the emotion data 121, the emotion change data 122, and the like (step S101). Note that, a configuration is possible in which, for the second and subsequent startups of the robot 200, the various values from when the power of the robot 200 was last turned OFF are set in step S101. This can be realized by the controller 110 storing the various data values in nonvolatile memory (flash memory or the like) of the storage 120 when an operation for turning the power OFF is performed the last time and, when the power is thereafter turned ON, setting the stored values as the various data values.
Then, the controller 110 starts measuring the leave-alone time using the clock function (step S102). The leave-alone time is continuously measured until the leave-alone time is reset by step S109, described later.
Next, the controller 110 acquires the detection value detected by the external stimulus detector 210 (step S103). Then, the controller 110 determines, on the basis of the acquired detection value, whether there is an external stimulus (step S104).
When there is an external stimulus (step S104; Yes), the controller 110 (the amount of change acquirer 112) acquires the emotion change data 122 in accordance with the detection value of the external stimulus acquired in step S103 (step S105). Specifically, when, for example, petting of the head 204 is detected by the touch sensor 211 of the head 204 as the external stimulus, the robot 200 obtains a pseudo sense of relaxation and, as such, the controller 110 acquires DXP as the emotion change data 122 to be added to the X value of the emotion data 121.
Then, the controller 110 (the emotion setter 113) sets the emotion data 121 in accordance with the emotion change data 122 acquired in step S105 (step S106). Specifically, when, for example, DXP is acquired as the emotion change data 122 in step S104, the controller 110 adds the DXP of the emotion change data 122 to the X value of the emotion data 121.
In steps S105 and S106, any type of settings are possible for the type of emotion change data 122 acquired and the emotion data 121 set for each individual external stimulus. Examples are described below.
However, in a case in which a value (X value, Y value) of the emotion data 121 exceeds the maximum value of the emotion map 300 when adding the emotion change data 122, that value of the emotion data 121 is set to the maximum value of the emotion map 300. In addition, in a case in which a value of the emotion data 121 is less than the minimum value of the emotion map 300 when subtracting the emotion change data 122, that value of the emotion data 121 is set to the minimum value of the emotion map 300.
Next, the controller 110 (the action controller 114) references the control content table 123 and acquires the control data corresponding to the control condition that is satisfied by the acquired detection value of the external stimulus (step S107).
Then, the controller 110 (the action controller 114) plays back the control data acquired in step S107 (step S108). Note that a configuration is possible in which the controller 110 adjusts (changes), on the basis of the emotion data 121, the content of the control data to be played back.
Next, the controller 110 resets the leave-alone time to 0 (step S109), and starts measuring the leave-alone time (step S110). Then, the controller 110 executes step S113.
Meanwhile, when there is not an external stimulus in step S104 (step S104; No), the controller 110 determines whether to perform a spontaneous action (a breathing imitation action that is an action resembling breathing of a creature, or the like) (step S11). Any method may be used as the method for determining whether to perform the spontaneous action but, in the present embodiment, it is assumed that the determination of step S11 is “Yes” and the breathing imitation action is performed every breathing cycle (for example, two seconds).
When not performing the spontaneous action (step S111; No), the controller 110 executes step S113. When performing the spontaneous action (step S111; Yes), the controller 110 executes a spontaneous action (for example, a breathing action) as the spontaneous action (step S112), and executes step S113.
The control data of this spontaneous action also is stored in the control content table 123 (such as illustrated in, for example, “breathing cycle elapsed” of the “control conditions” of
In step S113, the controller 110 uses the clock function to determine whether a date has changed. When the date has not changed (step S113; No), the controller 110 executes step S103.
When the date has changed (step S113; Yes), the controller 110 (the amount of change learner 111) performs learning of the emotion change data 122 (step S114), and executes step S103. The learning of the emotion change data 122 is, specifically, processing for updating the emotion change data 122 by adding 1 to the DXP of the emotion change data 122 when the X value of the emotion data 121 is set to the maximum value of the emotion map 300 even once in step S106 of that day, adding 1 to the DYP of the emotion change data 122 when the Y value of the emotion data 121 is set to the maximum value of the emotion map 300 even once in step S106 of that day, adding 1 to the DXM of the emotion change data 122 when the X value of the emotion data 121 is set to the minimum value of the emotion map 300 even once in step S106 of that day, and adding 1 to the DYM of the emotion change data 122 when the Y value of the emotion data 121 is set to the minimum value of the emotion map 300 even once in step S106 of that day.
However, when the various values of the emotion change data 122 become exceedingly large, the amount of change of one time of the emotion data 121 becomes exceedingly large and, as such, the maximum value of the various values of the emotion change data 122 is set to 20, for example, and the various values are limited to that maximum value or less. Here, 1 is added to each piece of the emotion change data 122, but the value to be added is not limited to 1. For example, a configuration is possible in which a number of times at which the various values of the emotion data 121 are set to the maximum value or the minimum value of the emotion map 300 is counted and, when that number of times is great, the numerical value to be added to the emotion change data 122 is increased.
Next, emotion change processing executed by the controller 110 (the emotion changer 115) of the apparatus control device 100 is described while referencing the flowchart illustrated in
Firstly, the controller 110 references the leave-alone time data 124 and determines whether the current leave-alone time is 10 minutes or greater (step S201). When the leave-alone time is less than 10 minutes (step S201; No), the controller 110 stands-by without executing the subsequent steps.
When the leave-alone time is 10 minutes or greater (step S201; Yes), the controller 110 references the amount of change table 125 illustrated in
Returning to
When the Y value of the emotion data 121 is a positive value (step S203; Yes), that is, when the robot 200 is in an excited state, the controller 110 changes the values (X, Y) on the emotion map 300 expressed by the emotion data 121 in a direction toward the origin (0, 0) a distance corresponding to the emotion change amount acquired in step S202 (step S204), and executes step S206. As a result of this processing, the pseudo-emotion of the robot 200 that is in the excited state changes so as to become a neutral emotion, and the robot 200 appears to be in a calm state.
Meanwhile, when the Y value of the emotion data 121 is not a positive value (step S203; No), the controller 110 changes the values (X, Y) on the emotion map 300 in a direction (negative direction) in which that Y value decreases a distance corresponding to the emotion change amount acquired in step S202 (step S205), and executes step S206. As a result of this processing, the pseudo-emotion of the robot 200 changes so as to become disinterested, and the robot 200 appears to be in a calm state.
In step S206, the controller 110 stands-by for 10 minutes. Then, the controller 110 executes step S201.
Next, examples of the change in the emotion data 121 resulting from the emotion change processing described above are described. For example, as illustrated in
Then, when left alone for another 10 minutes (when left alone for a total of 20 minutes), Yb is a positive value and, as such, the processing of step S204 is executed again, and the emotion data 121 changes to a point C (Xc, Yc) that is a distance 3 separated from the point B in the direction from the point B toward the origin (0, 0). This means that the pseudo-emotion of the robot 200 is changing, in a step-wise manner, to the state of the time of the first start-up.
Thereafter, every time 10 minutes of the leave-alone time elapses, the emotion data 121 for which the Y value is positive changes in the direction of the origin, and the emotion data 121 changes until reaching the point D (Xd=0, Yd=0). Then, when the robot 200 is left alone in this state for another 10 minutes, Yd is not a positive value and, as such, the processing of step S205 is executed, and the emotion data 121 changes from the point D to the point E (Xe=0, Ye) in the negative direction along the Y axis. Then, every time 10 minutes elapses without the external stimulus being detected, the Y value of the emotion data 121 changes in a step-wise manner in the negative direction. This means that the pseudo-emotion of the robot 200 is lost, and the robot 200 changes in a step-wise manner such that the appearance and actions of the robot 200 seem to become disinterested.
As described above, when the external stimulus is not detected in the predetermined period, the apparatus control device 100 according to the present embodiment changes, in a step-wise manner, the emotion data 121 (the emotion parameter) expressing the pseudo-emotion of the robot 200. That is, even in a case such as when the user leaves the robot 200 alone, the pseudo-emotion of the robot 200 changes and, as such, it is possible to express more lifelike emotions, and it is possible to appropriately generate the pseudo-emotion.
Additionally, when the apparatus control device 100 according to the present embodiment determines that the degree of excitement (the Y value) expressed by the emotion data 121 is a positive value, that is, that the robot 200 tends to be excited (when the pseudo-emotion is close to the predetermined state), the apparatus control device 100 changes the emotion data 121 in the direction toward the origin on the emotion map 300, that is, changes the emotion data 121 so as to become the emotion of the time of the first start-up. Moreover, when the apparatus control device 100 according to the present embodiment determines that the degree of excitement (the Y value) expressed by the emotion data 121 is not a positive value, that is, that the robot 200 tends to be disinterested (when the pseudo-emotion is close to the predetermined state), the apparatus control device 100 changes the emotion data 121 in the negative direction along the Y axis of the emotion map 300, that is, changes the emotion data 121 so as to become more disinterested. By changing in this manner, it is possible to provide a robot 200 that becomes calm and gradually more and more composed as a result of being left alone, that can express emotion such as becoming disinterested, and that demonstrates emotions similar to a real creature or pet.
With the apparatus control device 100 according to the present embodiment, the amount of change table 125 (
The present disclosure is not limited to the embodiment described above, and various modifications and uses are possible. For example, in the embodiment described above, the quantity of the emotion data that is changed in a step-wise manner is determined using the amount of change table 125 illustrated in
For example, a configuration is possible in which the quantity of the emotion data 121 that is changed in a step-wise manner is changed on the basis of an amount of charge of the battery 252 of the robot 200. In such a case, the amount of change table 125 illustrated in
For example, a configuration is possible in which the quantity of the emotion data 121, that is changed in a step-wise manner, is changed on the basis of the pseudo personality of the robot 200 expressed by the emotion change data 122. In such a case, for example, the amount of change table 125 illustrated in
For example, a configuration is possible in which the quantity of the emotion data 121, that is changed in a step-wise manner, is changed on the basis of whether the battery 252 of the robot 200 is being charged. In such a case, for example, the amount of change table 125 illustrated in
Note that a configuration is possible in which the conditions described above are combined to determine the quantity of the emotion data 121 that is changed in a step-wise manner. For example, a configuration is possible in which, in step S202 of the emotion change processing, the controller 110 references the various amount of change tables 125 illustrated in
In the embodiment described above, when the external stimulus is not detected in the predetermined period, the emotion data 121 is changed in the direction toward the origin when the degree of excitement (the Y value) expressed by the emotion data 121 is a positive value, and is changed in the negative direction along the Y axis when the degree of excitement (the Y value) is not a positive value. However, the directions in which the emotion data 121 is changed in a step-wise manner are not limited thereto. For example, a configuration is possible in which the emotion data 121 is changed in a step-wise manner in a direction in which the degree of relaxation (the X value) becomes negative. Additionally, a configuration is possible in which the emotion data 121 is changed in a random direction every 10 minutes.
In the embodiments described above, a configuration is described in which the apparatus control device 100 is built into the robot 200, but a configuration is possible in which the apparatus control device 100 is not built into the robot 200. For example, a configuration is possible in which, as illustrated in
In the embodiments described above, the apparatus control device 100 is a control device that controls the robot 200. However, the apparatus to be controlled is not limited to the robot 200. Examples of the apparatus to be controlled include a wristwatch, and the like. For example, in the case of an apparatus that controls a wristwatch that is capable of outputting sound and that includes an acceleration sensor and a gyrosensor, impacts or the like applied to the wristwatch and detected by the acceleration sensor and the gyrosensor can be envisioned as the external stimulus. Additionally, it is expected that the emotion change data 122 and the emotion data 121 are updated in accordance with this external stimulus, and the sound effect data set in the control content table 123 is adjusted (changed) on the basis of the emotion data 121 from the point in time at which the user wears the wristwatch, and outputted.
Accordingly, a configuration is possible in which, when the wristwatch is being handled roughly, a sad-like sound effect is emitted when the user puts the wristwatch on, and when the wristwatch is being handled with care, a pleased sound effect is emitted when the user is puts the wristwatch on. Furthermore, by executing the same emotion change processing in such a wristwatch, the pseudo-emotion can be changed when the user leaves the wristwatch alone and does not use the wristwatch for an extended amount of time.
Thus, the apparatus control device 100 is not limited to a robot and can be applied to various apparatuses, and can provide the applied apparatus with pseudo-emotions, a personality, and the like. Furthermore, the apparatus control device 100 can be applied to various apparatuses to cause a user to feel as if they are pseudo-raising that apparatus.
In the embodiments described above, a description is given in which the action programs executed by the CPU of the controller 110 are stored in advance in the ROM or the like of the storage 120. However, the present disclosure is not limited thereto, and a configuration is possible in which the action programs for executing the various processings described above are installed on an existing general-purpose computer or the like, thereby causing that computer to function as a device corresponding to the apparatus control device 100 according to the embodiments described above.
Any method can be used to provide such programs. For example, the programs may be stored and distributed on a non-transitory computer-readable recording medium (flexible disc, Compact Disc (CD)-ROM, Digital Versatile Disc (DVD)-ROM, Magneto Optical (MO) disc, memory card, USB memory, or the like), or may be provided by storing the programs in a storage on a network such as the internet, and causing these programs to be downloaded.
Additionally, in cases in which the processings described above are realized by being divided between an operating system (OS) and an application/program, or are realized by cooperation between an OS and an application/program, it is possible to store only the portion of the application/program on the non-transitory recording medium or in the storage. Additionally, the programs can be piggybacked on carrier waves and distributed via a network. For example, the programs may be posted to a bulletin board system (BBS) on a network, and distributed via the network. Moreover, a configuration is possible in which the processings described above are executed by starting these programs and, under the control of the operating system (OS), executing the programs in the same manner as other applications/programs.
The foregoing describes some example embodiments for explanatory purposes. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of the invention is defined only by the included claims, along with the full range of equivalents to which such claims are entitled.
Number | Date | Country | Kind |
---|---|---|---|
2022-154437 | Sep 2022 | JP | national |