This application claims the benefit of Japanese Patent Application No. 2021-207281, filed on Dec. 21, 2021, the entire disclosure of which is incorporated by reference herein.
The present disclosure relates generally to a robot.
Although various robots have been conventionally developed, in recent years, development of not only industrial robots but also consumer robots, such as a communication robot capable of communicating with a person, has progressed. There has been developed a communication robot for which body movements, voices, and expressions of the eyes that look like those of a person or an animal are employed as an expression method of the communication robot.
For example, in Unexamined Japanese Patent Application Publication No. 2002-307354, a humanoid or animal-like electronic robot in which a face display animation that changes a facial expression for each of modes “delight”, “anger”, “sad”, and “pleasure” is provided and, further, sounds and motions are combined for each mode is disclosed. In addition, in Unexamined Japanese Patent Application Publication No. 2020-137687, a robot device that includes a light emitting-type emotional expression portion capable of rapidly or slowly changing the color of the white of the eye from a color to another color is disclosed.
One aspect of a robot according to the present disclosure includes an umbrella portion capable of performing a rotational movement and an opening/closing movement of an umbrella; and a processor. The processor acquires emotion data representing a pseudo emotion in accordance with an external stimulus, and controls, based on the emotion data, at least one of the rotational movement and the opening/closing movement of the umbrella of the umbrella portion.
A more complete understanding of this application can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
An embodiment of the present disclosure is described below with reference to the drawings. Note that, in the drawings, the same or equivalent constituent elements are designated by the same reference numerals.
In addition, to the motor frame 28, a shaft 36 that extends upward and a slider 37 that is formed in a disk shape and in which a hole through which the slider 37 is to be fitted onto the shaft 36 is formed at the center are disposed. The slider 37 fitted onto the shaft 36 is movable in the vertical direction along the shaft 36 and rotatable about the shaft 36. On the circumferential surface of the slider 37, a cam groove 37a into which the lever shaft 34 is to be fitted is formed. The lever shaft 34 slides within the cam groove 37a in association with the rotation of the wheel 30. In addition, to the slider 37, one ends of the fin levers 26 are attached while being axially supported in a rotatable manner On the other hand, the other ends of the fin levers 26 are attached to lower portions of the fins 24 while being axially supported in a rotatable manner Upper-side end portions of the fins 24 are attached to a disk-shaped upper block 38 that is attached to an upper-side end portion of the shaft 36, while being axially supported in a rotatable manner The upper block 38, as with the slider 37, is rotatable about the shaft 36. Therefore, the fins 24 are configured to be rotatable in conjunction with the slider 37. All of six pairs of a fin 24 and a fin lever 26 are attached in the same manner
The lever shaft 34, which slides within the cam groove 37a, moves the slider 37 in the vertical direction along the shaft 36 or in the horizontal direction about the shaft 36 in association with the rotation of the wheel 30. The fin levers 26 move in the vertical direction in association with the vertical movement of the slider 37 and, in association therewith, the fins 24 also rotationally move in the vertical direction about the upper block 38, which enables the umbrella to be opened and closed. The structure described above enables rotation of the servo motor 29 to move the fins 24 in the rotational direction and the direction in which the fins 24 are opened.
Next, a movement in which the robot 100 rotates the umbrella and a movement in which the robot 100 opens and closes the umbrella are described.
First, the rotational movement is described. When the umbrella is to be rotated right, the servo motor 29 is rotated from an initial position in the initial state illustrated in
Next, the movement in which the umbrella is opened is described. When the umbrella is to be opened, the servo motor 29 is rotated from the initial position in the initial state illustrated in
Next, a functional configuration of the robot 100 is described. The robot 100 includes a controller 110, a storage 120, a communicator 130, a sensor group 210, a driver 220, an outputter 230, and an operation acceptor 240, as illustrated in
The controller 110 is configured by, for example, a central processing unit (CPU) or the like and executes various types of processing, which are described later, by executing programs stored in the storage 120. Note that, since the controller 110 has a capability of performing a multi-thread function in which a plurality of pieces of processing is performed in parallel with one another, the controller 110 is capable of executing various types of processing, which are described later, in parallel with one another. The controller 110 also has a clock function and a timer function and is capable of timing a date and time and the like.
The storage 120 includes a read only memory (ROM), a flash memory, a random access memory (RAM), and the like. In the ROM, programs that the CPU of the controller 110 executes and data that are required in advance for the CPU to execute the programs are stored. A flash memory is a writable non-volatile memory, and data that need to be saved after power is cut off are stored in the flash memory. In the RAM, data that are generated or changed during execution of programs are stored. The storage 120 stores, for example, emotion data 121, emotion change data 122, and a growth table 123, which are described later.
The communicator 130 includes a communication module compatible with a wireless local area network (LAN), Bluetooth (registered trademark), or the like and performs data communication with an external device, such as a smartphone.
The sensor group 210 includes the afore-described pyroelectric sensor 20, triaxial acceleration sensor 16, illuminance sensor 17, and microphone 18. The controller 110 acquires detected values detected by various types of sensors that the sensor group 210 includes as external stimulus data that represent external stimuli acting on the robot 100. Note that the sensor group 210 may include a sensor other than the pyroelectric sensor 20, the triaxial acceleration sensor 16, the illuminance sensor 17, and the microphone 18. Increasing the types of sensors that the sensor group 210 includes enables the types of external stimuli that the controller 110 can acquire to be increased. For example, the sensor group 210 may include an image acquirer, such as a charge-coupled device (CCD) image sensor. In this case, the controller 110 becomes capable of, by recognizing an image that the image acquirer acquired, determining who a person present around the robot 100 is (for example, an owner of the robot 100, a person who always takes care of the robot 100, or a stranger).
The pyroelectric sensor 20 is capable of detecting that a person has moved and, for example, is capable of detecting that a person is coming close to the robot 100. The pyroelectric sensor 20 is configured by an infrared sensor that detects infrared rays emitted from an object, such as a human body, using pyroelectric effect. The controller 110 detects how close the object has come to the robot 100, based on a detected value from the pyroelectric sensor 20.
The triaxial acceleration sensor 16 detects acceleration in three axial directions that are composed of the front-rear direction, the width (right-left) direction, and the vertical direction of the robot 100. Since the triaxial acceleration sensor 16 detects gravitational acceleration when the robot 100 is standing still, the controller 110 is capable of detecting a current attitude of the robot 100, based on gravitational acceleration that the triaxial acceleration sensor 16 detected. In addition, when, for example, a user lifts up, lightly rubs, or slaps the robot 100, the triaxial acceleration sensor 16 detects acceleration associated with movement of the robot 100 in addition to gravitational acceleration. Therefore, the controller 110 is capable of detecting a change in the attitude of the robot 100, by removing a gravitational acceleration component from a detected value that the triaxial acceleration sensor 16 detected. Based on a detected value of the triaxial acceleration sensor 16, external stimuli, such as the robot 100 being rubbed and slapped by the user, can be detected. Note that the controller 110 may detect such external stimuli using a sensor other than the triaxial acceleration sensor 16, such as a touch sensor.
The illuminance sensor 17 senses illuminance around the robot 100. The illuminance sensor 17 is also capable of recognizing a level of illuminance. Since the controller 110 is capable of determining whether it has gradually become bright or rapidly become bright because the illuminance sensor 17 constantly senses illuminance, the controller 110 is also capable of detecting, for example, that the user turned on a light in the night.
The microphone 18 detects sound around the robot 100. The controller 110 is capable of detecting, for example, that the user is speaking to the robot 100, that the user is clapping his/her hands, or the like, based on components of the sound that the microphone 18 detected.
The driver 220 includes the servo motor 29 as a movable portion to express motion of the robot 100 and is driven by the controller 110. The controller 110 controlling the driver 220 enables the robot 100 to express movements, such as opening the umbrella and rotating the umbrella. Movement control data for performing such movements are recorded in the storage 120, and the movement of the robot 100 is controlled based on detected external stimuli, a growth value, which is described later, and the like.
Note that the above-described configuration is only an example of the driver 220, and the driver 220 may include wheels, crawlers, hands and feet, or the like and the robot 100 may be capable of moving in an arbitrary direction or arbitrarily moving the body thereof.
The outputter 230 includes the speaker 19, and the controller 110 inputting sound data in the outputter 230 causes a sound to be output from the speaker 19. For example, the controller 110 inputting voice data of the robot 100 in the outputter 230 causes the robot 100 to emit a pseudo voice. The voice data are also recorded in the storage 120, and a type of voice is selected based on a detected external stimulus, a growth value, which is described later, and the like.
The outputter 230 also includes the LEDs 27 and causes the LEDs 27 to emit light, based on a detected external stimulus, a growth value, which is described later, and the like. Since each of the LEDs 27 produces three primary colors (red, green, and blue) in 256 gradations for each color, the LEDs 27 are capable of representing 16 million or more colors. As with movements, any lighting patterns, such as rapid on/off flashing, slow on/off flashing, and lighting gradually changing color, can be achieved by control. In addition, the robot 100 may include, instead of the LEDs 27, a display, such as a liquid crystal display, as the outputter 230 and may display an image based on a detected external stimulus, a growth value, which is described later, and the like on the display.
The operation acceptor 240 includes, for example, an operation button and a volume knob. The operation acceptor 240 is an interface for accepting an operation performed by the user, such as power on or power off and adjustment of the volume of output sound. Note that, in order to increase a feeling as if the robot 100 were a living thing, the robot 100 does not have to include, as the operation acceptor 240, any component, such as an operation button and a volume knob, except a power switch, which is disposed on the inner side of the exterior. Even in this case, an operation of the robot 100, such as adjustment of the volume of output sound, can be performed using an external device, such as a smartphone, that is connected to the robot 100 via the communicator 130.
Next, among data to be stored in the storage 120, the emotion data 121, the emotion change data 122, the growth table 123, a movement detail table 124, and growth days data 125, which are data required for determining a general movement to be determined based on a growth value and the like, are described in sequence.
The emotion data 121 are data for causing the robot 100 to have a pseudo emotion and are data (X, Y) representing coordinates on an emotion map 300. The emotion map 300 indicates a distribution of emotion and, as illustrated in
Note that, although, in
In the present embodiment, the size of the emotion map 300 as an initial value is defined by the maximum value of 100 and the minimum value of −100 with respect to both the X value and the Y value, as illustrated by a frame 301 in
The emotion change data 122 are data for setting the amount of change that increases or decreases each of the X value and the Y value of the emotion data 121. In the present embodiment, the emotion change data 122 include, as emotion change data corresponding to the X value of the emotion data 121, DXP that increases the X value and DXM that decreases the X value, and, as emotion change data corresponding to the Y value, DYP that increases the Y value and DYM that decreases the Y value. That is, the emotion change data 122 are composed of the following four variables and are data indicating a degree to which pseudo emotion of the robot 100 is changed.
DXP: Easiness to feel secure (easiness for the X value to change in the positive direction on the emotion map)
DXM: Easiness to feel anxious (easiness for the X value to change in the negative direction on the emotion map)
DYP: Easiness to feel excited (easiness for the Y value to change in the positive direction on the emotion map)
DYM: Easiness to feel apathetic (easiness for the Y value to change in the negative direction on the emotion map)
In the present embodiment, initial values of all of these variables are, as an example, set to 10, and the values of the variables are assumed to increase up to 20 by performing training processing of emotion change data in movement control processing, which is described later. Since the training processing causes the emotion change data 122, that is, degrees to which emotion changes, to change, the robot 100 is to have various characters depending on a manner in which the user deals with the robot 100. In other words, each of the characters of the robot 100 is to be differently formed depending on a manner in which the user deals with the robot 100.
Thus, in the present embodiment, each piece of character data (character value) is derived by subtracting 10 from a corresponding piece of the emotion change data 122. That is, a value obtained by subtracting 10 from DXP, which indicates an easiness to feel secure, is defined as a character value “happy”, a value obtained by subtracting 10 from DXM, which indicates an easiness to feel anxious, is defined as a character value “shy”, a value obtained by subtracting 10 from DYP, which indicates an easiness to feel excited, is defined as a character value “active”, and a value obtained by subtracting 10 from DYM, which indicates an easiness to feel apathetic, is defined as a character value “wanted”. Because of this configuration, for example, a character value radar chart 400 can be generated by plotting the character value “happy”, the character value “active”, the character value “shy”, and the character value “wanted” on an axis 411, an axis 412, an axis 413, and an axis 414, respectively, as illustrated in
Since an initial value of each character value is 0, the initial character of the robot 100 is represented by an origin 410 of the character value radar chart 400. As the robot 100 grows, each character value changes up to a limit of 10 due to external stimuli and the like (a manner in which the user deals with the robot 100) detected by the sensor group 210. When the four character values change in a range from 0 to 10 as in the present embodiment, 11 to the power of four, that is, 14641, types of characters can be expressed.
In the present embodiment, the largest value among the four character values is used as growth degree data (growth value) that indicate a degree of pseudo growth of the robot 100. The controller 110 performs control in such a way that variations are produced in movement details of the robot 100 as the robot 100 grows in a pseudo manner (as the growth value increases). Data that the controller 110 uses for this purpose is the growth table 123.
In the growth table 123, the type of each of movements that the robot 100 performs in accordance with a movement trigger, such as an external stimulus detected by the sensor group 210, and a probability that the movement is selected depending on the growth value (hereinafter, referred to as a “movement selection probability”) are recorded. Note that the movement trigger is information of an external stimulus or the like that serves as an event causing the robot 100 to perform some movement. The movement selection probabilities are set such that, while the growth value is small, a basic movement that is set in accordance with a movement trigger is selected regardless of a character value and, when the growth value increases, a character movement that is set in accordance with a character value is selected. The movement selection probabilities are also set such that, as the growth value increases, the types of selectable basic movements increase.
The movement detail table 124 is a table in which a specific movement detail of each movement type defined in the growth table 123 is recorded. Note, however, that, with regard to character movements, a movement detail is defined for each type of character. Note that the movement detail table 124 is not essential data. For example, the movement detail table 124 is not required when the growth table 123 is configured in a form in which specific movement details are directly recorded in a movement type column in the growth table 123.
The initial value of the growth days data 125 is 1, and the growth days data 125 are incremented by 1 each time one day elapses. The growth days data 125 enables pseudo growth days (the number of days since the pseudo birth) of the robot 100 to be represented.
Next, referring to a flowchart illustrated in
First, the controller 110 initializes various types of data, such as the emotion data 121, the emotion change data 122, and the growth days data 125 (step S101).
Next, the controller 110 executes processing of acquiring external stimuli from various types of sensors that the sensor group 210 includes (step S102).
Next, the controller 110 determines whether or not an external stimulus detected by the sensor group 210 has been applied (step S103).
When an external stimulus has been applied (step S103; Yes), the controller 110 acquires the emotion change data 122 that are to be added to or subtracted from the emotion data 121 depending on the external stimulus acquired from the various types of sensors (step S104). Specifically, for example, since the robot 100 feels pseudo security when the robot 100 detects, as an external stimulus, that the robot 100 is rubbed, by the triaxial acceleration sensor 16, the controller 110 acquires DXP as the emotion change data 122 to be added to the X value of the emotion data 121.
Next, the controller 110 sets the emotion data 121 in accordance with the emotion change data 122 acquired in step S104 (step S105). This processing causes the controller 110 to acquire the emotion data 121 that are updated in accordance with an external stimulus, and, because of this configuration, the controller 110 constitutes an emotion data acquirer. Specifically, when, for example, DXP is acquired as the emotion change data 122 in step S104, the controller 110 adds the DXP, which is the emotion change data 122, to the X value of the emotion data 121. Note, however, that, when addition of the emotion change data 122 causes the value (the X value or the Y value) of the emotion data 121 to exceed the maximum value of the emotion map 300, the value of the emotion data 121 is set to the maximum value of the emotion map 300. In addition, when subtraction of the emotion change data 122 causes the value of the emotion data 121 to be less than the minimum value of the emotion map 300, the value of the emotion data 121 is set to the minimum value of the emotion map 300.
Although, in steps S104 and S105, what type of emotion change data 122 are acquired and used for setting the emotion data 121 with respect to each external stimulus can be arbitrarily set, examples are described hereinbelow. Note that, since, for the X value and the Y value of the emotion data 121, maximum values and minimum values are defined depending on the size of the emotion map 300, a maximum value and a minimum value are set to the X value or the Y value when the X value or the Y value calculated in accordance with the following procedures exceeds the maximum value and when the X value or the Y value falls below the minimum value, respectively.
A light is turned on (the robot 100 becomes calm): X=X+DXP and Y=Y−DYM.
(This external stimulus can be detected by the illuminance sensor 17)
The user comes close to the robot 100 (the robot 100 is pleased): X=X+DXP and Y=Y+DYP.
The user moves away from the robot 100 (the robot 100 is sad): X=X−DXM and Y=Y−DYM.
(These external stimuli can be detected by the pyroelectric sensor 20)
The robot 100 is gently rubbed (the robot 100 is pleased): X=X+DXP and Y=Y+DYP.
The robot 100 is slapped (the robot 100 is sad: X=X−DXM and Y=Y−DYM.
The robot 100 is lifted up with the umbrella pointing upward (the robot 100 is excited): Y=Y+DYP.
The robot 100 is suspended in midair with the umbrella pointing downward (the robot 100 becomes apathetic): Y=Y−DYM.
(These external stimuli can be detected by the triaxial acceleration sensor 16) The robot is called in a gentle voice (the robot 100 becomes calm): X=X+DXP and Y=Y−DYM.
The robot 100 is yelled at loudly (the robot 100 is irritated): X=X−DXM and Y=Y+DYP.
(These external stimuli can be detected by the microphone 18)
The controller 110 executes a movement corresponding to an external stimulus (step S106) and proceeds to step S109.
In contrast, when it is determined that no external stimulus has been applied in step S103 (step S103; No), the controller 110 determines whether or not the robot 100 performs a spontaneous movement, such as a breathing movement (step S107). Although any method can be used as a method for determining whether or not the robot 100 performs a spontaneous movement, it is assumed in the present embodiment that the determination in step S107 results in Yes at each breathing period (for example, every 2 seconds) and the breathing movement is performed.
The breathing movement is, for example, a movement in which the umbrella in the umbrella portion 2 is slowly opened and subsequently returned to the original position. At the same time, the LEDs 27 may be configured to slightly emit light or the speaker 19 may be configured to emit a breathing sound. Performing a spontaneous movement as described above when no external stimulus has been applied enables the reality of the robot 100 as a living thing to be better expressed and the user to feel an attachment to the robot 100.
When the controller 110 determines that the robot 100 performs a spontaneous movement (step S107; Yes), the controller 110 executes the spontaneous movement (for example, a breathing movement) (step S108) and proceeds to step S109.
When the controller 110 determines that the robot 100 does not perform a spontaneous movement (step S107; No), the controller 110 determines whether or not the date has changed, using a clock function (step S109). When the date has not changed (step S109; No), the controller 110 returns to step S102.
When the date has changed (step S109; Yes), the controller 110 determines whether or not the current time is within the first period (step S110). When it is assumed that the first period is a period of, for example, 50 days since the pseudo birth (for example, the first activation by the user after purchase) of the robot 100, the controller 110 determines that the current time is within the first period when the growth days data 125 is less than or equal to 50. When the current time is not within the first period (step S110; No), the controller 110 proceeds to step S112.
When the current time is within the first period (step S110; Yes), the controller 110 performs training of the emotion change data 122 and expands the emotion map (step S111). The training of the emotion change data 122 is specifically processing of, when, in step S105 on the previous day, the X value of the emotion data 121 was set to the maximum value of the emotion map 300 even once, the Y value of the emotion data 121 was set to the maximum value of the emotion map 300 even once, the X value of the emotion data 121 was set to the minimum value of the emotion map 300 even once, and the Y value of the emotion data 121 was set to the minimum value of the emotion map 300 even once, updating the emotion change data 122 by adding 1 to DXP of the emotion change data 122, adding 1 to DYP of the emotion change data 122, adding 1 to DXM of the emotion change data 122, and adding 1 to DYM of the emotion change data 122, respectively.
Note, however, that, since, when each value of the emotion change data 122 becomes too large, the amount of change in the emotion data 121 per update becomes too large, each value of the emotion change data 122 is limited to, for example, less than or equal to the maximum value that is set to 20. In addition, although, in the present embodiment, it is assumed that 1 is added to any data in the emotion change data 122, a value to be added is not limited to 1. For example, it may be configured such that the number of times that each value of the emotion data 121 is set to the maximum value or the minimum value is counted and, when the number is large, a numerical value to be added to the emotion change data 122 is increased.
Returning to step S111 in
Next, the controller 110 adds 1 to the growth days data 125 and initializes both the X value and the Y value of the emotion data to 0 (step S112) and returns to step S102.
Next, emotional expressions in accordance with the emotion map 300 are described. The emotion of the robot 100 is expressed by an opening/closing movement and a rotational movement of the umbrella, change in color, intensity, and on-off patterns of light by the LEDs 27, an emitted sound, and the like.
When the umbrella is brought to the state in
Returning to
Next, specific examples of emotional expression corresponding to external stimuli are enumerated below. For example, when the room is dark, the illuminance sensor 17 detects that the light is off and the robot 100 is put into a sleep mode in which motion and light emission of the robot 100 are suspended. When the light is turned on while the robot 100 is in this state, the illuminance sensor 17 detects a rapid change in the illuminance because the illuminance sensor 17 constantly performs sensing and the robot 100 determines that the light is turned on. Then, the robot 100 lights the LEDs 27 in green for approximately one minute in such a manner as to assert the presence of the robot 100 itself and repeats a movement of opening the umbrella slightly and slowly and a movement of closing the umbrella. In addition, when the user comes close to the robot 100, the pyroelectric sensor 20 detects a person and, in response to the detection, the robot 100 causes the LEDs 27 to slowly emit orange light and, while repeating the opening/closing movement of the umbrella widely and slowly, pronounces in such a manner as to seek attention and shows an expression of a desire to be taken care of. In addition, when the user gently touches the umbrella of the robot 100, the triaxial acceleration sensor 16 detects the touch and, in response to the detection, the robot 100, while producing an orange color, pronounces in such a manner as to seek attention, repeats the opening/closing movement of the umbrella slowly and slightly, and, by including a repetitive rotational movement in the lateral direction, exhibits an expression of pleasure. On this occasion, the user is able to feel an attachment to the robot 100 through a visual sense, an auditory sense, and a tactile sense including a soft touch delivered to the hand. In addition, since the robot 100 grows up as the time elapses, the robot 100 comes to exhibit a different emotional expression depending on a formed character, such as coming to always exhibit a cheerful expression when the robot 100 is often dealt with by the user and coming to exhibit a lonely expression when the robot 100 is rarely dealt with.
Note that the present disclosure is not limited to the above-described embodiment and can be subjected to various modifications and applications, and details of emotions and expression methods are only examples. In addition, although the movements are assumed to include a movement of opening the umbrella, a movement of closing the umbrella, and a movement of rotating the umbrella laterally, another movement may be included.
In addition, the configuration of the emotion map 300 and the methods for setting the emotion data 121, the emotion change data 122, the character data, the growth value, and the like in the above-described embodiment are only examples. For example, as a simpler configuration, a numerical value obtained by dividing the growth days data 125 by a certain number (when the numerical value exceeds 10, the numerical value is always set to 10) may be set as a growth value.
In addition, although, in the above-described embodiment, it was configured such that the controller 110 to control the robot 100 is incorporated in the robot 100, the controller 110 to control the robot 100 does not necessarily have to be incorporated in the robot 100.
For example, a control device (not illustrated) including a controller, a storage, and a communicator may be configured as a separate device (for example, a server) from the robot 100. In this variation, the communicator 130 of the robot 100 and the communicator of the control device are configured to be able to transmit and receive data to and from each other. The controller of the control device acquires external stimuli detected by the sensor group 210 and controls the driver 220 and the outputter 230 via the communicator of the control device and the communicator 130 of the robot 100.
Note that, when the control device and the robot 100 are configured by separate devices as described above, the robot 100 may be configured to be controlled by the controller 110 as needed basis. For example, it may be configured such that a simple movement is controlled by the controller 110 and a complex movement is controlled by the controller of the control device via the communicator 130, and the like.
In the above-described embodiment, movement programs that the CPU of the controller 110 executes are stored in the ROM and the like in the storage 120 in advance. However, the present disclosure is not limited to the configuration, and an existing general-purpose computer may be configured to function as a device equivalent to the controller 110 and the storage 120 of the robot 100 according to the above-described embodiment, by installing movement programs for causing the robot 100 to execute the above-described various types of processing in the computer.
In addition, it may be configured such that data relating to pseudo emotion, such as the emotion change data 122, the growth table 123, the movement detail table 124, and the growth days data 125, that are stored in the storage 120 can be acquired from an external device and edited by the external device. Specifically, data relating to pseudo emotion are acquired from the robot 100, using an application program installed in an information communication device, such as a smartphone, and are displayed on a display screen of the application. Further, it may be configured such that displayed data are edited by the user and subsequently sent to the robot 100.
Such a configuration enables the user, who desires to raise the robot 100 into a robot having a character that the user prefers, to confirm the character of the robot 100 on the screen and set the character of the robot 100 to a character that the user prefers and cause the robot 100 to perform a movement that the user prefers. Further, when the pseudo growth of the robot 100 has already stopped and the character of the robot 100 has been fixed, it becomes possible to reset the growth value of the robot 100 and raise the robot 100 again.
An arbitrary method can be used as a method for providing such programs, and, for example, the programs may be stored in a non-transitory computer-readable recording medium (a flexible disk, a compact disc (CD)-ROM, a digital versatile disc (DVD)-ROM, a magneto-optical disc (MO), a memory card, a USB memory, or the like) and distributed or may be provided by storing the programs in a storage on a network, such as the Internet, and causing the programs to be downloaded.
In addition, when the above-described processing is to be executed through sharing of processing between an operating system (OS) and an application program or collaboration between the OS and the application program, only the application program may be stored in a non-transitory recording medium or a storage. It is also possible to superimpose a program on a carrier wave and distribute the program via a network. For example, the above-described program may be posted on a bulletin board system (BBS) on the network, and the program may be distributed via the network. The above-described processing may be configured to be able to be executed by starting up and executing the distributed program in a similar manner to other application programs under the control of the OS.
In addition, the controller 110 may be configured not only by an arbitrary processor, such as a single processor, multiple processors, and a multi-core processor, alone but also by combining such an arbitrary processor and a processing circuit, such as an application specific integrated circuit (ASIC) and a field-programmable gate array (FPGA).
The foregoing describes some example embodiments for explanatory purposes. Although the foregoing discussion has presented specific embodiments, persons skilled in the art will recognize that changes may be made in form and detail without departing from the broader spirit and scope of the invention. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense. This detailed description, therefore, is not to be taken in a limiting sense, and the scope of the invention is defined only by the included claims, along with the full range of equivalents to which such claims are entitled.
Number | Date | Country | Kind |
---|---|---|---|
2021-207281 | Dec 2021 | JP | national |