The present technology relates to a moving body and a moving method, and more particularly to a moving body and a moving method capable of moving the moving body while causing the moving body to exert interactivity.
There is conventionally a moving body that creates an environment map or the like representing a surrounding situation by sensing surrounding persons and environment, and moves autonomously. Examples of the moving body include an automobile, a robot, and an airplane.
A conventional moving body is limited to a moving body that focuses on supporting movement and activity of persons, such as a moving body as a means of moving persons and a moving body that supports activity of persons such as cleaning.
Moreover, the conventional moving body is limited to a moving body in which information such as emotion and character is given in the robot itself and that acts to give a feeling of familiarity in conjunction with user's action such as stroking the head, like a pet-type robot.
The present technology has been made in view of such a situation, and makes it possible to move a moving body while causing the moving body to exert interactivity.
A moving body of one aspect of the present technology includes a moving unit that moves while controlling a movement speed and a movement direction, depending on a state of the moving body, a state of a person located around the moving body, and a parameter indicating character or emotion of the moving body.
In one aspect of the present technology, the movement speed and the movement direction are controlled depending on the state of the moving body, the state of the person located around the moving body, and the parameter indicating the character or emotion of the moving body.
<Overview of the Present Technology>
The present technology focuses on changes in character and emotion of a moving body itself, and moves the moving body while causing the moving body to exert interactivity such as interlocking with an action of an object in consideration of a relationship between the object (human, robot, and the like) and the moving body as well as various relationships surrounding the moving body.
Relationships surrounding the moving body include relationships between moving bodies, relationships between moving bodies within a group including a plurality of moving bodies, relationships between groups including a plurality of moving bodies, and the like.
<Application of Robot System>
The robot system illustrated in
As illustrated in
The robot system is provided with a control device that recognizes a position of each mobile robot 1 and a position of each person, and controls movement of each mobile robot 1.
As illustrated in A of
Inside the main body unit 11, a computer is provided that communicates with the control device and controls actions of the mobile robot 1 in accordance with a control command transmitted from the control device. Furthermore, inside the main body unit 11, a drive unit is also provided that rotates the entire main body unit 11 by changing an amount of rotation and direction of an omni-wheel.
The main body unit 11 rotates with the cover 12 covered, whereby movement of the mobile robot 1 in any direction can be implemented as illustrated in B of
Each mobile robot 1 illustrated in
Each mobile robot 1 moves in conjunction with motion of a person. For example, an action of the mobile robot 1 is implemented, such as approaching the person, or moving away from the person in a case where the person is nearby.
Furthermore, each mobile robot 1 moves in conjunction with motion of another mobile robot 1. For example, an action of the mobile robot 1 is implemented, such as approaching another mobile robot 1 being nearby or performing the same motion and dancing.
As described above, each mobile robot 1 moves alone, or moves by forming a group with another mobile robot 1.
The robot system illustrated in
As illustrated in
Two areas, an area A11 and an area A12, are set in the movable area A1. For example, the whole of the mobile robots 1 is divided into the mobile robots 1 that move in the area A11 and the mobile robots 1 that move in the area A12.
An area in which each mobile robot 1 moves is set, for example, depending on time, or depending on character of the mobile robot 1 described later.
As a result, it is possible to prevent a situation in which the mobile robots 1 unevenly exist in a part of the movable area A1.
As illustrated in
The operation mode of the mobile robot 1 is appropriately switched from a certain operation mode to another operation mode as illustrated by bidirectional arrows. Which operation mode is used is set depending on conditions such as the character of the mobile robot 1, a situation of a person in the room, a situation of another mobile robot 1, and time.
As illustrated in
Furthermore, when the DUO mode is set, the mobile robot 1 takes an action such as shaking together near another mobile robot 1 that forms a group, chasing another mobile robot 1, or pushing against another mobile robot 1.
When the TRIO mode is set, the mobile robot 1 takes an action such as moving following other mobile robots 1 that form a group while gently curving (wave), or moving like drawing a circle with the other mobile robots 1 (dance).
When the QUARTET mode is set, the mobile robot 1 takes an action such as racing with other mobile robots 1 that form a group (run), or moving like drawing a circle with the other mobile robots 1 in a connected state (string).
As the parameters, for example, a parameter representing sociability to persons, a parameter representing sociability to other mobile robots 1, a parameter representing tiredness, and a parameter representing quickness are prepared.
Curious, active, spoiled, and cowardly characters are defined by a combination of values of respective parameters.
The curious (CUTE) character is defined by a combination of 5 for the parameter representing sociability to persons, 1 for the parameter representing sociability to other mobile robots 1, 1 for the parameter of representing tiredness, and 3 for the parameter representing quickness.
The mobile robot 1 having the curious character takes an action, for example, approaching a person, following a person, or taking a predetermined motion near a person.
The active (WILD) character is defined by a combination of 3 for the parameter representing sociability to persons, 3 for the parameter representing sociability to other mobile robots 1, 5 for the parameter of representing tiredness, and 5 for the parameter representing quickness.
The mobile robot 1 having the active character repeatedly performs an action, for example, approaching another mobile robot 1 and then leaving.
The spoiled (DEPENDENT) character is defined by a combination of 3 for the parameter representing sociability to persons, 5 for the parameter representing sociability to other mobile robots 1, 3 for the parameter of representing tiredness, and 1 for the parameter representing quickness.
The mobile robot 1 having the spoiled character takes an action, for example, orbiting around another mobile robot 1 or taking a predetermined motion near the other mobile robot 1.
The cowardly (SHY) character is defined by a combination of 1 for the parameter representing sociability to persons, 3 for the parameter representing sociability to other mobile robots 1, 5 for the parameter of representing tiredness, and 3 for the parameter representing quickness.
The mobile robot 1 having the cowardly character takes an action, for example, escaping from a person or gradually approaching a person.
Such a character is set for each mobile robot 1. Note that, types of the parameters that define the character are not limited to four types illustrated in
It can be said that the parameters are information representing not only the character but also the emotion. That is, the parameters are information representing the character or emotion.
<Example of Action of Mobile Robot 1>
Each mobile robot 1 takes various actions on the basis of not only the character and emotion of the mobile robot 1 itself defined by the parameters as described above but also a relationship between the mobile robot 1 and a surrounding situation. The surrounding situation includes an action of a person, character and emotion of a person, an action of another mobile robot 1, and character and emotion of other mobile robot 1.
The actions taken by each mobile robot 1 includes the following.
(1) Watching over
(2) Becoming attached
(3) Being vigilant
(4) Reacting to a mark
(5) Being distracted
(6) Gathering together among robots
(1) Watching Over
As illustrated in
In this way, an action of “watching over” is implemented.
(2) Becoming Attached
As illustrated in
In this way, an action of “becoming attached” is implemented.
(3) Being Vigilant
As illustrated in
In this way, an action of “being vigilant” is implemented.
(4) Reacting to a Mark
As illustrated in
As illustrated in
In this way, an action of “reacting to a mark” is implemented.
(5) Being Distracted
As illustrated in
In this way, an action of “being distracted” is implemented.
(6) Gathering Together Among Robots
As illustrated in
In this way, an action of “gathering together among robots” is implemented. The action of “gathering together among robots” such that the mobile robots 1 ignore persons all at once is performed, for example, at predetermined time intervals.
As described above, each mobile robot 1 takes various actions to communicate with a person or to communicate with another mobile robot 1. The robot system can move each mobile robot 1 while causing the mobile robot 1 to exert interactivity with a person or another mobile robot 1.
<Configuration Example of Robot System>
As illustrated in
The mobile robot 1 includes a moving unit 21, a control unit 22, and a communication unit 23. The moving unit 21, the control unit 22, and the communication unit 23, are provided in the main body unit 11.
The moving unit 21 implements movement of the mobile robot 1 by driving the omni-wheel. The moving unit 21 functions as a moving unit that implements the movement of the mobile robot 1 while controlling the movement speed and the movement direction in accordance with control by the control unit 22. Control of the moving unit 21 is performed in accordance with a control command generated in the control device 31 depending on a state of the mobile robot 1, a state of surrounding persons, and the parameters of the mobile robot 1.
Furthermore, the moving unit 21 also implements an action of the mobile robot 1 such as shaking, by driving a motor, or the like. Details of a configuration of the moving unit 21 will be described later.
The control unit 22 includes a computer. The control unit 22 executes a predetermined program by a CPU and controls the entire operation of the mobile robot 1. The control unit 22 drives the moving unit 21 in accordance with a control command supplied from the communication unit 23.
The communication unit 23 receives a control command transmitted from the control device 31 and outputs the control command to the control unit 22. The communication unit 23 is also provided inside the computer constituting the control unit 22.
The control device 31 includes a data processing device such as a PC. The control device 31 includes a control unit 41 and a communication unit 42.
The control unit 41 generates a control command on the basis of an imaging result by the camera group 32, a detection result by the sensor group 33, and the like, and outputs the control command to the communication unit 42. In the control unit 41, a control command for each mobile robot 1 is generated.
The communication unit 42 transmits a control command supplied from the control unit 41 to the mobile robot 1.
The camera group 32 includes a plurality of cameras arranged at respective positions in the space where the robot system is installed. The camera group 32 may include RGB cameras or IR cameras. Each camera constituting the camera group 32 generates an image for a predetermined range and transmits the image to the control device 31.
The sensor group 33 includes a plurality of sensors arranged at respective positions in the space where the robot system is installed. As the sensors constituting the sensor group 33, for example, a distance sensor, a human sensor, an illuminance sensor, and a microphone are provided. Each sensor constituting the sensor group 33 transmits information representing a sensing result for a predetermined range to the control device 31.
At least some of functional units illustrated in
In the control device 31, a parameter management unit 51, a group management unit 52, a robot position recognition unit 53, a movement control unit 54, a person position recognition unit 55, and a person state recognition unit 56 are implemented.
The parameter management unit 51 manages the parameters of each mobile robot 1 and outputs the parameters to the group management unit 52 as appropriate.
The group management unit 52 sets the operation mode of each mobile robot 1 on the basis of the parameters managed by the parameter management unit 51.
Furthermore, the group management unit 52 forms and manages a group including the mobile robots 1 in which an operation mode other than the SOLO mode is set, on the basis of the parameters and the like of each mobile robot 1. For example, the group management unit 52 forms a group including the mobile robots 1 whose degree of similarity of the parameters is greater than a threshold value.
The group management unit 52 outputs, to the movement control unit 54, information regarding the operation mode of each mobile robot 1 and information regarding the group to which the mobile robot 1 in which the operation mode other than the SOLO mode is set belongs.
The robot position recognition unit 53 recognizes the position of each mobile robot 1 on the basis of the image transmitted from each camera constituting the camera group 32 or on the basis of the sensing result by each sensor constituting the sensor group 33. The robot position recognition unit 53 outputs information representing the position of each mobile robot 1 to the movement control unit 54.
The movement control unit 54 controls movement of each mobile robot 1 on the basis of the information supplied from the group management unit 52 and the position of the mobile robot 1 recognized by the robot position recognition unit 53. The movement of the mobile robot 1 is appropriately controlled also on the basis of the position of the person recognized by the person position recognition unit 55 and the emotion of the person recognized by the person state recognition unit 56.
For example, in the movement control unit 54, in a case where the mobile robot 1 having the curious character acts in the SOLO mode and there is a person within a predetermined distance centered on a current position of the mobile robot 1, a position near the person is set as a destination. The movement control unit 54 generates a control command giving an instruction to move from the current position to the destination.
Furthermore, in the movement control unit 54, in a case where the mobile robot 1 having the active character acts in the DUO mode and a group is formed by one mobile robot 1 and the other mobile robot 1, a destination of each mobile robot 1 is set. The movement control unit 54 generates a control command for each mobile robot 1 giving an instruction to race by moving from the current position to the destination.
The movement control unit 54 generates a control command for each mobile robot 1 and causes the communication unit 42 to transmit the control command. Furthermore, the movement control unit 54 generates a control command for taking each action as described with reference to
The person position recognition unit 55 recognizes the position of the person on the basis of the image transmitted from each camera constituting the camera group 32 or on the basis of the sensing result by each sensor constituting the sensor group 33. The person position recognition unit 55 outputs information representing the position of the person to the movement control unit 54.
The person state recognition unit 56 recognizes the state of the person on the basis of the image transmitted from each camera constituting the camera group 32 or on the basis of the sensing result by each sensor constituting the sensor group 33.
For example, as the state of the person, an action of the person is recognized such as that a person keeps standing at the same position for a predetermined time or longer, or that a person crouches. Approaching of the mobile robot 1 to a person is started by a predetermined action as a trigger such as, for example, that a person keeps standing at the same position for a predetermined time or longer, or that a person crouches.
Furthermore, the character and emotion of a person are recognized as the state of the person on the basis of a pattern of motion of the person, and the like. For example, in a case where a child who is curious and touches many mobile robots 1 is near a mobile robot 1 having the curious character, control is performed so that the mobile robot 1 is brought closer to the child.
In this case, the mobile robot 1 takes an action of approaching a person whose degree of similarity of the character or emotion is high.
As described above, the action of the mobile robot 1 may be controlled on the basis of the state of the person including the action and emotion. The person state recognition unit 56 outputs information representing a recognition result of the state of the person to the movement control unit 54.
As illustrated in
The robot position recognition unit 53 of the control device 31 detects a blinking pattern of the IR light of each mobile robot 1 by analyzing images imaged by the IR cameras constituting the camera group 32. The robot position recognition unit 53 identifies the position of each mobile robot 1 on the basis of the detected blinking pattern of the IR light.
As illustrated in
An omni-wheel 115 is attached to the motor 114. In the example of
The omni-wheel 115 rotates in a state of being in contact with the inner surface of a spherical cover constituting the main body unit 11. By adjusting the amount of rotation of the omni-wheel 115, the entire main body unit 11 rolls, and the movement speed and the movement direction of the mobile robot 1 are controlled.
A guide roller 116 is provided at a predetermined position on the substrate 112 via a support member. The guide roller 116 is pressed against the inner surface of the cover of the main body unit 11 by, for example, a spring material serving as a support column. As the omni-wheel 115 rotates, the guide roller 116 also rotates in a state of being in contact with the inner surface of the cover.
Instead of covering the main body unit 11 having the configuration illustrated in
<Example of Control by Movement Control Unit 54>
The control by the movement control unit 54 is performed depending on the state of the mobile robot 1, the state of the person being around the mobile robot 1, and the parameters indicating the character and emotion of the mobile robot 1.
As described above, the state of the person also includes the character and emotion of the person recognized by the person state recognition unit 56 on the basis of the action of the person and the like. In this case, the control by the movement control unit 54 is performed depending on a combination of the character and emotion of the mobile robot 1 represented by the parameters and the character and emotion of the person.
In a case where a degree of similarity between the character and emotion of the mobile robot 1 represented by the parameters and the character and emotion of the person is higher than or equal to a threshold value, control may be performed to bring the mobile robot 1 closer to the person. In this case, the mobile robot 1 moves to a person whose character and emotion are similar to those of the mobile robot 1.
In a case where the degree of similarity between the character and emotion of the mobile robot 1 represented by the parameters and the character and emotion of the person is smaller than the threshold value, control may be performed to bring the mobile robot 1 away from the person. In this case, the mobile robot 1 moves away from a person whose character and emotions are not similar to those of the mobile robot 1.
Furthermore, the control by the movement control unit 54 is performed so that the mobile robots 1 form a group depending on a combination of the state of the mobile robot 1 and a state of another mobile robot 1.
For example, the group is formed by the mobile robots 1 being nearby. Furthermore, the group is formed by the mobile robots 1 whose degree of similarity of the parameters is higher than the threshold value and whose character and emotion are similar.
The mobile robot 1 belonging to a predetermined group moves while being in a state of forming the group together with another mobile robot 1.
While being in the state of forming the group, an action such as approaching or leaving a person is performed on a group basis. In this case, the action of a certain mobile robot 1 is controlled on the basis of three parameters, the state of the person, the state of the mobile robot 1 itself, and a state of another mobile robot 1 belonging to the same group.
One mobile robot 1 out of the mobile robots 1 belonging to a certain group may be set as a master robot. In this case, another mobile robot 1 belonging to the same group is set as the master robot.
For a group in which the master robot is set, the parameters of the master robot are set as representative parameters representing the character and emotion of the entire group. The action of each mobile robot 1 belonging to the group is controlled in accordance with the representative parameters.
<Modifications>
It has been described that the action of the mobile robot 1 is controlled by the control device 31; however, the mobile robot 1 may estimate a self-position and move autonomously while determining the surrounding situation.
It has been described that the mobile robot 1 takes an action in conjunction with the action of a person or in conjunction with the action of another mobile robot 1; however, the mobile robot 1 may take the actions described above in conjunction with an action of another type of robot such as a pet-type robot.
A series of processing steps described above can be executed by hardware, or can be executed by software. In a case where the series of the processing steps is executed by the software, a program configuring the software is installed from a program recording medium to a computer incorporated in dedicated hardware, a general purpose personal computer, or the like.
The program executed by the computer can be a program by which the processing is performed in time series along the order described in the present specification, and can be a program by which the processing is performed in parallel or at necessary timing such as when a call is performed.
In the present specification, a system means an aggregation of a plurality of constituents (device, module (component), and the like), and it does not matter whether or not all of the constituents are in the same cabinet. Thus, a plurality of devices that is accommodated in a separate cabinet and connected to each other via a network and one device that accommodates a plurality of modules in one cabinet are both systems.
Note that, the advantageous effects described in this specification are merely examples, and the advantageous effects of the present technology are not limited to them and may include other effects.
The embodiment of the present technology is not limited to the embodiments described above, and various modifications are possible without departing from the gist of the present technology.
For example, the present technology can adopt a configuration of cloud computing that shares one function in a plurality of devices via a network to process in cooperation.
Number | Date | Country | Kind |
---|---|---|---|
2019-025717 | Feb 2019 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2020/003601 | 1/31/2020 | WO | 00 |