The present disclosure relates to a robot.
Japanese Unexamined Patent Application Publication No. 2004-306251 discloses a robot that determines whether or not the robot is in a state of being held or a state of being lifted by a user's arms, and stops the operation of joint mechanisms based on a determination result.
However, further improvement on the above-mentioned technique in related art is called for.
In one general aspect, the techniques disclosed here feature a robot including: a spherical body; a frame disposed inside the body; a display that is mounted in the frame and displays at least part of a face of the robot; a set of drive wheels that are mounted in the frame, are in contact with an inner circumferential surface of the body, and cause the body to move by rotating the body; a drive mechanism for weight that is mounted in the frame and causes the weight to reciprocate in a predetermined direction; an angular speed sensor that detects an angular speed, of the display, around an axis in a horizontal direction perpendicular to a moving direction of the body; a memory that stores a correspondence relationship between a reference pitch angle and a minimum control amount which is used in the drive mechanism for moving the body without being stopped; and a control circuit that, when the robot moves to a predetermined target point by rotating the body, detects a statistical value of a pitch angle which changes since an instruction to rotate the body is given to the drive mechanism, where the pitch angle is a cumulative value of the detected angular speed, determines the minimum control amount corresponding to the detected statistical value of the pitch angle by referring to the correspondence relationship, when the robot arrives at a location a predetermined distance short of the predetermined target point, generates a deceleration control amount for the drive mechanism in a range greater than or equal to the minimum control amount, according to a remaining distance to the predetermined target point, and decelerates the rotation of the body by controlling the drive mechanism in accordance with the deceleration control amount.
These general and specific aspects may be implemented using a system, a method, and a computer program, and any combination of systems, methods, and computer programs.
Thus, for instance, when a user calls a robot to move toward the user, the robot can stop at the location of the user.
Additional benefits and advantages of the disclosed embodiments will become apparent from the specification and drawings. The benefits and/or advantages may be individually obtained by the various embodiments and features of the specification and drawings, which need not all be provided in order to obtain one or more of such benefits and/or advantages.
First, the inventor has been studying a robot that has a spherical body and moves by rotating the body.
The inventor has been studying the function that allows a user of the above-mentioned robot to move the robot to the location of the user by calling the name of the robot.
In order to achieve such function of the robot, the inventor has devised the following specifications.
Specifically, the robot recognizes an instruction for moving the robot to the location of the user and identifies the location of the user based on the voice uttered by the user. The robot then sets the identified location of the user as a target point, and starts to move to the target point. When detecting arrival to the target point, the robot stops the movement motion.
However, after the inventor tried various experiments, it was found that stopping the robot at the target point is not necessarily easy. This is because the body of the robot is spherical and is likely to be rolled, and thus stopping the robot at a desired location is not easy. As a consequence, the robot sometimes stopped short of the location of the user or passed by the location of the user due to inertia even after driving of the robot was stopped.
Therefore, in order to avoid stopping of the robot short of the location of the user or stopping of the robot after passing the location of the user, the performance of the robot had to be improved so that the robot stops at the location of the user.
After intensive study, the inventor has found that in order to stop the robot at the location of the user, not only information indicating the speed of movement of the robot and information indicating the distance to the target point, but also information indicating the material of a moving surface are needed.
Meanwhile, the robot itself can identify the information indicating the speed of movement of the robot, for instance, from information indicating the number of revolutions of a motor inside the robot. Similarly, the robot itself can identify the information indicating the distance to the target point based on, for instance, information inputted from a camera built in the robot.
As for the information indicating the material of a moving surface, the inventor found a problem that such information is not directly identifiable from the information inputted from sensors provided inside the robot.
As a result of intensive study, the inventor focused on the fact that when the robot starts to move, a rotation angle of the body of the robot varies according to the material of a moving surface. For instance, when a moving surface is wood floor, the friction between the robot and the moving surface is relatively low. Thus, in this case, the angle of rotation of the body of the robot is relatively small. In contrast, when the moving surface is carpet, the friction between the robot and the moving surface is relatively high. Thus, in this case, the angle of rotation of the body of the robot is relatively large. Consequently, although the information indicating the material of a moving surface is not directly identifiable from the information inputted from sensors provided inside the robot, the information is identifiable based on the rotation angle of the body of the robot when the robot starts to move.
Based on the knowledge described above, the inventor has devised an aspect of the invention below.
A robot according to an aspect of the present disclosure includes: a spherical body; a frame disposed inside the body; a display that is mounted in the frame and displays at least part of a face of the robot; a set of drive wheels that are mounted in the frame, are in contact with an inner circumferential surface of the body, and cause the body to move by rotating the body; a drive mechanism for weight that is mounted in the frame and causes the weight to reciprocate in a predetermined direction; an angular speed sensor that detects an angular speed, of the display, around an axis in a horizontal direction perpendicular to a moving direction of the body; a memory that stores a correspondence relationship between a reference pitch angle and a minimum control amount which is used in the drive mechanism for moving the body without being stopped; and a control circuit that, when the robot moves to a predetermined target point by rotating the body, detects a statistical value of a pitch angle which changes since an instruction to rotate the body is given to the drive mechanism, where the pitch angle is a cumulative value of the detected angular speed, determines the minimum control amount corresponding to the detected statistical value of the pitch angle by referring to the correspondence relationship, when the robot arrives at a location a predetermined distance short of the predetermined target point, generates a deceleration control amount for the drive mechanism in a range greater than or equal to the minimum control amount, according to a remaining distance to the predetermined target point, and decelerates the rotation of the body by controlling the drive mechanism in accordance with the deceleration control amount.
According to the aspect, there is provided an angular speed sensor that detects an angular speed with respect to the horizontal direction perpendicular to the moving direction of the body so that when the robot moves to a predetermined target point by rotating the body, a statistical value of the angular speed is detected, which changes in a predetermined time since an instruction of rotating the body is given to the drive mechanism.
Thus, a minimum control amount corresponding to a statistical value of the detected pitch angle is determined, and when the robot arrives at a location a predetermined distance short of the target point, a deceleration control amount for the drive mechanism is generated according to the remaining distance to the target point in a range greater than or equal to the minimum control amount so that rotation of the body is decelerated by controlling the drive mechanism in accordance with the deceleration control amount.
Thus, the robot can stop at the location of the user in consideration of the material of a moving surface based on the rotation angle of the body of the robot at the start of movement of the robot without stopping short of the location of the user or stopping after passing the location of the user.
In other words, the robot decelerates in a range greater than or equal to the minimum control amount in accordance with the deceleration control amount, and thus it is possible to prevent stopping of the robot short of the location of the user. Also, the robot decelerates near the predetermined target point in accordance with a deceleration control amount in the vicinity of the minimum control amount, and thus it is possible to avoid rolling of the robot due to inertia after an instruction of stopping rotation of the body is given. Therefore, when an instruction of stopping the rotation of the body is given, the robot can be stopped at the timing.
Also, in the aspect, the reference pitch angle includes a first reference pitch angle and a second reference pitch angle, before the robot starts to move, the control circuit may detect a maximum pitch angle as a statistical value of the pitch angle, and may determine that the minimum control amount is a first control amount corresponding to the first reference pitch angle corresponding to the maximum pitch angle, after the robot starts to move, the control circuit may detect an average pitch angle as the statistical value of the pitch angle, may determine that the minimum control amount is a second control amount corresponding to the second reference pitch angle corresponding to the average pitch angle, and may generate the deceleration control amount in a range greater than or equal to the minimum control amount which is determined before the robot arrives at the location the predetermined distance short of the predetermined target point.
In the aspect, before the robot starts to move, a first control amount corresponding to the first reference pitch angle corresponding to the detected maximum pitch angle is determined to be a minimum control amount, and after the robot starts to move, a second control amount corresponding to the second reference pitch angle corresponding to the detected average pitch angle is determined. Therefore, even when the floor surface for the robot is changed to a floor surface with a different material during movement of the robot, the robot can be stopped at a target point.
Also, in the aspect, the control circuit may decelerate the rotation of the body by decreasing the deceleration control amount by S-curve control.
In the aspect, the rotation of the body is decelerated by S-curve control, and thus the robot can be stopped without wobbling at a predetermined target point.
Also, in the aspect, when movement of the robot is started by rotating the body, the control circuit may accelerate the rotation of the body by increasing an acceleration control amount for accelerating the rotation of the body by trapezoidal control until a rotational speed of the body reaches a predetermined speed.
In the aspect, when the robot is started to move, the body is accelerated by trapezoidal control until the rotational speed of the body reaches a predetermined speed, and thus it is possible to shorten the movement time of the robot to a predetermined target point.
Also, in the aspect, after the rotational speed of the body reaches the predetermined speed, the control circuit may maintain the rotational speed of the body at the predetermined speed until the robot arrives at the location the predetermined distance short of the predetermined target point.
In the aspect, after the rotational speed of the body reaches a predetermined speed, the rotational speed of the body is maintained at the predetermined speed until the robot arrives at a location a predetermined distance short of a predetermined target point, and thus it is possible to prevent the rotational speed of the body from exceeding the predetermined speed. Therefore, the rotational speed of the body can be prevented from increasing excessively.
Also, in the aspect, the robot may further include: a camera included in the frame; and a microphone included in the frame. The memory may store reference data image for checking a person and reference voice data for recognizing voice, and the control circuit, when determining that a predetermined person has uttered predetermined words based on voice data inputted from the microphone and the reference voice data and recognizing the predetermined person based on image data inputted from the camera and the reference data image, may set a location of the predetermined person as the predetermined target point.
In the aspect, it is determined that a predetermined person utters predetermined words based on voice data and reference voice data inputted from a microphone, and when a predetermined person is recognized based on image data and reference data image inputted from a camera, the location of the predetermined person is set as a predetermined target point. Thus, in the aspect, for instance, even when multiple persons are present around the robot, the robot can be stopped at the location of a person who has uttered the predetermined words.
Also, in the aspect, the control circuit may generate the deceleration control amount using a calculation expression below: (SIN(3*π/2−π/L*d)+1)*(Max−min)/2+min, where in the calculation expression, d indicates a distance (m) from a location of the robot to the predetermined target point, Max indicates a control amount (Hz) when the control circuit starts to control the second drive mechanism in accordance with the deceleration control amount, min indicates the minimum control amount, and L indicates a predetermined distance from the target point.
In the aspect, the deceleration control amount is generated using the arithmetic expression, thus the robot can be smoothly moved to a predetermined target point by S-curve control, and the robot can be stopped at the predetermined target point accurately.
Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. It is to be noted that the same symbol is used for the same components in the drawings. cl First Embodiment
In
As illustrated in
As illustrated in
As illustrated in
A first drive wheel 110 and a second drive wheel 111 are each provided under the lower surface of the second rotating plate 104, and are in contact with the inner circumferential surface of the body 101. Also, the first drive wheel 110 has a first motor 112 that drives the first drive wheel 110. Similarly, the second drive wheel 111 has a second motor 113 that drives the second drive wheel 111. In other words, the first drive wheel 110 and the second drive wheel 111 are driven by independent separate motors. The details of the operation of the robot 1 driven by the first drive wheel 110 and the second drive wheel 111 will be described later. The first drive wheel 110 and the second drive wheel 111 form body drive wheels.
In
As illustrated in
The rotating shaft 118 extends in a perpendicular direction to the drive axis of the first drive wheel 110 and the second drive wheel 111. The rotating shaft 118 corresponds to an example of a shaft provided in the frame 102. The first drive wheel 110 and the second drive wheel 111 are mounted so as to be spaced apart from the ground in front view. In this case, the drive axis of the first drive wheel 110 and the second drive wheel 111 is a virtual axis line that connects the centers of the first drive wheel 110 and the second drive wheel 111, for instance. When the first drive wheel 110 and the second drive wheel 111 are mounted in parallel in front view, the drive axis of the first drive wheel 110 and the second drive wheel 111 provides the actual drive shaft.
The robot 1 further includes a power source (not illustrated) and a microphone 217 (
Next, the operation of the robot 1 using the first drive wheel 110 and the second drive wheel 111 will be described with reference to
As illustrated in
As illustrated in
Next, the basic operation of the robot 1 using the counterweight (weight) 114 will be described with reference to
As illustrated in
As illustrated in
As illustrated in
As illustrated in
As illustrated in
The details of the operation of the robot 1 using the counterweight (weight) 114 will be described with reference to
As illustrated in
As described above, the first display 105, the second display 106, and the third display 107 represent part of the face of the robot 1, for example, the eyes and the mouth. Thus, for instance, a breathlessness state or a sleepy state of the robot 1 can be expressed by causing the robot 1 to reciprocate using the counterweight 114, in which the robot 1 is forwardly inclined in the pitch direction or rearwardly inclined in the pitch direction. If this control is performed, for instance, when the remaining amount of power of a power source is less than or equal to a predetermined value, the robot 1 can naturally inform a user that the remaining amount of power of the power source is small without displaying information on the remaining amount of power, irrelevant to the face on the first display 105, the second display 106, and the third display 107.
As illustrated in
Next, the posture of the robot 1 at the start of movement will be described with reference to
Thus, the pitch angle of the frame 102 including the first display 105 and the second display 106 increases by the effect of a force due to an external factor during a period until the robot 1 starts to move. Here, an angular speed sensor 219 is mounted, for instance, on the upper surface of the first rotating plate 103 in the frame 102. Therefore, the angular speed sensor 219 can detect an angular speed in the pitch direction of the frame 102. Consequently, the pitch angle of the frame 102 is detected by accumulating the angular speed in the pitch direction detected by the angular speed sensor 219.
For instance, application which cooperates with the robot 1 is installed in the mobile terminal 3. The mobile terminal 3 is capable of giving various instructions to the robot 1 via the application and displaying a result of image recognition of an object presented in front of the robot 1 by the user 1201.
For instance, when a request of reading aloud a picture book for the child is received from the mobile terminal 3, the robot 1 starts to read aloud the picture book for the child. For instance, when receiving a question from the child while reading aloud the picture book, the robot 1 sends the question to the cloud server 2, receives an answer for the question from the cloud server 2, and utters a voice indicating the answer.
Like this, the user 1201 can treat the robot 1 like a pet, and the child can learn language through interaction with the robot 1.
Next, the details of the internal circuit of the robot 1 according to the first embodiment of the present disclosure will be described with reference to
As illustrated in
The control circuit 109 includes a memory 206, a main controller 200 including a processor such as a CPU, a display information output controller 205, and a computer including a timer (not illustrated) that measures time.
The memory 206 is comprised of, for instance, a nonvolatile rewritable storage, and stores a control program for the robot 1.
The main controller 200 executes the control program for the robot 1 stored in the memory 206. Thus, the main controller 200 serves as a target location generator 201, a movement path generator 202, a self-location estimator 203, and a drive controller 204.
The angular speed sensor 219 is mounted, for instance, on the upper surface of the first rotating plate 103. The angular speed sensor 219 detects an angular speed around each of three directional axes: the directional axis parallel to the direction of gravitational force (the directional axis parallel to the Z-axis illustrated in
The distance sensor 220 is comprised of a distance sensor that obtains distance information indicating distance distribution in the surroundings of the robot 1 by using infrared light or ultrasonic waves, for instance. Similarly to the camera 108, the distance sensor 220 is provided in the forward direction of the robot 1 on the first rotating plate 103. For this reason, the direction of distance information obtained by the distance sensor 220 matches the direction of an object ahead of the robot 1. Thus, the distance sensor 220 can detect the distance between an object located ahead of the robot 1 and the robot 1. The distance sensor 220 may be disposed at any position as long as the position does not interfere with distance measurement performed by the distance sensor 220, such as a front position on the lower surface of the first rotating plate 103 or a front position on the upper surface or the lower surface of the second rotating plate 104 without being limited to a front position on the upper surface of the first rotating plate 103.
The microphone 217 is provided in the frame 102 to convert sound into an electrical signal, and output the electrical signal to the main controller 200. The microphone 217 may be mounted, for instance, on the upper surface of the first rotating plate 103, or mounted on the upper surface of the second rotating plate 104. The main controller 200 recognizes the presence or absence of the voice of a user from the voice obtained by the microphone 217, accumulates voice recognition results in the memory 206, and manages the voice recognition results. The main controller 200 compares the data for voice recognition stored in the memory 206 with the obtained voice, and recognizes the contents of voice and a user who has uttered the voice.
The loudspeaker 216 is provided in the frame 102 so that the output face faces the front, and converts an audio electrical signal into physical vibration. The main controller 200 outputs predetermined voice from the loudspeaker 216, and causes the robot 1 to utter the voice.
As described with reference to
The main controller 200 generates a command based on the information on the voice recognition results, the face recognition results, the distance information of the surrounding environment, the angular speeds around the three axes, and the communicator 210, and outputs the command to the display information output controller 205, the shaft controller 213, the body drive wheel controller 214, the weight drive mechanism controller 215, and the communicator 210.
The display information output controller 205 displays on the display 211 display information on the facial expression of the robot 1 according to a command outputted from the main controller 200. The display 211 includes the first display 105, the second display 106, and the third display 107 which have been described with reference to
The shaft controller 213 rotates the rotating shaft 118 which has been described with reference to
The body drive wheel controller 214 causes the body drive wheel 212 of the robot 1 to operate according to a command transmitted from the main controller 200. The body drive wheel controller 214 includes the first motor 112 and the second motor 113 which have been described with reference to
The weight drive mechanism controller 215 causes the weight drive mechanism 218 of the robot 1 to operate according to a command transmitted from the main controller 200. The weight drive mechanism controller 215 includes a motor for weight drive (not illustrated) built in the counterweight 114. The weight drive mechanism 218 includes the guide shaft 115, the swing arm 116, the motor 117 for rotation, the belt 119, and the motor pulley 120 which have been described with reference to
The communicator 210 is comprised of a communication device for connecting the robot 1 to the cloud server 2 (
Next, the target location generator 201, the movement path generator 202, the self-location estimator 203, and the drive controller 204 included in the main controller 200 will be described.
The target location generator 201 will be described with reference to
When the first keyword is included in a voice recognition result of the voice uttered by the first user 1300, the target location generator 201 performs location detection processing on the first user 1300. The target location generator 201 compares a captured image 1302 of the camera 108 with face information on the first user 1300 held in the memory 206, and recognizes the face of the first user 1300 in the captured image 1302. After successfully recognizing the face of the first user 1300 in the captured image 1302, the target location generator 201 extracts an area of the first user 1300 in the captured image 1302, and identifies the direction of the first user 1300 with respect to the robot 1 from the extracted area of the first user 1300. The target location generator 201 obtains distance information corresponding to the identified direction from the distance sensor 220, thereby estimating the distance between the robot 1 and the first user 1300. Also, from the estimated direction of the first user 1300 and distance, the target location generator 201 generates a location at which the first user 1300 is present in the real space as a target location 301 (
The movement path generator 202 generates a movement path for the robot 1 to move to the target location. The movement path generator 202 will be described with reference to
The self-location estimator 203 estimates the current position of the robot 1 in the real space at predetermined time intervals using environmental information on the surroundings of the robot 1 or a movement amount of the robot 1. For instance, the self-location estimator 203 refers to captured data obtained by capturing the surroundings by the camera 108, and distance information which indicates the distance to each of objects located in the surroundings of the robot 1 and is detected by the distance sensor 220, and may estimate the current location of the robot 1 using, for instance, visual localization and mapping (V-SLAM). Alternatively, the self-location estimator 203 may estimate the current location of the robot 1 from the surrounding environment or may estimate the current location of the robot 1 by a publicly known method, such as dead reckoning, using the rotational amount of the first motor 112 and the second motor 113 obtainable from the body drive wheel controller 214, and an angular speed (angular speed in the yaw angle), obtainable from the angular speed sensor 219, around the directional axis (Z-axis) parallel to the direction of gravitational force of the robot 1.
The self-location estimator 203 sets the estimated current location of the robot 1 in the map information held in the memory 206. For instance, as illustrated in
The drive controller 204 determines a control amount to be outputted as a command to each of the shaft controller 213, the body drive wheel controller 214, and the weight drive mechanism controller 215, and a control command that controls the display information output controller 205. The control amount includes a control amount C1 that controls the first motor 112 and the second motor 113 included in the body drive wheel controller 214, a control amount C2 that controls a motor for weight drive (not illustrated) included in the weight drive mechanism controller 215, and a control amount C3 that controls the motor 117 for rotation included in the shaft controller 213.
The control amount C1 is a value that controls the rotational amount of each of the first motor 112 and the second motor 113 included in the body drive wheel controller 214, and the torque and the rotational speed of the first motor 112 and the second motor 113 increase as the value increases. In this embodiment, the first motor 112 and the second motor 113 are comprised of a motor on which PFM control is performed, and thus the frequency for determining the torque and the rotational speed of the first motor 112 and the second motor 113 is used as the control amount C1. However, this is an example, and when the first motor 112 and the second motor 113 are comprised of a motor on which PWM control is performed, the duty value is used as the control amount C1. The motor 117 for rotation (
The control command is a command for changing the facial expression pattern of the robot 1. Therefore, when changing the facial expression pattern of the robot 1, the drive controller 204 outputs the control command to the display information output controller 205.
Next, the details of the processing performed by the drive controller 204 will be described. The drive controller 204 estimates an effect received by the robot 1 from the floor surface, and determines a control amount to be outputted to each of the display information output controller 205, the shaft controller 213, the body drive wheel controller 214, the weight drive mechanism controller 215, and the communicator 210.
First, an overview of floor surface detection processing performed by the robot 1 according to the first embodiment of the present disclosure will be described with reference to
As described above, the posture of the robot 1 according to the first embodiment of the present disclosure rotates around the Y-axis in the period from reception of a movement start command by the robot 1 until the robot 1 actually starts to move. In the period, the angular speed sensor 219 obtains an angular speed in the direction indicated by the arrow 401 (
As illustrated in
Therefore, it can be concluded that the robot 1 starts to move at the timing of occurrence of decrease in the pitch angle, and thus the type of floor surface can be determined by monitoring the change in the pitch angle. Thus, the drive controller 204 estimates the type of floor surface by determining whether or not a maximum angle of the pitch angle (a maximum pitch angle) exceeds a predetermined value according to a floor surface type. The change in the pitch angle may be monitored in the period until the location 300 of the robot 1 is moved by referring to the map information in the memory 206, or a maximum pitch angle in a predetermined time may be monitored.
The maximum pitch angle and the minimum control amount illustrated in
As illustrated in
Next, the generation processing for the control amount C1 in the robot 1 according to the first embodiment of the present disclosure will be described with reference to
The body drive wheel controller 214 causes the body drive wheel 212 of the robot 1 to operate according to the control amount C1 for the body drive wheel 212 transmitted from the main controller 200. The control amount C1 controls the rotational amount of the first motor 112 and the second motor 113. The rotational amount of the first motor 112 and the second motor 113 varies directly with the control amount C1. The body drive wheel controller 214 may obtain the rotational amount of the first motor 112 and the second motor 113 from an encoder attached to the first motor 112 and the second motor 113 or may calculate the rotational amount by a publicly known calculation method according to the specifications of the first motor 112 and the second motor 113.
The control amount C1 varies according to the self-location estimated by the self-location estimator 203 and the remaining distance to a target location generated by the target location generator 201. Here, the control amount C1 is updated as needed not to fall below a minimum control amount corresponding to a maximum pitch angle determined by referring to the control amount determination database T20. Therefore, the robot 1 can arrive at a target location without being stopped in the middle of move due to an external factor of the floor surface.
As illustrated in
Here, the reason why the minimum control amount stored in the control amount determination database T20 is to be referred will be described. The robot 1 according to the first embodiment of the present disclosure has a spherical shape as illustrated in
Next, the difference between stop locations according to the type of floor surface will be described with reference to
In
In the case of movement on carpet, when the control amount C1 falls below the value (400 Hz) indicated by a line 500, the robot 1 stops. Also, in the case of movement on wood floor, when the control amount C1 falls below the value (200 Hz) indicated by a line 501, the robot 1 stops because the wood floor has lower friction than that of the carpet.
A distance 502 indicates the difference between the stop location of the robot 1 when the robot 1 is moved on carpet by changing the control amount C1 as indicated by the line 503, and the stop location of the robot 1 when the robot 1 is moved on wood floor by changing the control amount C1 as indicated by the line 503.
The difference between the stop locations indicated by the distance 502 is caused by an external force, such as friction, given by the floor surface to the robot 1. Therefore, the robot 1 needs to maintain the control amount C1 at least the minimum control amount until the robot 1 arrives at the target location. In other words, when the robot 1 is moved on carpet, it is possible to prevent stopping of the robot 1 short of the target location provided that the control amount C1 is maintained at least 400 Hz which is a minimum control amount corresponding to carpet. Also, when the robot 1 is moved on wood floor, it is possible to prevent stopping of the robot 1 short of the target location provided that the control amount C1 is maintained at least 200 Hz which is a minimum control amount corresponding to wood floor. Thus, stopping of the robot 1 short of the target location can be avoided by setting the control amount C1 to at least a minimum control amount according to the type of floor surface, and thus the robot 1 can be smoothly moved to the target location.
The drive controller 204 generates the control amount C1 according to the remaining distance to the target location and the minimum control amount. Even when the type of floor surface is different, the robot 1 performs a similar operation, thus the drive controller 204 determines the control amount C1, for instance, by S-curve control using the following Expression (1).
For the method of calculating the control amount C1, a control method which varies according to floor surface may be used. For instance, when the floor surface is wood, wobbling of the robot 1 may occur in the forward or backward direction at the time of stop because the effect from the floor surface is less. In this case, it is better to set a smaller amount of change in the control amount C1 immediately before stop. Thus, in this embodiment, the control amount C1 is determined using Expression (1). Also, when the floor surface is carpet, wobbling of the robot 1 is unlikely to occur in the forward or backward direction at the time of stop because the effect of friction from the floor surface is large. In this case, the control amount C1 may be determined using trapezoidal control. However, in the following example, the control amount C1 is to be determined by S-curve control of Expression (1) before the robot 1 arrives at the target location regardless of the type of floor surface.
The control amount C1=(SIN(3*π/2π/L*d)+1)*(Max−min)/2+min (1)
L[m] is the deceleration start distance which is a predetermined distance from a target position for starting deceleration control, d [m] is the remaining distance from the location of the robot 1 to the target location, Max [Hz] is the control amount C1 at the deceleration start location which is the location indicated by deceleration control distance, and min [Hz] is the minimum control amount. Also, the value calculated using the technique described above with reference to
In the graph, L [m] which is the deceleration start distance from the target location is 1 [m], the control amount C1 at the deceleration start location is 1000 [Hz], the minimum control amount with the floor surface of carpet is 400 [Hz], and the minimum control amount with the floor surface of wood floor is 200 [Hz], and arithmetic results when these values are substituted into Expression (1) are illustrated.
As indicated by the curve of carpet (dotted line) and the curve of wood floor (solid line), it is seen that the control amount C1 is gradually decreased in a sign curve from the deceleration start location at 1 [m] point to the target location at 0 [m] point. Also, for wood floor and carpet, the control amounts C1 at the target location are 200 [Hz] and 400 [Hz], respectively, and each control amount C1 is maintained at least the minimum control amount until the robot 1 arrives at the target location. Therefore, the robot 1 is prevented from stopping short of the target location. In the case of wood floor, when the remaining distance is less than 0.15 [m], the slope of the control amount C1 becomes suddenly gentle, and prevention of wobbling of the robot 1 at the target location is achieved.
The area 600 is an acceleration area. In the area 600, the control amount C1 is an acceleration control amount which is increased with time at a constant rate of change. Specifically, in the area 600, the control amount C1 is increased by trapezoidal control. The area 601 is a uniform speed area. In the area 600, the control amount C1 is a uniform speed control amount which maintains a maximum control amount. The maximum control amount refers to a predetermined control amount C1 corresponding to an upper limit speed of the robot 1. As the upper limit speed, a value is used, which has been determined in advance in consideration of the performance of the first motor 112 and the second motor 113 and the safety of the robot 1 at the time of moving.
The area 602 is a deceleration area. In the area 602, the control amount C1 is a deceleration control amount determined by S-curve control indicated by Expression (1).
When the robot 1 starts to move, the drive controller 204 increases the control amount C1 by trapezoidal control, and when the control amount C1 reaches a maximum control amount (1000 [Hz]), the drive controller 204 maintains the control amount C1 at the maximum control amount. When the robot 1 arrives at the deceleration start location, the drive controller 204 decreases the control amount C1 in accordance with Expression (1). Consequently, the drive controller 204 is capable of causing the robot 1 to quickly arrive at the target location and stopping the robot 1 without wobbling at the target location. In addition, when the control amount C1 reaches the maximum control amount, the drive controller 204 does not increase the control amount C1 any more, thus the safety of the robot 1 can be secured.
When the distance from the movement start location to the target location is short, the robot 1 may arrive at the deceleration start location before the control amount C1 reaches the maximum control amount. In this case, the drive controller 204 may calculate the control amount C1 by substituting the control amount C1 at the deceleration start location into Max of Expression (1). Consequently, the drive controller 204 can cause the robot 1 to stop at the target location smoothly and accurately.
Referring back to
The shaft controller 213 causes the rotating shaft 118 of the robot 1 to operate according to a control amount C3 outputted from the main controller 200. The control amount C3 controls the rotational amount of the motor 117 for rotation.
Hereinafter, processing steps performed by the robot 1 in the first embodiment will be described with reference to
Referring to
The microphone 217 obtains audio signal in the surrounding environment (Yes in step S1001), and outputs the audio signal to the main controller 200. The main controller 200 performs voice recognition processing on the obtained audio signal (step S1002). The voice recognition processing extracts voice data which indicates a temporal change in the sound pressure of the voice uttered by a user, and utterance information which indicates the contents of utterance of the user contained in the voice data in text format. When an audio signal is not obtained by the microphone 201, the target location generator 201 repeats the processing in step S1001 until an audio signal is obtained (No in step S1001).
The target location generator 201 determines whether or not the voice data extracted by the voice recognition processing matches any one of one or multiple pieces of voiceprint information pre-stored in the memory 206 as user information of one or multiple users. When it is determined that the extracted voice data matches the voiceprint information (Yes in step S1003), the target location generator 201 determines that a user with the matched voiceprint information is the first user 1300 (step S1004). When the extracted voice data does not match any of the pieces of voiceprint information stored in the memory 206 (No in step S1003), the target location generator 201 causes the processing to return to S1001.
When an utterance first keyword is contained in the voice data of the first user 1300 obtained by the voice recognition processing (Yes in step S1005), the target location generator 201 obtains image data from the camera 108 (step S1006). When the first keyword is not contained in the voice data of the first user 1300 obtained by the voice recognition processing (No in step S1005), the target location generator 201 causes the processing to return to S1001.
The target location generator 201 performs face recognition processing to compare each of one or multiple face images contained in the image data obtained from the camera 108 with the characteristic quantity of the face of the first user 1300 stored in the memory 206 as the user information of the first user 1300, and detects the first user 1300 from the image data (step S1007).
When the first user 1300 is detectable from the image data (Yes in step S1007), the target location generator 201 detects the direction of the first user 1300 with respect to the robot 1 from the location of the first user 1300 in the image data (step S1008).
Of the distance information obtained by the distance sensor 220, the target location generator 201 obtains distance information in the direction in which the first user 1300 is present, as the distance information on the first user 1300 (step S1009). The target location generator 201 detects the location of the first user 1300 in the real space around the robot 1 from the direction and the distance information of the first user 1300, and plots the detected location in the map information (
The target location generator 201 sets the plotted location as the target location 300 of the robot 1 (step S1011). Also, when the first user 1300 is not detectable from the image data (No in step S1007), the target location generator 201 causes the processing to return to S1006.
Next, generation of a movement path for the robot 1 to move to the target location will be described. Referring to
Next, the drive control processing of the robot 1 will be described. Referring to
The drive controller 204 obtains an angular speed in the pitch direction detected by the angular speed sensor 219 (step S1101). Next, the drive controller 204 calculates a rate of change in the pitch angle per unit time from the obtained angular speed in the pitch direction (step S1102).
For instance, the angular speed sensor 219 detects an angular speed in the pitch direction at uniform sampling intervals. In this case, the drive controller 204 can calculate an angular speed in the pitch direction at one sample point detected by the angular speed sensor 219 as the rate of change in the pitch angle per unit time. Alternatively, when a time different from the sampling interval is used as the unit time, the drive controller 204 may calculate a rate of change in the pitch angle per unit time by accumulating the angular speeds in the pitch direction at sample points for unit time, detected by the angular speed sensor 219.
Next, the drive controller 204 accumulates rates of change in the pitch direction per unit time (step S1103), and calculates the current pitch angle of the frame 102. Referring to
When the pitch angle has continuously decreased predetermined number of times (Yes in step S1104), the drive controller 204 identifies a maximum pitch angle from pitch angles stored in the memory 206 in a time series (step S1105). Here, when the pitch angle has continuously decreased predetermined number of times, the drive controller 204 assumes that the pitch angle has reached a peak as illustrated in
Next, the drive controller 204 refers to the control amount determination database T20 to determine a minimum control amount corresponding to the identified maximum pitch angle (step S1106).
On the other hand, when the pitch angle has not continuously decreased predetermined number of times (No in step S1104), the drive controller 204 causes the processing to proceed to step S1107 without performing the processing in step S1105 and step S1106.
Next, the self-location estimator 203 estimates the self-location of the robot 1 from the image data obtained by the camera 108 and the distance information obtained by the distance sensor 220 (step S1107). Here, the self-location estimator 203 may estimate the self-location using publicly known V-SLAM.
If the image data obtained by the camera 108 does not sufficiently show a group of characteristic points indicating the objects in the surroundings of the robot 1, the self-location estimator 203 is unable to estimate the self-location using V-SLAM. In this case, the self-location estimator 203 obtains the rotational amounts of the first motor 112 and the second motor 113 from the body drive wheel controller 214 as well as performs publicly known dead reckoning based on the angular speed in the yaw angle detected by the angular speed sensor 219. Specifically, the self-location estimator 203 interpolates the self-location of the robot 1 by dead reckoning during a period from a point at which the self-location is lost by V-SLAM until the self-location is detected again by V-SLAM. Thus, the self-location estimator 203 can recognize the self-location of the robot 1 all the time.
Next, the drive controller 204 refers to the map information stored in the memory 206, and calculates the remaining distance using the coordinates of the location 300 of the robot 1 and the coordinates of the target location 301 (step S1108). The remaining distance is calculated by multiplying the distance per square cell by the number of square cells indicating the movement path that connects the coordinates of the location 300 of the robot 1 and the coordinates of the target location 301.
When the robot 1 has arrived at the target location 301 (Yes in step S1109), the drive controller 204 generates a stop control amount as the control amount C1 (step S1110), and outputs the generated stop control amount to the body drive wheel controller 214 (step S1116). When outputting the stop control amount to the body drive wheel controller 214 (Yes in step S1117), the drive controller 204 terminates the processing. Here, for instance, 0 [Hz] may be used as the stop control amount.
On the other hand, when the robot 1 has not arrived at the target location 301 (No in step S1109), the drive controller 204 determines whether or not the remaining distance from the location 300 of the robot 1 to the target location 301 is less than or equal to the deceleration start distance (step S1111). When the remaining distance is less than or equal to the deceleration start distance (Yes in step S1111), the drive controller 204 generates a deceleration control amount according to the remaining distance using Expression (1) (step S1112), and outputs the generated deceleration control amount as the control amount C1 to the body drive wheel controller 214 (step S1116).
Here, the drive controller 204 substitutes the remaining distance from the location 300 of the robot 1 to the target location 301, the deceleration start distance, the minimum control amount determined in step S1106, and the control amount C1 at the deceleration start location into d, L, min, and MAX, respectively of Expression (1), and generates a deceleration control amount. The deceleration control amount is the control amount C1 generated in the area 602 of
When the remaining distance from the location 300 of the robot 1 to the target location 301 exceeds the deceleration start distance (No in step S1111), the drive controller 204 determines whether or not the control amount C1 is less than the maximum control amount (step S1113). When the control amount C1 is less than the maximum control amount (Yes in step S1113), the drive controller 204 generates an acceleration control amount as the control amount C1 (step S1114), and outputs the generated acceleration control amount to the body drive wheel controller 214 (step S1116). The acceleration control amount is the control amount C1 generated in the area 600 of
When the control amount C1 exceeds the maximum control amount (No in step S1113), the drive controller 204 generates a uniform speed control amount as the control amount C1 (step S1115), and outputs the generated uniform speed control amount to the body drive wheel controller 214 (step S1116). The uniform speed control amount is the control amount C1 generated in the area 601 of
When the stop control amount has not been outputted to the body drive wheel controller 214 (No in step S1117), the drive controller 204 determines whether or not a minimum control amount has been determined by the processing in step S1106 (step S1118). When a minimum control amount has not been determined (No in step S1118), the drive controller 204 causes the processing to return to step S1101 because the robot 1 has not started to move yet.
On the other hand, when a minimum control amount has been determined (Yes step S1118), the drive controller 204 causes the processing to return to step S1107 because the robot 1 has started to move.
On the other hand, when the stop control amount has been outputted to the body drive wheel controller 214 (Yes in step S1117), the drive controller 204 terminates the processing because the robot 1 has arrived at the target location 301.
Referring to the flowchart of
Also, during acceleration control after the start of move, the loop of No in step S1109, No in step S1111, Yes in step S1113, No in step S1117, and Yes in step S1118 is repeated, and the robot 1 moves at a constant acceleration.
During uniform speed control, the loop of No in step S1109, No in step S1111, No in step S1113, No in step S1117, and Yes in step S1118 is repeated, and the robot 1 moves at a constant speed.
During deceleration control, the loop of No in step S1109, Yes in step S1111, No in step S1117, and Yes in step S1118 is repeated, and the robot 1 is decelerated in accordance with S-curve control indicated by Expression (1).
As described above, with the robot 1 according to the first embodiment, a minimum control amount corresponding to a maximum pitch angle of the frame 102 detected by the angular speed sensor 219 is determined, and the deceleration control is performed on the robot 1 so that the control amount C1 does not fall below the minimum control amount. Consequently, the robot 1 can be stopped at the target location accurately and smoothly.
Next, a second embodiment will be described with reference to the drawings. It is to be noted that the same symbol is used for the same components in the drawings. Also, a description of the same component, processing as in the first embodiment is omitted. Even when the type of floor surface changes during movement, the robot 1 according to the second embodiment determines an appropriate minimum control amount according to the type of floor surface after the change, and arrives at the target location of the robot 1 accurately without wobbling.
Also, at the time of moving, the robot 1 wobbles forward or backward in the pitch direction, thus the pitch angle of the internal mechanism such as the frame 102 at the time of moving vibrates with a certain amplitude, for instance. Thus, in the second embodiment, an average pitch angle, which is an average value of pitch angles at the time of moving of the robot 1, is determined, then the type of floor surface is estimated from the average pitch angle, and a minimum control amount is determined.
Since the frictional force of wood floor is smaller than that of carpet, a smaller minimum control amount is set for wood floor as compared with the case of carpet. Thus, when a minimum control amount of the robot 1 is set to the value for wood floor even after moving over carpet, the control amount C1 becomes insufficient before reaching the first user 1300, and the robot 1 may not be able to arrive at the location of the first user 1300.
In order to prevent this, the second embodiment adopts the following configuration.
The “average pitch angle during operation” specifies a range of an average pitch angle taken by the internal mechanism such as the frame 102 in the robot 1 in movement according to the type of floor surface. When the control amount determination database T29 is compared with the control amount determination database T20, in the control amount determination database T20, for instance, as shown in the first row, a maximum pitch angle is set to “0 degree or greater and 5 degrees or less” for a minimum control amount “100 Hz”. In contrast, in the control amount determination database T29, the average pitch angle during operation is set to “0 degree or greater and 3 degrees or less” for a minimum control amount “100 Hz”. Similarly, in the second row, a maximum pitch angle is set to “5 degrees or greater and 10 degrees or less” for a minimum control amount “150 Hz” in the control amount determination database T20, whereas the average pitch angle during operation is set to “3 degrees or greater and 6 degrees or less” for a minimum control amount “150 Hz” in the control amount determination database T29.
Like this, in the control amount determination database T29, the pitch angle is set to be overall small for the same minimum control amount as compared with the control amount determination database T20. That is, in the control amount determination database T29, for the same type of floor surface, the pitch angle is set to be small as compared with the control amount determination database T20.
This is because, at the time of moving, the frictional force received by the robot 1 from the floor surface is small, as compared with at the time of stop, thus even for the same type of floor surface, at the time of moving, the pitch angle of the internal mechanism such as the frame 102 is smaller, as compared with at the time of stop.
Thus, in the second embodiment, similarly to the first embodiment, before the robot 1 starts to move, a minimum control amount is determined by referring to the control amount determination database T20, whereas after the robot 1 starts to move, a minimum control amount is determined by referring to the control amount determination database T29. It is to be noted that the maximum pitch angle and the minimum control amount stored in the control amount determination database T20 are an example of the first reference pitch angle and the first control amount, and the average pitch angle during operation and the minimum control amount stored in the control amount determination database T29 are an example of the second reference pitch angle and the second control amount.
Hereinafter, a control method for allowing the robot 1 to arrive at the target location accurately and smoothly by referring to the control amount determination database T29 even when the type of floor surface changes during the movement of the robot 1 will be described.
In the second embodiment, the main routine is the same as in the first embodiment 1, that is, the same as in
Determination of No is made in step S1111 when the following conditions are satisfied: the robot 1 has started to move, the robot 1 is moving and the remaining distance to the target location is greater than the deceleration start distance. Therefore, the processing in step S1201 is performed when the robot 1 is moving and the remaining distance to the target location is greater than the deceleration start distance. In other words, the processing in step S1201 is performed when acceleration control indicated by the area 600 of
The drive controller 204 obtains an angular speed in the pitch direction detected by the angular speed sensor 219, calculates a pitch angle of the internal mechanism such as the frame 102, and stores the calculated pitch angle in the ring buffer B33 illustrated in
When the ring buffer B33 is filled with pitch angles (Yes in step S1302), the drive controller 204 adds N pitch angles stored in the ring buffer B33 together, and divides the total pitch angles by N to calculate an average pitch angle (step S1303).
Next, the drive controller 204 determines a minimum control amount corresponding to the calculated average pitch angle by referring to the control amount determination database T29 illustrated in
Next, the drive controller 204 updates the minimum control amount stored in the memory 209 with the determined minimum control amount (step S1305).
For instance, when a minimum control amount determined from the control amount determination database T20 is stored in the memory 209 in step S1105, in step S1305, the minimum control amount is updated with a minimum control amount determined in step S1304.
Also, even when a minimum control amount updated by the processing in previous step S1305 is stored in the memory 209, the minimum control amount is updated with a minimum control amount determined in step S1304.
When the robot 1 arrives at the deceleration start location, the minimum control amount stored then in the memory 209 is inputted to min of Expression (1) and the control amount C1 is calculated.
Consequently, even when the type of floor surface changes after the start of movement, an appropriate minimum control amount is set according to the type of floor surface after the change. Also, even when the type of floor surface changes twice or more during movement of the robot 1, an appropriate minimum control amount is set according to the type of floor surface at the target location.
In the flowchart, the distance from the deceleration start location to the target location is short, and it is assumed that the type of floor surface does not change on the way, and at the time of arrival to the deceleration start location, the minimum control amount stored in the memory 209 is inputted into min of Expression (1).
In step S1302, when the ring buffer B33 is not filled with the pitch angles (No in step S1302), the drive controller 204 does not perform the processing in step S1303 to step S1305, and terminates the update processing for minimum control amount.
As described above, with the robot 1 according to the second embodiment, even when the type of floor surface changes during movement of the robot 1, an appropriate minimum control amount according to the type of floor surface after the change is determined, thus the robot 1 can be moved smoothly without stopping the robot 1 on the way to the target location.
In the second embodiment, the control amount determination database T20 and the control amount determination database T29 are formed of different databases. However, this is an example, and both databases may be integrated into one database as illustrated in
It can be seen that in the control amount determination database T34, a maximum pitch angle for a minimum control amount and an average pitch angle during operation are stored in association with each other, and the control amount determination database T20 and the control amount determination database T29 are integrated into one database.
When the control amount determination database T34 is used, in step S1105 of
Also, in the flowchart of
Consequently, even when the distance to the target location falls below the deceleration start distance before an average pitch angle is determined, the present disclosure allows the deceleration control to be performed using a minimum control amount.
The present disclosure is useful for a household robot.
Number | Date | Country | Kind |
---|---|---|---|
2017-138272 | Jul 2017 | JP | national |