The present disclosure relates to a moving body, a movement control method, and a program, and more particularly, to a moving body, a movement control method, and a program capable of suppressing erroneous determination in obstacle detection.
Some moving bodies such as drones have a function of decelerating or stopping in a case where an obstacle having a possibility of collision is detected on a trajectory in a traveling direction.
However, in a case where the drone flies at a low altitude near the ground, the ground is detected as an obstacle having a possibility of collision, and the drone stops unintentionally.
Meanwhile, Patent Document 1 discloses a technique of detecting a normal vector in units of pixels from a polarized image transmitted through a plurality of polarizing filters having different polarization directions.
If the normal vector can be detected on the trajectory in the traveling direction, it is considered that the accuracy of determination in obstacle detection can be improved.
The present disclosure has been made in view of such a situation, and aims to suppress erroneous determination in obstacle detection.
A moving body of the present disclosure is a moving body including: a normal vector estimation unit that estimates a normal vector on the basis of sensor data obtained by sensing a traveling direction of an own device; and a control information generation unit that generates control information for controlling movement of the own device on the basis of the normal vector.
A movement control method of the present disclosure is a movement control method including: estimating a normal vector on the basis of sensor data obtained by sensing an object in a traveling direction of a moving body; and generating control information for controlling movement of the moving body on the basis of the normal vector.
A program of the present disclosure is a program for causing a computer to execute processing of: estimating a normal vector on the basis of sensor data obtained by sensing an object in a traveling direction of a moving body; and generating control information for controlling movement of the moving body on the basis of the normal vector.
In the present disclosure, a normal vector is estimated on the basis of sensor data obtained by sensing an object in a traveling direction of a moving body, and control information for controlling movement of the moving body is generated on the basis of the normal vector.
Hereinafter, modes for carrying out the present disclosure (hereinafter referred to as embodiments) will be described. Note that the description will be made in the following order.
Some moving bodies such as drones have a function of decelerating or stopping in a case where an obstacle having a possibility of collision is detected on a trajectory in a traveling direction.
For example, as illustrated in
In the drawing, as surrounded by a frame of a one-dot chain line, in a case where the point cloud data Pc exists on the spatial region Sa through which the moving body 10 passes along the predicted future trajectory, the moving body 10 determines that an obstacle having a possibility of collision has been detected, and decelerates or stops.
However, as illustrated in
Therefore, as illustrated in
As a result, even in a case where the moving body 10 is flying at a low altitude or at a high speed, it is possible to suppress erroneous determination in obstacle detection.
The moving body 10 includes a moving object such as a drone, a vehicle, or a ship. Hereinafter, an example in which the technology according to the present disclosure is applied to a drone flying in the air will be described. The technology according to the present disclosure can be applied to a drone that autonomously flies, an autonomous traveling vehicle that moves on land, an autonomous navigation vessel that moves on water or under water, and autonomous mobile robots such as an autonomous mobile cleaner that moves indoors, in addition to a moving body that moves by user's operation.
The moving body 10 includes a sensor 20, a communication unit 21, a control unit 22, a movement control unit 23, a moving mechanism 24, and a storage unit 25.
The sensor 20 includes various sensors including the above-described distance measuring sensor 11, and senses each direction around the moving body 10 including the traveling direction of the moving body 10. Sensor data obtained by sensing is supplied to the control unit 22.
The communication unit 21 includes a network interface or the like, and performs wireless or wired communication with the controller for operating the moving body 10 and any other device. For example, the communication unit 21 may directly communicate with a device to be communicated with, or may perform network communication via a base station or a repeater for Wi-Fi (registered trademark), 4G, 5G, or the like.
The control unit 22 includes a central processing unit (CPU), a memory, and the like, and controls the communication unit 21, the movement control unit 23, and the storage unit 25 by executing a predetermined program. For example, the control unit 22 controls the movement control unit 23 on the basis of the sensor data from the sensor 20.
The movement control unit 23 includes a circuit such as a dedicated IC or a field-programmable gate array (FPGA), and controls driving of the moving mechanism 24 under the control of the control unit 22.
The moving mechanism 24 is a mechanism for moving the moving body 10, and includes a flight mechanism, a traveling mechanism, a propulsion mechanism, and the like. In this example, the moving body 10 is configured as a drone, and the moving mechanism 24 includes a motor, a propeller, and the like as a flight mechanism. Furthermore, in a case where the moving body 10 is configured as an autonomous traveling vehicle, the moving mechanism 24 includes wheels or the like as a traveling mechanism. In a case where the moving body 10 is configured as an autonomous navigation vessel, the moving mechanism 24 includes a screw propeller and the like as a propulsion mechanism. The moving mechanism 24 is driven according to the control of the movement control unit 23 to move the moving body 10.
In the moving body 10, the control unit 22 drives the moving mechanism 24 by controlling the movement control unit 23 according to a control signal from the controller received by the communication unit 21, for example. As a result, the moving body 10 moves according to the operation of the controller by the user.
The storage unit 25 includes a nonvolatile memory such as a flash memory, and stores various types of information according to control of the control unit 22.
Hereinafter, embodiments of the moving body 10 that realize suppression of erroneous determination in obstacle detection will be described.
The moving body 10 illustrated in
The polarization image sensors 51-1 and 51-2 are configured by forming polarizers in a plurality of directions on photodiodes of pixels, respectively. For example, polarizers in four directions are mounted on the polarization image sensors 51-1 and 51-2, and polarized images in the four directions can be acquired. The polarization image sensors 51-1 and 51-2 are one of various sensors constituting the sensor 20 in
The moving body 10 further includes a normal vector estimation unit 52, luminance image construction units 53-1 and 53-2, parallelization processing units 54-1 and 54-2, calibration data 55, and a normal vector correction unit 56.
The normal vector estimation unit 52 estimates a normal vector for each pixel position of the polarized image on the basis of the polarized image acquired by the polarization image sensor 51-1. The polarized image used for estimating the normal vector may be a polarized image acquired by the polarization image sensor 51-2.
Specifically, the normal vector estimation unit 52 obtains the relationship between the luminance and the polarization angle from the polarization direction and the luminance of the polarized image on the basis of the polarized image having three or more polarization directions, and determines the azimuth angle φ at which the luminance is the maximum. Furthermore, the normal vector estimation unit 52 calculates the polarization degree p using the maximum luminance and the minimum luminance obtained from the relationship between the luminance and the polarization angle, and determines the zenith angle θ corresponding to the polarization degree p on the basis of the characteristic curve indicating the relationship between the polarization degree and the zenith angle. In this way, the normal vector estimation unit 52 estimates the azimuth angle φ and the zenith angle θ for each pixel position as the normal vector of the subject on the basis of the polarized image having three or more polarization directions.
The luminance image construction units 53-1 and 53-2 configure two-view luminance images on the basis of the luminance values for each pixel of the two-view polarized images acquired by the polarization image sensors 51-1 and 51-2, respectively.
The parallelization processing units 54-1 and 54-2 perform parallelization processing by stereo rectification on the two-view luminance images configured by the luminance image construction units 53-1 and 53-2, respectively. The parallelization processing is performed using the internal parameters, the external parameters, and the distortion coefficients of the polarization cameras 30-1 and 30-2 held as the calibration data 55. By the parallelization processing, the two-view luminance images are corrected into parallelized luminance images.
The normal vector correction unit 56 corrects the normal vector estimated for each pixel position of the polarized image according to the correction of the luminance image by the parallelization processing by the parallelization processing units 54-1 and 54-2. Internal parameters, external parameters, and distortion coefficients of the polarization cameras 30-1 and 30-2 are also used for correction of the normal vector.
The moving body 10 further includes a parallax estimation unit 57, a visual odometry unit 58, a GPS sensor 59, an IMU 60, a barometer 61, a geomagnetic sensor 62, and a self-position estimation unit 63.
The parallax estimation unit 57 estimates the parallax on the luminance image by stereo matching using the luminance image after the parallelization processing. On the basis of the estimated parallax, the parallax estimation unit 57 outputs a parallax map including point cloud data indicating the distance (depth) to the subject.
The visual odometry unit 58 estimates the trajectory of the own device (moving body 10) by visual odometry on the basis of the luminance image after the parallelization processing, and supplies the trajectory to the self-position estimation unit 63.
The global positioning system (GPS) sensor 59 acquires GPS information of the own device (moving body 10) and supplies the GPS information to the self-position estimation unit 63. The inertial measurement unit (IMU) 60 detects a three-dimensional angular velocity and acceleration of the own device (moving body 10), and supplies the three-dimensional angular velocity and acceleration to the self-position estimation unit 63. The barometer 61 measures the atmospheric pressure and supplies the atmospheric pressure to the self-position estimation unit 63. The geomagnetic sensor 62 detects geomagnetism and supplies the detected geomagnetism to the self-position estimation unit 63. Each of the GPS sensor 59, the IMU 60, the barometer 61, and the geomagnetic sensor 62 is also one of various sensors constituting the sensor 20 in
The self-position estimation unit 63 performs sensor fusion using the extended Kalman filter on the basis of data obtained by each of the visual odometry unit 58, the GPS sensor 59, the IMU 60, the barometer 61, and the geomagnetic sensor 62. As a result, the self-position estimation unit 63 can calculate the self-position and the velocity vector of the own device (moving body 10).
The moving body 10 further includes an obstacle collision determination unit 64 and a flight controller 65.
The obstacle collision determination unit 64 determines the possibility of collision of the moving body 10 with an obstacle on the basis of the normal vector from the normal vector correction unit 56, the parallax map (point cloud data) from the parallax estimation unit 57, and the velocity vector from the self-position estimation unit 63.
The obstacle collision determination unit 64 includes a division unit 64a, a calculation unit 64b, and a setting unit 64c.
The division unit 64a divides the spatial region in the traveling direction of the moving body 10 into small regions continuous in the traveling direction. Hereinafter, the divided small region is referred to as a collision determination region. The calculation unit 64b calculates a collision risk with an obstacle for each collision determination region divided by the division unit 64a on the basis of the normal vector, the point cloud data, and the velocity vector. The setting unit 64c sets an obstacle region where an obstacle is likely to exist on the basis of the collision risk for each collision determination region calculated by the calculation unit 64b.
Then, the obstacle collision determination unit 64 generates control information for controlling the movement of the own device on the basis of the distance from the own device (moving body 10) to the obstacle region. That is, the obstacle collision determination unit 64 has a function as a control information generation unit that generates control information for controlling the movement of the moving body 10 on the basis of the collision risk.
The flight controller 65 corresponds to the movement control unit 23 in
Note that the obstacle collision determination unit 64 can generate control information or the flight controller 65 can control the movement of the moving body 10 on the basis of a control signal input from a controller for operating the own device (moving body 10). The controller can not only input a control signal for controlling the movement of the moving body 10 in real time, but also input, for example, a destination, a moving route, and the like as the control signal. In this case, the flight controller 65 controls the movement of the moving body 10 so as to autonomously move the moving body 10 on the basis of the destination or the moving route input as the control signal.
The flow of the obstacle detection processing by the moving body 10 in
In step S11, the polarization cameras 30-1 and 30-2 (polarization image sensors 51-1 and 51-2) start capturing polarized images.
In step S12, the normal vector estimation unit 52 estimates a normal vector on the basis of the polarized image captured by the polarization camera 30-1 or the polarization camera 30-2.
In step S13, the luminance image construction units 53-1 and 53-2 configure luminance images from two-view polarized images captured by the polarization cameras 30-1 and 30-2, respectively.
In step S14, the parallelization processing units 54-1 and 54-2 perform parallelization processing on the two-view luminance images configured by the luminance image construction units 53-1 and 53-2, respectively.
In step S15, the normal vector correction unit 56 corrects the normal vector in accordance with the parallelization processing by the parallelization processing units 54-1 and 54-2.
In step S16, the parallax estimation unit 57 estimates the parallax from the luminance image after the parallelization processing.
In step S17, the self-position estimation unit 63 calculates the self-position and the velocity vector of the own device (moving body 10) on the basis of the data obtained by each of the visual odometry unit 58, the GPS sensor 59, the IMU 60, the barometer 61, and the geomagnetic sensor 62.
In step S18 of
The division of the collision determination region will be described with reference to
For example, the division unit 64a divides the spatial region Sa through which the moving body 10 passes into the collision determination regions Cd on the basis of the distance Dr according to the distance resolution in the optical axis direction Ax of the distance measuring sensor (the polarization cameras 30-1 and 30-2 as stereo cameras). Since the accuracy of the distance resolution of the distance measuring sensor decreases as the distance measuring sensor becomes farther from the own device, the distance Dr increases as the distance measuring sensor becomes farther from the own device. The length of the collision determination region Cd in the direction of the velocity vector v (depth direction) is set according to the distance Dr. That is, the length of the collision determination region Cd in the direction of the velocity vector v increases as the distance from the own device increases.
Returning to the flowchart of
In step S20, the calculation unit 64b calculates the collision risk for each collision determination region on the basis of the normal vector for the extracted point cloud data.
The calculation of the collision risk for each collision determination region will be described with reference to
In
Here, assuming that the number i of point cloud data included in the k-th collision determination region Cd is N, the collision risk Rk of the k-th collision determination region Cd is expressed by following Equation (1).
The Rarea in Equation (1) is a value proportional to the area of the real space corresponding to each pixel of the point cloud data 1. As illustrated in
In Equation (2), the product of d/fx and d/fy is proportional to the area of the real space corresponding to one pixel of the point cloud data, and ωarea is a fixed weight parameter.
The Rcount in Equation (1) is a value (weight) uniformly set for each point cloud data, and is expressed by following Equation (3).
In Equation (3), ωcount is the total number of point cloud data included in the collision determination region Cd, and is a value for preventing the above-described Rarea proportional to the area of the real space from becoming too small in a case where there is an obstacle at a close distance.
Rnormal in Equation (1) is a value (gain) calculated on the basis of the normal vector n for each point cloud data, and is expressed by following Equation (4) using an inner product of the velocity vector v and the normal vector n of each point cloud data.
In Equation (4), ωnormal is a fixed weight parameter. In addition, the absolute value of the inner product of the velocity vector v and the normal vector n of each point cloud data increases as the traveling direction of the moving body 10 and the surface of the subject face each other. Meanwhile, even in a case where the moving body 10 is flying at a low altitude near the ground, since the traveling direction of the moving body 10 is orthogonal to the normal direction of the ground surface, the absolute value of the inner product of the velocity vector v and the normal vector n of each point cloud data becomes small.
Note that both the velocity vector v and the normal vector n of each point cloud data are vectors in a fixed coordinate system based on the moving body 10.
When the collision risk for each collision determination region is calculated as described above, in step S20, the setting unit 64c sets an obstacle region where an obstacle is likely to exist on the basis of the calculated collision risk for each collision determination region.
Setting of the obstacle region will be described with reference to
In
The setting unit 64c sets, as the obstacle region Ob, a collision determination region Cd in which the collision risk Rk is higher than a predetermined threshold Rth and which is closest to the own device (moving body 10). In the example of
Returning to the flowchart of
In step S23, the obstacle collision determination unit 64 determines whether or not the distance to the obstacle region is shorter than the calculated stoppable distance. In a case where it is determined that the distance to the obstacle region is shorter than the stoppable distance, the process proceeds to step S24.
For example, in the example of
In step S24, the flight controller 65 decelerates the moving body 10 on the basis of the control information generated by the obstacle collision determination unit 64. Here, the moving body 10 is decelerated by generating the control information for decelerating to a speed at which the moving body can stop before the obstacle. However, the moving body 10 may be stopped by generating the control information for stopping before the obstacle.
Meanwhile, in the example of
Thereafter, the process returns to step S12, and the processes of steps S12 to S24 are repeated.
According to the above processing, the movement of the own device is controlled using the normal vector estimated on the basis of the polarized image obtained by imaging the traveling direction of the own device, and thus, it is possible to suppress erroneous determination in obstacle detection even in a case where the moving body 10 is flying at a low altitude. As a result, the user can perform a more natural manual operation.
In the second embodiment, as illustrated in
A moving body 10A illustrated in
The representative normal vector calculation unit 64d calculates a normal vector being representative (hereinafter, referred to as a representative normal vector) in the obstacle region on the basis of the normal vector for the point cloud data included in the obstacle region set by the setting unit 64c.
The angular acceleration calculation unit 64e predicts a velocity vector when the own device (moving body 10A) reaches the obstacle region, and calculates angular acceleration such that the velocity vector (predicted velocity vector) and the representative normal vector calculated by the representative normal vector calculation unit 64d are orthogonal to each other.
As a result, the obstacle collision determination unit 64 can generate control information for correcting the trajectory of the own device such that the predicted velocity vector and the representative normal vector are orthogonal to each other.
The flow of the obstacle detection processing by the moving body 10A of
Note that the processing up to step S23 in the flowchart of
That is, in a case where it is determined in step S23 that the distance to the obstacle region is shorter than the stoppable distance, the process proceeds to step S50. In step S50, the obstacle collision determination unit 64 executes trajectory correction processing.
In step S51, the representative normal vector calculation unit 64d of the obstacle collision determination unit 64 calculates a representative normal vector of the obstacle region.
The calculation of the representative normal vector will be described with reference to
A of
FIG. B illustrates a distribution of the normal vector n in the obstacle region Ob. The representative normal vector calculation unit 64d analyzes the distribution of the normal vector n, and determines the vector in the most dominant direction in the obstacle region Ob as the representative normal vector Rn illustrated in C of the drawing.
When the representative normal vector of the obstacle region is calculated as described above, in step S52, the angular acceleration calculation unit 64e calculates the angular acceleration at which the predicted velocity vector when the own device (moving body 10A) reaches the obstacle region and the representative normal vector are orthogonal to each other.
Specifically, as illustrated in
In step S53, the obstacle collision determination unit 64 determines whether or not the angular acceleration calculated by the angular acceleration calculation unit 64e exceeds a predetermined value. In a case where it is determined that the calculated angular acceleration does not exceed the predetermined value, the process proceeds to step S54.
In step S54, the obstacle collision determination unit 64 determines whether or not point cloud data corresponding to an object having a possibility of collision exists on the trajectory after correction (corrected trajectory). In a case where it is determined that the point cloud data corresponding to the object having a possibility of collision does not exist on the corrected trajectory, the process proceeds to step S55. At this time, the obstacle collision determination unit 64 outputs the angular acceleration calculated by the angular acceleration calculation unit 64e to the flight controller 65 as control information for correcting the trajectory of the own device.
In step S55, the flight controller 65 corrects the trajectory of the own device by controlling the posture of the moving body 10A on the basis of the angular acceleration output as the control information by the obstacle collision determination unit 64. Thereafter, the process returns to step S12 (
Meanwhile, in a case where it is determined in step S53 that the calculated angular acceleration exceeds the predetermined value, or in a case where it is determined in step S54 that the point cloud data corresponding to the object having a possibility of collision exists on the corrected trajectory, the process proceeds to step S56. At this time, the obstacle collision determination unit 64 generates control information for decelerating the moving body 10 to a speed at which the moving body 10 can stop before the obstacle, and outputs the control information to the flight controller 65.
In step S56, the flight controller 65 decelerates the moving body 10A on the basis of the control information generated by the obstacle collision determination unit 64. Thereafter, the process returns to step S12 (
According to the above processing, since the trajectory of the own device is corrected using the normal vector estimated on the basis of the polarized image obtained by imaging the traveling direction of the own device, it is possible to avoid collision with an obstacle even in a case where the moving body 10A is flying at a high speed.
A first person view (FPV) camera having a gimbal mechanism is mounted on a drone, and an image captured by the FPV camera is transmitted to a controller operated by a user, so that the user can remotely operate the drone while viewing the image. However, in the image captured by the FPV camera, it may be difficult to visually recognize the unevenness of the surrounding environment in which the drone flies.
In the third embodiment, the normal vector image generated on the basis of the estimated normal vector is superimposed on the image captured by the FPV camera, thereby realizing the assistance of the user at the time of remote operation of the moving body 10.
A moving body 10B illustrated in
The FPV camera 100 has a gimbal mechanism and can capture images at various angles. The RGB image sensor 111 included in the FPV camera 100 is configured by arranging R, G, and B color filters on pixels in a Bayer array, for example, and captures RGB images.
On the basis of the RGB image captured by the RGB image sensor 111, the posture estimation unit 112 estimates the current posture of the FPV camera 100 based on the origin position of the camera coordinate system of the FPV camera 100.
The coordinate conversion unit 113 converts the coordinate system of the normal vector map including the normal vector for each pixel position of the polarized image into the camera coordinate system of the FPV camera 100. For the coordinate conversion of the normal vector map, the parallax map from the parallax estimation unit 57, the self-position from the self-position estimation unit 63, the posture information of the polarization cameras 30-1 and 30-2, the relative position of the FPV camera 100 with respect to the polarization cameras 30-1 and 30-2, and the current posture information are used.
The normal vector image generation unit 114 generates a normal vector image colored according to the direction of the normal vector on the basis of the coordinate-converted normal vector map. That is, the normal vector image converted into the camera coordinate system of the FPV camera 100 is generated.
The superimposition unit 115 generates a superimposed image in which the normal vector image generated by the normal vector image generation unit 114 is superimposed on the RGB image captured by the RGB image sensor 111.
The transmission unit 116 corresponds to the communication unit 21 in
In step S111, the FPV camera 100 (the RGB image sensor 111) captures an RGB image.
In step S112, the posture estimation unit 112 estimates a current posture of the FPV camera 100 based on an origin position of a camera coordinate system of the FPV camera 100 on the basis of the captured RGB image.
In step S113, the coordinate conversion unit 113 converts the coordinate system of the normal vector map into the camera coordinate system of the FPV camera 100.
In step S114, the normal vector image generation unit 114 generates a normal vector image colored according to the direction of the normal vector on the basis of the coordinate-converted normal vector map.
In step S115, the superimposition unit 115 superimposes the normal vector image generated by the normal vector image generation unit 114 on the RGB image captured by the RGB image sensor 111.
Generation of a superimposed image will be described with reference to
In a case where the RGB image 130 illustrated in
Therefore, the normal vector image generation unit 114 generates the normal vector image 140 colored according to the direction of the normal vector on the basis of the normal vector map, and the superimposition unit 115 generates the superimposed image 150 in which the normal vector image 140 is superimposed on the RGB image 130.
As a result, an image in which the unevenness of the subject is easily visually recognized is obtained.
Returning to the flowchart of
According to the above process, it is possible to obtain an image in which the unevenness of the subject can be easily visually recognized, and it is possible to realize the assistance of the user at the time of remote operation of the moving body 10B.
Note that, in the present embodiment, only the normal vector image may be transmitted to the controller without providing the FPV camera 100.
Hereinafter, modifications of the above-described embodiments will be described.
In the above-described embodiments, the sensor 20 that realizes the estimation of the normal vector and the distance measurement is configured by the polarization image sensors 51-1 and 51-2 constituting the two-view stereo camera.
Alternatively, as illustrated in A of
Similarly, as illustrated in B of
In any configuration, the sensor 20 can realize estimation of a normal vector and distance measurement.
Furthermore, in the above-described embodiments, the normal vector is estimated on the basis of the polarized image acquired by the polarization image sensor.
In addition to this, the normal vector can be estimated on the basis of sensor data in which the traveling direction of the own device (moving body 10) is sensed by a predetermined sensor. For example, it is possible to estimate the normal vector on the basis of data obtained by performing predetermined processing on depth information acquired by a general stereo camera or a distance measuring device such as LiDAR.
In the above-described embodiments, the configuration from the configuration for estimating the normal vector (normal vector estimation unit 52) to the configuration for determining collision with an obstacle (obstacle collision determination unit 64) is realized by the control unit 22 in
These configurations may be realized by the control unit 331 included in the information processing apparatus 320 in a case where the movement of the moving body 310 is controlled by the information processing apparatus 320 configured on a cloud, for example, in the moving body control system illustrated in
Also in the above configuration, it is possible to suppress erroneous determination in obstacle detection in a case where the moving body 310 is flying at a low altitude.
The moving body 10 to which the technology according to the present disclosure is applied has been described as exhibiting the effect in a case where the moving body is flying at a low altitude near the ground. However, the moving body 10 to which the technology according to the present disclosure is applied can also exhibit the effect in a case where the moving body is moving along a wall surface, for example.
The above-described series of processing can be executed by hardware or software. In a case where the series of processing is executed by software, a program constituting the software is installed in a computer. Here, examples of the computer include a computer incorporated in dedicated hardware, and for example, a general-purpose personal computer capable of executing various functions by installing various programs.
In the computer, a CPU 501, a read only memory (ROM) 502, and a random access memory (RAM) 503 are mutually connected by a bus 504.
An input/output interface 505 is further connected to the bus 504. An input unit 506, an output unit 507, a storage unit 508, a communication unit 509, and a drive 510 are connected to the input/output interface 505.
The input unit 506 includes a keyboard, a mouse, a microphone, and the like. The output unit 507 includes a display, a speaker, and the like. The storage unit 508 includes a hard disk, a non-volatile memory and the like. The communication unit 509 includes, for example, a network interface and the like. The drive 510 drives a removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
In the computer configured as described above, for example, the CPU 501 loads a program stored in the storage unit 508 into the RAM 503 via the input/output interface 505 and the bus 504 and executes the program, whereby the above-described series of processing is performed.
The program executed by the computer (CPU 501) can be provided by being recorded on, for example, a removable medium 511 as a package medium or the like. Furthermore, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
In the computer, the program can be installed in the storage unit 508 via the input/output interface 505 by mounting the removable medium 511 to the drive 510. Furthermore, the program can be received by the communication unit 509 via a wired or wireless transmission medium and installed in the storage unit 508. In addition, the program can be installed in the ROM 502 or the storage unit 508 in advance.
Note that the program executed by the computer may be a program for processing in time series in the order described in the present specification, or a program for processing in parallel or at a necessary timing such as when a call is made.
The embodiments of the present disclosure are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present disclosure.
The effects described in the present specification are merely examples and are not limited, and other effects may be provided.
Moreover, the technology according to the present disclosure can have the following configurations.
A moving body including:
The moving body according to (1), further including
The moving body according to (2),
The moving body according to (3), further including
The moving body according to (4),
The moving body according to (4) or (5),
The moving body according to (6),
The moving body according to any one of (4) to (7), further including
The moving body according to (8),
The moving body according to (8) or (9), further including
The moving body according to (10),
The moving body according to (11),
The moving body according to (12),
The moving body according to any one of (1) to (13), further including:
The moving body according to (14),
The moving body according to (14) or (15),
The moving body according to any one of (14) to (16), further including
The moving body according to any one of (1) to (17),
A movement control method including:
A program for causing a computer to execute processing of:
Number | Date | Country | Kind |
---|---|---|---|
2021-079528 | May 2021 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2022/003738 | 2/1/2022 | WO |