The present application claims priority to and incorporates by reference the entire contents of Japanese Patent Application No. 2021-214918 filed in Japan on Dec. 28, 2021.
The present disclosure relates to a control method for a mobile object, a mobile object, and a computer-readable storage medium.
For example, there is known a technique of causing a mobile object such as a forklift to automatically move. Normally, such a mobile object moves while successively detecting a self-position. Patent Literature 1 discloses a palette conveyance vehicle that conveys a pallet while recognizing a self-position by detecting a sign disposed on a floor or a ceiling of a workplace.
Patent Literature 1: Japanese Patent Application Laid-open No. H1-302408
Herein, a mobile object may approach a parked transport vehicle at the time of unloading or loading an object from/onto the transport vehicle. The transport vehicle is parked in a parking region set in advance, but the mobile object cannot appropriately approach the transport vehicle in some cases due to a shift of a parking position in the parking region.
The present disclosure is intended to solve the problem described above, and an object of the present disclosure is to provide a control method for a mobile object, a mobile object, and a computer-readable storage medium that can appropriately approach a transport vehicle parked in a parking region.
To solve the above problem and achieve the object, a control method according to the present disclosure is for a mobile object that automatically moves. The control method includes: acquiring positional information on a transport vehicle parked in a parking region, the positional information including information on a position of a rear end portion of the transport vehicle, information on an attitude of the transport vehicle, and information on a length of the transport vehicle; setting a first path toward the transport vehicle based on the positional information on the transport vehicle; and causing the mobile object to move along the first path.
To solve the above problem and achieve the object, a mobile object according to the present disclosure automatically moves, and includes: a first path acquisition unit configured to acquire a first path toward a transport vehicle parked in a parking region, the first path being set based on positional information on the transport vehicle including information on a position of a rear end portion of the transport vehicle, information on an attitude of the transport vehicle, and information on a length of the transport vehicle; and a movement control unit configured to cause the mobile object to move along the first path.
To solve the above problem and achieve the object, a non-transitory computer-readable storage medium according to the present disclosure stores a computer program for causing a computer to perform a control method for a mobile object that automatically moves. The computer program includes: acquiring a first path toward a transport vehicle parked in a parking region, the first path being set based on positional information on the transport vehicle including information on a position of a rear end portion of the transport vehicle, information on an attitude of the transport vehicle, and information on a length of the transport vehicle; and causing the mobile object to move along the first path.
According to the present disclosure, it is possible to appropriately approach a transport vehicle parked in a parking region.
The following describes a preferred embodiment of the present disclosure in detail with reference to the attached drawings. The present disclosure is not limited to the embodiment. In a case in which there are a plurality of embodiments, the present disclosure encompasses a combination of the embodiments.
Entire Configuration of Movement Control System
Transport Vehicle
The following describes the transport vehicle V more specifically.
Parking Region AR0
The following describes the parking region AR0 illustrated in
The parking region AR0 is set in advance as a region in which the transport vehicle V should be parked. That is, a position (coordinates), a shape, and a size of the parking region AR0 are set in advance, and the parking region AR0 may be marked out by a white line and the like, for example. The parking region AR0 is preferably set to have a shape and a size so that orientation of the transport vehicle V parked in the parking region AR0 can be defined. In the present embodiment, the parking region AR0 is set to have a shape and a size so that, in a case in which the transport vehicle V is parked in the parking region AR0, a direction from a rear end toward a front end of the transport vehicle V is oriented toward the direction Y side. For example, in the example of
In the storage chamber VA of the transport vehicle V, a plurality of the target objects P are disposed along the front and rear direction of the transport vehicle V (direction from the rear end toward the front end of the transport vehicle V). Thus, in the parking region AR0, the target objects P are disposed side by side in the direction Y in the storage chamber VA. In the present embodiment, in the storage chamber VA, the target objects P are disposed side by side also in the right and left direction of the vehicle V (in the X-direction in the parking region AR0). In the example of
Mobile Object
The mobile object 10 is a device that can automatically move. In the present embodiment, the mobile object 10 is a forklift, more specifically, what is called an Automated Guided Vehicle (AGV) or an Automated Guided Forklift (AGF). As exemplified in
Each of the sensors 26 detects at least one of a position and an attitude of an object that is present around the vehicle body 20. It can also be said that the sensor 26 detects a position of the object with respect to the mobile object 10 and the attitude of the object with respect to the mobile object 10. In the present embodiment, the sensors 26 are disposed on side surfaces of the mast 22 and on a rear direction side of the vehicle body 20. However, the positions at which the sensors 26 are disposed are not limited thereto. The sensors 26 may be disposed at optional positions, and the number of the sensors 26 to be disposed may also be optional. For example, a safety sensor installed on the mobile object 10 may also be used as the sensor 26. By using the safety sensor as the sensor 26, a new sensor is not required to be disposed.
The sensor 26 is, for example, a sensor that emits laser light. The sensor 26 emits laser light while performing scanning in one direction (herein, a lateral direction), and detects the position and orientation of the object from reflected light of the emitted laser light. That is, the sensor 26 can be assumed to be what is called a two-dimensional (2D)-Light Detection And Ranging (LiDAR). However, the sensor 26 is not limited thereto, and may be a sensor that detects the object using an optional method. For example, the sensor 26 may be what is called a three-dimensional (3D)-LiDAR that scans in a plurality of directions, or may be a camera.
The control device 28 controls movement of the mobile object 10. The control device 28 will be described later.
Management System
The communication unit 30 is a module that is used by the control unit 34 to communicate with an external device such as the information processing device 14, and may include an antenna and the like, for example. A communication scheme used by the communication unit 30 is wireless communication in the present embodiment, but the communication scheme may be optional. The storage unit 32 is a memory that stores various kinds of information such as a computer program or arithmetic content of the control unit 34, and includes at least one of a random access memory (RAM), a main storage device such as a read only memory (ROM), and an external storage device such as a hard disk drive (HDD), for example.
The control unit 34 is an arithmetic device, and includes an arithmetic circuit such as a central processing unit (CPU), for example. The control unit 34 includes a work determination unit 36. The control unit 34 implements the work determination unit 36 by reading out, from the storage unit 32, and executing a computer program (software), and performs processing thereof. The control unit 34 may perform the processing with one CPU, or may include a plurality of CPUs and perform the processing with the CPUs. The work determination unit 36 may be implemented with a hardware circuit. A computer program for the control unit 34 stored in the storage unit 32 may be stored in a (non-transitory) computer-readable storage medium that can be read by the management system 12.
The work determination unit 36 determines the target object P as an object to be conveyed. Specifically, the work determination unit 36 determines work content indicating information on the target object P as the object to be conveyed based on an input work plan, for example. It can also be said that the work content is information for specifying the target object P as the object to be conveyed. In the example of the present embodiment, as the work content, it is determined that which target object P in which facility is to be conveyed to where by when. That is, the work determination unit 36 is information indicating the facility in which the target object P to be conveyed is kept, the target object P to be conveyed, a conveyance destination of the target object P, and a conveyance timing of the target object P. The work determination unit 36 transmits the determined work content to the information processing device 14 via the communication unit 30. The work determination unit 36 is not an essential configuration in the present embodiment.
Information Processing Device
The control unit 44 is an arithmetic device, and includes an arithmetic circuit such as a CPU, for example. The control unit 44 includes a work content acquisition unit 50, a mobile object selection unit 52, a vehicle information acquisition unit 54, a relative target position acquisition unit 56, and a first path acquisition unit 58. The control unit 44 implements the work content acquisition unit 50, the mobile object selection unit 52, the vehicle information acquisition unit 54, the relative target position acquisition unit 56, and the first path acquisition unit 58 by reading out, from the storage unit 42, and executing a computer program (software), and performs processing thereof. The control unit 44 may perform the processing with one CPU, or may include a plurality of CPUs and perform the processing with the CPUs. At least part of the work content acquisition unit 50, the mobile object selection unit 52, the vehicle information acquisition unit 54, the relative target position acquisition unit 56, and the first path acquisition unit 58 may be implemented with a hardware circuit. A computer program for the control unit 44 stored in the storage unit 42 may be stored in a (non-transitory) computer-readable storage medium that can be read by the information processing device 14.
Work Content Acquisition Unit and Mobile Object Selection Unit
The work content acquisition unit 50 acquires information on the work content determined by the management system 12, that is, information on the target object P to be conveyed. The work content acquisition unit 50 specifies the parking region AR0 of the transport vehicle V on which the target object P is mounted based on the information on the target object P in the work content. For example, the storage unit 42 stores the target object P, the transport vehicle V on which the target object P is mounted, and the parking region AR0 of the transport vehicle V in association with each other, and the work content acquisition unit 50 specifies the parking region AR0 by reading out the information from the storage unit 42. The mobile object selection unit 52 selects the mobile object 10 to be operated. For example, the mobile object selection unit 52 selects the mobile object 10 to be operated from among a plurality of the mobile objects belonging to the facility W. The mobile object selection unit 52 may select the mobile object 10 to be operated, using an optional method. For example, based on the parking region AR0 specified by the work content acquisition unit 50, the mobile object selection unit 52 may select the mobile object 10 suitable for conveyance of the target object P in the parking region AR0 as the mobile object 10 to be operated. The work content acquisition unit 50 and the mobile object selection unit 52 are not essential configurations in the present embodiment.
Vehicle Information Acquisition Unit
The vehicle information acquisition unit 54 acquires positional information on the transport vehicle V. As the positional information on the transport vehicle V, the vehicle information acquisition unit 54 acquires information on a position of a rear end portion Vb of the transport vehicle V parked in the parking region AR0, information on an attitude of the transport vehicle V parked in the parking region AR0, and information on a length of the transport vehicle V. The position of the rear end portion Vb of the transport vehicle V parked in the parking region AR0 is a position (coordinates) of the rear end portion Vb in a coordinate system on a two-dimensional surface on the region AR (coordinate system of the region AR). The attitude of the transport vehicle V parked in the parking region AR0 is orientation of the transport vehicle V in the coordinate system of the region AR, and is assumed to be a yaw angle (rotation angle) of the mobile object 10 assuming that the X-direction is 0° when viewed from the direction Z orthogonal to the direction X and the direction Y. The position in the present embodiment may indicate coordinates in the coordinate system of the facility W unless otherwise specified. Similarly, the attitude in the present embodiment may indicate a yaw angle assuming that the X-direction is 0° when viewed from the direction Z unless otherwise specified. The length of the transport vehicle V indicates the length in the front and rear direction of the transport vehicle V. In the present embodiment, the length of the transport vehicle V indicates a distance in the front and rear direction (distance VL in
The vehicle information acquisition unit 54 may acquire the position of the rear end portion Vb of the transport vehicle V parked in the parking region AR0 and the attitude of the transport vehicle V parked in the parking region AR0 using an optional method. In the present embodiment, as illustrated in
For example, in a case in which the sensor S1 is configured to emit laser light, the vehicle information acquisition unit 54 causes the sensor S1 to perform scanning in the lateral direction (horizontal direction) while causing the sensor S1 to emit laser light toward the parking region AR0 side. The laser light from the sensor S1 is incident on the rear end portion Vb of the transport vehicle V parked in the parking region AR0, and the laser light from the sensor 26 is reflected by the rear end portion Vb. The sensor 26 receives reflected light from the rear end portion Vb. The vehicle information acquisition unit 54 acquires a point group as a set of measuring points based on a detection result of the reflected light received by the sensor 26. The measuring point is a point indicating a position (coordinates) at which the laser light is reflected, and the point group indicates a set of points indicating positions at which the laser light is reflected. In the present embodiment, based on the detection result of the reflected light, the vehicle information acquisition unit 54 calculates, as the measuring point, a position (coordinates) of a point at which the reflected light is reflected. The vehicle information acquisition unit 54 extracts a straight line by using, for example, a RANSAC algorithm based on the measuring points (point group), and calculates a position of the straight line as the rear end portion Vb of the transport vehicle V. The vehicle information acquisition unit 54 then assumes the attitude of the straight line as the attitude of the rear end portion Vb, and calculates the attitude of the transport vehicle V based on the attitude of the rear end portion Vb. For example, the vehicle information acquisition unit 54 may calculate a direction orthogonal to the attitude of the rear end portion Vb as the attitude of the transport vehicle V. However, a calculation method for the position of the rear end portion Vb of the transport vehicle V and the attitude of the transport vehicle V based on the detection result of the sensor S1 is not limited to the above description, and may be optional. For example, in a case of using a camera as the sensor S1, the position of the rear end portion Vb of the transport vehicle V and the attitude of the transport vehicle V may be calculated by causing the camera to image the rear end portion Vb and performing image analysis based on image data in which the rear end portion Vb is reflected. For example, a sensor for detecting the position of the rear end portion Vb of the transport vehicle V and a sensor for detecting the attitude of the transport vehicle V may be respectively disposed, and the position of the rear end portion Vb of the transport vehicle V and the attitude of the transport vehicle V may be calculated based on detection results obtained by the respective sensors.
The vehicle information acquisition unit 54 may acquire the length in the front and rear direction of the transport vehicle V using an optional method. In the present embodiment, the vehicle information acquisition unit 54 acquires vehicle type information indicating a vehicle type of the transport vehicle V, and acquires information on the length of the transport vehicle V based on the vehicle type information. Specifically, in the present embodiment, as illustrated in
In the present embodiment, the sensor S2 is a camera, and the point indicating the vehicle type of the transport vehicle V is a number plate disposed on the transport vehicle V. In the present embodiment, the vehicle information acquisition unit 54 causes the sensor S2 to image the number plate of the transport vehicle V, and performs image analysis on a taken image of the number plate to acquire the vehicle type information on the transport vehicle V. In this case, for example, the vehicle information acquisition unit 54 acquires, as the vehicle type information on the transport vehicle V, an identifier specific to the vehicle (for example, a number, a symbol, a character, or a combination thereof) printed on the number plate of the transport vehicle V. The vehicle information acquisition unit 54 then reads out relation information indicating a relation between the vehicle type information and the length of the vehicle, and acquires information on the length of the transport vehicle V based on the relation information and the acquired vehicle type information on the transport vehicle V. The relation information is a database indicating a correspondence relation between the vehicle type information and the length of the vehicle. The vehicle information acquisition unit 54 acquires, as the information on the length of the transport vehicle V, a length of the vehicle associated with the acquired vehicle type information on the transport vehicle V in the relation information. The relation information is set in advance. The vehicle information acquisition unit 54 may read out the relation information from the storage unit 42, or may read out the relation information via the communication unit 40. In this way, in the present embodiment, the vehicle type information is acquired by imaging the number plate with the camera as the sensor S2, but the method of acquiring the vehicle type information is not limited thereto. For example, the point indicating the vehicle type is not limited to the number plate, but may be an optional portion of the transport vehicle V indicating the vehicle type, for example, a portion in which a model name of the vehicle type is marked. For example, the sensor S2 is not limited to the camera, but may be an optional sensor that can detect a point indicating the vehicle type.
Relative Target Position Acquisition Unit
The relative target position acquisition unit 56 acquires relative target position information. The relative target position information is information indicating a relative position (relative target position) of a target position in the transport vehicle V with respect to the transport vehicle V. That is, in the present embodiment, it can be said that the relative target position information is not information indicating coordinates of the target position in the coordinate system of the region AR, but indicates a position at which the target position is present in the transport vehicle V, that is, coordinates of the target position in the coordinate system of the transport vehicle V. More specifically, it can be said that the relative target position information is not obtained by directly detecting the target position in the transport vehicle by the sensor, but is information indicating an approximate position of the target position in the transport vehicle V. In the description of the present embodiment, exemplified is a case in which the mobile object 10 picks up the target object P disposed in the transport vehicle V, so that it can be said that the relative target position is an approximate relative position in the transport vehicle V of the target object P to be picked up that is disposed in the transport vehicle V.
The relative target position acquisition unit 56 may acquire the relative target position information using an optional method other than a method of directly detecting the relative target position with the sensor. For example, the relative target position acquisition unit 56 acquires the relative target position information based on number information, which is set in advance, indicating the number of the target objects P disposed in the transport vehicle V. The number of the target objects P disposed in the transport vehicle V indicates the number of the target objects P disposed in the transport vehicle V, and may be a total number of the target objects P to be picked up and the target objects P not to be picked up. In this case, the relative target position acquisition unit 56 acquires the number information set in advance, and calculates the relative target position in the transport vehicle V based on the number information. That is, respective positions at which the target objects P are disposed and disposition order thereof are determined in advance in the transport vehicle V, so that the relative target position acquisition unit 56 can calculate the position (relative target position) of the target object P to be picked up based on the number information, and the positions at which the respective target objects P are disposed and the disposition order thereof. The relative target position acquisition unit 56 may acquire the number information using an optional method. For example, the number information may be manually set by a driver of the transport vehicle V or a staff of the facility W. In this case, the number information is input to a terminal carried by the driver of the transport vehicle V or the staff of the facility W, and the relative target position acquisition unit 56 acquires the number information from the terminal via the communication unit 40. Alternatively, for example, the number information may be automatically set at the time when the target object P is mounted on the transport vehicle V, and the relative target position acquisition unit 56 may acquire the number information that is automatically set.
The relative target position acquisition unit 56 may acquire the relative target position information set in advance. In this case, for example, among the respective positions at which the target objects P are disposed, the position at which the target object P to be picked up is disposed is set as the relative target position information in advance. The relative target position acquisition unit 56 acquires, as the relative target position information, the information on the position at which the target object P to be picked up is disposed that is set in advance. In this case, for example, an identifier (for example, a number and the like) is given to each of the positions at which the target objects P are disposed, and the relative target position acquisition unit 56 acquires information on the identifier given to the position at which the target object P to be picked up is disposed. The relative target position acquisition unit 56 then acquires the position corresponding to the acquired identifier as the relative target position in the transport vehicle V, that is, as the relative target position information. The identifier may be set by using an optional method. For example, the identifier may be manually set by the driver of the transport vehicle V or the staff of the facility W. In this case, for example, the identifier is input to the terminal carried by the driver of the transport vehicle V or the staff of the facility W, and the relative target position acquisition unit 56 acquires the identifier from the terminal via the communication unit 40. Alternatively, for example, the identifier may be automatically set at the time when the target object P is mounted on the transport vehicle V, and the relative target position acquisition unit 56 may acquire the identifier that is automatically set.
First Path Acquisition Unit
In the present embodiment, the first path acquisition unit 58 preferably sets the first path R1 toward the target position based on the positional information on the transport vehicle V acquired by the vehicle information acquisition unit 54 and the relative target position information acquired by the relative target position acquisition unit 56. Specifically, the first path acquisition unit 58 calculates the position and the attitude of the target position (herein, the target object P) in the coordinate system of the region AR based on the position of the rear end portion Vb of the transport vehicle V parked in the parking region AR0, the attitude of the transport vehicle V parked in the parking region AR0, the length in the front and rear direction of the transport vehicle V, and the relative target position indicating the relative position of the target position (herein, the target object P) in the transport vehicle V. That is, it can be said that the first path acquisition unit 58 converts the position and the attitude of the target position (relative target position) in the coordinate system of the transport vehicle V into the position and the attitude of the target position in the coordinate system of the region AR based on the position of the rear end portion Vb in the coordinate system of the region AR, the attitude of the transport vehicle V in the coordinate system of the region AR, and the length of the transport vehicle V. That is, for example, the first path acquisition unit 58 can calculate the position and the attitude of the front end portion Va in the coordinate system of the region AR based on the position of the rear end portion Vb, the attitude of the transport vehicle V, and the length of the transport vehicle V, and calculates the position and the attitude of the target position in the coordinate system of the region AR based on the position and the attitude of the front end portion Va and the relative target position.
As illustrated in
In the present embodiment, the first path acquisition unit 58 sets the first path R1 to include a track R1a, a track R1b, and a track R1c. The track R1a is a track to a predetermined position B. The track R1b is a track that is connected to the track R1a, and runs in the Y-direction from the predetermined position B to a predetermined position C on a lateral side (herein, the X-direction side) of the target position. The track R1c is a track that is connected to the track Rib, and runs toward the opposite side of the X-direction from the predetermined position C to the target arrival position A0. It can be said that the track R1c is a track for turning the mobile object 10 so that a traveling direction of the mobile object 10 is switched from a direction toward the Y-direction to a direction toward a direction opposite to the X-direction. The predetermined position B may be optionally set. For example, the predetermined position B may be a position within a predetermined distance range with respect to the parking region AR0. The predetermined position C may also be optionally set. For example, the predetermined position C may be a position within a predetermined distance range with respect to the target position.
The track R1a may be a track along a global path set in advance. The global path is a track toward the parking region AR0, and set in advance based on map information on the facility W. The global path may be set also based on information about vehicle specifications of the mobile object 10 in addition to the map information on the facility W. The information about the vehicle specifications is, for example, specifications that influence a route along which the mobile object 10 can move, such as a size or a minimum turning radius of the mobile object 10. In a case in which the global path is set also based on the information about the vehicle specifications, the global path may be set for each mobile object. The global path may also be set by a person based on the map information, the information about the vehicle specifications, and the like, or may be automatically set by a device such as the information processing device 14 based on the map information, the information about the vehicle specifications, and the like. In a case of automatically setting the global path, for example, a point desired to be passed through (Waypoint) may be designated, for example. In this case, the shortest global path can be set to pass through the point desired to be passed through while avoiding an obstacle (a fixed object such as a wall).
The track R1b and the track R1c are set based on the positional information on the transport vehicle V and the relative target position information. That is, for example, the track R1b may be a track that runs straight in a direction along the attitude of the transport vehicle V in the parking region AR0, and the track R1c may be a track that runs from the predetermined position C on the X-direction side of the parking region AR0 toward the opposite side of the X-direction to reach the target arrival position A0.
In this way, the first path acquisition unit 58 uses the track R1a set in advance as the global path to the predetermined position B, and sets the tracks R1b and R1c from the predetermined position B to the target arrival position A0 based on the positional information on the transport vehicle V and the relative target position information. However, the first path R1 is not limited to the track including the track R1a, the track R1b, and the track R1c as described above, but may be an optional track to the target arrival position A0.
Control Device for Mobile Object
Next, the following describes the control device 28 for the mobile object 10.
The control unit 64 is an arithmetic device, and includes an arithmetic circuit such as a CPU, for example. The control unit 64 includes a first path acquisition unit 70, a movement control unit 72, a detection control unit 74, a second path acquisition unit 76, and a fork control unit 78. The control unit 64 implements the first path acquisition unit 70, the movement control unit 72, the detection control unit 74, the second path acquisition unit 76, and the fork control unit 78 by reading out, from the storage unit 62, and executing a computer program (software), and performs processing thereof. The control unit 64 may perform the processing with one CPU, or may include a plurality of CPUs and perform the processing with the CPUs. At least part of the first path acquisition unit 70, the movement control unit 72, the detection control unit 74, the second path acquisition unit 76, and the fork control unit 78 may be implemented with a hardware circuit. A computer program for the control unit 64 stored in the storage unit 62 may be stored in a (non-transitory) computer-readable storage medium that can be read by the control device 28.
First Path Acquisition Unit
The first path acquisition unit 70 acquires information on the first path R1. When the mobile object 10 is selected to be operated, the first path acquisition unit 70 may acquire the information on the first path R1 from the information processing device 14, or may read out the information on the first path R1 previously stored in the storage unit 62. The first path acquisition unit 70 does not necessarily acquire the first path R1 from the information processing device 14, but may set the first path R1 by itself. In this case, the first path acquisition unit 70 may acquire the positional information on the transport vehicle V and the relative target position information from the information processing device 14, and may set the first path R1 based on the positional information on the transport vehicle V and the relative target position information. Alternatively, for example, the control device 28 may include the vehicle information acquisition unit 54 and the relative target position acquisition unit 56, the control device 28 may acquire the positional information on the transport vehicle V and the relative target position information, and the first path acquisition unit 70 may set the first path R1 based thereon.
Movement Control Unit
The movement control unit 72 controls movement of the mobile object 10 by controlling a moving mechanism such as a driving unit or a steering system of the mobile object 10. The movement control unit 72 causes the mobile object 10 to move along the first path R1 and a second path R2 (described later). By successively grasping the positional information on the mobile object 10, the movement control unit 72 causes the mobile object 10 to move to pass through the first path R1 and the second path R2. A method of acquiring the positional information on the mobile object 10 is optional. For example, in the present embodiment, a detection body (not illustrated) is disposed in the facility W, and the movement control unit 72 acquires information about the position and the attitude of the mobile object 10 based on detection of the detection body. Specifically, the mobile object 10 emits laser light toward the detection body, and receives reflected light of the laser light reflected from the detection body to detect the position and the attitude of the mobile object 10 in the facility W. Herein, the position of the mobile object 10 means two-dimensional coordinates in the direction X and the direction Y in the region AR of the facility W. Also in the following description, the position means two-dimensional coordinates in the region AR unless otherwise specified. The attitude of the mobile object 10 is a yaw angle (rotation angle) of the mobile object 10 assuming that the X-direction is 0° when viewed from the direction Z orthogonal to the direction X and the direction Y. The method of acquiring the information about the position and the attitude of the mobile object 10 is not limited to using the detection body. For example, Simultaneous Localization and Mapping (SLAM) may be used.
Detection Control Unit
The detection control unit 74 causes the sensor 26 to detect the position and the attitude of the target object P, and acquires a detection result of the position and the attitude of the target object P obtained by the sensor 26. Specific processing performed by the detection control unit 74 will be described later.
Second Path Acquisition Unit
The second path acquisition unit 76 acquires information on the second path R2 that is set based on the position and the attitude of the target object P. Specific processing performed by the second path acquisition unit 76 will be described later.
Fork Control Unit
The fork control unit 78 controls an operation of the fork 24 of the mobile object 10.
Processing Performed by Control Device
Next, the following describes processing performed by the control device 28 at the time when the mobile object 10 approaches the transport vehicle V.
Movement Along First Path
As illustrated in
Herein, the first path R1 is set based on the relative target position indicating an approximate target position, but the target position (herein, the target object P) in the transport vehicle V may be shifted from the relative target position described above. Thus, in the present embodiment, an actual target position is detected, and the path is switched based thereon.
Detection of Target Object on First Path
The detection control unit 74 acquires a detection result of the position and the attitude of the target position (herein, the target object P) in the transport vehicle V. A method of acquiring the detection result of the position and the attitude of the target position (herein, the target object P) in the transport vehicle V is optional. In the present embodiment, the detection control unit 76 causes the sensor 26 to detect the position and the attitude of the target position (herein, the target object P) while the mobile object 10 is moving along the first path R1. The detection control unit 74 causes the sensor 26 to detect the target object P herein in the transport vehicle V in the parking region AR0 by causing the sensor 26 to perform detection for the parking region AR0 the position of which is known.
For example, in a case in which the sensor 26 is configured to emit laser light, the detection control unit 74 causes the sensor 26 to perform scanning in the lateral direction (horizontal direction) while causing the sensor 26 to emit laser light toward the parking region AR0 side during a period in which the mobile object 10 is moving along the first path R1. The target object P reflects the laser light from the sensor 26. The sensor 26 receives reflected light from the target object P. The detection control unit 74 acquires a point group as a set of measuring points based on a detection result of the reflected light received by the sensor 26. In the present embodiment, based on the detection result of the reflected light, the detection control unit 74 calculates, as the measuring point, a position (coordinates) of a point at which the reflected light is reflected. The detection control unit 74 extracts a straight line by using, for example, a RANSAC algorithm based on the measuring points (point group), and calculates a position and an attitude of the straight line as the position and the attitude of the target object P. However, a calculation method for the position and the attitude of the target object P based on the detection result of the sensor 26 may be optional.
In the present embodiment, the target object P is detected while the mobile object 10 is moving along the track R1c, but the detection control unit 74 may cause the transport vehicle V to be detected at an optional timing when the mobile object 10 is positioned on the first path R1. For example, the detection control unit 74 may cause the transport vehicle V to be detected while the mobile object 10 is moving along the track R1b, or may cause the transport vehicle V to be detected at the time when the mobile object 10 is stopping at an optional position on the first path R1.
Hereinafter, information indicating the position and the attitude of the target object P detected by the sensor 26 is appropriately referred to as position and attitude information on the target object P. It can be said that the detection control unit 74 acquires the position and attitude information on the target object P detected by the sensor 26. The position and the attitude of the target object P are not necessarily detected by the sensor 26, or not necessarily acquired at the time when the mobile object 10 is positioned on the first path R1. For example, a sensor for detecting the position and the attitude of the target object P may be disposed in the facility W, and the detection control unit 74 may acquire a detection result of the position and the attitude of the target object P from the sensor. That is, the position and attitude information on the target object P acquired by the detection control unit 74 is not limited to the information detected by the sensor 26, but may be information detected by a sensor disposed in the facility W, for example.
Setting of Second Path
In the present embodiment, the second path acquisition unit 76, that is, the mobile object 10 itself sets the second path R2 based on the position and attitude information on the target object P. However, the embodiment is not limited thereto. A unit other than the mobile object 10 (for example, the information processing device 14) may set the second path R2, and the second path acquisition unit 76 may acquire the information on the second path R2 set by the unit via the communication unit 60.
Movement Along Second Path
After the second path acquisition unit 76 acquires the second path R2, the movement control unit 72 switches the first path R1 to the second path R2, and causes the mobile object 10 to move along the second path R2.
After arriving at the target arrival position A1 by moving along the second path R2, the mobile object 10 causes the fork control unit 78 to move the fork 24, and causes the fork 24 to be inserted into the opening Pb to pick up the target object P. The movement control unit 72 causes the mobile object 10 that has picked up the target object P to be conveyed to the set conveyance destination. Herein, the mobile object 10 may successively detect the position of the opening Pb of the target object P while moving along the second path R2, and may move the fork 24 to align the position of the opening Pb with the position of the fork 24 by feedback control. In this case, for example, the fork control unit 78 may move (side-shift) the fork 24 in the lateral direction to align the position of the opening Pb with the position of the fork 24.
However, the second path R2 is not necessarily set based on the position and attitude information on the target object P. For example, the movement control unit 72 may cause the mobile object 10 to move to the target object P along the first path R1 without setting the second path R2.
The first path R1 may be updated by detecting the position and the attitude of the transport vehicle V by the sensor 26 to set the first path R1 with higher accuracy. In this case, for example, the detection control unit 74 causes the sensor 26 to detect the transport vehicle V while the mobile object 10 is moving along the first path R1 (for example, the track R1b), and acquires the position and the attitude of the transport vehicle V based on a detection result of the transport vehicle V obtained by the sensor 26. A method of acquiring the position and the attitude of the transport vehicle V based on the detection result of the sensor 26 may be the same as the method of acquiring the position and the attitude of the target object P. The first path acquisition unit 70 then recalculates the target arrival position A0 based on the acquired position and attitude of the transport vehicle V and the relative target position, sets a track to the recalculated target arrival position A0, updates the first path R1 to the set track, and continues movement along the updated first path R1.
In the present embodiment, the mobile object 10 moves toward the opposite side of the fork 24 as the front until passing through the track R1b, turns back on the track R1c or the second path R2 so that the fork 24 side becomes the front, and moves toward the target object P in a direction opposite to the X-direction. However, the track R1c and the second path R2 are not limited to the track for turning back, but may be a track for causing the mobile object 10 to turn toward the target object P in a direction opposite to the X-direction without turning back. In this case, for example, the mobile object 10 moves while causing the fork 24 to face the front even until passing through the track R1b, and switches the track to the track R1c or the second path R2 while keeping the fork 24 at the front.
Processing Flow
The following describes a processing flow for the movement control system 1 described above based on a flowchart.
As described above, in the present embodiment, before the mobile object 10 starts to move toward the transport vehicle V, the first path R1 is set based on the position of the rear end portion Vb of the transport vehicle V, the attitude of the transport vehicle V, and the length of the transport vehicle V. That is, the first path R1 is a track that is set based on the position and the attitude of the transport vehicle V, so that, even in a case in which the transport vehicle V is parked while being shifted from a reference position in the parking region AR0 (position at which the transport vehicle V is parked in the parking region AR0 without a shift), the mobile object 10 can appropriately approach the transport vehicle V. The first path R1 is set by using the position of the rear end portion Vb, the attitude of the transport vehicle V, and the length of the transport vehicle V, so that the first path R1 can be set with high accuracy while appropriately reflecting the position and the attitude of the transport vehicle V. More specifically, by setting the first path R1 as described above, a change amount of the track can be reduced at the time of switching the track to the second path R2, for example. Accordingly, a large track change such as turning back can be suppressed, and throughput can be improved.
In the above description, exemplified is a case in which the mobile object 10 unloads the target object P mounted on the transport vehicle V, but the mobile object 10 may load the target object P onto the transport vehicle V. In this case, assuming that a position at which the target object P should be loaded is the target position, the mobile object 10 may approach the target arrival position A0 along the first path R1 to the target arrival position A0, which is a point where a predetermined position and attitude are achieved with respect to the target position, without setting the second path R2 based on the positional information on the target object P in the transport vehicle V. The position at which the target object P should be loaded may be set using the same method as the method of setting the target position used for setting the first path R1 in a case of unloading the target object P, that is, based on the positional information on the transport vehicle V and the relative target position, for example. For example, the mobile object 10 may also set the second path R2 in a case of loading the target object P. In this case, for example, the mobile object 10 may cause the sensor 26 to detect the position at which the target object P should be loaded in the transport vehicle V, set the second path R2 to the target arrival position A1 assuming that the target arrival position A1 is the point at which the predetermined position and attitude are achieved with respect to the position at which the target object P should be loaded, and approach the target arrival position A1 along the second path R2. In this case, for example, the sensor 26 may be caused to detect a wall in the storage chamber VA and the position and the attitude of the other target object P, and detect a position distant from the wall and the other target object P by a reference distance in a predetermined direction as the position at which the target object P should be loaded.
Next, the following describes other examples of the present embodiment.
Alternatively, for example, the transport vehicle V may be stopped in a detectable region of the sensor S3, the sensor S3 may be caused to emit laser light from the front end portion Va to the rear end portion Vb of the transport vehicle V while performing scanning, the point group as a set of measuring points may be acquired based on a detection result of the reflected light received by the sensor S3, and the length in the front and rear direction of the transport vehicle V may be calculated based on the point group.
The sensor S4 is disposed on a ceiling of the parking region AR0. The sensor S4 may be an optional sensor that can detect the front end portion Va and the rear end portion Vb of the transport vehicle V, for example, a sensor that emits laser light. In this example, two sensors S4a and S4b are disposed as sensors S4. However, the number of the sensors S4 may be optional.
As illustrated in
The sensor S4a receives reflected light from the surface VBa, which is the laser light emitted by the sensor S4a itself. It can be said that the sensor S4a acquires a point group Q1a of the reflected light from the surface VBa. The point group Q1a is aligned in the scanning direction of the sensor S4a. Similarly, the sensor S4b receives the reflected light from the surface VBa, which is the laser light emitted by the sensor S4b itself, and acquires a point group Q2a of the reflected light from the surface VBa. The point group Q2a is aligned in the scanning direction of the sensor S4b. The vehicle information acquisition unit 54 acquires information on these point groups Q1a and Q2a (information on coordinates of the point groups Q1a and Q2a). The vehicle information acquisition unit 54 reads out information on dimensions of the transport vehicle V from the storage unit 42, for example. The vehicle information acquisition unit 54 calculates the positional information on the transport vehicle based on the information on the point groups Q1a and Q2a and the information on the dimensions of the transport vehicle V. For example, the information on the dimensions of the transport vehicle V is information indicating a ratio between the length in the front and rear direction and the length in the lateral direction of the transport vehicle V, and the position of the rear end portion Vb of the transport vehicle V, the attitude of the transport vehicle V, and the length in the front and rear direction of the transport vehicle V can be calculated based on the ratio between the length in the front and rear direction and the length in the lateral direction of the transport vehicle V and a ratio between the entire length of the arranged point group Q1a and the entire length of the arranged point group Q2a. As described above, in this example, the target object information acquisition unit 54a calculates the position and the orientation of the vehicle V based on the information on the point groups Q1a and Q2a and the information on the dimensions of the vehicle V. The information on the dimensions of the vehicle V can be acquired by using the same method as the method of acquiring the vehicle type information in the embodiment described above, for example.
The surface VBa of the lateral door VB faces the lateral side when the transport vehicle V moves, so that foreign substances such as snow, water, or dust hardly adhere thereto. Thus, lowering of detection accuracy can be suppressed by detecting the surface VBa. A flap for reducing air resistance is disposed between the storage chamber VA and the driver's seat, and it can be considered that detection accuracy at a position of the front end portion Va may be lowered due to the flap. However, the surface VBa is distant from the flap, so that lowering of detection accuracy due to the flap can also be suppressed.
The methods in the embodiment and the other examples described above may be combined with each other to acquire the positional information on the transport vehicle. That is, for example, the position of the rear end portion Vb of the transport vehicle V, the attitude of the transport vehicle V, and the length of the transport vehicle V may be respectively acquired by using different methods, or at least one of them may be acquired by using a method different from the other method.
Alternatively, a moving mechanism for changing a position of the sensor S5 in the Z-direction may be disposed. In this case, for example, the sensor S5 is attached to a pole CL extending in the Z-direction in a manner movable in the Z-direction, and the moving mechanism changes the position of the sensor S5 in the Z-direction so that the sensor S5 can emit laser light onto the target object P in accordance with a height of the load-carrying platform of the vehicle V.
In the description about
Effects
As described above, the control method for the mobile object 10 according to the present embodiment is the control method for the mobile object 10 that automatically moves, and the control method includes: a step of acquiring the positional information on the transport vehicle V parked in the parking region AR0 including the information on the position of the rear end portion Vb of the transport vehicle V, the information on the attitude of the transport vehicle V, and the information on the length of the transport vehicle V; a step of setting the first path R1 toward the transport vehicle V based on the positional information on the transport vehicle V; and a step of causing the mobile object 10 to move along the first path R1. The first path R1 is a track that is set based on the position and the attitude of the transport vehicle V parked in the parking region AR0, so that, even in a case in which the transport vehicle V is parked with a shift in the parking region AR0, the mobile object 10 can appropriately approach the transport vehicle V. The first path R1 is set by using the position of the rear end portion Vb, the attitude of the transport vehicle V, and the length of the transport vehicle V, so that the first path R1 can be set with high accuracy while appropriately reflecting the position and the attitude of the transport vehicle V. More specifically, by setting the first path R1 as described above, a change amount of the track can be reduced at the time of switching the track to the second path R2, for example. Accordingly, a large track change such as turning back can be suppressed, and throughput can be improved.
This control method further includes a step of acquiring the relative target position information indicating the relative position of the target position with respect to the transport vehicle V using a method other than the method of directly detecting the target position in the transport vehicle V with the sensor. At the step of setting the first path R1, the first path R1 toward the target position is preferably set based on the positional information on the transport vehicle V and the relative target position information. According to this control method, the first path R1 is set by also using the relative target position in the transport vehicle V, so that the mobile object 10 can appropriately approach the target position in the transport vehicle V.
At the step of acquiring the relative target position information, the relative target position information is preferably acquired based on the information indicating the number of the target objects P disposed in the transport vehicle V. According to this control method, the relative target position information is set based on the number of the target objects P disposed in the transport vehicle V, so that the mobile object 10 can appropriately approach the target position in the transport vehicle V.
At the step of acquiring the relative target position information, the relative target position information set in advance is preferably acquired. According to this control method, the first path R1 is set by also using the relative target position, so that the mobile object 10 can appropriately approach the target position in the transport vehicle V.
At the step of acquiring the positional information on the transport vehicle, the position of the rear end portion Vb and the attitude of the transport vehicle V are preferably acquired by causing the sensor S1 disposed in the facility W in which the parking region AR0 is disposed to detect the rear end portion Vb of the transport vehicle V. In this way, by acquiring the position of the rear end portion Vb and the attitude of the transport vehicle V, the first path R1 can be set with high accuracy while appropriately reflecting the position and the attitude of the transport vehicle V.
At the step of acquiring the positional information on the transport vehicle, it is preferable to acquire the vehicle type information indicating the vehicle type of the transport vehicle V, and acquire the information on the length of the transport vehicle V based on the vehicle type information. By acquiring the information on the length of the transport vehicle V using the vehicle type information, the first path R1 can be appropriately set.
At the step of acquiring the positional information on the transport vehicle, it is preferable to acquire the vehicle type information by causing the sensor S2 disposed in the facility W in which the parking region AR0 is disposed to detect the point indicating the vehicle type of the transport vehicle V, and acquire the information on the length of the transport vehicle V based on the relation information indicating the relation between the vehicle type information and the length of the vehicle and the acquired vehicle type information. By acquiring the information on the length of the transport vehicle V using the vehicle type information, the first path R1 can be appropriately set.
At the step of acquiring the positional information on the transport vehicle, it is preferable to acquire the information on the length of the transport vehicle V by causing the sensor S3 disposed in the facility W in which the parking region AR0 is disposed to detect the rear end portion Vb and the front end portion Va of the transport vehicle V. By acquiring the information on the length of the transport vehicle V using the sensor S3, the first path R1 can be appropriately set.
The transport vehicle V is in a state in which the lateral doors VB are opened so that the surface VBa of the lateral door VB faces upward in a vertical direction in the parking region AR0. At the step of acquiring the positional information on the transport vehicle V, the positional information on the transport vehicle V is acquired by causing the sensor S4 disposed on the ceiling of the parking region AR0 to detect the lateral door VB. By acquiring the positional information on the transport vehicle V using the sensor S3, the first path R1 can be appropriately set.
The mobile object 10 according to the present disclosure is an object that automatically moves, and includes the first path acquisition unit 70 and the movement control unit 72. The first path acquisition unit 70 acquires the first path R1 toward the transport vehicle V that is set based on the positional information on the transport vehicle V parked in the parking region AR0 including the information on the position of the rear end portion Vb of the transport vehicle V, the information on the attitude of the transport vehicle V, and the information on the length of the transport vehicle V. The movement control unit 72 causes the mobile object 10 to move along the first path R1. The mobile object 10 moves along the first path R1, so that the mobile object 10 can appropriately approach the transport vehicle V.
The embodiments of the present disclosure have been described above, but the embodiments are not limited thereto. The constituent elements described above encompass a constituent element that is easily conceivable by those skilled in the art, substantially the same constituent element, and what is called an equivalent. Additionally, the constituent elements described above can be appropriately combined with each other. Furthermore, the constituent elements can be variously omitted, replaced, or modified without departing from the gist of the embodiments described above.
Number | Date | Country | Kind |
---|---|---|---|
2021-214918 | Dec 2021 | JP | national |