The present disclosure relates to location estimation systems and vehicles including the location estimation systems.
Vehicles capable of autonomous movement, such as automated guided vehicles (or automated guided cars) and mobile robots, are under development.
Japanese Laid-Open Patent Publication No. 2008-250905 discloses a mobile robot that performs localization by matching a preliminarily prepared map against a local map acquired from a laser range finder.
In carrying out matching, the mobile robot disclosed in Japanese Laid-Open Patent Publication No. 2008-250905 removes unnecessary points from an environmental map so as to estimate its own position.
Example embodiments of the present disclosure provide location estimation systems and vehicles that are each able to reduce an amount of computation in generating a map.
In a non-limiting and illustrative example embodiment of the present disclosure, a location estimation system is used by being connected to an external sensor to scan an environment so as to periodically output scan data. The location estimation system includes a processor and a memory to store a computer program to operate the processor. The processor performs, in accordance with a command included in the computer program acquiring the scan data from the external sensor so as to generate a reference map from the scan data, executing, upon newly acquiring the scan data from the external sensor, matching of newly acquired latest scan data against the reference map so as to estimate a location and an attitude of the external sensor on the reference map and add the latest scan data to the reference map so that the reference map is updated, removing, from the reference map that has been updated a plurality of times, a portion thereof other than a portion including the latest scan data so as to reset the reference map, and updating, in resetting the reference map, an environmental map in accordance with the reference map that has been updated a plurality of times before the resetting.
In a non-limiting and illustrative example embodiment according to the present disclosure, a vehicle includes the location estimation system, an external sensor, a storage to store the environmental map generated by the location estimation system, and a driver.
In a non-limiting and illustrative example embodiment according to the present disclosure, a non-transitory computer readable medium includes a computer program to be used in any one of the location estimation systems described above.
According to example embodiments of the present disclosure, it is possible to execute matching of a plurality of pieces of scan data, which are periodically output from an external sensor, with a small amount of computation in generating an environmental map.
The above and other elements, features, steps, characteristics and advantages of the present disclosure will become more apparent from the following detailed description of the example embodiments with reference to the attached drawings.
The term “automated guided vehicle” (AGV) refers to an unguided vehicle that has cargo loaded on its body manually or automatically, performs automated travel to a designated place, and then has the cargo unloaded manually or automatically. The term “automated guided vehicle” encompasses an unmanned tractor unit and an unmanned forklift.
The term “unmanned” refers to the absence of need for a person to steer a vehicle, and does not preclude an automated guided vehicle from carrying a “person (who loads/unloads cargo, for example)”.
The term “unmanned tractor unit” refers to an unguided vehicle that performs automated travel to a designated place while towing a car on which cargo is loaded manually or automatically and from which cargo is unloaded manually or automatically.
The term “unmanned forklift” refers to an unguided vehicle that includes a mast for raising and lowering, for example, a fork for cargo transfer, automatically transfers cargo on, for example, the fork, and performs automated travel to a designated place so as to perform an automatic cargo-handling operation.
The term “unguided vehicle” refers to a vehicle including a wheel and an electric motor or an engine to rotate the wheel.
The term “vehicle” refers to a device that moves, while carrying a person or cargo on board, the device including a driving unit (such as a wheel, a two-legged or multi-legged walking device, or a propeller) to produce a traction for movement. The term “vehicle” according to the present disclosure encompasses not only an automated guided vehicle in a strict sense but also a mobile robot, a service robot, and a drone.
The term “automated travel” encompasses travel based on a command from an operation management system of a computer to which an automated guided vehicle is connected via communications, and autonomous travel effected by a controller included in an automated guided vehicle. The term “autonomous travel” encompasses not only travel of an automated guided vehicle to a destination along a predetermined path but also travel that follows a tracking target. An automated guided vehicle may temporarily perform manual travel that is based on an instruction from an operator. The term “automated travel” usually refers to both of travel in a “guided mode” and travel in a “guideless mode”. In the present disclosure, however, the term “automated travel” refers to travel in a “guideless mode”.
The term “guided mode” refers to a mode that involves placing guiding objects continuously or continually, and guiding an automated guided vehicle by using the guiding objects.
The term “guideless mode” refers to a mode that involves guiding without placing any guiding objects. The automated guided vehicle according to an example embodiment of the present disclosure includes a localization device and is thus able to travel in a guideless mode.
The term “location estimation device” refers to a device to estimate a location of the device itself on an environmental map in accordance with sensor data acquired by an external sensor, such as a laser range finder.
The term “external sensor” refers to a sensor to sense an external state of a vehicle. Examples of such an external sensor include a laser range finder (which may also be referred to as a “laser range scanner”), a camera (or an image sensor), light detection and ranging (LIDAR), a millimeter wave radar, an ultrasonic sensor, and a magnetic sensor.
The term “internal sensor” refers to a sensor to sense an internal state of a vehicle. Examples of such an internal sensor include a rotary encoder (which may hereinafter be simply referred to as an “encoder”), an acceleration sensor, and an angular acceleration sensor (e.g., a gyroscope sensor).
The term “SLAM” is an abbreviation for Simultaneous Localization and Mapping and refers to simultaneously carrying out localization and generation of an environmental map.
See
The external sensor 102 performs environmental scanning, for example, on an environment in the range of 135 degrees to the right and to the left (which is 270 degrees in total) with respect to the front surface of the external sensor 102. Specifically, the external sensor 102 emits pulsed laser beams while changing the direction of each laser beam for each predetermined step angle within a horizontal plane, and then detects reflected light of each laser beam so as to measure a distance. A step angle of 0.3 degrees allows to obtain measurement data on a distance to a point of reflection in a direction determined by an angle corresponding to a total of 901 steps. In this example, the external sensor 102 scans its surrounding space in a direction substantially parallel to the floor surface, which means that the external sensor 102 performs planar (or two-dimensional) scanning. The external sensor, however, may perform three-dimensional scanning.
A typical example of scan data may be expressed by position coordinates of each point included in a point cloud acquired for each round of scanning. The position coordinates of each point are defined by a local coordinate system that moves together with the vehicle 10. Such a local coordinate system may be referred to as a “vehicle coordinate system” or a “sensor coordinate system”. In the present disclosure, the origin point of the local coordinate system fixed to the vehicle 10 is defined as the “location” of the vehicle 10, and the orientation of the local coordinate system is defined as the “attitude” of the vehicle 10. The location and attitude may hereinafter be collectively referred to as a “pose”.
When represented by a polar coordinate system, scan data may include a numerical value set that indicates the location of each point by the “direction” and “distance” from the origin point of the local coordinate system. An indication based on a polar coordinate system may be converted into an indication based on an orthogonal coordinate system. The following description assumes that scan data output from the external sensor is represented by an orthogonal coordinate system, for the sake of simplicity.
The vehicle 10 includes a storage device 104 to store an environmental map, and a location estimation system 115.
The location estimation system 115 is used by being connected to the external sensor 102. The location estimation system 115 includes a processor 106 and a memory 107 storing a computer program to control the operation of the processor.
The location estimation system 115 matches the scan data acquired from the external sensor 102 against the environmental map read from the storage device 104 so as to estimate the location and attitude (i.e., the pose) of the vehicle 10. This matching may be referred to as “pattern matching” or “scan matching” and may be executed in accordance with various algorithms. A typical example of a matching algorithm is an iterative closest point (ICP) algorithm.
The location estimation system 115 matches the scan data acquired from the external sensor 102 with the environmental map read from the storage device 104 so as to estimate the location and attitude (i.e., the pose) of the vehicle 10. This matching may be referred to as “pattern matching” or “scan matching” and may be executed in accordance with various algorithms. A typical example of a matching algorithm is an iterative closest point (ICP) algorithm.
As will be described below, the location estimation system 115 performs matching of a plurality of pieces of scan data output from the external sensor 102 so that the plurality of pieces of scan data are aligned and linked with each other, thus generating an environmental map.
The location estimation system 115 according to an example embodiment of the present disclosure is implemented by the processor 106 and the memory 107 storing the computer program to operate the processor 106. In accordance with a command included in the computer program, the processor 106 performs the following operation:
(1) acquiring scan data from the external sensor 102 so as to generate a reference map from the scan data;
(2) executing, upon newly acquiring the scan data from the external sensor 102, matching of the newly acquired latest scan data with the reference map so as to estimate a location and an attitude of the external sensor 102 (i.e., the location and attitude of the vehicle 10) on the reference map and add the latest scan data to the reference map so that the reference map is updated;
(3) removing, from the reference map that has been updated a plurality of times, a portion thereof other than a portion including the latest scan data so as to reset the reference map; and
(4) updating, in resetting the reference map, the environmental map in accordance with the reference map that has been updated a plurality of times before the resetting.
The above operation will be described in more detail below.
In the illustrated example, the vehicle 10 further includes a driving unit 108, an automated travel control unit 110, and a communication circuit 112. The driving unit 108 is a unit to generate a traction necessary for the vehicle 10 to move. Examples of the driving unit 108 include a wheel (or a driving wheel) to be rotated by an electric motor or an engine, and a two-legged or multi-legged walking device to be actuated by a motor or other actuator. The wheel may be an omnidirectional wheel, such as a Mecanum wheel. The vehicle 10 may be a vehicle that moves in the air or water, or a hovercraft. The driving unit 108 in this case includes a propeller to be rotated by a motor.
The automated travel control unit 110 operates the driving unit 108 so as to control conditions (such as velocity, acceleration, and the direction of movement) for movement of the vehicle 10. The automated travel control unit 110 may move the vehicle 10 along a predetermined traveling path, or move the vehicle 10 in accordance with a command provided from outside. When the vehicle 10 is in motion or at rest, the location estimation system 115 calculates an estimated value of the location and attitude of the vehicle 10. The automated travel control unit 110 controls the travel of the vehicle 10 by referring to the estimated value.
The location estimation system 115 and the automated travel control unit 110 may be collectively referred to as a “travel control unit 120”. Together with the location estimation system 115, the automated travel control unit 110 may include the processor 106 and the memory 107 storing the computer program to control the operation of the processor 106. The processor 106 and the memory 107 just mentioned may be implemented by one or more semiconductor integrated circuits.
The communication circuit 112 is a circuit through which the vehicle 10 is connected to an external management device, another vehicle(s), or a communication network (which includes, for example, a mobile terminal of an operator) so as to exchange data and/or commands therewith.
In
Scan data acquired by the external sensor 102 of the vehicle 10 has different point cloud arrangements when the vehicle 10 is at a location PA, a location PB, and a location PC illustrated in
When the latest scan data and the immediately preceding scan data, which are sequentially output from the external sensor 102, are similar to each other, matching will be relatively easily performed. This means that highly reliable matching is expected to be finished in a short period of time. When the moving velocity of the vehicle 10 is relatively high, however, the latest scan data may not be similar to the immediately preceding scan data. This may increase the time required for matching or may prevent matching from being completed within a predetermined period of time.
In the present specification, a period during which the location estimation system 115 acquires scan data from the external sensor 102 is represented as Δt. For example, Δt is 200 milliseconds. During movement of the vehicle 10, contents of the scan data periodically acquired from the external sensor 102 may change.
When the period Δt is, for example, 200 milliseconds, movement of the vehicle 10 at a speed of one meter per second causes the vehicle 10 to move by about 20 centimeters during the period Δt. Usually, movement of the vehicle 10 by about 20 centimeters does not cause a great change in the environment for the vehicle 10. Therefore, an environment scanned at the time t+Δt by the external sensor 102 and an environment scanned at the time t by the external sensor 102 include a wide overlapping area. Accordingly, a point cloud in the scan data SD (t) and a point cloud in the scan data SD (t+Δt) include a large number of corresponding points.
Thus, linking a plurality of pieces of periodically acquired scan data, i.e., the scan data SD (t), SD (t+Δt), . . . , and SD (t+N×Δt), makes it possible to generate a local environmental map (or reference map). In this example, N is an integer equal to or greater than 1.
In this example, the previously acquired scan data SD (a) constitutes a “reference map RM”. The reference map RM is a portion of an environmental map that is being generated. Matching is executed such that the location and orientation of the latest scan data SD (b) are aligned with the location and orientation of the previously acquired scan data SD (a).
Executing such matching makes it possible to know the location and attitude of the vehicle 10b on the reference map RM. After completion of matching, the scan data SD (b) is added to the reference map RM so that the reference map RM is updated.
The coordinate system of the scan data SD (b) is linked to the coordinate system of the scan data SD (a). This link is represented by a matrix that defines rotational and translational transformation (or rigid transformation) for the two coordinate systems. Such a transformation matrix makes it possible to convert the coordinate values of each point on the scan data SD (b) into coordinate values in the coordinate system of the scan data SD (a).
Because the reference map RM is thus sequentially updated, the number of points included in the reference map RM increases each time scanning is performed by the external sensor 102. This causes an increase in the amount of computation when the latest scan data is to be matched against the reference map RM. For example, suppose that a piece of scan data includes about 1000 points at the most. In this case, when one reference map RM is generated by connecting 2000 pieces of scan data together, the number of points included in this reference map RM will reach about two millions at the most. When matching involves finding corresponding points and iterating matching computations, this matching may not be completed within the period Δt (which is a scanning period) if the point cloud of the reference map RM is too large.
The location estimation system according to the present disclosure removes, from a reference map that has been updated a plurality of times, a portion thereof other than a portion including the latest scan data, thus resetting the reference map. The location estimation system according to the present disclosure updates, in resetting the reference map, an environmental map in accordance with the reference map that has been updated a plurality of times before the resetting. Thus, the environmental map itself is able to retain the environmental information obtained by scanning, rather than losing it.
The reference map is resettable, for example, (i) when the number of times the reference map is updated has reached a predetermined number of times, (ii) when the data volume of the reference map has reached a predetermined volume, or (iii) when a lapse of time from the preceding resetting has reached a predetermined length. The “predetermined number of times” in the case (i) may be, for example, 100 times. The “predetermined volume” in the case (ii) may be, for example, 10000. The “predetermined length” in the case (iii) may be, for example, five minutes.
Minimizing the data volume of the reference map after resetting preferably involves leaving only the latest scan data (i.e., data acquired by a single round of the newest scanning at the time of resetting) and removing the other scan data. When the number of points included in the latest scan data is equal to or smaller than a predetermined value, not only the latest scan data but also a plurality of pieces of scan data obtained near the present time may be included in the reference map after resetting, thus to enhance the matching precision after resetting.
In generating a reference map from a plurality of pieces of scan data, an increase in the density of points per unit area of a point cloud that exceeds a predetermined value may result in a waste in matching. For example, if a large number of points (or measurement points) are present in a rectangular region having a size of 10×10 cm2 in an environment, the matching precision may not improve sufficiently in proportion to the rate of increase in the amount of computation required for matching, and may thus level off. To reduce or eliminate such a waste, when the density of a point cloud included in scan data and/or a reference map has exceeded a predetermined density, some points may be removed from the point cloud so that the density of the point cloud is reduced to or below the predetermined density. The “predetermined density” may be, for example, 1/(10 cm)2.
Repeating the process of updating the environmental map M in the above-described manner will eventually finalize the environmental map M. The environmental map thus generated is then used for localization, during movement of the vehicle 10.
In an example embodiment of the present disclosure, two types of methods may be adopted in determining the initial values.
A first method involves measuring, by using odometry, the amount of change from the location and attitude estimated by the preceding matching. When the vehicle 10 moves with two driving wheels, for example, a moving amount of the vehicle 10 and the direction of movement of the vehicle 10 are determinable by an encoder attached to each of the driving wheels or motor(s) thereof. Because methods that use odometry are known in the art, it would not be necessary to go into any further details.
A second method involves predicting the current location and attitude in accordance with a history of estimated values of locations and attitudes of the vehicle 10. The following description will focus on this point.
Referring to
An example embodiment of the present disclosure involves calculating predicted values of the current location and attitude from a history of locations and attitudes obtained in the past by the location estimation device.
An location and an attitude of the vehicle obtained by the preceding matching are defined as (xi−1, yi−1, θi−1). An location and an attitude of the vehicle obtained by matching previous to the preceding matching are defined as (xi−2, yi−2, θi−2). Predicted values of the current location and attitude of the vehicle are defined as (xi, yi, θi). Thus, the following assumptions are made.
Assumption 1: The time required for movement from the location (xi−1, yi−1) to the location (xi, yi) is equal to the time required for movement from the location (xi−2, yi−2) to the location
Assumption 2: The moving velocity during the movement from the location (xi−1, yi−1) to the location (xi, yi) is equal to the moving velocity during the movement from the location (xi−2, yi−2) to the location (xi−1, yi−1).
Assumption 3: A change in the attitude (or orientation) of the vehicle that is represented as θi−θi−1 is equal to Δθ (where Δθ=θyi−θi−1).
Based on these assumptions, Eq. 1 below is established.
As mentioned above, Δθ is equal to θyi−θi−1. For the attitude (or orientation) of the vehicle, the relationship represented by Eq. 2 below is established based on Assumption 3.
θi=θi−1+Δθ [Eq. 2]
Making an approximation such that Δθ is zero may simplify calculation of a matrix on the second term of the right side of Eq. 2 as a unit matrix.
If Assumption 1 is not satisfied, the time required for movement from the location (xi−1, yi−1) to the location (xi, yi) is defined as Δt, and the time required for movement from the location (xi−2, yi−2) to the location (xi−1, yi−1) is defined as Δs. In this case, (xi−1−xi−2) and (yi−1−yi−2) on the right side of Eq. 1 may each be corrected by being multiplied by Δt/Δs, and Δθ in the matrix on the right side of Eq. 1 may be corrected by being multiplied by Δt/Δs.
Referring to
First,
In step S10, the processor 106 of the location estimation system 115 acquires the latest (or current) scan data from the external sensor 102.
In step S12, the processor 106 acquires values of the current location and attitude by odometry.
In step S14, the processor 106 performs initial positioning of the latest scan data with respect to a reference map by using, as initial values, values of the current location and attitude acquired by odometry.
In step S16, the processor 106 makes positional gap correction by using an ICP algorithm.
In step S18, the processor 106 adds the latest scan data to the existing reference map so as to update the reference map.
In Step S20, it is determined whether the reference map satisfies an updating requirement. As previously mentioned, the updating requirement is determined to be satisfied, for example, (i) when the number of times the reference map is updated has reached the predetermined number of times, (ii) when the data volume of the reference map has reached the predetermined volume, or (iii) when a lapse of time from the preceding resetting has reached the predetermined length. When the answer is No, the process returns to step S10 so as to acquire next scan data. When the answer is Yes, the process goes to step S22.
In step S22, the processor 106 updates an environmental map in accordance with the reference map that has been updated a plurality of times.
In step S24, the processor 106 removes, from the reference map that has been updated a plurality of times, a portion thereof other than a portion including the latest scan data so as to reset the reference map. This makes it possible to reduce the number and density of points in a point cloud included in the reference map.
Next, referring to
First, in step S32, the processor 106 searches two point clouds for corresponding points. Specifically, the processor 106 selects points on the environmental map, each corresponding to an associated one of points of a point cloud included in scan data.
In step S34, the processor 106 performs rotational and translational rigid transformation (i.e., coordinate transformation) for the scan data so that distances between the corresponding points of the scan data and the environmental map are reduced. This is synonymous to optimizing parameters of a coordinate transformation matrix so that a sum total (or square sum) of the distances between the corresponding points (i.e., the errors between the corresponding points) is reduced. This optimization is performed by iterative calculations.
Step S36 the processor 106 determines whether results of the iterative calculations have converged. Specifically, the processor 106 determines that they have converged when a decrement in the sum total (or square sum) of the errors between the corresponding points remains below a predetermined value even if the parameters of the coordinate transformation matrix are changed. When they have not yet converged, the process returns to step S32, and the processor 106 repeats the process beginning from making a search for corresponding points. When the results of iterative calculations are determined to have converged in step S36, the process goes to step S38.
In step S38, by using the coordinate transformation matrix, the processor 106 converts coordinate values of the scan data from values of the sensor coordinate system into values of the coordinate system of the environmental map. The coordinate values of the scan data thus obtained are usable to update the environmental map.
Referring now to
The procedure of
With the flow of
The location estimation system according to the present disclosure does not need to be used by being installed on a vehicle including a driving unit. The location estimation system according to the present disclosure may be used for map generation by being installed, for example, on a handcart to be thrusted by a user.
An example embodiment of the vehicle including the location estimation system according to the present disclosure will be described below in more detail. In the present example embodiment, an automated guided vehicle will be used as an example of the vehicle. In the following description, the automated guided vehicle will be abbreviated as “AGV”. The “AGV” will hereinafter be identified by the reference sign “10” similarly to the vehicle 10.
The vehicle 10 is an automated guided car that is able to travel in a “guideless mode” that requires no guiding object, such as a magnetic tape, for travel. The AGV 10 is able to perform localization and transmit estimation results to the terminal device 20 and the operation management device 50. The AGV 10 is able to perform automated travel in an environment S in accordance with a command from the operation management device 50.
The operation management device 50 is a computer system that tracks the location of each AGV 10 and manages the travel of each AGV 10. The operation management device 50 may be a desktop PC, a notebook PC, and/or a server computer. The operation management device 50 communicates with each AGV 10 through a plurality of access points 2. For example, the operation management device 50 transmits, to each AGV 10, data on the coordinates of the next destination for each AGV 10. Each AGV 10 transmits, to the operation management device 50, data indicative of the location and attitude (or orientation) of each AGV 10 at regular time intervals (e.g., for every 250 milliseconds). When the AGV 10 has reached the designated location, the operation management device 50 transmits data on the coordinates of the next destination to the AGV 10. Each AGV 10 may be able to travel in the environment S in accordance with an operation input to the terminal device 20 by the user 1. An example of the terminal device 20 is a tablet computer.
The AGV 10 is able to not only convey cargo placed on its top but also convey cargo by using a trailer unit connected to the AGV 10.
The AGV 10 may be coupled to the trailer unit 5 by any method. An example of the coupling method will be described below. A plate 6 is secured to the top of the AGV 10. The trailer unit 5 is provided with a guide 7 including a slit. The AGV 10 approaches the trailer unit 5 so that the plate 6 is inserted into the slit of the guide 7. Upon completion of the insertion, the AGV 10 has an electromagnetic lock pin (not shown) passed through the plate 6 and the guide 7 and activates an electromagnetic lock. The AGV 10 and the trailer unit 5 are thus physically coupled to each other.
Refer again to
A map of the environment S is generated so that each AGV 10 is able to travel while estimating its own location. Each AGV 10 is equipped with a location estimation device and an LRF and is thus able to generate a map by using an output from the LRF.
Each AGV 10 shifts to a data acquisition mode in response to an operation performed by a user. In the data acquisition mode, each AGV 10 starts acquiring sensor data (i.e., scan data) by using the LRF. The subsequent processes are performed as described above.
Movement within the environment S for acquisition of sensor data may be enabled by travel of each AGV 10 in accordance with an operation performed by the user. For example, each AGV 10 wirelessly receives, from the user through the terminal device 20, a travel command that instructs each AGV 10 to move in each of the front/rear/right/left directions. Each AGV 10 travels in the front/rear/right/left directions in the environment S in accordance with the travel command so as to generate a map. When each AGV 10 is connected by wire to an operating device, such as a joystick, each AGV 10 may travel in the front/rear/right/left directions in the environment S in accordance with a control signal from the operating device so as to generate a map. A person may walk while pushing a measuring car equipped with an LRF, thus acquiring sensor data.
Although
Upon generation of the map, each AGV 10 is able to, from then on, perform automated travel while estimating its own location using the map.
The travel control unit 14 is a unit to control the operation of the AGV 10. The travel control unit 14 includes an integrated circuit whose main component is a microcontroller (which will be described below), an electronic component(s), and a substrate on which the integrated circuit and the electronic component(s) are mounted. The travel control unit 14 receives and transmits data from and to the terminal device 20 described above and performs preprocessing computations.
The LRF 15 is an optical instrument that emits, for example, infrared laser beams 15a and detects reflected light of each laser beam 15a, thus measuring a distance to a point of reflection. In the present example embodiment, the LRF 15 of the AGV 10 emits the laser beams 15a in a pulsed form to, for example, a space in the range of 135 degrees to the right and to the left (for a total of 270 degrees) with respect to the front surface of the AGV 10 while changing the direction of each laser beam 15a by every 0.25 degrees, and detects reflected light of each laser beam 15a. This makes it possible to obtain, for every 0.25 degrees, data on a distance to a point of reflection in a direction determined by an angle corresponding to a total of 1081 steps. In the present example embodiment, the LRF 15 scans its surrounding space in a direction substantially parallel to a floor surface, which means that the LRF 15 performs planar (or two-dimensional) scanning. The LRF 15, however, may perform scanning in a height direction.
The AGV 10 is able to generate a map of the environment S in accordance with the location and attitude (or orientation) of the AGV 10 and scanning results obtained by the LRF 15. The map may be reflective of the location(s) of a structure(s), such as a wall(s) and/or a pillar(s) around the AGV, and/or an object(s) placed on a floor. Data on the map is stored in a storage device provided in the AGV 10.
The location and attitude, i.e., the pose (x, y, θ), of the AGV 10 may hereinafter be simply referred to as a “location”.
The travel control unit 14 compares measurement results obtained by the LRF 15 with map data retained in itself so as to estimate its own current location in the manner described above. The map data may be map data generated by the other AGV(s) 10.
The AGV 10 includes the travel control unit 14, the LRF 15, two motors 16a and 16b, a driving unit 17, and the wheels 11a and 11b.
The travel control unit 14 includes a microcontroller 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a location estimation device 14e. The microcontroller 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the location estimation device 14e are connected to each other through a communication bus 14f and are thus able to exchange data with each other. The LRF 15 is also connected to the communication bus 14f through a communication interface (not shown) and thus transmits measurement data (which is measurement results) to the microcontroller 14a, the location estimation device 14e, and/or the memory 14b.
The microcontroller 14a is a processor or a control circuit (e.g., a computer) that performs computations to control the entire AGV 10 including the travel control unit 14. The microcontroller 14a is typically a semiconductor integrated circuit. The microcontroller 14a transmits a pulse width modulation (PWM) signal (which is a control signal) to the driving unit 17 and thus controls the driving unit 17 so as to adjust voltages to be applied to the motors. This rotates each of the motors 16a and 16b at a desired rotation speed.
One or more control circuits (e.g., one or more microcontrollers) to control driving of the left motor 16a and the right motor 16b may be provided independently of the microcontroller 14a. For example, the driving unit 17 may include two microcontrollers each of which controls driving of an associated one of the motors 16a and 16b.
The memory 14b is a volatile storage device to store a computer program to be executed by the microcontroller 14a. The memory 14b may also be used as a working memory when the microcontroller 14a and the location estimation device 14e perform computations.
The storage device 14c is a non-volatile semiconductor memory device. Alternatively, the storage device 14c may be a magnetic storage medium, such as a hard disk, or an optical storage medium, such as an optical disc. The storage device 14c may include a head device to write and/or read data to and/or from any of the storage media, and a controller for the head device.
The storage device 14c stores: environmental map M on the environment S in which the AGV 10 travels; and data on one or a plurality of traveling paths (i.e., traveling path data R). The environmental map M is generated by operating the AGV 10 in a map generating mode and stored in the storage device 14c. The traveling path data R is transmitted from outside after the environmental map M is generated. In the present example embodiment, the environmental map M and the traveling path data R are stored in the same storage device 14c. Alternatively, the environmental map M and the traveling path data R may be stored in different storage devices.
An example of the traveling path data R will be described below.
When the terminal device 20 is a tablet computer, the AGV 10 receives, from the tablet computer, the traveling path data R indicative of a traveling path(s). The traveling path data R in this case includes marker data indicative of the locations of a plurality of markers. The “markers” indicate locations (or passing points) to be passed by the traveling AGV 10. The traveling path data R includes at least location information on a start marker indicative of a travel start location and an end marker indicative of a travel end location. The traveling path data R may further include location information on a marker(s) indicative of one or more intermediate passing points. Supposing that a traveling path includes one or more intermediate passing points, a path extending from the start marker and sequentially passing through the travel passing points so as to reach the end marker is defined as a “traveling path”. Data on each marker may include, in addition to coordinate data on the marker, data on the orientation (or angle) and traveling velocity of the AGV 10 until the AGV 10 moves to the next marker. When the AGV 10 temporarily stops at the location of each marker, performs localization, and provides, for example, notification to the terminal device 20, the data on each marker may include data on acceleration time required for acceleration to reach the traveling velocity, and/or deceleration time required for deceleration from the traveling velocity so as to stop at the location of the next marker.
Instead of the terminal device 20, the operation management device 50 (e.g., a PC and/or a server computer) may control movement of the AGV 10. In this case, each time the AGV 10 reaches a marker, the operation management device 50 may instruct the AGV 10 to move to the next marker. From the operation management device 50, for example, the AGV 10 receives, in the form of the traveling path data R of a traveling path(s), coordinate data of a target location (which is the next destination) or data on a distance to the target location and an angle at which the AGV 10 should travel.
The AGV 10 is able to travel along the stored traveling path(s) while estimating its own location using the generated map and the sensor data acquired during travel and output from the LRF 15.
The communication circuit 14d is, for example, a wireless communication circuit to perform wireless communication compliant with Bluetooth (registered trademark) standards and/or Wi-Fi (registered trademark) standards. The Bluetooth standards and Wi-Fi standards both include a wireless communication standard that uses a frequency band of 2.4 GHz. For example, in a mode of generating a map by running the AGV 10, the communication circuit 14d performs wireless communication compliant with Bluetooth (registered trademark) standards so as to communicate with the terminal device 20 on a one-to-one basis.
The location estimation device 14e performs the process of generating a map and the process of estimating, during travel, its own location. The location estimation device 14e generates a map of the environment S in accordance with the location and attitude of the AGV 10 and scanning results obtained by the LRF. During travel, the location estimation device 14e receives sensor data from the LRF 15 and reads the environmental map M stored in the storage device 14c. Local map data (or sensor data) generated from the scanning results obtained by the LRF 15 is matched against the environmental map M covering a larger range, thus identifying its own location (x, y, θ) on the environmental map M. The location estimation device 14e generates data on “reliability” indicative of the degree of agreement between the local map data and the environmental map M. The respective data of its own location (x, y, θ) and reliability may be transmitted from the AGV 10 to the terminal device 20 or the operation management device 50. The terminal device 20 or the operation management device 50 is able to receive the respective data of its own location (x, y, θ) and reliability and present the location (x, y, θ) and the data on a display device built into the terminal device 20 or the operation management device 50 or connected thereto.
In the present example embodiment, the microcontroller 14a and the location estimation device 14e are separate components by way of example. Alternatively, a single chip circuit or semiconductor integrated circuit that enables the microcontroller 14a and the location estimation device 14e to operate independently may be provided.
The two motors 16a and 16b are each attached to an associated one of the two wheels 11a and 11b so that each wheel is rotated. In other words, each of the two wheels 11a and 11b is a driving wheel. Each of the motors 16a and 16b is described herein as a motor to drive an associated one of the right and left wheels of the AGV 10.
The vehicle 10 may further include a rotary encoder to measure rotational positions and rotational speeds of the wheels 11a and 11b. The microcontroller 14a may estimate the location and attitude of the vehicle 10 by using not only a signal received from the location estimation device 14e but also a signal received from the rotary encoder.
The driving unit 17 includes motor driving circuits 17a and 17b to adjust voltages to be applied to the two motors 16a and 16b. The motor driving circuits 17a and 17b each include an “inverter circuit”. The motor driving circuits 17a and 17b each turn on and off a current flowing through an associated one of the motors by a PWM signal transmitted from the microcontroller 14a or a microcontroller in the motor driving circuit 17a, thus adjusting a voltage to be applied to an associated one of the motors.
The laser positioning system 14h includes the location estimation device 14e and the LRF 15. The location estimation device 14e and the LRF 15 are connected through, for example, an Ethernet (registered trademark) cable. The location estimation device 14e and the LRF 15 each operate as described above. The laser positioning system 14h outputs information indicative of the pose (x, y, θ) of the AGV 10 to the microcontroller 14a.
The microcontroller 14a includes various general-purpose I/O interfaces or general-purpose input and output ports (not shown). The microcontroller 14a is directly connected through the general-purpose input and output ports to other components in the travel control unit 14, such as the communication circuit 14d and the laser positioning system 14h.
The configuration of
The AGV 10 according to an example embodiment of the present disclosure may include safety sensors, such as an obstacle detecting sensor and a bumper switch (not shown).
The CPU 51, the memory 52, the location DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected to each other through a communication bus 57 and are thus able to exchange data with each other.
The CPU 51 is a signal processing circuit (computer) to control the operation of the operation management device 50. The CPU 51 is typically a semiconductor integrated circuit.
The memory 52 is a volatile storage device to store a computer program to be executed by the CPU 51. The memory 52 may also be used as a working memory when the CPU 51 performs computations.
The location DB 53 stores location data indicative of each location that may be a destination for each AGV 10. The location data may be represented, for example, by coordinates virtually set in a factory by an administrator. The location data is determined by the administrator.
The communication circuit 54 performs wired communication compliant with, for example, Ethernet (registered trademark) standards. The communication circuit 54 is connected by wire to the access points 2 (
The map DB 55 stores data on maps of the inside of, for example, a factory where each AGV 10 travels. When the maps each have a one-to-one corresponding relationship with the location of an associated one of the AGVs 10, the data may be in any format. The maps stored, for example, in the map DB 55 may be maps generated by CAD.
The location DB 53 and the map DB 55 may be generated on a non-volatile semiconductor memory, a magnetic storage medium, such as a hard disk, or an optical storage medium, such as an optical disc.
The image processing circuit 56 is a circuit to generate data on an image to be presented on a monitor 58. The image processing circuit 56 is operated exclusively when the administrator operates the operation management device 50. In the present example embodiment, we will not go into any further details on this point. The monitor 58 may be integral with the operation management device 50. The CPU 51 may perform the processes to be performed by the image processing circuit 56.
In the foregoing example embodiments, an AGV that travels in a two-dimensional space (e.g., on a floor surface) has been described by way of example. The present disclosure, however, may be applicable to a vehicle that moves in a three-dimensional space, such as a flying vehicle (e.g., a drone). In the case where a drone generates a map of a three-dimensional space while flying, a two-dimensional space can be extended to a three-dimensional space.
The example embodiments described above may be implemented by a system, a method, an integrated circuit, a computer program, or a storage medium. Alternatively, the example embodiments described above may be implemented by any combination of a system, a device, a method, an integrated circuit, a computer program, and a storage medium.
Vehicles according to example embodiments of the present disclosure may be suitably used to move and convey articles (e.g., cargo, components, and finished products) in places, such as, factories, warehouses, construction sites, distribution centers, and hospitals.
While example embodiments of the present disclosure have been described above, it is to be understood that variations and modifications will be apparent to those skilled in the art without departing from the scope and spirit of the present disclosure. The scope of the present disclosure, therefore, is to be determined solely by the following claims.
Number | Date | Country | Kind |
---|---|---|---|
2017-169728 | Sep 2017 | JP | national |
This is a U.S. national stage of PCT Application No. PCT/JP2018/030308, filed on Aug. 14, 2018, and priority under 35 U.S.C. § 119(a) and 35 U.S.C. § 365(b) is claimed from Japanese Application No. 2017-169728, filed Sep. 4, 2017; the entire disclosures of each of which are hereby incorporated herein by reference.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2018/030308 | 8/14/2018 | WO | 00 |