The present disclosure relates to trajectory planning, and particularly to an image-based trajectory planning method and a movement control method as well as a mobile machine using the same.
With the help of the maturity of artificial intelligence (AI) techniques, mobile machines such as mobile robots and vehicles have been used in various scenes of daily life to provide various services such as housework, healthcare, and transportation. To provide the services in a more mobile way, automatic navigation capability is needed. Trajectory planning is one of the essential technologies to realize automatic navigation, which provides trajectories for a mobile machine to move to the destination while considering kinematic constraint, path length, obstacle avoidance, and the like.
The most common trajectory planning techniques are realized in the Cartesian coordinate (which as x, y and z axis), which is straightforward and has been widely used already, while not all states of the mobile machine in the Cartesian coordinate can be measured directly or accurately when, for example, the poses of the mobile machine are to be measured while the measurement can only be realized through a single onboard camera. One solution called “image-based” technique is to use the image features of images captured by the camera which are expressed in the image plane coordinate. The image features can be calculated from multiple feature points fixed in the environment and treated as the states of the mobile machine in the image plane coordinate.
As a newly developed technique, the existing image-based trajectory planning in image plane coordinate still focuses on full-actuated manipulators and three-dimensional (3D) drones (which is holonomic), and the planned trajectory may be not reachable and practical for the mobile machines with nonholonomic and underactuated constraints.
In order to more clearly illustrate the technical solutions in this embodiment, the drawings used in the embodiments or the description of the prior art will be briefly introduced below. In the drawing(s), like reference numerals designate corresponding parts throughout the figures. It should be understood that, the drawings in the following description are only examples of the present disclosure. For those skilled in the art, other drawings can be obtained based on these drawings without creative works.
In order to make the objects, features and advantages of the present disclosure more obvious and easy to understand, the technical solutions in this embodiment will be clearly and completely described below with reference to the drawings. Apparently, the described embodiments are part of the embodiments of the present disclosure, not all of the embodiments. All other embodiments obtained by those skilled in the art based on the embodiments of the present disclosure without creative efforts are within the scope of the present disclosure.
It is to be understood that, when used in the description and the appended claims of the present disclosure, the terms “including”, “comprising”. “having” and their variations indicate the presence of stated features, integers, steps, operations, elements and/or components, but do not preclude the presence or addition of one or a plurality of other features, integers, steps, operations, elements, components and/or combinations thereof.
It is also to be understood that, the terminology used in the description of the present disclosure is only for the purpose of describing particular embodiments and is not intended to limit the present disclosure. As used in the description and the appended claims of the present disclosure, the singular forms “one”, “a”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is also to be further understood that the term “and/or” used in the description and the appended claims of the present disclosure refers to any combination of one or more of the associated listed items and all possible combinations, and includes such combinations.
In the present disclosure, the terms “first”, “second”, and “third” are for descriptive purposes only, and are not to be comprehended as indicating or implying the relative importance or implicitly indicating the amount of technical features indicated. Thus, the feature limited by “first”, “second”, and “third” may include at least one of the feature either explicitly or implicitly. In the description of the present disclosure, the meaning of “a plurality” is at least two, for example, two, three, and the like, unless specifically defined otherwise.
In the present disclosure, the descriptions of “one embodiment”, “some embodiments” or the like described in the specification mean that one or more embodiments of the present disclosure can include particular features, structures, or characteristics which are related to the descriptions of the descripted embodiments. Therefore, the sentences “in one embodiment”, “in some embodiments”, “in other embodiments”, “in other embodiments” and the like that appear in different places of the specification do not mean that descripted embodiments should be referred by all other embodiments, but instead be referred by “one or more but not all other embodiments” unless otherwise specifically emphasized.
The present disclosure relates to trajectory planning and movement control for a mobile machine. As used herein, the term “mobile machine” refers to a machine such as a vehicle or a mobile robot that has the capability to move around in its environment. The term “trajectory planning” refers to find a sequence of valid configurations that moves a mobile machine from the source to the destination and is parametrized by time, where “trajectory” denotes a sequence of poses with time stamp (cf. “path” denotes a sequence of poses or position without time stamp). The term “pose” refers to position (e.g., x and y coordinates on x and y axes) and posture (e.g., a yaw angle along z axis). The term “nonholonomic constraint” refers to constraint for the movement manners of a mobile machine, for example, can move straightly and rotate while cannot move left and right directly). The term “navigation” refers to the process of monitoring and controlling the movement of a mobile machine from one place to another, and the term “collision avoidance” refers to prevent or reduce the severity of a collision. The term “sensor” refers to a device, module, machine, or subsystem such as ambient light sensor and image sensor (e.g., camera) whose purpose is to detect events or changes in its environment and send the information to other electronics (e.g., processor).
In some embodiments, the navigation and/or the trajectory planning of the mobile machine 100 may be actuated through the mobile machine 100 itself (e.g., a control interface on the mobile machine 100) or a control device 200 such as a remote control, a smart phone, a tablet computer, a notebook computer, a desktop computer, or other electronic device by, for example, providing a request for the navigation and/or the trajectory planning of the mobile machine 100. The mobile machine 100 and the control device 200 may communicate over a network which may include, for example, the Internet, intranet, extranet, local area network (LAN), wide area network (WAN), wired network, wireless networks (e.g., Wi-Fi network, Bluetooth network, and mobile network), or other suitable networks, or any combination of two or more such networks.
The navigation module 121 in the storage unit 120 of the mobile machine 100 may be a software module (of the operation system of the mobile machine 100), which has instructions In (e.g., instruction for actuating motor(s) M of the mobile machine 100 to move the mobile machine 100) for implementing the navigation of the mobile machine 100, a map builder 1211, and trajectory planner(s) 1212. The map builder 1211 may be a software module having instructions Ib for building map for the mobile machine 100. The trajectory planner(s) 1212 may be software module(s) having instructions Ip for planning path for the mobile machine 100. The trajectory planner(s) 1212 may include a global trajectory planner for planning global trajectories (e.g., trajectory T1) for the mobile machine 100 and a local trajectory planner for planning local trajectories (e.g., trajectory T2) for the mobile machine 100. The global trajectory planner may be, for example, a trajectory planner based on Dijkstra's algorithm, which plans global trajectories based on map(s) built by the map builder 1211 through, for example, simultaneous localization and mapping (SLAM). The local trajectory planner may be, for example, a trajectory planner based on TEB (timed elastic band) algorithm, which plans local trajectories based on the global trajectory Pg, and other data collected by the mobile machine 100. For example, images may be collected through the camera C of the mobile machine 100, and the collected images may be analyzed so as to identify obstacles, so that the local trajectory can be planned with reference to the identified obstacles, and the obstacles can be avoided by moving the mobile machine 100 according to the planned local trajectory.
Each of the map builder 1211 and the trajectory planner(s) 1212 may be a submodule separated from the instructions In or other submodules of the navigation module 121, or a part of the instructions In for implementing the navigation of the mobile machine 100. The trajectory planner(s) 1212 may further have data (e.g., input/output data and temporary data) related to the trajectory planning of the mobile machine 100 which may be stored in the one or more memories and accessed by the processing unit 110. In some embodiments, each of the trajectory planner(s) 1212 may be a module in the storage unit 120 that is separated from the navigation module 121.
In some embodiments, the instructions In may include instructions for implementing collision avoidance of the mobile machine 100 (e.g., obstacle detection and path replanning). In addition, the global trajectory planner may replan the global trajectory(s) (i.e., plan new global trajectory(s)) to detour in response to, for example, the original global trajectory(s) being blocked (e.g., blocked by an unexpected obstacle) or inadequate for collision avoidance (e.g., impossible to avoid a detected obstacle when adopted). In other embodiments, the navigation module 121 may be a navigation unit communicating with the processing unit 110, the storage unit 120, and the control unit 130 over the one or more communication buses or signal lines L, and may further include one or more memories (e.g., high-speed random access memory (RAM) and non-transitory memory) for storing the instructions In, the map builder 1211, and the trajectory planner(s) 1212, and one or more processors (e.g., MPU and MCU) for executing the stored instructions In, Ib and Ip to implement the navigation of the mobile machine 100.
The mobile machine 100 may further include a communication subunit 131 and an actuation subunit 132. The communication subunit 131 and the actuation subunit 132 communicate with the control unit 130 over one or more communication buses or signal lines that may be the same or at least partially different from the above-mentioned one or more communication buses or signal lines L. The communication subunit 131 is coupled to communication interfaces of the mobile machine 100, for example, network interface(s) 1311 for the mobile machine 100 to communicate with the control device 200 via the network(s) N and I/O interface(s) 1312 (e.g., a physical button), and the like. The actuation subunit 132 is coupled to component(s)/device(s) for implementing the motions of the mobile machine 100 by, for example, actuating motor(s) M of wheels E (see
The mobile machine 100 may further include a sensor subunit 133 which may include a set of sensor(s) and related controller(s), for example, the camera C and an IMU U (or an accelerometer and a gyroscope), for detecting the environment in which it is located to realize its navigation. The sensor subunit 133 communicates with the control unit 130 over one or more communication buses or signal lines that may be the same or at least partially different from the above-mentioned one or more communication buses or signal lines L. In other embodiments, in the case that the navigation module 121 is the above-mentioned navigation unit, the sensor subunit 133 may communicate with the navigation unit over one or more communication buses or signal lines that may be the same or at least partially different from the above-mentioned one or more communication buses or signal lines L. In addition, the sensor subunit 133 may just abstract component for representing the logical relationships between the components of the mobile machine 100.
In some embodiments, the map builder 1211, the trajectory planner(s) 1212, the sensor subunit 133, and the motor(s) M (and wheels and/or joints of the mobile machine 100 coupled to the motor(s) M) jointly compose a (navigation) system which implements map building, (global and local) trajectory planning, and motor actuating so as to realize the navigation of the mobile machine 100. In addition, the various components shown in
According to the trajectory planning method, the processing unit 110 obtains a desired image Id of the referenced target T and an initial image Ii of the referenced target T through the camera C of the mobile machine 100 (block 310 of
The processing unit 110 may further calculate a homography H0 of the desired image Id and the initial image Ii (block 320 of
[y1(t), . . . ,yn(t)]∝H(t)[y1(tƒ), . . . ,yn(tƒ)];
where, H(t) is the Euclidean homography of the two images, t∈[0, tƒ], 0 is the starting time corresponding to the initial image Ii, and tƒ is the ending time (which is with respect to the starting time, in seconds) corresponding to the desired image Id (the sequence of H(0)-H(tƒ) can be seen as the trajectory of homography); yi(t) is the i-th image point on the image plane coordinate (unit in pixel) of an image which is captured at the time t (the image will be the initial image Ii when t=0, and will be the desired image Id when t=tƒ), and there are n points in the initial image Ii which are matched with n points in the desired image Id (n is larger than or equal to three); and yi(tƒ) is the i-th image point on the image plane coordinate (unit in pixel) of the desired image Id which is captured at the ending time tƒ, there are also n points in the desired image Id which are matched with the n points in the initial image Ii. ∝ means the left-hand side is proportional to the right-hand side but they may not be equal. That is, the homography H(t) can be solved up to a scale rather than obtain its truth value. Having given those 2n points in the initial image Ii and the desired image Id, the homography H(t) can be solved. For example, if there are 4 pairs of coplanar points (i.e., n=4) can be detected and matched in in the initial image Ii and the desired image Id (here points are assumed in a plane Π (not shown)), the homography H(t) can be solved through a corresponding equation set containing 8 equations. Since the homography H(t) has 9 entries (3×3), the solution of equation set is linear dependent of the truth value as mentioned above. H0 is the homography corresponding to the initial image Ii (captured at the starting time of 0) and the desired image Id, that is, the homography H(0).
The processing unit 110 may further decompose the homography H0 into an initial translation component b0 and an initial rotation matrix R0 (block 330 of
where, the rotation matrix R(t) represents a rotation between the two images corresponding to the homography H(t) in the camera coordinate system F, the translation component b(t) represents a translation between the two images corresponding to the homography H(t) in the camera coordinate system F, dƒ is the depth from the origin of the desired image Id to the plane Π, and αƒ is the unitary normal to the plane Π in the desired image Id. If dƒ is known ahead, the truth value of the homography H(t) can be calculated, and the truth value of the rotation matrix R(t) and the translation component b(t) can be obtained too. As mentioned above, the homography H(t) is given, hence the rotation matrix R(t) and the translation component b(t) can be obtained in an analytical form. As mentioned above, H0 is the homography H(0), hence the initial translation component b0 is b(0), that is, the translation component corresponding to the initial image Ii (captured at the starting time of 0) and the desired image Id, and the initial rotation matrix R0 is R(0), that is, the rotation matrix corresponding to the initial image Ii and the desired image Id.
The processing unit 110 may further obtain optimized translation components br(t) corresponding to constraint(s) for the mobile machine 10 based on the initial translation component b0 (block 340 of
In some embodiments, each constraint for the mobile machine 100 may be represented through one objective function of translation component b, and the sequence of intermediate pose(s) Sm may be planned (in the camera coordinate system F) based on the objective function(s) by, for example, the above-mentioned local trajectory planner using TEB algorithm. In the case that TEB algorithm is used, the optimization problem will be transformed into a hyper-graph. The poses S of the mobile machine 100 and the time interval therebetween are used as nodes of the hyper-graph, and objective functions are used as edges of the hyper-graph. Each node is connected by edge(s) to form the hyper-graph, and the hyper-graph is optimized to solve the optimization problem accordingly.
Trajectory Optimization: With Nonholonomic Constraint
γj=xj×lj; and
γj+1=lj×xj+1;
where, xj=[cos θj, sin θj, 0]T and xj+1=[cos θj+1, sin θj+1, 0]T represent the heading directions in the world coordinate system, and lj is the unit vector from the origin of the xj coordinate to the origin of the xj+1 coordinate. The objective function to penalize the nonholonomic constraint is ƒl(b)−∥(xj+xj+1)×lj∥2.
Trajectory Optimization: With Kinematic Constraint
If the mobile machine 100 has the kinematic constraint of maximum velocity, for every two consecutive poses, the linear velocity vj and the angular velocity ωj can be approximated by:
where, dtj is the time interval between the two consecutive poses. The objective function to penalize velocity which is over the maximum value
where, gj(x) is a smooth function, and
Trajectory Optimization: With Fastest Trajectory Constraint
As to the constraint of fastest trajectory for the mobile machine 100, for every two consecutive poses, the objective function for the fastest trajectory constraint is:
where, dtj is the time interval between the two consecutive poses.
The processing unit 110 may further obtain the optimized translation components br(t) based on the initial pose Si, the planned intermediate pose(s) Sm, and the desired pose Sd of the trajectory T2 (block 342 of
where, ƒk(b) is an objective function corresponding to a constraint of the mobile machine 100 (each objective function corresponds to each constraint) and βk is its weight, k∈[1, m] (m is the number of the constraints of the mobile machine 100), and b is the translation component. Upon minimizing the sum of all the objective functions ƒk(b) of all the constraints of the mobile machine 100, the translation component b corresponding to each of the objective functions ƒk(b) is taken as an optimized translation component br(t) at time t. A sequence of the optimized translation components br(t) at the time in-between the starting time 0 and the ending time tƒ are obtained. The objective function ƒk(b) may be solved through Quasi-Newton method to obtain the optimal solution of the translation component b.
In other embodiments, the optimization against the constraints for the mobile machine 10 (block 340 of
The processing unit 110 may further obtain optimized homography(s) Hr(t) based on the optimized translation component(s) br(t) and the initial rotation matrix R0 (block 350 of
where, R0 is a rotation matrix corresponding to the initial image Ii and the desired image Id, [φ0]=log(R0T)∈SO(3), dƒ is the depth from the origin of the desired image Id to the plane Π, and αƒ is the unitary normal to the plane Π in the desired image Id; t∈[0, tƒ], 0 is the starting time corresponding to the initial image Ii, and tƒ is the ending time corresponding to the desired image Id.
The processing unit 110 may further obtain objective image features sr(t) corresponding to the trajectory T2 based on the optimized homographies Hr( ) (block 360 of
The processing unit 110 may further obtain objective feature points hi of each of the initial image Ii, the desired image Id, and the intermediate image(s) Im by projecting the obtained feature points Pi of each of the initial image Ii, the desired image Id, and the intermediate image(s) Im onto a virtual unitary sphere V (see
The processing unit 110 may further obtain the objective image features sr(t) of each of the initial image Ii, the intermediate image(s) Im, and the desired image Id using the plurality of obtained objective feature points hi of each of the initial image Ii, the intermediate image(s) Im, and the desired image Id (block 363 of
The image-based trajectory planning method plans a trajectory for a mobile machine with constraints in terms of image features in image plane coordinate. Then, during the navigation of the mobile machine, the poses of the mobile machine can be regulated according to the image features without requirements of the poses of the mobile machine relative to either inertial frame or visual target for feedback control, so that the mobile machine can eventually reach a desired pose in the trajectory without the need for pose measurements or estimations.
According to the movement control method, the processing unit 110 may obtain a current image Ic of the referenced target T captured at a current pose Sc of the mobile machine 100 through the camera C of the mobile machine 100 (block 770). The current image Ic has to include the referenced target T, so that feature points Pi in the current image Ic that correspond to the referenced target T can be obtained. The current pose Sc is the pose of the mobile machine 100 at the time when the current image Ic is captured, which may be just the initial pose Si if the mobile machine 100 has not moved yet after the initial image Ii is captured (i.e., at the starting point of the trajectory T2).
The processing unit 110 may further obtain the feature points Pi of the current image Ic that are matched with the feature points Pi of the desired image Id (block 780). The matched feature points Pi in the current image Ic are found so as to obtain current image features sr(t) of the current image Ic for comparing with the objective image features sr(t) of the intermediate image(s) Im and the desired image Id.
The processing unit 110 may further obtain objective feature points hi by projecting the obtained feature points Pi onto the virtual unitary sphere V (block 790). As mentioned above, the feature points Pi are projected onto the virtual unitary sphere V as the objective feature points hi (see
The processing unit 110 may further obtain the current image features sr(t) of the current image Ic using the obtained objective feature points hi (block 7100). As mentioned above, in some embodiments, the invariant visual features of image are used as the objective image features sr(t), and the objective image features sr(t) may include, for example, centroid, moments, distances, areas, and other invariant visual features of image. In the case that the obtained objective image features sr(t) include distances, the inverse of the distances dab (e.g., d01 and d02 in
The processing unit 110 may further calculate a difference D between the current image features sr(t) of the current image Ic and the objective image features sr(t) corresponding to the current pose Sc (block 7110). The current image features sr(t) of the current image Ic are actual image features while the objective image features sr(t) corresponding to the current pose Sc are expected image features, hence the current image features sr(t) of the current image Ic may be compared with the objective image features sr(t) of the time of the current pose Sc, that is, the current image features sr(t) and the objective image features sr(t) of the same time t are compared, so as to obtain the difference D by, for example, in the case that the objective image features sr(t) and the current image features sr(t) include distances, the above-mentioned inverse of distances dab (e.g., d01 and d02 in
The processing unit 110 may further control the mobile machine 100 to move according to the calculated difference D (block 7120). In the case that the objective image features sr(t) and the current image features sr(t) include distances, since the current image features sr(t) are actual image features while the objective image features sr(t) are expected image features, the mobile machine 100 may be controlled to move according to the above-mentioned differences D between the distances dab in the objective image features sr(t) and that in the current image features sr(t), so as to offset the differences therebetween so that the mobile machine 100 can be moved according to the planned trajectory (e.g., the trajectory T2).
The movement control method uses a trajectory for a mobile machine with constraints which is planned in terms of image features in image plane coordinate. The poses of the mobile machine will be regulated according to the image features during the movement of the mobile machine, and the mobile machine will be moved in a smoother way because the trajectory has planned with the optimization against the constraints for the mobile machine. In addition, in the movement control method, natural backward movements are also allowed.
It can be understood by those skilled in the art that, all or part of the method in the above-mentioned embodiment(s) can be implemented by one or more computer programs to instruct related hardware. In addition, the one or more programs can be stored in a non-transitory computer readable storage medium. When the one or more programs are executed, all or part of the corresponding method in the above-mentioned embodiment(s) is performed. Any reference to a storage, a memory, a database or other medium may include non-transitory and/or transitory memory. Non-transitory memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, solid-state drive (SSD), or the like. Volatile memory may include random access memory (RAM), external cache memory, or the like.
The processing unit 110 (and the above-mentioned processor) may include central processing unit (CPU), or be other general purpose processor, digital signal processor (DSP), application specific integrated circuit (ASIC), field-programmable gate array (FPGA), or be other programmable logic device, discrete gate, transistor logic device, and discrete hardware component. The general purpose processor may be microprocessor, or the processor may also be any conventional processor. The storage unit 120 (and the above-mentioned memory) may include internal storage unit such as hard disk and internal memory. The storage unit 120 may also include external storage device such as plug-in hard disk, smart media card (SMC), secure digital (SD) card, and flash card.
The exemplificative units/modules and methods/steps described in the embodiments may be implemented through software, hardware, or a combination of software and hardware. Whether these functions are implemented through software or hardware depends on the specific application and design constraints of the technical schemes. The above-mentioned trajectory planning method and mobile machine may be implemented in other manners. For example, the division of units/modules is merely a logical functional division, and other division manner may be used in actual implementations, that is, multiple units/modules may be combined or be integrated into another system, or some of the features may be ignored or not performed. In addition, the above-mentioned mutual coupling/connection may be direct coupling/connection or communication connection, and may also be indirect coupling/connection or communication connection through some interfaces/devices, and may also be electrical, mechanical or in other forms.
The above-mentioned embodiments are merely intended for describing but not for limiting the technical schemes of the present disclosure. Although the present disclosure is described in detail with reference to the above-mentioned embodiments, the technical schemes in each of the above-mentioned embodiments may still be modified, or some of the technical features may be equivalently replaced, so that these modifications or replacements do not make the essence of the corresponding technical schemes depart from the spirit and scope of the technical schemes of each of the embodiments of the present disclosure, and should be included within the scope of the present disclosure.
Number | Name | Date | Kind |
---|---|---|---|
20200327695 | Lin | Oct 2020 | A1 |
Number | Date | Country |
---|---|---|
108628310 | Oct 2018 | CN |
Entry |
---|
K. Deguchi, “Optimal motion control for image-based visual servoing by decoupling translation and rotation,” Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. (Cat. No. 98CH36190), Victoria, BC, Canada, 1998, pp. 705-707 (Year: 1998). |
Jian Chen, W. E. Dixon, M. Dawson and M. McIntyre, “Homography-based visual servo tracking control of a wheeled mobile robot,” in IEEE Transactions on Robotics, vol. 22, No. 2, pp. 406-415, Apr. 2006, doi: 10.1109/TRO.2006.862476. (Year: 2006). |
Borgstadt, J.A., & Ferrier, N.J. (2001). Visual servoing: Path interpolation by homography decomposition. In 2001IEEE International Conference on Robotics and Automation (ICRA), May 21-26, 2001, 1, 723-730. doi: 10.1109/ROBOT.2001.932636 (Year: 2001). |
K. Zhang, J. Chen, Y. Li and Y. Gao, “Unified Visual Servoing Tracking and Regulation of Wheeled Mobile Robots With an Uncalibrated Camera,” in IEEE/ASME Transactions on Mechatronics, vol. 23, No. 4, pp. 1728-1739, Aug. 2018, doi: 10.1109/TMECH.2018.2836394. (Year: 2018). |
ISR for PCT/CN2021/139492. |
Written opinions of ISA for PCT/CN2021/139492. |
Number | Date | Country | |
---|---|---|---|
20220339786 A1 | Oct 2022 | US |