The present disclosure generally relates to the technical field of automatic control of autonomous mobile devices and, more specifically, to a method for controlling movement of an autonomous mobile device.
Autonomous mobile devices such as devices that can autonomously carry out pre-set or pre-configured tasks within a pre-defined closed space (i.e., a work environment) have been widely used in consumer and industrial applications. Currently available autonomous mobile devices typically include, but are not limited to, cleaning robots (e.g., smart floor sweeping robots, smart floor mopping robots, window cleaning robots, etc.), escort or companion type mobile robots (e.g., smart electronic pets, nanny robots), service type mobile robots (e.g., robots for restaurants, hotels, conference room reception), industrial inspection smart devices (e.g., electric power line inspection robots, smart forklift, etc.), and security robots (e.g., consumer or commercial smart security robots).
An autonomous mobile device typically can move autonomously within a limited space. For example, a cleaning robot or a companion type mobile robot typically operates indoor. Service type mobile robots typically operate within a limited space in a hotel, conference room, etc. Currently, some autonomous mobile devices can obtain locations of obstacles through sensing environmental information, thereby establishing a two-dimensional planar map for the environment, e.g., the space in which the autonomous mobile devices operate. For example, an autonomous mobile device may directly measure distances to obstacles through a laser based distance measurement device, and may convert the distances into coordinates. Alternatively, the autonomous mobile device may obtain coordinates of an obstacle based on a wheel encoder, an inertial measurement unit (“IMU”), and a collision sensor that detects collision with the obstacle. In some situations, the autonomous mobile device may capture images of objects in the environment through a camera (e.g., an ordinary camera). The autonomous mobile device may determine coordinates of an obstacle, through a visual simultaneous localization and mapping (“vSLAM”) algorithm, based on the captured images and motion data obtained through dead reckoning sensors such as the wheel encoder and IMU. The autonomous mobile device may obtain a map for the environment based on the coordinates of the obstacles. The obstacle refers to any physical object the autonomous mobile device senses or detects in a work zone that blocks the autonomous mobile device from moving to a target space, such as a wall, a floor cabinet, a bed not having a sufficient space at a lower portion for the autonomous mobile device to move through, legs of a chair, refrigerator, toilet, etc. The autonomous mobile device may detect the physical object that blocks the passage of the autonomous mobile device based on an onboard collision sensor or a proximity sensor. Another example of the obstacle is a magnetic strip disposed on a floor or an object. The magnetic field of the magnetic strip may be detected by a Hall-effect element disposed at the autonomous mobile device. The autonomous mobile device may be controlled, e.g., through an algorithm executed by a processor disposed on the autonomous mobile device, to refrain from traversing the location or space where the magnetic field is present. Another example of the obstacle is stairs leading to a lower floor. When the autonomous mobile device moves to an edge of the stairs, a cliff sensor downwardly disposed at a front portion of the autonomous mobile device can detect the stairs leading to the lower floor. As such, the stairs leading to the lower floor can function as an obstacle to block the passage of the autonomous mobile device. The location where the stairs start to descend may be deemed as an obstacle. Objects that do not block the autonomous mobile device from moving into the floor space occupied by the objects are not obstacles. For example, some beds or tables may have a space with a sufficient height at a lower portion adjacent the floor, which may allow a cleaning robot to move therein under the beds or tables. The beds or tables configured with a sufficiently high space may not be treated as obstacles. The legs of the beds or the legs of the tables may be obstacles that block the autonomous mobile device, because the legs of the beds or the legs of the tables can block the autonomous mobile device (e.g., the cleaning robot) from moving into the locations occupied by the legs.
In a conventional autonomous mobile device, when the autonomous mobile device moves along a boundary or edge of a first obstacle and arrives at a corner location, i.e., a location where the first obstacle and a second obstacle (or boundaries of the two obstacles) are substantially perpendicular to one another, the autonomous mobile device may spin around its own vertical axis (e.g., a predetermined spinning axis passing through the autonomous mobile device) at a fixed location for, e.g., 90 degrees, such that the autonomous mobile device can change its forward moving direction, and move along the second obstacle. The spinning may not be an issue for an autonomous mobile device having a substantially round, circular shape, in which the circumferential portions of the housing may have substantially equal distance to the spinning axis. Thus, as the autonomous mobile device having a circular shape spins, the circumferential portions of the housing of the autonomous mobile device may not collide with the first obstacle. However, for an autonomous mobile device having an non-circular shape, such as an oval shape, a rectangular shape, or any shape in which the circumferential portions of the housing of the autonomous mobile device have different distances to the spinning axis, a portion of the housing having the furthest distance to the spinning axis of the autonomous mobile device may collide with the first obstacle. For example, when the spinning is initially started, a first circumferential portion of the housing that is closest to the obstacle may have a first distance to the spinning axis. As the autonomous mobile device spins, a second circumferential portion of the housing having a larger, second distance to the spinning axis may collide with the obstacle. Thus, there is a need for an improved algorithm for controlling the movement (e.g., turning) of the autonomous mobile device.
Embodiments of the present disclosure provide a method executable by a processor of an autonomous mobile device for controlling the movement of the autonomous mobile device. The method may control the turning of the autonomous mobile device at a corner location, or at any location where a turn is needed to change the moving path of the autonomous mobile device. The method disclosed herein resolve the technical issues associated with the existing conventional autonomous mobile devices, i.e., when making a turn or when spinning at a location adjacent an obstacle, a circumferential portion of housing of the autonomous mobile device having a non-circular shape may collide with the obstacle.
According to an aspect of the present disclosure, a method for controlling movement of an autonomous mobile device is provided. The method includes controlling the autonomous mobile device to move along an edge of a first obstacle. The method also includes determining detection of a second obstacle in a moving direction, the first obstacle and the second obstacle forming a corner. The method also includes determining a first location of the autonomous mobile device. The method also includes determining a projected location adjacent the second obstacle. The method also includes determining a curved path connecting the first location and the projected location. The method also includes controlling the autonomous mobile device to move from the first location to the projected location along the curved path.
According to another aspect of the present disclosure, a method for controlling movement of an autonomous mobile device in a cornrow pattern is provided. The method includes controlling the autonomous mobile device to move along a first long path in the cornrow pattern. The method also includes determining detection of an obstacle in a moving direction along the first long path. The method also includes determining a first location of the autonomous mobile device. The method also includes determining a projected location on a second long path in the cornrow pattern, the second long path being parallel with the first long path. The method also includes determining a curved path connecting the first location and the projected location. The method also includes controlling the autonomous mobile device to move from the first location to the projected location along the curved path.
According to another aspect of the present disclosure, a method for controlling movement of an autonomous mobile device in a cornrow pattern is provided. The method includes controlling the autonomous mobile device to move along a first long path in the cornrow pattern. The method also includes determining detection of an obstacle in a moving direction along the first long path. The method also includes determining a first location of the autonomous mobile device. The method also includes determining a first projected location. The methods also includes determining a first curved path connecting the first location and the first projected location. The method also includes controlling the autonomous mobile device to move from the first location to the first projected location along the first curved path. The method also includes determining a second projected location on the second long path. The method also includes determining a second curved path connecting the first projected location and the second projected location. The method further includes controlling the autonomous mobile device to move from the first projected location to the second projected location.
According to another aspect of the present disclosure, an autonomous mobile device is provided. The autonomous mobile device includes a sensor configured to detect an obstacle. The autonomous mobile device includes a motion unit configured to move the autonomous mobile device along a surface. The autonomous mobile device includes a processor configured to control the motion unit to control movement of the autonomous mobile device based on detection of the obstacle. The processor is configured to perform a method including controlling the motion unit to move the autonomous mobile device to along an edge of a first obstacle. The method also includes determining detection of a second obstacle in a moving direction, the first obstacle and the second obstacle forming a corner. The method also includes determining a first location of the autonomous mobile device. The method also includes determining a projected location adjacent the second obstacle. The method also includes determining a curved path connecting the first location and the projected location. The method also includes controlling the motion unit to move the autonomous mobile device from the first location to the projected location along the curved path.
According to another aspect of the present disclosure, an autonomous mobile device is provided. The autonomous mobile device includes a sensor configured to detect an obstacle. The autonomous mobile device also includes a motion unit configured to move the autonomous mobile device along a surface. The autonomous mobile device also includes a processor configured to control the motion unit to control movement of the autonomous mobile device based on detection of the obstacle. The processor is configured to perform a method including controlling the autonomous mobile device to move along a first long path in the cornrow pattern. The method also includes determining detection of an obstacle in a moving direction along the first long path. The method also includes determining a first location of the autonomous mobile device. The method also includes determining a projected location on a second long path in the cornrow pattern, the second long path being parallel with the first long path. The method also includes determining a curved path connecting the first location and the projected location. The method also includes controlling the motion unit to move the autonomous mobile device from the first location to the projected location along the curved path.
According to another aspect of the present disclosure, an autonomous mobile device is provided. The autonomous mobile device includes a sensor configured to detect an obstacle. The autonomous mobile device also includes a motion unit configured to move the autonomous mobile device along a surface. The autonomous mobile device also includes a processor configured to control the motion unit to control movement of the autonomous mobile device based on detection of the obstacle. The processor is configured to perform a method including controlling the motion unit to move the autonomous mobile device along a first long path in a cornrow pattern. The method also includes determining detection of an obstacle in a moving direction along the first long path. The method also includes determining a first location of the autonomous mobile device. The method also includes determining a first projected location. The method also includes determining a first curved path connecting the first location and the first projected location. The method also includes controlling the motion unit to move the autonomous mobile device from the first location to the first projected location along the first curved path. The method also includes determining a second projected location on the second long path. The method also includes determining a second curved path connecting the first projected location and the second projected location. The method also includes controlling the motion unit to move the autonomous mobile device from the first projected location to the second projected location.
Other aspects of the present disclosure can be understood by those skilled in the art in light of the description, the claims, and the drawings of the present disclosure. The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.
The accompanying drawings are provided to illustrate some, but not all, embodiments of the present disclosure to facilitate the understanding of the present disclosure. The drawings are parts of the present disclosure. The illustrative embodiments and the descriptions are for explaining the principles of the present disclosure, and are not intended to limit the scope of the present disclosure. In the drawings:
To assist a person having ordinary skills in the art in understanding the technical solutions of the present disclosure, various embodiments of the present disclosure will be explained in detail, with reference to the accompanying drawings. The embodiments illustrated in the drawings and described below are merely some, but not all, embodiments of the present disclosure. Based on the illustrated embodiments, a person having ordinary skills in the art can derive other embodiments without creative efforts. The derived embodiments are also within the scope of protection of the present disclosure.
It should be understood that in the specification, claims, and drawings of the present disclosure, relational terms such as “first” and “second,” etc., are only used to distinguish similar objects, and do not necessarily describe a specific order or sequence. It should be understood that data modified by such terms may be exchanged in suitable situations, such that the embodiments described herein can be implemented in orders or sequences other than those described or illustrated herein. In addition, the term “comprise,” “include,” and their variations are intended to mean non-exclusive inclusion. For example, processes, methods, systems, products, or devices (or apparatus) including a series of steps or units are not limited to the listed steps or units, and may also include other steps or units that are not explicitly listed or that are inherent to the processes, methods, products, or devices.
The phrase “at least one of A or B” may encompass all combinations of A and B, such as A only, B only, or A and B. Likewise, the phrase “at least one of A, B, or C” may encompass all combinations of A, B, and C, such as A only, B only, C only, A and B, A and C, B and C, or A and B and C. The phrase “A and/or B” may be interpreted in a manner similar to that of the phrase “at least one of A or B.” For example, the phrase “A and/or B” may encompass all combinations of A and B, such as A only, B only, or A and B. Likewise, the phrase “A, B, and/or C” has a meaning similar to that of the phrase “at least one of A, B, or C.” For example, the phrase “A, B, and/or C” may encompass all combinations of A, B, and C, such as A only, B only, C only, A and B, A and C, B and C, or A and B and C.
The main body 110 may include a first bumper (or first cover, front bumper) 121 and a second bumper (or second cover, rear bumper) 122 at a circumferential side of the main body 110. The bumpers 121 and 122 may be parts of the housing 105, or may be treated as separate parts different from the housing 105. The first bumper 121 may be separated from the second bumper 122 by one or more gaps 120. At least one of the first bumper 121 or the second bumper 122 may be resiliently coupled with the other portions of the housing 105 or other portions of the main body 110 through an elastic member, such as a spring (not shown). When the autonomous mobile device 100 collides with an obstacle, such as a wall or furniture, the first bumper 121 or the second bumper 122 may retract when pushed by the obstacle, thereby providing a buffer or an impact absorption for the autonomous mobile device 100. One or more collision sensors may be disposed at or behind the first bumper 121 and/or the second bumper 122. When the first bumper 121 and/or the second bumper 122 collides with an object, the one or more collision sensors may detect the collision and generate a signal indicating the occurrence of the collision. In some embodiments, the collision sensor may detect a potential collision and generate a warning signal, or trigger a controller (such as a processor 190) to make a collision avoidance control.
The autonomous mobile device 100 may also include an environmental data sensing device configured to acquire environmental data of a work environment (e.g., a room) in which the autonomous mobile device 100 moves or operates. The environmental data sensing device may include at least one of a camera 125 or a distance measuring device 175. In some embodiments, the autonomous mobile device 100 may include both the camera 125 and the distance measuring device 175. The camera 125 may be configured to capture one or more images of the environment (e.g., room) in which the autonomous mobile device 100 moves. For illustrative purposes, the camera 125 is shown as being mounted at the top portion of the housing 105 of the autonomous mobile device 100. It is understood that the camera 125 may be mounted at any other portion of the autonomous mobile device main body 110, e.g., at the front bumper 121. Although one camera 125 is shown, it is understood that the autonomous mobile device 100 may include multiple cameras 125 disposed at multiple portions of the autonomous mobile device 100, such as at the back bumper 122, at the left and right side of the housing 105, etc. The orientation of the camera 125 may be in any suitable directions, such as facing front, facing back, facing sides, facing up (e.g., ceiling of a room), facing a direction forming an acute angle relative to the moving direction of the autonomous mobile device 100, etc. In some embodiments, the facing direction of the camera 125 may be adjustable manually or automatically through an electrical adjustment mechanism, such as a motor. In some embodiments, the facing direction of the camera 125 may be fixed.
The autonomous mobile device 100 may include the processor 190 disposed within the inner space enclosed by the housing 105. The processor 190 may include a controller, or may be a controller, or may be part of a controller. The processor 190 may be any suitable processor, such as a central processing unit (“CPU”), a graphics processing unit (“GPU”), an application-specific integrated circuit (“ASIC”), a programmable logic device (“PLD”), or a combination thereof. Other processors not listed above may also be used. The processor 190 may be implemented as software, hardware, firmware, or a combination thereof. For example, the processor 190 may include circuits, logic gates, and/or software codes encoded therein, etc.
The processor 190 may perform various control functions to control the operations and/or movements of the autonomous mobile device 100. The processor 190 may process data and/or signals received from various sensors (or sensing devices) equipped in or on the autonomous mobile device 100, or received from another external device electrically coupled with the autonomous mobile device 100. Based on analysis of the data and/or signals, the processor 190 may control the operation and/or movement of the autonomous mobile device 100. For example, the processor 190 may execute computer-readable instructions or codes to perform a method for controlling the movement, such as turning, of the autonomous mobile device 100 to avoid collision with a detected obstacle.
The autonomous mobile device 100 may include a data storage device 191 configured to store data, signals, images, processor-executable (or computer-executable) instructions or codes, etc. The data storage device 191 may also be referred to as a non-transitory computer-readable medium. The non-transitory computer-readable medium may be any suitable medium for storing, transferring, communicating, broadcasting, or transmitting data, signal, or information. For example, the non-transitory computer-readable medium may include a memory, a hard disk, a magnetic disk, an optical disk, a tape, etc. The memory may include a read-only memory (“ROM”), a random-access memory (“RAM”), a flash memory, etc. The processor 190 may be electrically coupled with the data storage device 191, e.g., through a data transmission or communication bus. The processor 190 may store data into the data storage device 191, or may retrieve data from the data storage device 191.
The autonomous mobile device 100 may include a communication device 180 configured to communicate with another device, such as a cloud server, a docking station of the autonomous mobile device 100, a smart phone, another similar autonomous mobile device operating in the same work environment, etc. The communication device 180 may be at least partially disposed at the housing 105, or may be entirely disposed within the inner space enclosed by the housing 105. The communication device 180 may include a receiver 181 configured to receive data or signals from another device, such as a sensor or sensing device, a cloud server, a smart phone, etc. The communication device 180 may also include a transmitter 182 configured to transmit data or signals to another device. In some embodiments, the receiver 181 and the transmitter 182 may be a single integral transceiver. The autonomous mobile device 100 may further include one or more cleaning devices, such as one or more brushes. For illustrative purposes,
In some embodiments, the distance measuring device 175 may be configured to measure a distance between the autonomous mobile device 100 and an obstacle, such as a wall, a furniture, a human being, an animal, a. For example, in some embodiments, the distance measuring device 175 may be a laser-based distance measuring device, such as Light Detection and Ranging (“Lidar”) sensor.
The distance measuring device 175 and/or the camera 125 may be configured as a sensor for detecting an obstacle. For example, the processor 190 may identify an obstacle located in front of the autonomous mobile device 100 and/or in other directions around the autonomous mobile device 100 based on signals received from the Lidar sensor (an example of the distance measuring device 175). In some embodiments, the processor 190 may identify an obstacle in front of the autonomous mobile device 100 and/or in other directions around the autonomous mobile device 100 based on analysis of one or more environmental images captured by the camera 125. In some embodiments, one or more additional proximity sensors 171 may be disposed at one or more suitable locations on or in the autonomous mobile device 100, and may be configured to detect an obstacle within a predetermined distance at one or more sides of the autonomous mobile device 100.
The water tank 129 shown in
The autonomous mobile device 100 may include a motion mechanism (or motion unit) configured to enable the autonomous mobile device 100 to move along a surface (e.g., a floor, a ground). The motion mechanism may include a wheel assembly. The wheel assembly may include an omnidirectional wheel 135 disposed at a front portion of the bottom surface 155. The omnidirectional wheel 135 may be a non-driving, passively rotating wheel. The wheel assembly may also include at least two driving wheels 140 disposed at two sides (e.g., left and right sides) of the bottom surface 155. The positions of the omnidirectional wheel 135 and the two driving wheels 140 may form a triangle, as shown in
In some embodiments, the autonomous mobile device 100 may include a mopping mechanism 160 disposed at the bottom surface 155. The mopping mechanism 160 may include at least one movable mopping plate attached with a mop to mop the surface to be cleaned (e.g., a floor). For illustrative purposes, the mopping mechanism 160 is shown as a rectangle in
Similar issues may be encountered in other scenarios in conventional technologies. For example, when operating in a cornrow mode using conventional technologies, the autonomous mobile device 200 may need to change its direction and may need to rotate in place (i.e., spin around its own vertical spinning axis), e.g., after the autonomous mobile device 200 encounters or detects an obstacle in the current moving direction. For example, after detecting the obstacle in front of the autonomous mobile device 200 in the current moving direction, the autonomous mobile device 200 may move backwardly for a predetermined distance, and may then rotate in place. When rotating in place (e.g., spinning around its own vertical spinning axis), due to the non-circular shape, the rear portion or another portion of the autonomous mobile device 200 may collide with the obstacle on the side. The collision may cause damages to the obstacle and/or the autonomous mobile device 200.
In some embodiments, the autonomous mobile device 100 may detect the second wall 212 when colliding with the second wall 212 (e.g., a collision sensor mounted at or behind the front bumper may detect the collision). After colliding with the second wall 212, the autonomous mobile device 100 may move backwardly from the second wall 212 such that the distance L (shown in
It is presumed that at the location A, the forward moving direction of the autonomous mobile device 100 (as indicated by the arrow) forms an acute angle θ with respect to the first wall 211. That is, the forward moving direction of the autonomous mobile device 100 may form the angle θ with respect to the y-axis shown in
At the current location A, the autonomous mobile device 100 may determine the curved path (or arc) p1. The curved path (or arc) p1 connecting the location A and the projected location B may be determined as follows. Based on the current orientation of the autonomous mobile device 100 at the location A (e.g., facing direction or forward moving direction), as indicated by the arrow at the location A, and the orientation of the autonomous mobile device 100 at the projected location B (as indicated by the arrow at the projected location B), a center point of a target circle may be determined. The curved path p1 may be a portion of the target circle. First, connecting the location A and the projected location B to obtain a straight segment AB, as indicated by the dashed line connecting A and B. The segment AB forms an angle α with respect to the x-axis. Draw a perpendicular bisector 301 (i.e., a line that is perpendicular to the segment AB and that passes through the midpoint m1 of the segment AB). In theory, any point on the perpendicular bisector 301 to the right side of the segment AB may be used as the center of the target circle for generating the curved path (e.g., arc) p1. For example, taking any point on the perpendicular bisector 301, and drawing a circle passing through both location A and projected location B generates a curved path p1 (e.g., arc), which may be used as the projected moving path for the autonomous mobile device 100 to move from the location A to the projected location B. The center O may be selected such that turning along the curved path p1 would not cause any portion of the autonomous mobile device 100 to collide with the wall 211. In the example shown in
The coordinates of the center O of the target circle may be determined as (x0+d, y0+L−wr−lw−r). Based on the coordinates of the center O, the radius r of the target circle, the coordinates of the current location A, and the coordinates of the projected location B after making the turn, the arc (or the curved path p1) connecting the location A and the location B may be determined. The autonomous mobile device 100 may move from the location A to the location B along the determined curved path p1, rather than along substantially straight lines following the first wall 211 and the second wall 212 between the location A and the location B according to conventional technologies. After the autonomous mobile device 100 moves to the location B along the curved path p1, the autonomous mobile device 100 may continue to move in the wall-following mode, following the edge or boundary of the second wall 212. During the movement (e.g., turn) from the location A to the location B along the curved path p1, the motion-related parameters of the autonomous mobile device 100 may be set as follows: the linear velocity of the autonomous mobile device 100 may be a fixed value, the angular velocity=the linear velocity/r, the rotation angle from location A to location B=the angle between the segment OA and the segment OB.
A typical cornrow mode (or cornrow pattern) is shown in
The present disclosure provides a method for controlling movement (e.g., the turning) of the autonomous mobile device 100, such that the potential collision associated with turning in the cornrow mode may be avoided.
In the movement control method shown in
For example, in some embodiments, an obstacle detection sensor may detect the wall 311 at a distance from the autonomous mobile device 100. If the distance is greater than a predetermined distance such that the autonomous mobile device 100 can make a turn along a curved path from the current location to the projected location G without colliding with the wall 311, the current location may be used as the first location F. If the distance is less than or equal to the predetermined distance (i.e., not sufficient for the autonomous mobile device 100 to make a turn along a curved path from the current location to the projected location G without colliding with the wall 311), the autonomous mobile device 100 may move backward along the reverse direction of the first long path L2 until the distance from the current location to the wall 311 is greater than the predetermined distance. The current location may be used as the first location F. After the first location F is determined, the autonomous mobile device 100 may rotate for 90 degrees to face the +y-axis direction, as shown in
In some embodiments, a collision sensor may detect collision of the autonomous mobile device 100 with the wall 311, thereby detecting the wall 311. If the wall 311 is detected based on collision, the autonomous mobile device 100 may move backwardly along the reverse direction of the first long path L2 until the distance from the current location to the wall 311 is greater than the predetermined distance such that the autonomous mobile device 100 can make a turn along a curved path from the current location to the projected location G without colliding with the wall 311. After the autonomous mobile device 100 has moved backwardly for the predetermined distance, the current location of the autonomous mobile device 100 may be used as the first location F.
The first location F may be determined after determining the detection of the obstacle (e.g., wall 311). The autonomous mobile device 100 may determine the coordinates of the first location F. The autonomous mobile device 100 may determine the pose (e.g., coordinates and orientation) of the autonomous mobile device 100 at the projected location G. The projected location G (also referred to as the direction-reversing point) is located on the second long path L3, which is parallel with the current moving path, i.e., the first long path L2 where the first location F is located. The projected location G may be selected based on a variety of factors. For example, in some embodiments, the projected location G may be selected to be a predetermined distance dG in the x-axis direction from the first location F. The predetermined distance dG may be pre-set. The pose of the autonomous mobile device 100 at the first location F may include coordinates and orientation. The coordinates of the autonomous mobile device 100 at the first location F may be determined as (x1, y1), the orientation of the autonomous mobile device 100 may be the +y-axis (or y-axis) direction. The predetermined distance dG may be equal to the length of the projection of the distance between the projected location G (which may be the third location G shown in
Because in a cornrow mode, at the location G, the moving direction of the autonomous mobile device 100 is predetermined, i.e., the +x-axis direction, the orientation of the autonomous mobile device 100 at the location G may be determined as the +x-axis direction. As such, the pose of the autonomous mobile device 100 at the location G may be determined.
According to the movement control method disclosed herein, the autonomous mobile device 100 may move from the first location F to the projected location G through a curved path (or arc) p11. In some embodiments, the curved path p11 may be a portion of a target circle. The target circle and the curved path p11 may be determined as follows. Based on the pose of the autonomous mobile device 100 at the first location F, the pose of the autonomous mobile device 100 at the direction-reversing location G, the predetermined distance dG, and the predetermined distance b between adjacent parallel paths in the cornrow mode (or the cornrow gap b), the autonomous mobile device 100 may determine the center point O of the target circle. For example, in some embodiments, a segment FG may be formed by connecting the locations F and G. The angle between the segment FG and the x-axis may be denoted by B. The midpoint of the segment FG may be denoted by m2. Draw a perpendicular bisector pb passing through the middle point m2 and being perpendicular to the segment FG. In theory, any point on the perpendicular bisector pb to the right of the segment FG may be used as the center of the target circle. The center O may be selected such that turning along the curved path p11 would not cause any portion of the autonomous mobile device 100 to collide with the wall 311. Based on the selected center O, and using the distance between the first location F and the center O (“OF”) or the distance between the center O and the projected location G (“OG”) as the radius r, drawing an arc passing through both the first location F and the projected location G generates the curved path p11 (or arc) for moving the autonomous mobile device 100. The curved path p11 (or arc) is part of the target circle. For simplicity of illustration, only the arc p11 of the target circle is shown in
Based on the determined radius of the target circle, the coordinates of the center O of the target circle can be calculated as: (x1+dG, y1+b−r). The curved path p11 (i.e., the arc of the target circle between the first location F and the projected location G) may be determined based on the coordinates of the first location F, the coordinates of the projected location G, the coordinates of the center O, and the radius r of the target circle. The curved path p11 may be used as the travel path for the autonomous mobile device 100 when the autonomous mobile device 100 makes a turn from the first location F on the first long path L2 to the projected location G on the second long path L3 in the cornrow mode. The autonomous mobile device 100 may move from the first location F to the projected location G following the curved path p11 (or arc), and then continue to move from the projected location G in a pre-set moving direction (which is the +x-axis direction) along the path L3. During the movement (e.g., turn) from the first location F to the projected location G along the curved path p11, the motion-related parameters of the autonomous mobile device 100 may be set as follows: the linear velocity of the autonomous mobile device 100 may be a fixed value, the angular velocity=the linear velocity/r, the rotation angle from the first location F to the projected location G=the angle between the segment OF and the segment OG.
According to the disclosed movement control method, at the first location F, the autonomous mobile device 100 may determine a pose (e.g., coordinates and orientation) of the autonomous mobile device 100 at the first projected location H if the autonomous mobile device 100 makes a turn from the first location F following a curved path (which is to be determined). The first projected location H may be selected through a suitable algorithm. For example, in one embodiment, the first projected location H may be selected to have a first predetermined distance d1 from the first location F in the y-axis direction (or in the extending direction of the detected obstacle 311). An example of the first projected location H may be selected to be a point such that the distance d1 is ½, ⅓, or any other fraction, of the entire length of the connecting path L23 (or the cornrow gap b). It is noted that in
Based on the first predetermined distance d1, the y-axis coordinate of the first projected location H may be determined as yh=y2+d1. Here, the first predetermined distance d1 is pre-set, such that the y-axis coordinate of the first projected location H can be determined. The coordinates of the first projected location H if the autonomous mobile device 100 makes a turn from the location F may be determined as (x2−Lf+wr+lw, y2+d1).
Before the autonomous mobile device 100 moves from the first location F to the first projected location H, the autonomous mobile device 100 may determine a first target circle and a first curved path (or arc) p21 as follows. Based on the orientation of the autonomous mobile device 100 at the first location F and the orientation of the autonomous mobile device 100 at the first projected location H (if the autonomous mobile device 100 makes a turn), a first center of the first target circle may be determined. Connecting the locations F and H may obtain a segment FH. A perpendicular bisector pb1 of the segment FH may be drawn. The perpendicular bisector pb1 passes through a midpoint m3, and is perpendicular to the segment FH. In theory, any point on the perpendicular bisector pb1 to the right of the segment FH may be used as the first center of the first target circle for generating the first curved path p21 (or arc p21) for moving the autonomous mobile device 100 from the first location F to the first projected location H. The center may be selected such that when making the turn from the location F to the first projected location H, any portion of the autonomous mobile device 100 would not collide with the wall 311. The first radius r1 is the distance between the first center and the first location F or the first projected location H. As an example, in the embodiment shown in
The coordinates of the first center of the first target circle may be determined as (x2−Lf+wr+lw+r1, y2+d1). Based on the coordinates of the first center O1, the first radius r1 of the first target circle, the coordinates of the first projected location H, the first curved path p21 (or arc p21) connecting the first location F and the first projected location H may be determined. The autonomous mobile device 100 may move from the first location F to the first projected location H along the first curved path p21. During the movement (e.g., turn) from the first location F to the first projected location H along the first curved path p21, the motion-related parameters of the autonomous mobile device 100 may be set as follows: the linear velocity of the autonomous mobile device 100 may be a fixed value, the angular velocity=the linear velocity/r1, the rotation angle from the first location F to the first projected location H=the angle between the segments O1F and O1H.
As shown in
In a cornrow mode, when the autonomous mobile device 100 arrives at the location G, the autonomous mobile device 100 continues to move in the +x-axis direction. Thus, the orientation of the autonomous mobile device 100 at the second projected location G is the +x-axis direction. As such, the pose (which includes the coordinates and orientation) of the autonomous mobile device 100 at the second projected location G can be determined.
The second target circle and the second curved path p22 (or arc p22) may be determined as follows. Based on the pose (e.g., coordinates and orientation) of the autonomous mobile device 100 at the first location F, the location (e.g., coordinates) of the autonomous mobile device 100 at the first projected location H, the pose of the autonomous mobile device 100 at the second projected location G, the first predetermined distance d1, the second predetermined distance d2, the predetermined distance b (or cornrow gap b), the second center O2 of the second target circle may be determined. Connecting the first projected location H and the second projected location G forms a segment HG. The angle between the segment HG and the y-axis is presumed to be γ. Draw a perpendicular bisector pb2 for the segment HG. In theory, any point on the perpendicular bisector pb2 to the right of the segment HG may be used as the second center of the second target circle, and the distance between the center and the location H or the location G may be used as the second radius. The second center may be selected such that when the autonomous mobile device 100 moves along the second curved path, any portion of the autonomous mobile device 100 would not collide with the wall 311. Thus, the second target circle may be determined. As an example, the second center O2 of the second target circle in the embodiment shown in
Based on the coordinates of the second projected location G and the second radius r2 of the second target circle, the coordinates of the second center O2 of the second target circle may be determined as (x2+d2, y2+b−r2). Based on the coordinates of the location F, the coordinates of the second projected location G, the coordinates of the second center O2, and the second radius r2 of the second target circle, the second curved path p22 (or arc p22) connecting the first projected location H and the second projected location G may be determined. The second curved path p22 is a portion of the second target circle, and passes both the first projected location H and the second projected location G. The second curved path p22 may be used as the travel path for the autonomous mobile device 100 to move from the first projected location H to the second projected location G. The autonomous mobile device 100 moves from the first projected location H to the second projected location G along the second curved path p22, and at the second projected location G, the autonomous mobile device 100 may continue to move in the pre-set direction (i.e., the +x-axis direction in this embodiment) of the cornrow mode. During the movement (e.g., turn) from the first projected location H to the second projected location G along the second curved path p22, the motion-related parameters of the autonomous mobile device 100 may be set as follows: the linear velocity of the autonomous mobile device 100 may be a fixed value, the angular velocity=the linear velocity/r2, the rotation angle from the first projected location H to the second projected location G=the angle between the segments O2H and O2G.
Various embodiments have been described to illustrate the exemplary implementations. Based on the disclosed embodiments, a person having ordinary skills in the art may make various other changes, modifications, rearrangements, and substitutions without departing from the scope of the present disclosure. Thus, while the present disclosure has been described in detail with reference to the above embodiments, the present disclosure is not limited to the above described embodiments. The present disclosure may be embodied in other equivalent forms without departing from the scope of the present disclosure. The scope of the present disclosure is defined in the appended claims.
This application is a continuation of International Application No. PCT/CN2022/113676, filed on Aug. 19, 2022, the content of which is incorporated herein by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/CN2022/113676 | Aug 2022 | WO |
Child | 19056714 | US |