This application claims all benefits accruing under 35 U.S.C. § 119 from TW Patent Application No. 105124849, filed on Aug. 4, 2016 in the TW Intellectual Property Office, the contents of which are hereby incorporated by reference.
The subject matter herein generally relates to an autonomous mobile device and a method of forming a guiding path.
Simultaneous localization and mapping (SLAM) is commonly used in an autonomous mobile device for positioning. SLAM means the autonomous mobile device starts from an unknown environment location, and establish its own location and posture by repeatedly observing map features during a movement; then incrementally constructing a map, so as to achieve a self-locating and map-constructing simultaneously. SLAM commonly achieves positioning by more information from the sensor, such as GPS, IMU, Odometry. When the autonomous mobile device moves by universal wheel or omni wheel, the odometry can not provide a reference to a moving distance, and the GPS also can not be used in an interior room environment.
An artificial marker is commonly used to achieve positioning. But the artificial marker is generally pre-set in the desired location, and then a procedure is written to control the autonomous mobile device moving on the map. Technical staff will provide on-site service to set the artificial marker and write the procedure according to the desired area environment after users buying the autonomous mobile device. The artificial marker is generally located on a starting point, a destination, or a corner. A wheel rotation direction and an outputted motor of the autonomous mobile device can be calculated by a distance between two artificial markers and a road surface environment. But the autonomous mobile device can not accurately arrive at the destination because that a wheel of the autonomous mobile device may be slipping or idling during a movement. Thus, the autonomous mobile device needs to go back and forth to debug and modify several times in order to accurately arrive at the destination. The need for technical staff on-site service every time is time-consuming and laborious.
Implementations of the present technology will now be described, by way of example only, with reference to the attached figures, wherein:
The disclosure is illustrated by way of example and not by way of limitation in the figures of the accompanying drawings in which like references indicate similar elements. It should be noted that references to “another,” “an,” or “one” embodiment in this disclosure are not necessarily to the same embodiment, and such references mean “at least one.”
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures, and components have not been described in detail so as not to obscure the related relevant feature being described. Also, the description is not to be considered as limiting the scope of the embodiments described herein. The drawings are not necessarily to scale, and the proportions of certain parts have been exaggerated to better illustrate details and features of the present disclosure.
Several definitions that apply throughout this disclosure will now be presented.
The term “substantially” is defined to be essentially conforming to the particular dimension, shape, or other feature described, such that the component need not be exactly conforming to such feature. The term “comprise,” when utilized, means “include, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in the so-described combination, group, series, and the like.
Referring to
The autonomous mobile device can be any mobile device, such as robot or unmanned vehicle. The autonomous mobile device moves by a foot or a wheel.
The desired moving area can be a workplace, such as a workshop, a restaurant, or a tourist station. The artificial marker and the guidance points are located in the desired moving area. Each artificial marker corresponds to an ID. The ID may include a number, a character, etc. Each ID represents a name of an artificial marker, such as a starting point or a destination. The artificial marker can be Tag36h11 marker series, Tag36h10 marker series, Tag25h9 marker series, or Tag16h5 marker series.
The map interpretation module stores the map of the desired moving area, the map description file corresponding to the map, and the location information of the ID corresponding to the artificial marker. The plurality of artificial markers are located in the desired moving area, and the autonomous mobile device moves between the plurality of artificial markers. The map is stored in a designated mark language (XML) or another format file, wherein the artificial marker is defined. The map description file includes a description of a vicinity of the artificial marker on the map. The map description file may be a place name marked by the artificial marker on the map.
The map interpretation module further stores a plurality of guidance points, the location of the guidance point relative to the artificial marker, and the actual movement information of the autonomous mobile device.
The image collection module comprises a camera. The camera is located on the side of the autonomous mobile device facing a direction of movement to capture the image in a field of view, so as to be capable of capturing the artificial marker to form the image signal. The image collection module transmits the image signal to the artificial marker identification module through a data line. The camera can be a web camera based on Charge-coupled Device (CCD) or Complementary Metal Oxide Semiconductor (CMOS).
The artificial marker identification module receives the image captured by the image collection module, and reads and identifies the artificial marker in the image. The artificial marker identification module transmits the ID of the artificial marker to the map interpretation module, to determine a position and an angle of the autonomous mobile device relative to the artificial marker, so as to realize positioning. The artificial marker identification module can calculate the distance and the angle between the autonomous mobile device and the artificial marker according to a collected artificial marker, and the control module can fine tune the autonomous mobile device to move to the artificial marker.
The personnel guidance module activates the personal guidance mode to make the autonomous mobile device move and follow a guidance person. The personal guidance module continuously defines the specific point as the guidance point, and stores the location of the guidance point relative to the artificial marker and an actual movement information of the autonomous mobile device in the map interpretation module to form the guidance path.
The voice input module is used to input the name of the guidance path, and the guidance path is automatically added in the autonomous mobile device.
The control module controls the autonomous mobile device to move and arrive at the destination according to the guidance path stored by the map interpretation module.
S1: providing an autonomous mobile device comprising a map interpretation module, an image collection module, an artificial marker identification module, a personal guidance module, a voice input module, and a control module;
S2: locating an artificial marker on a desired location, storing a location information of an ID corresponding to the artificial marker in the map interpretation module;
S3: activating their personal guidance mode to make the autonomous mobile device follow a guidance person and move from a starting point to a destination, continuously defining a specific point as a guidance point during movement of the autonomous mobile device, and storing a location of the guidance point relative to the artificial marker and an actual movement information of the autonomous mobile device in the map interpretation module to form a guidance path; and
S4: inputting a name of the guidance path by the voice input module, and automatically adding the guidance path in the autonomous mobile device.
In step S1, the autonomous mobile device can be any mobile device, such as robot or unmanned vehicle.
In step S2, the artificial marker is located in the desired location. The location information of the ID corresponding to the artificial marker is stored in the map interpretation module. The image collection module continuously captures an image of the artificial marker around the desired moving area to form an image signal, and transmits the image signal to the artificial marker identification module. The artificial marker identification module identifies the image of the artificial marker and transmits the ID of the artificial marker to the map interpretation module. The map interpretation module determines the position of the autonomous mobile device according to the ID of the artificial marker, so as to achieve positioning of the autonomous mobile device.
In step S3, the personal guidance mode of the autonomous mobile device is activated to make the autonomous mobile device follow the guidance person and move from the starting point to the destination. The plurality of specific points is continuously defined as the guidance points during movement. Each specific point represents a location. There is an obstacle or a corner in the location, or a path roughness changes in the location. The autonomous mobile device follows the guidance person passing through the specific point and continuously defines the specific point as the guidance point. The autonomous mobile device stores the location of the artificial marker corresponding to the guidance point and the actual movement information of the autonomous mobile device in the map interpretation module to form the guidance path. The actual movement information of the autonomous mobile device includes the rotational direction of a wheel, the rotational speed of the wheel and a number of rotations of the wheel.
In step S4, the name of the guidance path is inputted by the voice input module, and the guidance path is automatically added in the autonomous mobile device.
The control module controls the autonomous mobile device to smoothly and accurately move to the destination according to the guidance path. After the autonomous mobile device arriving at a vicinity of the destination, the artificial marker identification module calculates the distance and the angle between the autonomous mobile device and the destination according to an collected artificial marker in the destination, and then the control module can fine-tune a movement of the autonomous mobile device to arrive at the destination.
In the autonomous mobile device and the method of forming a guiding path, the personal guidance mode of the autonomous mobile device is activated to make the autonomous mobile device follow a guidance person and move from the starting point to the destination. The autonomous mobile device follows the guidance personnel passing through the specific point and continuously defines the specific point as the guidance point. The autonomous mobile device stores the location of the guidance point relative to the artificial marker and the actual movement information of the autonomous mobile device in the map interpretation module to form the guidance path. The autonomous mobile device moves according to the guidance path. Therefore, it is possible to save a trouble of on-site service of a technician and manual operation, save time and effort, and the autonomous mobile device can smoothly and accurately arrive at the destination.
Referring to
In the process of the robot moving from the starting point to the destination, the robot can not go straight and move forward when the robot encounters the obstacle. A guidance point F is added to indicate that there is the obstacle. The location of the guidance point relative to the artificial marker and the actual movement information of the autonomous mobile device are stored in the map interpretation module to form a guidance path.
The guidance path is named as “dodging the obstacle.” The voice input module voice inputs a name of the guidance path as “dodging the obstacle,” and a guidance path is automatically added in the robot.
The control module controls the robot to smoothly and accurately move to the destination according to the guidance path named as “dodging the obstacle.”
When there is a corner between the artificial marker A and the artificial marker B or a road roughness between the artificial marker A and the artificial marker B is changed, the autonomous mobile device can smoothly and accurately move to the destination by locating a plurality of guidance points.
Depending on the embodiment, certain of the steps of methods described may be removed, others may be added, and the sequence of steps may be altered. It is also to be understood that the description and the claims drawn to a method may include some indication in reference to certain steps. However, the indication used is only to be viewed for identification purposes and not as a suggestion as to an order for the steps.
Finally, it is to be understood that the above-described embodiments are intended to illustrate rather than limit the disclosure. Variations may be made to the embodiments without departing from the spirit of the disclosure as claimed. Elements associated with any of the above embodiments are envisioned to be associated with any other embodiments. The above-described embodiments illustrate the scope of the disclosure but do not restrict the scope of the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
105124849 | Aug 2016 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
4589810 | Heindl | May 1986 | A |
6285920 | McGee | Sep 2001 | B1 |
6314341 | Kanayama | Nov 2001 | B1 |
6347261 | Sakaue | Feb 2002 | B1 |
6453212 | Asama | Sep 2002 | B1 |
6459955 | Bartsch | Oct 2002 | B1 |
7015831 | Karlsson | Mar 2006 | B2 |
7848850 | Hoshino | Dec 2010 | B2 |
9829333 | Calder | Nov 2017 | B1 |
9945677 | Watts | Apr 2018 | B1 |
10071891 | High | Sep 2018 | B2 |
20030114959 | Sakamoto | Jun 2003 | A1 |
20030144763 | Mori | Jul 2003 | A1 |
20050041839 | Saitou | Feb 2005 | A1 |
20050256611 | Pretlove | Nov 2005 | A1 |
20060056678 | Tanaka | Mar 2006 | A1 |
20060140450 | Hong | Jun 2006 | A1 |
20060241792 | Pretlove | Oct 2006 | A1 |
20060293792 | Hasegawa | Dec 2006 | A1 |
20070013510 | Yamada | Jan 2007 | A1 |
20070022078 | Gupta | Jan 2007 | A1 |
20070135962 | Kawabe | Jun 2007 | A1 |
20070192910 | Vu | Aug 2007 | A1 |
20070233318 | Lei | Oct 2007 | A1 |
20090021351 | Beniyama | Jan 2009 | A1 |
20100049366 | Lee | Feb 2010 | A1 |
20100063652 | Anderson | Mar 2010 | A1 |
20100222925 | Anezaki | Sep 2010 | A1 |
20120197439 | Wang | Aug 2012 | A1 |
20120215380 | Fouillade | Aug 2012 | A1 |
20150283701 | Izhikevich | Oct 2015 | A1 |
20150283702 | Izhikevich | Oct 2015 | A1 |
20150283703 | Izhikevich | Oct 2015 | A1 |
20170282731 | Kwa | Oct 2017 | A1 |
20180039280 | Lee | Feb 2018 | A1 |
Number | Date | Country | |
---|---|---|---|
20180039281 A1 | Feb 2018 | US |