This application claims priority to Chinese Patent Application No. 105101311 filed on Jan. 16, 2016 in the Taiwan Intellectual Property Office, the contents of which are incorporated by reference herein.
The subject matter herein generally relates to indoor positioning methods based on dead reckoning and indoor positioning systems (IPS).
There are currently several existing indoor positioning systems, such as distance measurements to nearby anchor nodes (Wi-Fi, Bluetooth, etc.), image positioning or dead reckoning. However, each of the aforementioned techniques has their own flaws.
However, the distance measurements approach will require extra apparatus setup (e.g. Wi-Fi emitter/receiver) around the building, resulting in cumbersome work as well as added cost. Dead reckoning can avoid the additional setup and expenditure, but there are no means to correct the accumulated error over time. Although image positioning does not require additional equipment and do not have cumulative errors, they are expensive and involves a significant amount of processing.
What is needed, therefore, is to provide an indoor positioning method and an indoor positioning system which can overcome the shortcomings as described above.
Many aspects of the embodiments can be better understood with reference to the following drawings. The components in the drawings are not necessarily drawn to scale, the emphasis instead being placed upon clearly illustrating the principles of the embodiments. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
It will be appreciated that for simplicity and clarity of illustration, where appropriate, reference numerals have been repeated among the different figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein can be practiced without these specific details. In other instances, methods, procedures and components have not been described in detail so as not to obscure the related relevant feature being described. The drawings are not necessarily to scale and the proportions of certain parts may be exaggerated to better illustrate details and features. The description is not to be considered as limiting the scope of the embodiments described herein.
Several definitions that apply throughout this disclosure will now be presented.
The connection can be such that the objects are permanently connected or releasably connected. The term “inside” indicates that at least a portion of a region is partially contained within a boundary formed by the object. The term “substantially” is defined to be essentially conforming to the particular dimension, shape or other word that substantially modifies, such that the component need not be exact. The term “comprising” means “including, but not necessarily limited to”; it specifically indicates open-ended inclusion or membership in a so-described combination, group, series and the like. It should be noted that references to “an” or “one” embodiment in this disclosure are not necessarily to the same embodiment, and such references mean at least one.
The present disclosure relates to indoor positioning methods and indoor positioning systems described in detail as below.
Referring to
Compared with a conventional indoor positioning system based on dead reckoning, the indoor positioning system 10 includes the distance sensor 108 and the correction module of the processor 100. Compared with conventional indoor positioning system based on distance measurement, the indoor positioning system 10 omits the extra apparatus, such as Wi-Fi, bluetooth, emitter, and receiver. Compared with conventional indoor positioning system based on image, the indoor positioning system 10 does not need image acquisition device and image processing device.
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes a 1st actual distance D1 along an 1st direction, a 2nd actual distance D2 along a 2nd direction, . . . and an Mth actual distance DM along an Mth direction, M□2;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D; if yes, the unique point is used as the actual position 122 of the object 12, if no, go to step (S15); and
step (S15), increasing value of M, and go back to step (S13).
In step (S11), the estimated position 120 of the object 12 is obtained by the gyroscope 102, the digital compass 104 and the accelerometer 106 through dead reckoning and then sent to the processor 100.
In step (S12), the selecting the target area 103 is performed by the processor 100. The target area 103 is inside the indoor map 101. The shape of the target area 103 is not limited and can be circular, triangle, rectangle, square or polygon. The estimated position 120 can be the geometrical center of the target area 103.
The selecting the target area 103 is very important to the workload of determining the actual position 122 of the object 12. In the process of determining the actual position 122 of the object 12 according to the actual distance data D, only the points inside the target area 103 rather than on entire indoor map 101 would be calculated. The target area 103 can be selected according to accuracy and a cumulative error of the gyroscope 102, the digital compass 104 and the accelerometer 106. Referring to
In step (S13), the actual distance data D is detected by the distance sensor 108 and then sent to the processor 100. When M=2, the 1st direction and the 2nd direction are substantially perpendicular with each other, and at least one of the 1st direction and the 2nd direction is substantially perpendicular with the wall 105 of the building 107. When M=4, in a rectangular coordinates system, the 1st direction is +X direction, the 2nd direction is +Y direction, the 3rd direction is −X direction, and the 4th direction is −Y direction. The actual distance data D includes the 1st actual distance D1 along the +X direction, the 2nd actual distance D2 along the +Y direction, the 3rd actual distance D3 along the −X direction, and the 4th actual distance D4 along the −Y direction. When 8≧M>4, the 5th direction, the 6th direction, the 7th direction and the 8th direction can be the direction between adjacent two of the +X direction, the +Y direction, the −X direction and the −Y direction.
In step (S14), the judging can be performed by different judging methods described as in following examples.
In step (S15), the increasing value of M can be performed by different increasing methods, such as M=M+A, and A is a natural number; or M=B×M, and B is a natural number greater than one.
The different examples of indoor positioning methods based on different judging methods are described in detail as below.
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes the 1st actual distance D1 along the 1st direction, the 2nd actual distance D2 along the 2nd direction . . . and the Mth actual distance DM along the Mth direction, M=4, an angle between adjacent two of the 1st direction, the 2nd direction . . . and the Mth direction is α, and α=360°/M;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D by following substeps:
step (S15), increasing value of M, and go back to step (S13).
In step (S12), the target area 103 is circular having the estimated position 120 as the geometrical center.
In step (S13), when M=4, the 1st direction is east, the 2nd direction is north, the 3rd direction is west, and the 4th direction is south. When 8≧M>4, the 5th direction is northeast, the 6th direction is northwest, the 7th direction is southwest, and the 8th direction is southeast.
In step (S141), as shown in
Furthermore, the more points inside the target area 103 are selected in step (S141), the actual position 122 obtained by the method is more accurate. However, more workload is needed. In example 1, the interval H between adjacent two of the plurality points is the same as the accuracy S of the distance sensor 108. The term “matches” includes the map distance data d are identical to the actual distance data D, or distance difference between the map distance data d and the actual distance data D is less than the accuracy S of the distance sensor 108. The judging whether the unique point inside the target area 103 has the map distance data d which matches the actual distance data D comprises judging whether the unique point inside the target area 103 satisfies a condition: |D−d|≦S.
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes the 1st actual distance D1 along the 1st direction, the 2nd actual distance D2 along the 2nd direction . . . and the Mth actual distance DM along the Mth direction, M=4, an angle between adjacent two of the 1st direction, the 2nd direction . . . and the Mth direction is α, and α=360°/M;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D by following substeps:
step (S15), increasing value of M, and go back to step (S13).
The method of example 2 is similar to the method of example 1, except that in step (S14), the judging whether the unique point inside the target area 103 matches the actual distance data D is performed by judging whether the 1st line 130, the 2nd line 132 . . . and the Nth line have the unique common point. The 1st direction is east and the 2nd direction is north. The wall 105 of the indoor map 101 can be a straight line as shown in
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes the 1st actual distance D1 along the 1st direction, the 2nd actual distance D2 along the 2nd direction . . . and the Mth actual distance DM along the Mth direction, M=4, an angle between adjacent two of the 1st direction, the 2nd direction . . . and the Mth direction is α, and α=360°/M;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D by following substeps:
step (S15), increasing value of M, and go back to step (S13).
The method of example 3 is similar to the method of example 2, except that in step (514C), the 1st line 130, the 2nd line 132 . . . and the Nth line are obtained by moving the wall 105 of the indoor map 101. The 1st direction is east and the 2nd direction is north.
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes the 1st actual distance D1 along the 1st direction, the 2nd actual distance D2 along the 2nd direction . . . and the Mth actual distance DM along the Mth direction, M=4, an angle between adjacent two of the 1st direction, the 2nd direction . . . and the Mth direction is α, and α=360°/M;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D by following substeps:
step (S15), increasing value of M, and go back to step (S13).
The method of example 4 is similar to the method of example 2, except that in the first judging step (S144) of judging whether have the unique common point, three lines, the 1st line 130, the 2nd line 132 and the 3rd line 134, rather than two lines are used. The 1st direction is east, the 2nd direction is north and the 3rd direction is west.
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes the 1st actual distance D1 along the 1st direction, the 2nd actual distance D2 along the 2nd direction . . . and the Mth actual distance DM along the Mth direction, M=4, an angle between adjacent two of the 1st direction, the 2nd direction . . . and the Mth direction is α, and α=360°/M;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D by following substeps:
step (S15), increasing value of M, and go back to step (S13).
The method of example 5 is similar to the method of example 1, except that in step (S14), the 1st line 130 inside the target area 103 is defined first, and the all the points on the 1st line 130 rather than inside the target area 103 are selected to calculate the map distance data d. The 1st direction is east, and the 2nd direction is north.
Referring to
step (S11), obtaining the estimated position 120 of the object 12 on the indoor map 101;
step (S12), selecting a target area 103 on the indoor map 101, wherein the estimated position 120 is inside the target area 103;
step (S13), detecting the actual distance data D between the object 12 and the wall 105 surrounding the object 12, wherein the actual distance data D includes the 1st actual distance D1 along the 1st direction, the 2nd actual distance D2 along the 2nd direction . . . and the Mth actual distance DM along the Mth direction, M=4, an angle between adjacent two of the 1st direction, the 2nd direction . . . and the Mth direction is α, and α=360°/M;
step (S14), judging whether an unique point inside the target area 103 matches the actual distance data D by following substeps:
step (S15), increasing value of M, and go back to step (S13).
The method of example 6 is similar to the method of example 5, except that in step (S141), the 1st line 130 inside the target area 103 is defined by moving the wall 105 of the indoor map 101. In example 6, two points that satisfies d2=D2 are obtained in step (S143). In step (S144), the unique point that satisfies dN=DN is the point that satisfies d3=D3 when N=3. The 1st direction is east, the 2nd direction is north, and the 3rd direction is west.
The indoor positioning method can be used inside of the building with a roof, such as shopping malls, office, warehouse, or workshop. The indoor positioning method can also be used in the place without roof, such as open-air football field, open-air plaza or open-air parking place, as long as the place is enclosed by a wall. The indoor positioning system can be utilized in both unmanned and manned moving platforms, such as drones, autonomous ground vehicles, robots cars, or motorcycles, etc.
The indoor positioning method of this disclosure not only overcome the flaw of high cost and significant amount of processing of the conventional indoor positioning system based on distance measurement or image processing, but also overcome the flaw of the conventional indoor positioning method based only on dead reckoning which has no means to correct the accumulated error over time.
The embodiments shown and described above are only examples. Even though numerous characteristics and advantages of the present technology have been set forth in the forego description, together with details of the structure and function of the present disclosure, the disclosure is illustrative only, and changes may be made in the detail, including in matters of shape, size and arrangement of the parts within the principles of the present disclosure up to, and including, the full extent established by the broad general meaning of the terms used in the claims.
Depending on the embodiment, certain of the steps of methods described may be removed, others may be added, and the sequence of steps may be altered. The description and the claims drawn to a method may include some indication in reference to certain steps. However, the indication used is only to be viewed for identification purposes and not as a suggestion as to an order for the steps.
Number | Date | Country | Kind |
---|---|---|---|
105101311 A | Jan 2016 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
8639640 | Kadous | Jan 2014 | B1 |
20090043504 | Bandyopadhyay | Feb 2009 | A1 |
20120290254 | Thrun | Nov 2012 | A1 |
20130039391 | Skarp | Feb 2013 | A1 |
20130336138 | Venkatraman | Dec 2013 | A1 |
20140152809 | Jarvis | Jun 2014 | A1 |
20150018018 | Shen | Jan 2015 | A1 |
20150168153 | Hsu | Jun 2015 | A1 |
20150281910 | Choudhury | Oct 2015 | A1 |
20160148417 | Kim | May 2016 | A1 |
20160259061 | Carter | Sep 2016 | A1 |
20160298969 | Glenn, III | Oct 2016 | A1 |
Number | Date | Country | |
---|---|---|---|
20170205238 A1 | Jul 2017 | US |