The use of accurate position information of a mobile station, such as cellular or other wireless communication devices, is becoming prevalent in the communications industry. The satellite positioning system (SPS), such as the Global Positioning System, offers an approach to providing wireless mobile station position determination. A SPS user can derive precise navigation information including three-dimensional position, velocity and time of day through information gained from satellite vehicles (SVs) in orbit around the earth. The signals that are received from the SVs are typically weak. Therefore, in order to determine the position of a mobile station, the SPS receiver must be sufficiently sensitive to receive these weak signals and interpret the information represented by them.
One limitation of current SPS receivers is that their operation is limited to situations in which multiple satellites are clearly in view, without obstructions, and where a good quality antenna is properly positioned to receive such signals. As such, they normally are unusable in areas with blockage conditions, such as where there is significant foliage or buildings or other obstacles (e.g., urban canyons) and within buildings.
Navigation assistance information may be acquired for a mobile station based on feature descriptors of an image of a visual beacon captured by the mobile station. The navigation assistance information includes locations of neighboring visual beacons, locations of wireless positioning resources and user environmental context information. The navigation assistance information may then be used to assist in navigating within the local environment. The visual beacon may be an artificial feature, such as a QR code or other type of matrix or bar code or a natural feature, such as a statute or architectural detail. The mobile station may request navigation assistance by transmitting the feature descriptors to a server. The server retrieves the navigation assistance information from a database based on the feature descriptors and optionally location information, which may also be provided in the navigation assistance request, and transmits the navigation assistance information to the mobile station.
A system and method described herein uses a visual beacon to acquire navigation assistance information, which enables position determination and navigation without the need for signals from a satellite positioning system (SPS). Acquiring information that may assist in navigation of mobile station based on a visual beacon such as a data code label or QR code is described in general in U.S. Ser. No. 12/540,498, filed on Aug. 13, 2009, entitled “Accessing Positional Information for a Mobile Station Using a Data Code Label, by Edward T. L. Hardie, which is assigned to the same assignee as the present patent document, and which is incorporated by reference herein it its entirety. The system and method described herein may include a mobile station that uses features from visual beacons to request navigation assistance and in response receives a navigation assistance message. The navigation assistance message may include information with respect to the position of the visual beacon 104, and therefore the mobile station 102, as well as information with respect to neighboring visual beacons, wireless positioning resources, and user context and environmental context.
As used herein, a mobile station refers to a device such as a cellular or other wireless communication device, personal communication system (PCS) device, personal navigation device, Personal Information Manager (PIM), Personal Digital Assistant (PDA), laptop or other suitable mobile device which is capable of receiving wireless communications. The term “mobile station” is also intended to include devices which communicate with a personal navigation device (PND), such as by short-range wireless, infrared, wireline connection, or other connection—regardless of whether satellite signal reception, assistance data reception, and/or position-related processing occurs at the device or at the PND. Also, “mobile station” is intended to include all devices, including wireless communication devices, computers, laptops, etc. which are capable of communication with a server, such as via the Internet, Wi-Fi, or other network, and regardless of whether satellite signal reception, assistance data reception, and/or position-related processing occurs at the device, at a server, or at another device associated with the network. Any operable combination of the above is also considered a “mobile station.”
Acquiring navigation assistance information for a mobile station using visual beacons as described herein may be advantageous if the mobile station does not have SPS capabilities or if the SPS system is inactive or in locations where SPS may not work adequately, e.g., in locations that suffer from blockage conditions. Blockage conditions may exist where the SPS receiver in the mobile station has difficulty acquiring and/or tracking SPS signals from SPS satellites. For example, blockage conditions may exist in indoor environments, in urban canyon environments, and in certain outdoor environments with natural obstacles, such as foliage and interfering topology.
Navigation without SPS or in blockage conditions presents two related problems: keeping an accurate sense of position and having access to local information about the topology. Navigation without the benefits of SPS is hampered by the relative difficulty of substituting other technologies. For example, almanacs of wireless access points can supply some data, but they would be expensive to compile and the source of almanac data appropriate for a local area may not be obvious to the user of a mobile station. Another example is inertial sensors, which may provide information based on the tracking of movement through dead reckoning, but these tend to amass errors over time. These techniques can benefit from access to information which associates location information with a specific position as well as from access to information which associates a position with related almanac data or available maps.
As illustrated in
The visual beacon 104, however, may be a “natural” beacon, which includes all other visible objects excluding artificial beacons. These natural beacons are processed through SIFT or other image processing methods for feature descriptor extraction. Natural visual beacons may include features such as doors, buildings, architectural details, sculptures, or other features that can uniquely identify the visual beacon 104, particularly if the approximate position of the mobile station 102 is known. If a natural feature lack sufficient uniqueness within the search window which is the vicinity of mobile's approximate location, there could be more than one matching results in the server searching process. These multiple candidate matches may be preserved with their corresponding matching quality matrices and corresponding likelihood. The mobile station may be informed of either the best candidate among all or multiple candidates with corresponding likelihood. On the mobile station side, when no other positioning information is available, the most likely candidate is presented as a best estimate location and can be fed into a positioning filter (e.g. Kalman filter). If other positioning information is given, the likelihood is re-computed in consideration of other positioning information.
Navigation assistance information is embedded or indexed by the visual beacon 104. For example, if the visual beacon 104 is an artificial visual beacon that is capable of encoding information in a sufficiently dense manner, e.g., using colorized QR codes, the navigation assistance information may be embedded in the visual beacon 104. Alternatively, the navigation assistance information may be indexed by the artificial or natural visual beacon 104. The visual beacon 104 may include a feature that is observed by the mobile station 102 and reported to a navigation assistance server 108 via a wireless network 106 in order to obtain the navigation assistance information. For example, the visual beacon 104 may be encoded with a specific identifier that is reported to a navigation assistance server 108 via the wireless network 106. Alternatively, the visual beacon feature may be a unique or semi-unique attribute of the visual beacon 104, including size, shape, and location, which can be used to identify the visual beacon 104. Thus, all navigation assistance information requests may be routed to the same navigation assistance server 108, where features of the visual beacon 104 are used to access the navigation assistance information. Alternatively, the visual beacon 104 may be embedded with, e.g., a Uniform Resource Identifier (URI), which can be used by the mobile station 102 to access different navigation assistance servers via the wireless network 106.
As illustrated in
If the visual beacon 104 is not encoded with an identifier, i.e., the visual beacon is a natural visual beacon, the reported features of the visual beacon 104 may be compared to the candidate feature descriptors provided by the feature descriptor database 110 to determine the assistance message index. For example, if the visual beacon 104 is a natural feature, such as an architectural detail or statute, the mobile station 102 may report to the navigation assistance server 108 a description of the image of the visual beacon 104 including e.g., keypoint descriptors determined using Scale Invariant Feature Transform (SIFT), as well as the approximate location, which may be determined from the last known valid position of the mobile station 102. The candidate feature descriptors provided by the feature descriptor database 110 can then be compared to the reported features to identify the visual beacon 104 and determine the assistance message index.
The navigation assistance server 108 is coupled to a navigation assistance database 112, which is queried based on the assistance message index. A navigation assistance message that corresponds to the assistance message index is retrieved by the server 108 and provided to the mobile station 102, via the wireless network 106.
As illustrated in
As the mobile station 102 moves within the local environment, the position of the mobile station 102 as determined, e.g., by inertial sensors, by triangulation or other measurement of signals from wireless signals, or by visual localization techniques, may be updated with reference to the local or global coordinate system and displayed on the digital map 150, e.g., as updated position 154. Because inertial sensors alone tend to amass errors over time, the wireless signals and/or visual localization techniques may be used in conjunction with the inertial sensors to minimize errors. Moreover, by capturing images of other artificial or natural visual beacons, the position of the mobile station 102 may be periodically updated or corrected.
Due to the vicinity of the mobile station 102 to the visual beacon 104 that is acquired, the navigation assistance message may include micro-indexed contents that will significantly enhance the positioning and navigation of the mobile station 102. Current navigation assistance is given within a large geographical context, e.g., tens or hundreds of kilometers, with weak relevance to mobile stations current location. Micro-indexed contents of the navigation assistance message, on the other hand, focuses on localized assistance information, e.g., approximately hundreds of meters, in order to optimize the assistance information within the local environment. The localized assistance information is enabled by the high positioning accuracy of visual beacon-based positioning, which provides sub-meter accuracy in contrast to tens or hundreds of meters of accuracy of other urban/indoor positioning technologies. The high urban/indoor positioning accuracy is critical to both data accumulation and distribution. When a mobile station observes a visual beacon, the observation of neighboring navigation resources is tied relative to this visual beacon. Thus, the provided navigation assistance message includes information in the vicinity of the mobile station's current location with an accurate description of the current environment and surrounding navigation resources.
The proposed contents of a navigation message include a description of the visual beacon 104 itself, a list of neighboring artificial features and natural features, list of available wireless positioning resources, statistics of user context, and environmental context. The list of artificial or natural visual beacons enables faster and reliable acquisition of neighboring visual beacons based on their description. Additionally, the micro-indexed navigation assistance information specified in the vicinity of the visual beacon 104 provides a more accurate picture of the region and assists the mobile station 102 to determine the best set of positioning sources to be used and how to acquire and track them. The self description of the visual beacon 104 may include information as provided in Table 1 below.
The list of neighboring artificial features may include information such as the visual beacon description, including size and type, and the location, e.g., absolute or relative. The description of neighboring artificial features may include information as provided in Table 2 below.
The list of neighboring natural features may include information such as the visual beacon description, including size and type, and the location, e.g., absolute or relative. The description of neighboring natural features may include information as provided in Table 3 below.
The list of available wireless positioning resources, such as SPS, WiFi, Femtocell, and cellular signals may include acquisition information, such as signal type, signal power level, number of channels, frequency and time offset, and corresponding wireless channel characteristics, as well as tracking statistics, such as accuracy, continuity, integrity, and availability. The description of available wireless positioning resources may include information as provided in Table 4 below.
The list of statistics of user context may include statistical information obtained from other users in the same location, such as statistics of whether the average user is stationary, walking, running, or driving. The user context statistical information may be used to assist the navigation engine of the mobile station 102 to determine the proper mode of operation. For example, the description of previous user behavior in the area can provide proper model of expected user behavior specific to this area, which can be used to adjust and adapt the navigation software configuration (such as positioning rate), proper state of navigation filter (stationary, walking, driving), and a set of positioning sources to be used. The description of user context may include information as provided in Table 5 below.
The list of environmental context may include information such as whether the local environment is deep indoor, light indoor, dense urban, urban, suburban, residential, or rural, as well as information about building materials in the local environment, which may affect the received signal strength of wireless signals, and maps of the local environment. The environmental context in the vicinity of the visual beacon assists the navigation engine to determine proper mode of operation. For example, the description of environment context can provide a proper model of expected user behavior specific to this area and can be used to adjust and adapt the navigation process as performed by the position determination unit, such as adjusting a rate of position acquisition, the proper state of navigation filter (stationary, walking, driving), and a set of positioning sources to be used. The description of user context may include information as provided in Table 6 below.
The mobile station control unit 124 includes an image processing unit 126 that receives the image of the visual beacon 104 from the camera 122 and processes the image to determine identifying features of the visual beacon 104. For example, the processing unit 126 may decode an artificial visual beacon 104 using suitable matrix code label reading software stored in memory 127 to determine the encoded identifier feature or, where the artificial visual beacon 104 is encoded in a sufficiently dense manner, to determine the navigation assistance information embedded in the artificial visual beacon 104. The processing unit 126 may also determine a description of the image of a natural visual beacon 104 including e.g., keypoint descriptors determined using Scale Invariant Feature Transform (SIFT), or using other well known image recognition techniques. If desired, the processing unit 126 may include multiple sub-units to perform the different types of image analysis, e.g., decoding matrix codes and feature recognition.
With the identifying feature from the visual beacon 104 determined, the mobile station control unit 124 accesses the server 108 via wireless network 106 (
The mobile station 102 may include inertial sensors 142 which may be used to provide data to a position determination unit 131 that determines the position of the mobile station 102. Inertial data, including the direction and magnitude of movement of the mobile station 102, is provided by the inertial sensors 142 to the position determination unit 131 in the mobile station control unit 124, which may then update the determined position of the mobile station accordingly. Examples of inertial sensors that may be used with the mobile station 102 include accelerometers, quartz sensors, gyros, or micro-electromechanical system (MEMS) sensors used as linear accelerometers. Additionally, the mobile station 102 may include a received signal strength indicator system (RSSI) 146 that is connected to the wireless transceiver 144 and the position determination unit 131 of the mobile station control unit 124. The RSSI system 146 determines the signal strength of any signal received by the wireless transceiver 144 and provides the measured signal strength to the position determination unit 131. The measured radio signal strength may be compared to e.g., a wireless positioning resource almanac received in the navigation assistance message. By way of example, the navigation assistance message may include a database of the signal strengths of wireless access points for different positions with respect to the local environment. The current position of the mobile station 102 can be determined to lie in an area that corresponds to the data point in the wireless access point almanac with the highest correlation to the measured radio signal strength. Additionally, the position determination unit 131 may determine a current position based on triangulation with multiple wireless positioning resources based on location information provide in the navigation assistance message. Additionally, the position determination unit 131 may be used to determine the current position of the mobile station using visual localization techniques using images from camera 122. If the mobile station 102 includes a GPS signal receiver, the position determination unit 131 may be used to determine the position of the mobile station 102 based on the received GPS signals. It should be understood that the position determination unit 131 may include separate sub-units that determination position in accordance with the different techniques described. Moreover, the position determination unit 131 may be part of or separate from the process 125. The description of user and/or environment context can be used to adjust and adapt the navigation process as performed by the position determination unit 131, such as adjusting a rate of position acquisition, the proper state of navigation filter (stationary, walking, driving), and a set of positioning sources to be used.
The mobile station 102 also includes a user interface 134 that is in communication with the mobile station control unit 124, e.g., the mobile station control unit 124 accepts data from and controls the user interface 134. The user interface 134 includes a means for displaying images, such as digital display 136, produced by the camera 122 as well as information from the navigation assistance message, such as a digital map. The display 136 may further display control menus and positional information. The user interface 134 further includes a keypad 135 or other input device through which the user can input information into the mobile platform 100. In one embodiment, the keypad 135 may be integrated into the display 136, such as a touch screen display. The user interface 134 may also include a microphone 137 and speaker 138 when the mobile station 102 is a cellular telephone. Additionally, the inertial sensors 142 may be used as part of the user interface 134 by detecting user commands in the form of gestures.
The mobile control unit may include a position database 128 that stores and updates the position of the mobile station 102 based on position information obtained based on information obtained from the navigation assistance message, the inertial sensors 142 or access points 114, which act as wireless positioning resources, via wireless transceiver 144. As the mobile station control unit 124 determines that the position of the mobile station 102 changes, the position database 128 is updated with the new position. The updated positional information can then be provided, e.g., by displaying a digital map with the new position on the display 136 or by providing additional navigation instructions on the display and/or via speaker 138.
Once the positional information is downloaded, the mobile station 102 can navigate using the inertial sensors 142 even after the radio has been turned off, e.g., in “airplane mode” on a cell phone. Moreover, if the visual beacon 104 is capable of embedding the positional information, the mobile station 102 can obtain the map and navigate while in “airplane mode”.
The methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware 130, firmware 132, software 129, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. Memory may be implemented within the processor unit or external to the processor unit. As used herein the term “memory” refers to any type of long term, short term, volatile, nonvolatile, or other memory and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
For example, software 129 codes may be stored in memory 127 and executed by the processor 125 and may be used to run the processor and to control the operation of the mobile station 102 as described herein. For example, a program code stored in a computer-readable medium, such as memory 127, may include program code to determine feature descriptors of a visual beacon 104, to request navigation assistances using the feature descriptor, to receive navigation assistance information and use the navigation assistance information to assist in navigating within an environment. The computer-readable medium may include program code to update the position of the mobile station using inertial data provided by inertial sensors 142. Additionally, the computer-readable medium may include program code to measure the strength of a signal from one or more wireless access points that are included in the received navigation assistance information, and to update the position of the mobile station using the measured strength of the signal.
If implemented in firmware and/or software, the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
As illustrated in
If the server 108 determines that a feature descriptor is attached to the request (304), the server 108 checks to determine if the location information of the mobile station 102 is provided with the navigation assistance request (318). In assistance retrieval, the coarse location provided by the mobile station narrows down the search process by limiting the search window and enhances the probability of unique identification of the feature. If no location information is provided, the server 108 conducts a global feature search in the navigation assistance database 112 (320). If location information is provided, the server 108 conducts a localized feature search in the navigation assistance database 112 within a position boundary based on the provided location information (322). The position boundary may vary based on the uncertainty of the provided location information or may be a predetermined area around the provided location information, e.g., 100 square feet, or a square mile, or any other desired area. If no feature can be identified within the navigation assistance database 112 (324), the server 108 provides a failure request to the mobile station (316). If no feature can be identified within the navigation assistance database 112, the server 108 may store the feature descriptor in the navigation assistance database 112, along with the any provided location information and any known information with respect to environmental context, user context, and neighboring artificial and natural visual beacons so that the feature will be identified in any future navigation assistance requests based on the same feature.
If a feature is identified within the navigation assistance database 112 (324), the server 108 retrieves the navigation assistance message from the navigation assistance database 112 (326). The server 108 then determines whether an navigation assistance message is available (312) and either provides the navigation assistance message to the mobile station 102 (314) or provides a failure request to the mobile station (316) if no navigation assistance is available.
Although the present invention is illustrated in connection with specific embodiments for instructional purposes, the present invention is not limited thereto. Various adaptations and modifications may be made without departing from the scope of the invention. Therefore, the spirit and scope of the appended claims should not be limited to the foregoing description.
Number | Name | Date | Kind |
---|---|---|---|
6452544 | Hakala et al. | Sep 2002 | B1 |
6542824 | Berstis | Apr 2003 | B1 |
6556722 | Russell et al. | Apr 2003 | B1 |
6594600 | Arnoul et al. | Jul 2003 | B1 |
6859729 | Breakfield et al. | Feb 2005 | B2 |
7487042 | Odamura | Feb 2009 | B2 |
7845560 | Emanuel et al. | Dec 2010 | B2 |
7991222 | Dimsdale et al. | Aug 2011 | B2 |
8060302 | Epshtein et al. | Nov 2011 | B2 |
20020126364 | Miles | Sep 2002 | A1 |
20020169551 | Inoue et al. | Nov 2002 | A1 |
20030155413 | Kovesdi et al. | Aug 2003 | A1 |
20040028258 | Naimark et al. | Feb 2004 | A1 |
20040039519 | Burg et al. | Feb 2004 | A1 |
20040189517 | Pande et al. | Sep 2004 | A1 |
20050008256 | Uchiyama et al. | Jan 2005 | A1 |
20050020309 | Moeglein et al. | Jan 2005 | A1 |
20050026630 | Iso et al. | Feb 2005 | A1 |
20050037775 | Moeglein et al. | Feb 2005 | A1 |
20050143916 | Kim et al. | Jun 2005 | A1 |
20050159154 | Goren | Jul 2005 | A1 |
20050167504 | Meier | Aug 2005 | A1 |
20050190972 | Thomas et al. | Sep 2005 | A1 |
20050222767 | Odamura | Oct 2005 | A1 |
20050253718 | Droms et al. | Nov 2005 | A1 |
20060095349 | Morgan et al. | May 2006 | A1 |
20060136129 | Yokozawa | Jun 2006 | A1 |
20060229058 | Rosenberg | Oct 2006 | A1 |
20070263924 | Kochi et al. | Nov 2007 | A1 |
20070293146 | Lai et al. | Dec 2007 | A1 |
20080069405 | Endo et al. | Mar 2008 | A1 |
20080082258 | Pham et al. | Apr 2008 | A1 |
20080095402 | Kochi et al. | Apr 2008 | A1 |
20080096527 | Lamba et al. | Apr 2008 | A1 |
20080109302 | Salokannel et al. | May 2008 | A1 |
20080137912 | Kim et al. | Jun 2008 | A1 |
20080153516 | Hsieh | Jun 2008 | A1 |
20080214149 | Ramer et al. | Sep 2008 | A1 |
20090005975 | Forstall et al. | Jan 2009 | A1 |
20090048779 | Zeng et al. | Feb 2009 | A1 |
20090093959 | Scherzinger et al. | Apr 2009 | A1 |
20090098873 | Gogic | Apr 2009 | A1 |
20090115656 | Raman et al. | May 2009 | A1 |
20090248304 | Roumeliotis et al. | Oct 2009 | A1 |
20090254276 | Faulkner et al. | Oct 2009 | A1 |
20090262974 | Lithopoulos | Oct 2009 | A1 |
20100017115 | Gautama | Jan 2010 | A1 |
20100030469 | Hwang et al. | Feb 2010 | A1 |
20100083169 | Athsani et al. | Apr 2010 | A1 |
20100125409 | Prehofer | May 2010 | A1 |
20100329513 | Klefenz | Dec 2010 | A1 |
20110021207 | Morgan et al. | Jan 2011 | A1 |
20110039573 | Hardie | Feb 2011 | A1 |
20110143779 | Rowe et al. | Jun 2011 | A1 |
20110178708 | Zhang et al. | Jul 2011 | A1 |
Number | Date | Country |
---|---|---|
1272922 | Nov 2000 | CN |
1286424 | Mar 2001 | CN |
1637769 | Jul 2005 | CN |
101046378 | Oct 2007 | CN |
101144722 | Mar 2008 | CN |
1555541 | Jul 2005 | EP |
1790993 | May 2007 | EP |
2441644 | Mar 2008 | GB |
H07231473 | Aug 1995 | JP |
10213644 | Aug 1998 | JP |
H11304489 | Nov 1999 | JP |
2000046856 | Feb 2000 | JP |
2004213191 | Jul 2004 | JP |
2005189225 | Jul 2005 | JP |
2005257738 | Sep 2005 | JP |
2005530365 | Oct 2005 | JP |
2006170872 | Jun 2006 | JP |
2006184235 | Jul 2006 | JP |
2007525094 | Aug 2007 | JP |
2008111828 | May 2008 | JP |
2011525234 | Sep 2011 | JP |
593981 | Jun 2004 | TW |
2005004528 | Jan 2005 | WO |
2005124594 | Dec 2005 | WO |
WO-2008025013 | Feb 2008 | WO |
2008093103 | Aug 2008 | WO |
2009098319 | Aug 2009 | WO |
Entry |
---|
Alberto et al., “Inertial Navigation Systems for User-Centric Indoor Applications”, Oct. 2010, XP002663219, Retrieved from the Internet: URL:http://geoweb.crs4.it/lib/exe/fetch.php″id=indoornavigation&cache=cache&media=ne[retrieved on Nov. 10, 2011]the whole document. |
International Search Report and Written Opinion -PCT/US2011/040098, International Search Authority—European Patent Office -Aug. 19, 2011. |
Mirzaei et al., “A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation”, IEEE Transactions on Robotics, vol. 24, No. 5, Oct. 2008. |
Number | Date | Country | |
---|---|---|---|
20110306323 A1 | Dec 2011 | US |