This is the first application filed for the present technology.
The present technology relates to map databases and, in particular, to methods for interacting with map databases using mobile devices.
Map databases store map data for one or more geographical regions. This map data is used to render digital two-dimensional maps or, more recently, three-dimensional renderings of geographical areas. The map data in the map databases is typically created from government documents and pre-existing maps, satellite or aerial imagery or from ground-based data-collection vehicles.
These map databases must be frequently updated as new roads, buildings, landmarks, points of interest, or other structures are built or removed. Updating map databases can be an arduous task, typically requiring new satellite or aerial imagery, or new ground-based data collection, etc. Map data inevitably becomes stale (out of date) when the geographical landscape changes (e.g. due to natural disaster or climate change) or when man-made features change (e.g. when new structures are built or old structures are demolished). Furthermore, detailed map data for remote or uninhabited areas may not be available.
Accordingly, there is a need in the industry to provide an improved technique for updating map data in a map database.
Further features and advantages of the present technology will become apparent from the following detailed description, taken in combination with the appended drawings, in which:
It will be noted that throughout the appended drawings, like features are identified by like reference numerals.
In general, the present technology provides an innovative technique for updating map data in a map database by capturing digital imagery. New map data is created for an object (such as a newly erected building) in the captured image. The location of this object is determined by knowing the current location of the device that took the digital photo (e.g. via a GPS position fix) and the position of the object in the photo relative to the device. The new map data corresponding to the new object may then be added to an existing map database to represent the new object that was previously not represented by any map data in the map database. This enables new objects or map features (whether they are man-made or natural) to be added to a map database. Mobile users may thus contribute map data to a map database by capturing images. This is useful when there are new landmarks, roads, buildings, etc. It is also useful for areas that have not been mapped or that have not been fully mapped in detail.
Thus, an aspect of the present technology is a method of updating map data, the method comprising capturing an image using a camera, determining a location of an object in the image, creating new map data to represent the object in the image, and updating a map database to include the new map data for the object in the image.
Another aspect of the present technology is a computer-readable medium (or machine-readable medium) comprising instructions in code which when loaded into memory and executed on a processor of a wireless communications device or mobile device causes the device to generate new map data by capturing an image using a camera, determining a location of an object in the image, creating new map data to represent the object in the image, and updating a map database to include the new map data for the object in the image.
Yet another aspect of the present technology is a mobile device or wireless communications device for updating map data. The device includes a camera for capturing an image, a Global Positioning System (GPS) receiver for determining a current location of the wireless communications device, and a memory for storing the image. The memory is operatively coupled to a processor for determining a location of an object in the image. The processor is configured to cause new map data representing the object in the image to be created.
The details and particulars of these aspects of the technology will now be described below, by way of example, with reference to the attached drawings.
As shown schematically in
As further depicted in
Although the present disclosure refers to expressly to the “Global Positioning System”, it should be understood that this term and its abbreviation “GPS” are being used expansively to include any satellite-based navigation-signal broadcast system, and would therefore include other systems used around the world including the Beidou (COMPASS) system being developed by China, the multi-national Galileo system being developed by the European Union, in collaboration with China, Israel, India, Morocco, Saudi Arabia and South Korea, Russia's GLONASS system, India's proposed Regional Navigational Satellite System (IRNSS), and Japan's proposed QZSS regional system.
References herein to “GPS” are meant to include Assisted GPS and Aided GPS.
The wireless communications device 100 may optionally also include a Wi-Fi transceiver 197 for connecting to a Wi-Fi station and/or a Bluetooth® transceiver 199 for short-range data transfer with another Bluetooth® device with which it has been paired.
Signals emitted by terrestrially based transmitting stations, such as cellular base stations, TV stations, or Wi-Fi access points can be also used in the determination of a mobile's location. Some base-station based location technologies are, for example, Qualcomm's hybrid solution, which can obtain a position fix using a few GPS satellites plus a few base stations. Emergency 911 (E911) phase 1 is based on cell ID, which uses base station location to approximate the user location, which might be not accurate. Depending on the positioning accuracy that is sought, another sort of position-determining subsystem may be contemplated, e.g. a Wi-Fi positioning subsystem or a radiolocation subsystem that determines its current location using radiolocation techniques. Although GPS is believed to provide the most accurate positioning, for other applications where precise positioning is not so important, the approximate location of the mobile device can be determined using Wi-Fi positioning or triangulation of signals from in-range base towers, such as used for Wireless E911. Wireless Enhanced 911 services enable a cell phone or other wireless device to be located geographically using radiolocation techniques such as (i) angle of arrival (ADA) which entails locating the caller at the point where signals from two towers intersect; (ii) time difference of arrival (TDOA), which uses multilateration like GPS, except that the networks determine the time difference and therefore the distance from each tower; and (iii) location signature, which uses “fingerprinting” to store and recall patterns (such as multipath) which mobile phone signals exhibit at different locations in each cell. Radiolocation techniques may also be used in conjunction with GPS in a hybrid positioning system.
With reference still to
To summarize, therefore, the novel wireless communications device 100 disclosed herein includes a camera 195 for capturing a photographic image. For operational efficiency, the camera should be a digital camera but the technique may also be implemented using a non-digital camera and a digital scanner. The device 100 also includes a GPS receiver 190 for determining a current location of the device 100, and a memory 120, 130 for storing the image(s). The memory 120, 130 is operatively coupled to a processor 110 for determining a location of an object in the image(s). Various techniques may be employed for locating the object in the image(s), as will be explained below. The processor 110 is furthermore configured to cause new map data representing the object in the image(s) to be created, as will also be explained in greater detail below. In one implementation, as shown by way of example in
As introduced above, a related aspect of this technology is a novel method of updating map data. As depicted in the flowchart of
In one implementation, the step of determining the relative position of the object comprises using multiple images to triangulate the relative position of the object.
In one implementation, the step of determining the relative position of the object comprises using a stereoscopic camera to determine the relative position of the object.
In another implementation, the step of determining the relative position of the object comprises using a rangefinder (e.g. laser, radar, infrared beam, or any other wave-emitting and receiving device) and compass sensor to determine the relative position of the object. In one implementation, the GPS receiver 190 itself provides the compass heading, e.g., while moving. Alternatively, the orientation of the camera when taking the picture is determined through a combination of previous GPS position and heading and sensor reported parameters, such as those reported by a gyro sensor and/or a accelerometer sensor.
As shown in
As shown in
Alternatively, as shown in
Determining the location of an object in an image may be performed by the mobile device or it may be offloaded to a server (e.g. 700 in the network). If it is offloaded to a server, the device may simplify the image to reduce the amount of data being transmitted. In one implementation, the device may provide a low-resolution photo file to the server for calculation. In a variant, the mobile device may convert the photo into a wire mesh image showing only lines representing the edges of the new objects.
Likewise, determining whether an object is already accounted for in the map data of the map data may be done by the mobile device or offloaded to a server. Similarly, the creation of new map data for a new object can be done by the mobile device or offloaded to a server (which may or may not be the map server itself).
In one implementation, only designated (authorized) users may update the map database. Authorized users may upload new map data to the map database by logging in with their credentials, using a private key, or other cryptographic techniques for restricting write/edit access to the database. In another implementation, any user may contribute new map data (an open “wiki map” concept).
Once the map database has been updated, subsequent requests for map data may include the new map data for the new object, such as the new Building B.
Various techniques may be used to determine the position of the object relative to the current location of the mobile device 100. These techniques may involve geometric calculations based on triangulation based on multiple images, stereoscopic images, focal-length techniques, etc. Other techniques may rely upon a laser rangefinder for laser detection and ranging (LIDAR), radar, infrared beam, or other such wave-reflection technique.
In monocular optics, the principles of the rangefinding camera may be used. The so-called looming equation assumes a constant focal length f. The size of a projection of an object onto the focal plane depends on the distance between the object and the camera. The object's image height p0 is obtained from the image. Its corresponding distance d0 is unknown. By moving the camera a known distance Δd and taking another picture, the image height p1 can be obtained from the second image. Its corresponding distance d1 is unknown but is equal to d0+Δd. However, since Δd is known (here, Δd is mechanically adjusted by the camera itself to move the lens back by a small amount). With known Δd, p0, p1, the unknown d0 can be easily calculated by using the equations
d1=−p0 Δd/(p1−p0), d0=−p1 Δd(p1−p0).
Another aspect of the technology is an image overlay technique that immediately uses the images of the actual physical surroundings while the map database is being updated. In one implementation, the mobile device can update its own map data and render a new map with the new map data. In another implementation, the mobile device offloads the creation of new map data to the map server or other server and awaits new map data to render a new map onscreen. In the latter scenario, where data is offloaded, there is a lag in updating the map database due to the computation time for determining the location of the new object and the delivery time for delivering this new map data to the map server (not to mention the time needed for the map server to integrate the new map data). During this lag period, the device is still without an accurate portrayal of its surroundings. One solution to this problem is to use the captured images to create composite three-dimensional renderings that incorporate the images as the backdrop to a two- or three-dimensionally rendered map.
In other words, the image taken by the camera may be simultaneously displayed in real-time as a backdrop to a three-dimensionally rendered map of an area corresponding to the current position of the mobile device.
This means that new objects not in the map database can be seen in the background or backdrop image. This technique immediately provides an accurate portrayal of the actual physical surroundings without having to wait for the map database to be updated and for new map data to be downloaded to the device.
Another aspect of the technology uses predetermined position coordinates for objects in the surrounding area as reference points for improving the accuracy of the device's positioning system.
The foregoing methods can be implemented in hardware, software, firmware or as any suitable combination thereof. That is, the computer-readable medium comprises instructions in code which when loaded into memory and executed on a processor of a mobile device is adapted to any of the foregoing method steps.
These method steps may be implemented as software, i.e. as coded instructions stored on a computer readable medium which performs the foregoing steps when the computer readable medium is loaded into memory and executed by the microprocessor of the mobile device. A computer readable medium can be any means that contain, store, communicate, propagate or transport the program for use by or in connection with the instruction execution system, apparatus or device. The computer-readable medium may be electronic, magnetic, optical, electromagnetic, infrared or any semiconductor system or device. For example, computer executable code to perform the methods disclosed herein may be tangibly recorded on a computer-readable medium including, but not limited to, a floppy-disk, a CD-ROM, a DVD, RAM, ROM, EPROM, Flash Memory or any suitable memory card, etc. The method may also be implemented in hardware. A hardware implementation might employ discrete logic circuits having logic gates for implementing logic functions on data signals, an application-specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array (PGA), a field programmable gate array (FPGA), etc.
This new technology has been described in terms of specific implementations and configurations which are intended to be exemplary only. Persons of ordinary skill in the art will appreciate that many obvious variations, refinements and modifications may be made without departing from the inventive concepts presented in this application. The scope of the exclusive right sought by the Applicant(s) is therefore intended to be limited solely by the appended claims.
Number | Name | Date | Kind |
---|---|---|---|
6285317 | Ong | Sep 2001 | B1 |
6314370 | Curtright | Nov 2001 | B1 |
6377210 | Moore | Apr 2002 | B1 |
6496778 | Lin | Dec 2002 | B1 |
6526352 | Breed et al. | Feb 2003 | B1 |
6832156 | Farmer | Dec 2004 | B2 |
7042345 | Ellis | May 2006 | B2 |
7451041 | Laumeyer et al. | Nov 2008 | B2 |
7751805 | Neven | Jul 2010 | B2 |
7796081 | Breed | Sep 2010 | B2 |
7865267 | Sabe | Jan 2011 | B2 |
8509488 | Enge | Aug 2013 | B1 |
8649565 | Kim | Feb 2014 | B1 |
9198004 | Das | Nov 2015 | B2 |
9206023 | Wong | Dec 2015 | B2 |
20020122564 | Rhoads | Sep 2002 | A1 |
20030182052 | DeLorme | Sep 2003 | A1 |
20030215110 | Rhoads | Nov 2003 | A1 |
20040117358 | von Kaenel | Jun 2004 | A1 |
20050020902 | Janes | Jan 2005 | A1 |
20050068450 | Steinberg | Mar 2005 | A1 |
20050131581 | Sabe | Jun 2005 | A1 |
20050185060 | Neven, Sr. | Aug 2005 | A1 |
20070124064 | Fukui | May 2007 | A1 |
20080120122 | Olenski | May 2008 | A1 |
20080268876 | Gelfand | Oct 2008 | A1 |
20090005961 | Grabowski | Jan 2009 | A1 |
20090141966 | Chen | Jun 2009 | A1 |
20090216446 | Ma | Aug 2009 | A1 |
20090228204 | Zavoli | Sep 2009 | A1 |
20090237510 | Chen | Sep 2009 | A1 |
20100066814 | Su | Mar 2010 | A1 |
20100070125 | Lee | Mar 2010 | A1 |
20100118025 | Smith | May 2010 | A1 |
20100149399 | Mukai | Jun 2010 | A1 |
20100176987 | Hoshizaki | Jul 2010 | A1 |
20100188407 | Nielsen | Jul 2010 | A1 |
20100189312 | Nielsen | Jul 2010 | A1 |
20100313148 | Hochendoner | Dec 2010 | A1 |
20110172913 | Nakamura | Jul 2011 | A1 |
20110237274 | Wong | Sep 2011 | A1 |
20110310087 | Wright, Jr. | Dec 2011 | A1 |
20120011119 | Baheti et al. | Jan 2012 | A1 |
20120011142 | Baheti | Jan 2012 | A1 |
20120045093 | Salminen | Feb 2012 | A1 |
20120094596 | Tysowski | Apr 2012 | A1 |
20120094597 | Tysowski | Apr 2012 | A1 |
20120140061 | Zeng | Jun 2012 | A1 |
20120154557 | Perez | Jun 2012 | A1 |
20120259540 | Kishore | Oct 2012 | A1 |
20120300089 | Sbaiz et al. | Nov 2012 | A1 |
20130162665 | Lynch | Jun 2013 | A1 |
Number | Date | Country |
---|---|---|
2214122 | Aug 2010 | EP |
2006132522 | Dec 2006 | WO |
2009080070 | Jul 2009 | WO |
2009098154 | Aug 2009 | WO |
2010069380 | Jun 2010 | WO |
Entry |
---|
1. Bosch GPS Concept: http://www.gpsmagazine.com/2009/01/. |
3D GPS Maps GPS Review http://www.gpsreview.net/3d-gps-maps/. This page shows GPS navigation systems to provide 3D data in their maps (either 2D data with a viewing angle or real 3D data). |
http://www.pocketpcaddict.com/forums/software-related-pressreleases/20427-googlenavigator-google-earth-3d-gps-navigation-yourpocket-pc.html. |
http://www.icircuits.com/prod—osd—main.html. |
• http://portal.acm.org/citation.cfm?id=1459522&dl=GUIDE&coll=GUIDE&CFID=100012335&CFTOKEN=49910225. |
www.tineye.com. |
European Search Report from corresponding EP Application dated Jul. 1, 2011. |
Wikipedia: “Triangulation”, Dec. 7, 2010, Retrieved from the Internet: URL:http://en.wikipedia.org/w/index.php?title=Triangulation&oldid=401024072 [retrieved on Jan. 28, 2013]. |
Wikipedia: “Stereo camera”, Dec. 15, 2010, Retrieved from the Internet: URL:http://en.wikipedia.org/w/index.php?title=Stereo—camera&oldid=402478920 [retrieved on Jan. 28, 2013]. |
Vectronix: “Pocket Laser Range Finderwith Compass Option”, Dec. 31, 2008, Retrieved from the Internet: URL:www.vectronix.ch/App/bin/down.php/352—4fea1d6015832367fcdde2c8832f52f8/PLRF—Brochure—EN.pdf [retrieved on Jan. 28, 2013]. |
Summons to attend Oral Proceedings dated Apr. 5, 2013 issued in corresponding European Patent Application No. 10196895.6. |
Canadian Intellectual Property Office, Office Action, App. No. 2762743, Apr. 2, 2015. |
Canadian Intellectual Property Office, Office Action on Application No. 2,762,743 , Issued on Mar. 22, 2016. |
Number | Date | Country | |
---|---|---|---|
20120166074 A1 | Jun 2012 | US |