1. Background Field
Embodiments of the subject matter described herein are related generally to position and tracking, and more particularly to tracking the changing position of a remote mobile device.
2. Relevant Background
Tracking is used to estimate a mobile device's position and orientation (pose) relative to an object or to a coordinate system. One use of tracking is in augmented Reality (AR) systems, which render computer generated information that is closely registered to real world objects and places when displayed. When tracking is successful, the AR system can display the computer generated information tightly coupled to the real world objects, whereas without successful tracking the computer generated information would be displayed with little or no connection to the real world objects displayed. Conventionally, successful tracking and augmentation can be done only for known objects, i.e., objects that have been modeled or for which reference images are available, or in static scenes, i.e., scenes in which there are no moving unknown objects. Current systems are not capable of tracking unknown moving objects. Accordingly, an improved system for tracking unknown objects is desired.
A mobile platform in a multi-user system tracks its own position with respect to an object and tracks remote mobile platforms despite being an unknown moving objects. The mobile platform captures images of the object and tracks its position with respect to the object as the position changes over time using the multiple images. The mobile platform receives the position of a remote mobile platform with respect to the object as the second position changes over time. The mobile platform tracks the position of the mobile platform with respect to the remote mobile platform using the position of the mobile platform and the received position of the remote mobile platform. The mobile platform may render virtual content with respect to the object and the remote mobile platform based on the tracked positions or may detect or control interactions or trigger events based on the tracked positions.
In one implementation, a method includes capturing multiple images of an object with a first mobile platform; tracking a first position of the first mobile platform with respect to the object using the multiple images as the first position changes over time; receiving a second position of a remote mobile platform with respect to the object as the second position changes over time; and tracking a third position of the first mobile platform with respect to the remote mobile platform using the first position and the second position as the third position changes over time.
In another implementation, an apparatus includes a camera adapted to image an object; a transceiver adapted to receive a first position of a remote mobile platform with respect to the object as the first position changes over time; and a processor coupled to the camera and the transceiver, the processor being adapted to track a second position of the camera with respect to the object using images captured by the camera as the second position changes over time, and track a third position of the camera with respect to the remote mobile platform using the first position and the second position as the third position changes over time.
In another implementation, an apparatus includes means for capturing multiple images of an object with a first mobile platform; means for tracking a first position of the first mobile platform with respect to the object using the multiple images as the first position changes over time; means for receiving a second position of a remote mobile platform with respect to the object as the second position changes over time; and means for tracking a third position of the first mobile platform with respect to the remote mobile platform using the first position and the second position as the third position changes over time.
In yet another implementation, a non-transitory computer-readable medium including program code stored thereon, includes program code to track a first position of a first mobile platform with respect to an object as the first position changes over time using multiple captured images of the object; program code to receive a second position of a remote mobile platform with respect to the object as the second position changes over time; and program code to track a third position of the first mobile platform with respect to the remote mobile platform using the first position and the second position as the third position changes over time.
Each mobile platform 110 includes a camera 112 for imaging the environment and a display 113 on the front side of the mobile platform 110 (not shown on mobile platform 110B) for displaying the real world environment as well as any rendered virtual content. The real world environment in
Thus, each mobile platform 110A and 110B independently tracks its own respective position with respect to the game board 102. As illustrated in
Conventional systems, however, are not capable of tracking unknown moving objects. Thus, a conventional system is not able to track another mobile platform and, accordingly, virtual content is not conventionally rendered with respect to other mobile platforms.
The mobile platforms 110 in multi-user AR system 100, however, are capable of tracking other mobile platforms by communicating their respective positions to each other, e.g., in a peer-to-peer network using transceivers 119. For example, the mobile platforms 110 may communicate through directly with each other, as illustrated arrow 114 or through a network 130, which may be coupled to a server (router) 133, illustrated with dotted lines in
Thus, each mobile platform 110 visually tracks its own pose with tracking system 116 and receives via transceiver 119 the pose of the other mobile platform 110 with respect to the same object, e.g., the game board 102 in
The tracked position of the mobile platform with respect to the remote mobile platform may be used for various desired applications. For example, as illustrated with dotted lines in
The mobile platform 110 also includes a control unit 160 that is connected to and communicates with the camera 112 and transceiver 119. The control unit 160 accepts and processes images captured by camera 112 and controls the transceiver 119 to receive the position and orientation of any remote mobile platform and to send the position and orientation to any remote mobile platform. The control unit 160 further controls the user interface 150 including the display 113. The control unit 160 may be provided by a bus 160b, processor 161 and associated memory 164, hardware 162, software 165, and firmware 163. The control unit 160 may include a detection and tracking processor 166 that serves as the tracking system 116 to detect and tracks objects in images captured by the camera 112 to determine the position and orientation of the mobile platform 110 with respect to a tracked object in the captured images. The control unit 160 may further include a pose processor 167 for determining the pose of the mobile platform 110 with respect to a remote mobile platform using the pose of the mobile platform 110 with respect to a tracked object from the detection and tracking processor 166 and the pose of a remote mobile platform with respect to the object as received by the transceiver 119. The control unit may further include an AR processor 168, which may include a graphics engine to render desired AR content with respect to the tracked object using the tracked position and the remote mobile platform using the position of the remote mobile platform received by the transceiver 119. The rendered AR content is displayed on the display 113.
The detection and tracking processor 166, pose processor 167, and AR processor 168 are illustrated separately from processor 161 for clarity, but may be part of the processor 161 or implemented in the processor based on instructions in the software 165 which is run in the processor 161. It will be understood as used herein that the processor 161 can, but need not necessarily include, one or more microprocessors, embedded processors, controllers, application specific integrated circuits (ASICs), digital signal processors (DSPs), and the like. The term processor is intended to describe the functions implemented by the system rather than specific hardware. Moreover, as used herein the term “memory” refers to any type of computer storage medium, including long term, short term, or other memory associated with the mobile platform, and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
The methodologies described herein may be implemented by various means depending upon the application. For example, these methodologies may be implemented in hardware 162, firmware 163, software 165, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
For a firmware and/or software implementation, the methodologies may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the methodologies described herein. For example, software codes may be stored in memory 164 and executed by the processor 161. Memory may be implemented within or external to the processor 161. If implemented in firmware and/or software, the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include non-transitory computer-readable media encoded with a data structure and computer-readable media encoded with a computer program. Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
The mobile platform may include a means for capturing multiple images of an object with a first mobile platform such as the camera 112 or other similar means. The mobile platform may further include a means for tracking a first position of the first mobile platform with respect to the object using the multiple images as the first position changes over time, which may include the camera 112 as well as the detection/tracking processor 166, which may be implemented in hardware, firmware and/or software. The mobile platform may further include a means for receiving a second position of a remote mobile platform with respect to the object as the second position changes over time, which may include the transceiver 119 as well as the processor 161, which may be implemented in hardware, firmware and/or software. The mobile platform may further include means for tracking a third position of the first mobile platform with respect to the remote mobile platform using the first position and the second position as the third position changes over time, which may include the pose processor 167, and which may be implemented in hardware, firmware and/or software. The mobile platform may further include a means for rendering a first virtual content with respect to the object using the first position and means for rendering a second virtual content with respect to the remote mobile platform using the third position, which may include a display 113 and the AR processor 168, which may be implemented in hardware, firmware and/or software. The mobile platform may further include a means for using the first position and the third position to at least one of detect interactions between the mobile platform and the remote mobile platform; control interactions between the mobile platform and the remote mobile platform; and trigger an event in the mobile platform, which may include the processor 161 and may be implemented in hardware, firmware and/or software.
Although the present invention is illustrated in connection with specific embodiments for instructional purposes, the present invention is not limited thereto. Various adaptations and modifications may be made without departing from the scope of the invention. Therefore, the spirit and scope of the appended claims should not be limited to the foregoing description.
This application claims priority under 35 USC 119 to provisional application No. 61/529,135, filed Aug. 30, 2011, which is assigned to the assignee hereof and which is incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
61529135 | Aug 2011 | US |