1. Field
The present disclosure relates generally to mobile devices, and more particularly, to methods and apparatus for determining the orientation of a mobile phone in an indoor environment.
2. Background
Determination of the orientation of a mobile device in indoor environments may be useful in a number of applications. For example, the orientation of a mobile device may be needed to navigate mobile phone users in office/commercial environments, to enable customers to find items in a supermarket or retail outlet, for coupon issuance and redemption, and for customer service and accountability. However, achieving precise orientation estimates in indoor venues is a challenging task. Mobile devices typically estimate their orientation using a compass that is built in to the mobile devices. Such orientation estimates, however, are often highly inaccurate due to the presence of metallic objects inside walls, door frames, and furniture in most indoor venues.
In an aspect of the disclosure, a method, a computer program product, and an apparatus are provided. The apparatus captures one or more images of at least a first indicator and a second indicator, identifies the first indicator based on first identifying information and identifies the second indicator based on second identifying information, and determines an orientation of the mobile device based on the captured one or more images of the at least the first indicator and the second indicator.
The detailed description set forth below in connection with the appended drawings is intended as a description of various configurations and is not intended to represent the only configurations in which the concepts described herein may be practiced. The detailed description includes specific details for the purpose of providing a thorough understanding of various concepts. However, it will be apparent to those skilled in the art that these concepts may be practiced without these specific details. In some instances, well known structures and components are shown in block diagram form in order to avoid obscuring such concepts.
Several aspects of a mobile device will now be presented with reference to various apparatus and methods. These apparatus and methods will be described in the following detailed description and illustrated in the accompanying drawings by various blocks, modules, components, circuits, steps, processes, algorithms, etc. (collectively referred to as “elements”). These elements may be implemented using electronic hardware, computer software, or any combination thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system.
By way of example, an element, or any portion of an element, or any combination of elements may be implemented with a “processing system” that includes one or more processors. Examples of processors include microprocessors, microcontrollers, digital signal processors (DSPs), field programmable gate arrays (FPGAs), programmable logic devices (PLDs), state machines, gated logic, discrete hardware circuits, and other suitable hardware configured to perform the various functionality described throughout this disclosure. One or more processors in the processing system may execute software. Software shall be construed broadly to mean instructions, instruction sets, code, code segments, program code, programs, subprograms, software modules, applications, software applications, software packages, routines, subroutines, objects, executables, threads of execution, procedures, functions, etc., whether referred to as software, firmware, middleware, microcode, hardware description language, or otherwise.
Accordingly, in one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or encoded as one or more instructions or code on a computer-readable medium. Computer-readable media includes computer storage media. Storage media may be any available media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), and floppy disk where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
As used herein, the term mobile device may refer to a cellular phone, a smart phone, a session initiation protocol (SIP) phone, a laptop, a personal digital assistant (PDA), a satellite radio, a global positioning system, a multimedia device, a video device, a digital audio player (e.g., MP3 player), a camera, a game console, a tablet, or any other similar functioning device. Moreover, the term mobile device may also be referred to by those skilled in the art as a mobile station, a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a wireless device, a remote device, a mobile subscriber station, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client, a client, or some other suitable terminology.
In an aspect, floor 102 may include two or more orientation indicators (also referred to as “indicators” or “luminaires”) located above the mobile device 104. In the configuration of
With reference to
In an aspect, the indicators (e.g., indicators 108, 110, 112, 114, 116, 118, 120, and 122) may be LED devices configured to transmit visible light communication (VLC) signals. The VLC signals may be detected by the front facing camera 105 and the digital image sensor of the mobile device 104. The VLC signals may then be decoded by the mobile device 104. In such aspect, the VLC signals transmitted by an indicator may contain identification information of the indicator. The mobile device 104 may associate the indicator with the identification information transmitted by the indicator. For example, the identification information transmitted by an indicator may be a 48 bit MAC address that is unique with respect to other indicators. It should be understood that other types of identification information may be transmitted by the indicators if such identification information is unique and allows for disambiguation of an indicator located in a particular venue (e.g., a floor of an office building, supermarket, or shopping mall). In an aspect, the mobile device 104 may be configured to simultaneously decode VLC signals from multiple indicators.
For example, the front facing camera 105 may detect and decode first VLC signals transmitted by indicator E 116 and second VLC signals transmitted by indicator F 118. The mobile device 104 may decode the first VLC signals transmitted by indicator E 116 in order to determine the identifying information included in the first VLC signals and to identify the indicator E 116. The mobile device 104 may decode the second VLC signals transmitted by indicator F 118 in order to determine the identifying information included in the second VLC signals and to identify the indicator F 118. In this example, the mobile device 104 may identify the indicator E 116 based on a first 48 bit MAC address received from the indicator E 116 via the first VLC signals, where the first 48 bit MAC address identifies or corresponds to the indicator E 116. The mobile device 104 may identify the indicator F 118 based on a second 48 bit MAC address received from the indicator F 118 via the second VLC signals, where the second 48 bit MAC address identifies or corresponds to the indicator F 118.
In an aspect, one or more of the indicators (e.g., indicators 108, 110, 112, 114, 116, 118, 120, and 122) may not transmit any information. In such aspect, information may be embedded in the shape, color, and or visual structure of the indicator which may be detected and interpreted by the digital image sensor (e.g., a CMOS sensor) installed in the front facing camera 105.
In an aspect, after the mobile device 104 has identified two or more indicators, the mobile device 104 may reference a map of the venue in which the mobile device 104 is currently located. In one configuration, the map of the venue may include the locations of two or more of the indicators (e.g., indicators 108, 110, 112, 114, 116, 118, 120, and 122) located at the venue.
In an aspect, the map 216 may be stored in a memory of the mobile device 104. In another aspect, the map 216 may be stored on a remote server (not shown). In such aspect, the mobile device 104 may query the remote server for orientation information. For example, the mobile device 104 may send information regarding the identified indicators 116 and 118 to the remote server (also referred to as a network) along with the query. In one configuration, the remote server may respond with the orientation of the mobile device 104. In an aspect, the map 216 may be downloaded to the mobile device 104 using an out-of-band (RF) signal from a wireless local area network (WLAN), a wide area network (WAN), or other network. For example, such downloading of the map 216 may be triggered automatically by the mobile device 104 when mobile device 104 determines that it has entered an indoor venue. For example, the mobile device 104 may determine that it has entered an indoor venue using contextual information or by employing a positioning system that uses a combination of GPS and terrestrial RF technologies.
For example, with reference to
An example orientation determination operation of the mobile device 104 will now be described with reference to
As shown in
The mobile device 104 may draw a vector 212 on the image 206 captured by the front facing camera 105. In an aspect, the vector 212 may be drawn to pass through the center of the set of pixels identified as indicator E 208 and the center of the set of pixels identified as indicator F 210 as shown in
In the previously described aspect where the mobile device 104 queries the remote server for orientation information, the mobile device 104 may transmit a query that includes the identities of the indicators 116 and 118 and one or more of the captured images of the indicators 116 and 118 to the remote server. The remote server may then determine the orientation of the mobile device 104 using the identities of the indicators 116 and 118 and the one or more captured images of the indicators 116 and 118. The remote server may then transmit information regarding the orientation of the mobile device 104. The mobile device 104 may receive the information regarding the orientation of the mobile device 104 and may determine its orientation using the received information. For example, the information regarding the orientation of the mobile device 104 received from the remote server may indicate the orientation of the mobile device 104 with respect to a reference axis (e.g., the north axis 222 represented as vector 218 in
It should be understood that the reference axis may be selected to be an axis different from the north axis 222. In an aspect, the reference axis may be any fixed reference axis, such as a magnetic/geographic north axis or south axis, where the reference axis is stored in the map. In another aspect, the reference axis may be determined relative to a reference axis contained in the map. For example, the reference axis may be an axis corresponding to a hallway 224 on the map 216. As another example, the reference axis may be a particular aisle in a supermarket.
It should also be understood that the disclosure herein may be applied to a configuration where the indicators (e.g., indicators 108, 110, 112, 114, 116, 118, 120, and 122) are installed on the ground of the floor 102 (i.e., below the mobile device 104) and where the mobile device 104 uses a rear camera (not shown) to receive information for identifying two or more of the indicators and for capturing one or more images of the indicators.
At step 302, the mobile device captures one or more images of at least a first indicator and a second indicator. For example, with reference to
At step 304, the mobile device receives first identifying information from the first indicator and receives second identifying information from the second indicator. In an aspect, the first and second indicators may be LEDs configured to communicate the identifying information. For example, with reference to
At step 306, the mobile device identifies the first indicator based on the first identifying information and identifies the second indicator based on the second identifying information. In one example, with reference to
At step 308, the mobile device receives the map via a wireless communication. In an aspect, the map is automatically received when the mobile device is located indoors. For example, with reference to
At step 310, the mobile device determines respective locations of the first and second indicators on a map. For example, with reference to
At step 312, the mobile device determines a reference axis on the map. For example, with reference to
At step 314, the mobile device determines an orientation of the mobile device based on the captured one or more images of the at least the first indicator and the second indicator. In an aspect, the orientation of the mobile device is determined relative to the reference axis. For example, as shown in
The mobile device 104 may draw the image indicator axis (e.g., vector 212) on the image 206 captured by the front facing camera 105. The mobile device 104 may determine the angle of the image indicator axis (e.g., vector 212) relative to the screen axis (e.g., vector 214), which is defined as the axis extending from the bottom of the screen 212 to the top of the screen 212. The angle θ of the image indicator axis (e.g., vector 212) relative to the screen axis (e.g., vector 214) represents the orientation of the image indicator axis (e.g., vector 212) relative to the screen axis (e.g., vector 214). The negative of the angle θ represents the orientation axis (e.g., vector 106) of the mobile device 104 relative to the indicator axis (e.g., vector 220). Therefore, the orientation axis (e.g., vector 106) of the mobile device 104 relative to the reference axis (e.g., the north axis 222 represented as vector 218 in
It should be understood that the steps 304, 308, 310, and 312 indicated with dotted lines in
At step 402, the mobile device captures one or more images of at least the first indicator and the second indicator. For example, with reference to
At step 404, the mobile device receives first identifying information from the first indicator and receives second identifying information from the second indicator. In an aspect, each of the first and second indicators may be an LED configured to communicate the identifying information. For example, with reference to
At step 406, the mobile device identifies the first indicator based on the first identifying information and identifies the second indicator based on the second identifying information. In one example, with reference to
At step 408, the mobile device transmits at least one of the one or more captured images and the identities of the first and second indicators to a network.
At step 410, the mobile device receives information regarding the orientation of the mobile device from the network.
At step 412, the mobile device determines an orientation of the mobile device based on the captured one or more images of the at least the first indicator and the second indicator. In an aspect, determination of the orientation of the mobile device is further based on the received information regarding the orientation of the mobile device. For example, the information regarding the orientation of the mobile device received from the network may indicate the orientation of the mobile device 104 with respect to a reference axis (e.g., the north axis 222 represented as vector 218 in
It should be understood that the steps 404, 408, and 410 indicated with dotted lines in
The apparatus may include additional modules that perform each of the steps of the algorithm in the aforementioned flow charts of
The processing system 614 may be coupled to a transceiver 610. The transceiver 610 is coupled to one or more antennas 620. The transceiver 610 provides a means for communicating with various other apparatus over a transmission medium. The transceiver 610 receives a signal from the one or more antennas 620, extracts information from the received signal, and provides the extracted information to the processing system 614, specifically the receiving module 504. In addition, the transceiver 610 receives information from the processing system 614, specifically the transmission module 514, and based on the received information, generates a signal to be applied to the one or more antennas 620. The processing system 614 includes a processor 604 coupled to a computer-readable medium 606. The processor 604 is responsible for general processing, including the execution of software stored on the computer-readable medium 606. The software, when executed by the processor 604, causes the processing system 614 to perform the various functions described supra for any particular apparatus. The computer-readable medium 606 may also be used for storing data that is manipulated by the processor 604 when executing software. The processing system further includes at least one of the modules 504, 506, 508, 510, 512, and 514. The modules may be software modules running in the processor 604, resident/stored in the computer readable medium 606, one or more hardware modules coupled to the processor 604, or some combination thereof.
In one configuration, the apparatus 502/502′ for wireless communication includes means for capturing one or more images of at least a first indicator and a second indicator, means for identifying the first indicator based on first identifying information and identifying the second indicator based on second identifying information, means for determining an orientation of the apparatus based on the captured one or more images of the at least a first indicator and a second indicator, means for receiving the first identifying information from the first indicator and the second identifying information from the second indicator, means for determining respective locations of the first and second indicators on a map, means for determining a reference axis on the map, means for transmitting at least one of the one or more captured images and the identities of the first and second indicators to a network, means for receiving information regarding the orientation of the apparatus from the network, and means for receiving the map via a wireless communication. The aforementioned means may be one or more of the aforementioned modules of the apparatus 502 and/or the processing system 614 of the apparatus 502′ configured to perform the functions recited by the aforementioned means.
It is understood that the specific order or hierarchy of steps in the processes disclosed is an illustration of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged. Further, some steps may be combined or omitted. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
The previous description is provided to enable any person skilled in the art to practice the various aspects described herein. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects. Thus, the claims are not intended to be limited to the aspects shown herein, but is to be accorded the full scope consistent with the language claims, wherein reference to an element in the singular is not intended to mean “one and only one” unless specifically so stated, but rather “one or more.” Unless specifically stated otherwise, the term “some” refers to one or more. Combinations such as “at least one of A, B, or C,” “at least one of A, B, and C,” and “A, B, C, or any combination thereof” include any combination of A, B, and/or C, and may include multiples of A, multiples of B, or multiples of C. Specifically, combinations such as “at least one of A, B, or C,” “at least one of A, B, and C,” and “A, B, C, or any combination thereof” may be A only, B only, C only, A and B, A and C, B and C, or A and B and C, where any such combinations may contain one or more member or members of A, B, or C. All structural and functional equivalents to the elements of the various aspects described throughout this disclosure that are known or later come to be known to those of ordinary skill in the art are expressly incorporated herein by reference and are intended to be encompassed by the claims. Moreover, nothing disclosed herein is intended to be dedicated to the public regardless of whether such disclosure is explicitly recited in the claims. No claim element is to be construed as a means plus function unless the element is expressly recited using the phrase “means for.”