Aspects relate to optimized data processing for faster visible light communication (VLC) positioning.
Determining the position of a mobile device in an indoor environment can be useful in a number of applications, such as navigating mobile phone users in office/commercial environments, enabling customers to find items in a supermarket or retail outlet, coupon issuance and redemption, customer service and accountability, etc. However, achieving precise position estimates can be a challenging task. Indoor positioning is typically achieved using radio frequency (RF) signals received from Wi-Fi access points (or similar means). A drawback to this technique is that it requires mobile devices to learn RF signal propagation parameters, which presents a significant technical challenge for achieving high precision (e.g., less than one meter) position accuracy.
To provide greater indoor positioning accuracy, visible light communication (VLC) is being developed to transmit identification information for positioning operations by using variations of visible light (color, intensity, or position). Such communication technology for transmitting identification information is based on high-frequency blinking visible lights, referred to as VLC light sources. Specifically, the identification information to be transmitted is compiled into a digital signal. The digital signal is then applied to modulate the duration time or frequency of the driving current or driving voltage of the VLC light source, causing the VLC light source to blink at a high frequency. This high-frequency blinking signal can be detected by a photosensitive device, for example, an image sensor (e.g., a camera of a smartphone). By detecting the light signals from one or more VLC light sources, a mobile device can determine its position to a high degree of accuracy (e.g., within centimeters).
The following presents a simplified summary relating to one or more aspects disclosed herein. As such, the following summary should not be considered an extensive overview relating to all contemplated aspects, nor should the following summary be regarded to identify key or critical elements relating to all contemplated aspects or to delineate the scope associated with any particular aspect. Accordingly, the following summary has the sole purpose to present certain concepts relating to one or more aspects relating to the mechanisms disclosed herein in a simplified form to precede the detailed description presented below.
In an aspect, a method for optimizing visual light communication (VLC) processing performed at a mobile device includes capturing, by an image sensor of the mobile device having a rolling shutter, a plurality of lines of an image frame, writing, by an image signal processor (ISP) coupled to the image sensor, each line of the plurality of lines of the image frame into a dual port memory as each line of the plurality of lines of the image frame is received from the image sensor, and, for each line of the plurality of lines of the image frame, reading, by a VLC decoder of the mobile device, a first line of the plurality of lines of the image frame from the dual port memory while the ISP writes a second line of the plurality of lines of the image frame to the dual port memory.
In an aspect, an apparatus for optimizing VLC processing includes an image sensor of a mobile device having a rolling shutter configured to capture a plurality of lines of an image frame, an ISP for the image sensor configured to write each line of the plurality of lines of the image frame into a dual port memory as each line of the plurality of lines of the image frame is received from the image sensor, and a VLC decoder of the mobile device configured to read, for each line of the plurality of lines of the image frame, a first line of the plurality of lines of the image frame from the dual port memory while the ISP writes a second line of the plurality of lines of the image frame to the dual port memory.
In an aspect, a non-transitory computer-readable medium storing computer-executable instructions for optimizing VLC processing performed at a mobile device includes at least one instruction instructing an image sensor of the mobile device having a rolling shutter to capture a plurality of lines of an image frame, at least one instruction instructing an ISP for the image sensor to write each line of the plurality of lines of the image frame into a dual port memory as each line of the plurality of lines of the image frame is received from the image sensor, and at least one instruction instructing a VLC decoder of the mobile device to read, for each line of the plurality of lines of the image frame, a first line of the plurality of lines of the image frame from the dual port memory while the ISP writes a second line of the plurality of lines of the image frame to the dual port memory.
Other objects and advantages associated with the aspects disclosed herein will be apparent to those skilled in the art based on the accompanying drawings and detailed description.
A more complete appreciation of aspects of the disclosure will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings which are presented solely for illustration and not limitation of the disclosure, and in which:
Disclosed are techniques for optimizing visual light communication (VLC) processing performed at a mobile device. In an aspect, an image sensor of the mobile device having a rolling shutter captures a plurality of lines of an image frame, an image signal processor (ISP) coupled to the image sensor writes each line of the plurality of lines of the image frame into a dual port memory as each line of the plurality of lines of the image frame is received from the image sensor, and, for each line of the plurality of lines of the image frame, a VLC decoder of the mobile device reads a first line of the plurality of lines of the image frame from the dual port memory while the ISP writes a second line of the plurality of lines of the image frame to the dual port memory.
These and other aspects of the disclosure are disclosed in the following description and related drawings directed to specific aspects of the disclosure. Alternate aspects may be devised without departing from the scope of the disclosure. Additionally, well-known elements of the disclosure will not be described in detail or will be omitted so as not to obscure the relevant details of the disclosure.
The words “exemplary” and/or “example” are used herein to mean “serving as an example, instance, or illustration.” Any aspect described herein as “exemplary” and/or “example” is not necessarily to be construed as preferred or advantageous over other aspects. Likewise, the term “aspects of the disclosure” does not require that all aspects of the disclosure include the discussed feature, advantage or mode of operation.
Further, many aspects are described in terms of sequences of actions to be performed by, for example, elements of a computing device. It will be recognized that various actions described herein can be performed by specific circuits (e.g., application specific integrated circuits (ASICs)), by program instructions being executed by one or more processors, or by a combination of both. Additionally, these sequence of actions described herein can be considered to be embodied entirely within any form of computer readable storage medium having stored therein a corresponding set of computer instructions that upon execution would cause an associated processor to perform the functionality described herein. Thus, the various aspects of the disclosure may be embodied in a number of different forms, all of which have been contemplated to be within the scope of the claimed subject matter. In addition, for each of the aspects described herein, the corresponding form of any such aspects may be described herein as, for example, “logic configured to” perform the described action.
The access points 105 may wirelessly communicate with the mobile devices 115 via one or more access point antennas. Each of the access points 105 may provide communication coverage for a respective coverage area 110. In some aspects, an access point (such as access points 105) may be referred to as a base station, a base transceiver station (BTS), a radio base station, a radio transceiver, a basic service set (BSS), an extended service set (ESS), a NodeB, an evolved NodeB (eNB), a Home NodeB, a Home eNodeB, a WLAN access point, or some other suitable terminology. The coverage area 110 for an access point may be divided into sectors making up only a portion of the coverage area (not shown). The system 100 may include access points 105 of different types (e.g., macro, micro, and/or pico base stations). The access points 105 may also utilize different radio technologies. The access points 105 may be associated with the same or different access networks. The coverage areas of different access points 105, including the coverage areas of the same or different types of access points 105, utilizing the same or different radio technologies, and/or belonging to the same or different access networks, may overlap.
The system 100 may be a heterogeneous network in which different types of access points 105 provide coverage for various geographical regions. For example, each access point may provide communication coverage for a macro cell, a pico cell, a femto cell, and/or other types of cell. A macro cell generally covers a relatively large geographic area (e.g., several kilometers in radius) and may allow unrestricted access by mobile devices 115 with service subscriptions with the network provider. A pico cell would generally cover a relatively smaller geographic area and may allow unrestricted access by mobile devices 115 with service subscriptions with the network provider. A femto cell would also generally cover a relatively small geographic area (e.g., a home) and, in addition to unrestricted access, may also provide restricted access by mobile devices 115 having an association with the femto cell (e.g., mobile devices 115 in a closed subscriber group (CSG), mobile devices 115 for users in the home, and the like). An access point for a macro cell may be referred to as a macro base station. An access point for a pico cell may be referred to as a pico base station. And, an access point for a femto cell may be referred to as a femto base station or a home base station. An access point may support one or multiple (e.g., two, three, four, and the like) cells.
The core network 130 may communicate with the access points 105 via a backhaul 132 (e.g., S1, etc.). The access points 105 may also communicate with one another, e.g., directly or indirectly via backhaul links 134 (e.g., X2, etc.) and/or via backhaul 132 (e.g., through core network 130). The wireless communications system 100 may support synchronous or asynchronous operation. For synchronous operation, the access points 105 may have similar frame timing, and transmissions from different access points 105 may be approximately aligned in time. For asynchronous operation, the access points 105 may have different frame timing, and transmissions from different access points 105 may not be aligned in time. The techniques described herein may be used for either synchronous or asynchronous operations.
The mobile devices 115 may be dispersed throughout the wireless communications system 100, and each mobile device 115 may be stationary (but capable of mobility) or mobile. A mobile device 115 may also be referred to by those skilled in the art as a user equipment (UE), a mobile station, a subscriber station, a mobile unit, a subscriber unit, a wireless unit, a remote unit, a wireless device, a wireless communication device, a remote device, a mobile subscriber station, an access terminal, a mobile terminal, a wireless terminal, a remote terminal, a handset, a user agent, a mobile client, a client, or some other suitable terminology. A mobile device 115 may be a cellular phone, a personal digital assistant (PDA), a wireless communication device, a handheld device, a tablet computer, a laptop computer, a cordless phone, a wearable item such as a watch or glasses, or the like. A mobile device 115 may be able to communicate with macro base stations, pico base stations, femto base stations, relays, and the like. A mobile device 115 may also be able to communicate over different access networks, such as cellular or other wireless wide area network (WWAN) access networks, or WLAN access networks.
The communication links 125 shown in system 100 may include uplinks for carrying uplink (UL) transmissions (e.g., from one of mobile devices 115 to one of access points 105) and/or downlinks for carrying downlink (DL) transmissions (e.g., from one of access points 105 to one of mobile devices 115). The UL transmissions may also be called reverse link transmissions, while the DL transmissions may also be called forward link transmissions.
A particular small cell access point 105A (e.g., a pico cell, a femto cell, a WiFi access point, etc.) may be located within a venue (e.g., a building, stadium, ship, etc.), not shown in
In some cases, a mobile device 115 may be capable of receiving information-carrying light signals, such as visible light communication (VLC) signals or infrared signals. VLC uses modulated visible light to transmit data. The VLC light source, such as light source 205, is typically a light-emitting diode (LED), although other sources, such as fluorescent light bulbs, may, in some cases, be utilized. Reception at the mobile device 115 is typically based on photodiodes, either individually or in a digital image sensor or other array of photodiodes, such as those found in cell phones and digital cameras. Arrays of photodiodes may, in some cases, be utilized to provide multi-channel communication and/or spatial awareness relating to multiple VLC light sources.
When illuminated by a light source 205 capable of transmitting an information-carrying light signal, such as a VLC signal, the mobile device 115 may receive and decode the light signal to obtain identification information for the light source 205. The identification information contained in the light signal may in some cases include a repeated codeword that identifies the light source 205. As will be described further herein, the identification information may enable the mobile device 115 to determine the location of the light source 205 (e.g., by looking up the location in a local database or retrieving the location from the location server 170). By identifying the angle of arrival of the light signal, the mobile device 115 may be able to determine positioning information based on the light signal. In some cases, the positioning information may include a direction of one or more light sources 205 with respect to the mobile device. In some cases, the positioning information may also or alternately include an estimate of the distance from the mobile device 115 to one or more light sources 205. In some cases, the mobile device 115 may receive light signals from more than one light source 205 and determine additional positioning information, such as the location of the mobile device 115.
Turning now to
The mobile devices 115A and 115B may be examples of the mobile devices 115 described with reference to
Each of the light sources 205A, 205B, and 205C may contain (or be associated with) circuitry for generating a modulated light signal (e.g., an information-carrying light signal), such as a VLC signal or infrared signal. The modulated light signal may be generated using the primary luminaire of the light source 205A, 205B, and 205C, or using a secondary luminaire, such as a luminaire that is provided particularly for the purpose of generating a modulated light signal. In the latter case, and by way of example, a light source 205 might use a CFL luminaire as its primary light producing mechanism and use a light emitting diode (LED) luminaire particularly for the purpose of generating a modulated light signal.
Each of the mobile devices 115A and 115B may include circuitry for receiving and decoding a modulated light signal. The circuitry may in some cases include an image sensor, such as an image sensor containing an array of photodiodes (e.g., a complementary metal-oxide semiconductor (CMOS) image sensor).
In an aspect, by receiving and decoding the modulated light signal received from each of the three light sources 205A, 205B, and 205C, identifying a location of each of the three light sources 205A, 205B, and 205C as described herein, and identifying the angle of arrival of the light signal received from each light source 205A, 205B, and 205C, the mobile device 115 may not only estimate the distances 305A, 305B, 305C from the mobile device 115 to each light source 205A, 205B, and 205C, but may also determine a position (e.g., location) of the mobile device 115 (e.g., using trilateration) with a high degree of accuracy (e.g., less than a meter).
Alternatively, where the identification information contained in the light signal from the light source 205B includes information representing the size (e.g., dimensions of the light source such as 24 in×36 in, 12 in diameter, etc.) and shape (e.g., circle, square, rectangle, etc.) of the fixture containing the light source 205B and coordinates (e.g., x, y, and optionally z, relative to a floor plan of the venue in which the light source 205B is located) of at least one point (e.g., a corner) on the light fixture, the mobile device 115 may be able to determine its position within the venue based on identifying the at least one point on the light fixture, the orientation of the mobile device 115 with respect to the at least one point on the light fixture (using orientation sensors of the mobile device 115, e.g., an accelerometer and/or a gyroscope, and the information representing the shape of the light fixture), and the distance between the mobile device 115 and the at least one point on the light fixture (using the angle of arrival of the light signal received from each light source 205B and the information representing the size of the light fixture). Thus, the mobile device 115 may be able to determine its position based on information received from a single light source, here, the light source 205B.
Note that although the light sources 205 in
The mobile device 115 may include one or more wide area network (WAN) transceiver(s) 404 that may be connected to one or more antennas 402. The WAN transceiver 404 comprises suitable devices, hardware, and/or software for communicating with and/or detecting signals to/from WAN access points 105, and/or directly with other wireless devices within the system 100. In one aspect, the WAN transceiver 404 may comprise a code division multiple access (CDMA) communication system suitable for communicating with a CDMA network of wireless base stations; however in other aspects, the wireless communication system may comprise another type of cellular telephony network, such as, for example, time division multiple access (TDMA) or the Global System for Mobile Communications (GSM). Additionally, any other type of wide area wireless networking technologies may be used, for example, WiMAX (IEEE 802.16), etc.
The mobile device 115 may also include one or more WLAN and/or personal area network (PAN) transceivers 406 that may be connected to the one or more antennas 402. The one or more WLAN/PAN transceivers 406 comprise suitable devices, hardware, and/or software for communicating with and/or detecting signals to/from access points 105, and/or directly with other wireless devices within a network. In one aspect, the one or more WLAN/PAN transceivers 406 may include a Wi-Fi (802.11x) or Bluetooth® transceiver. Additionally, any other type of wireless networking technologies may be used, for example, Ultra Wide Band, ZigBee, wireless Universal Serial Bus (USB), etc.
A satellite positioning system (SPS) receiver 408 may also be included in the mobile device 115. The SPS receiver 408 may be connected to the one or more antennas 402 for receiving satellite signals. The SPS receiver 408 may comprise any suitable hardware and/or software for receiving and processing SPS signals. The SPS receiver 408 requests information and operations as appropriate from the other systems, and performs the calculations necessary to determine the mobile device's 115 position using measurements obtained by any suitable SPS algorithm.
One or more orientation sensors 412 may be coupled to a processor 410 to provide movement and/or orientation information that is independent of motion data derived from signals received by the WAN transceiver 404, the local area network (LAN) transceiver 406, and the SPS receiver 408. For example, the one or more orientation sensors 412 may comprise one or more accelerometers and/or a three-dimensional (3-D) accelerometer, a gyroscope, a geomagnetic sensor (e.g., a compass), a motion sensor, and/or any other type of movement detection sensor. Moreover, the one or more orientation sensors 412 may include a plurality of different types of devices and combine their outputs in order to provide motion information. For example, the one or more orientation sensors 412 may use a combination of a multi-axis accelerometer and orientation sensors to provide the ability to compute positions in two-dimension (2-D) and/or 3-D coordinate systems. Although not shown, the mobile device 115 may further include an altimeter (e.g., a barometric pressure altimeter).
One or more image sensors 414 may also be coupled to the processor 410. The one or more image sensors 414 may be image sensors containing an array of photodiodes (e.g., a complementary metal-oxide semiconductor (CMOS) image sensor), and may correspond to a front and/or a rear-facing camera of the mobile device 115. One or more light sensors 416 (e.g., photosensors or photodetectors) may also be coupled to the processor 410. The one or more light sensors 416 may be one or more photodiodes, photo transistors, etc.
The processor 410 may include one or more microprocessors, microcontrollers, and/or digital signal processors that provide processing functions, as well as other calculation and control functionality. The processor 410 may also be coupled to memory 418 for storing data and software instructions for executing programmed functionality within the mobile device 115. For example, the processor 410 may be operatively configurable based on instructions in the memory 418 to selectively initiate one or more routines that exploit motion data for use in other portions of the mobile device 115. The memory 418 may be on-board the processor 410 (e.g., within the same integrated circuit (IC) package), and/or the memory 418 may be external memory to the processor 410 and functionally coupled over a data bus.
A number of software modules and data tables may reside in memory 418 and be utilized by the processor 410 in order to manage both communications and positioning determination functionality as described herein. As illustrated in
The processor 410, the one or more orientation sensors 412, and the coarse positioning module 424 may cooperatively perform positioning operations based on dead reckoning (DR) to estimate the position of the mobile device 115 when other methods of estimating the position of the mobile device 115 are not available, such as when the mobile device 115 is in an indoor environment. Dead reckoning is the process of calculating the current position of the mobile device 115 by using a previously determined position, or fix, and advancing that position based upon known or estimated speeds over elapsed time and course, for example, as sensed by the one or more orientation sensors 412. More specifically, one or more accelerometers of the one or more orientation sensors 412 and one or more gyroscopes of the one or more orientation sensors 412 continuously calculate the movement, orientation, and velocity of the mobile device 115 to calculate changes in position of the mobile device 115 from the last known position fix.
The processor 410, the one or more image sensors 414 and/or the one or more light sensors 416, and the light signal positioning module 422 may cooperatively perform positioning operations based on light signals from one or more light sources 205 to estimate the position of the mobile device 115. For example, the one or more image sensors 414 and/or the one or more light sensors 416 may receive and decode the light signal(s) to obtain identification information for the light source(s) 205. Based on the identification information, the WAN transceiver 404 and/or the LAN transceiver 406 may obtain the location(s) of the light source(s) 205 from a local server (e.g., a location server, such as location server 170, associated with the venue in which the mobile device 115 is located). Alternatively, if location information for the light source(s) 205 was previously downloaded and stored in the light source location database 426, the processor 410 can retrieve the location(s) of the light source(s) 205 from the light source location database 426. Based on the location of the light source(s) 205, the angle of arrival of the light signal, and optionally the size (e.g., dimensions), shape, orientation, coordinates of a point within the light source(s) 205, or any combination thereof, the light signal positioning module 422, as executed by the processor 410, may determine positioning information, such as the location of the mobile device 115.
While the modules shown in
The mobile device 115 may further include a user interface 450 that provides any suitable interface systems, such as a microphone/speaker 452, keypad 454, and display 456 that allows user interaction with the mobile device 115. The microphone/speaker 452 provides for voice communication services using the WAN transceiver 404 and/or the LAN transceiver 406. The keypad 454 comprises any suitable buttons for user input. The display 456 comprises any suitable display, such as, for example, a backlit liquid crystal display (LCD) display, and may further include a touch screen display for additional user input modes.
As used herein, the mobile device 115 may be any portable or movable device or machine that is configurable to acquire wireless signals transmitted from, and transmit wireless signals to, one or more wireless communication devices or networks. As shown in
This is illustrated in
Referring back to
In general, the image sensor 502 acts as a VLC input, since it reads data line by line, and not until the ISP/VFE 506 is the frame structured. More specifically, the ISP/VFE 506 and a MIPI Decoder (not shown) decode the image frame line by line and structure the image frame. The ISP/VFE 506 then writes the structured image frame to the Double Data Rate (DDR) memory 508 over an Advanced eXtensible Interface (AXI) bus using the write masters. The VLC decoder 510 accesses the buffered image frame from the DDR memory 508 and decodes/demodulates it. The VLC decoder 510 then passes the decoded/demodulated data to the ISP/VFE 506 for further processing. For example, the ISP/VFE 506 may perform positioning operations, such as those described above, using the decoded/demodulated data.
The structuring of the buffered image frame depends upon the frame rate at which the image is captured by the image sensor 502. For example, if the frame rate is 30 frames per second (fps), the ISP/VFE 506 writes an image frame buffer to the DDR memory 508 once every 33 ms. The image frame buffer is passed to the display (e.g., display 456), the video encoder (not shown), or the VLC decoder 510 for their respective operations.
Note that the ISP/VFE 506, the DDR memory 508, and the VLC decoder 510 correspond to a camera subsystem, which may be part of the main processor of the mobile device 115, such as processor 410 in
The camera software may allocate approximately six (6) to ten (10) preview buffers for the entire real time operation of capturing the image line by line. Considering a 16 megapixel (M) image sensor, the total buffer size allocated would be approximately 300 megabytes (MB) (i.e., 16 M×10 bits×1.5/8=30 MB/frame, where each pixel is “10 bits,” “1.5” is the RGB to YCbCr (or YUV) conversion, and “8” is the bits to byte conversion). In the worst case scenario, two buffers would be required for better latency, where, when the ISP/VFE 506 writes to the first buffer, the second buffer is consumed by the VLC decoder 510. This is known as “ping-pong” buffer management.
A drawback to current VLC signal processing techniques is that the VLC decoder 510 must wait for the full image frame to be delivered to the DDR memory 508 before it can retrieve the image frame from the DDR memory 508 and decode the VLC signal captured in the image frame. For camera operations (e.g., where the user is taking a picture), all lines of an image frame should be read for further processing. In the case of VLC signal processing, however, the VLC decoder 510 does not need to wait for the complete frame to be structured in order to perform further processing. That is because, as illustrated in
In the exemplary system 700, the ISP/VFE 706 fetches captured image data from the image sensor (e.g., image sensor 502) line by line and writes each line of data into the dual port memory 708 as it is fetched. The dual port memory 708 may be structured as a queue of line buffers, where the last line(s) in the queue are read by the VLC decoder 510 from one end while new line(s) are written by the ISP/VFE 706 to the other end. In an aspect, the ISP/VFE 706 and the VLC decoder 710 may be in the same clock domain, meaning that they both perform operations (i.e., writes and reads, respectively) on the dual port memory 708 during the same clock cycle. This reduces the number of clock cycles it takes to read the lines of the image frame. By writing the image data from the image sensor to the dual port memory 708 line by line, the VLC decoder 710 can receive an image frame line by line, rather than having to wait to receive an entire image frame at once.
The functionality of the modules of
In addition, the components and functions represented by
Those of skill in the art will appreciate that information and signals may be represented using any of a variety of different technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips that may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.
Further, those of skill in the art will appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the aspects disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The methods, sequences and/or algorithms described in connection with the aspects disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in random-access memory (RAM), flash memory, read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, hard disk, a removable disk, a compact disc read-only memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal (e.g., UE). In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary aspects, the functions described herein, for example, with reference to
While the foregoing disclosure shows illustrative aspects of the disclosure, it should be noted that various changes and modifications could be made herein without departing from the scope of the disclosure as defined by the appended claims. The functions, steps and/or actions of the method claims in accordance with the aspects of the disclosure described herein need not be performed in any particular order. Furthermore, although elements of the disclosure may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated.