Aspects of the present disclosure generally relate to inertial navigation and, more specifically, to inertial navigation using machine-learning models.
With respect to a contemporary mobile device (such as a smartphone, for example) six degrees of freedom (6DoF) pose tracking capabilities can be a valuable feature to support applications such as augmented/extended reality, autonomous vehicle control, immersive gaming, health and fitness monitoring, autonomous package handling, robotics, and others. In conjunction with 6DoF pose tracking, estimates of translational and rotational motion of a mobile device can be inferred from inertial measurements provided by an inertial measurement unit (IMU). It may be possible to improve the accuracy of IMU-based pose tracking via multimodal sensor fusion, according to which additional modalities (such as camera imaging and global navigation satellite system (GNSS) location estimation) may be used in concert with motion estimation based on IMU data. However, pose tracking using multimodal sensor fusion may consume more power, and depending on the additional modalities used, may be unreliable under some conditions.
An example method for inertial navigation, according to this disclosure, may include obtaining inertial measurement unit (IMU) data from an IMU, generating, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, wherein each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and determining a device pose estimate based on the IMU data and the plurality of pose measurement vectors.
An example apparatus for inertial navigation aided with multi-interval pose measurements, according to this disclosure, may include an IMU, a transceiver, a memory, and one or more processors communicatively coupled with the transceiver and the memory, wherein the one or more processors are configured to obtain IMU data from the IMU, generate, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, wherein each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and determine a device pose estimate based on the IMU data and each of the plurality of pose measurement vectors.
An example apparatus for inertial navigation aided with multi-interval pose measurements, according to this disclosure, may include means for obtaining inertial measurement unit (IMU) data from an IMU, means for generating, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, wherein each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and means for determining a device pose estimate based on the IMU data and each of the plurality of pose measurement vectors.
This summary is neither intended to identify key or essential features of the claimed subject matter, nor is it intended to be used in isolation to determine the scope of the claimed subject matter. The subject matter should be understood by reference to appropriate portions of the entire specification of this disclosure, any or all drawings, and each claim. The foregoing, together with other features and examples, will be described in more detail below in the following specification, claims, and accompanying drawings.
Like reference symbols in the various drawings indicate like elements, in accordance with certain example implementations. In addition, multiple instances of an element may be indicated by following a first number for the element with a letter or a hyphen and a second number. For example, multiple instances of an element 110 may be indicated as 110-1, 110-2, 110-3 etc. or as 110a, 110b, 110c, etc. When referring to such an element using only the first number, any instance of the element is to be understood (e.g., element 110 in the previous example would refer to elements 110-1, 110-2, and 110-3 or to elements 110a, 110b, and 110c).
The following description is directed to certain implementations for the purposes of describing innovative aspects of various embodiments. However, a person having ordinary skill in the art will readily recognize that the teachings herein can be applied in a multitude of different ways. The described implementations may be implemented in any device, system, or network that is capable of transmitting and receiving radio frequency (RF) signals according to any communication standard, such as any of the Institute of Electrical and Electronics Engineers (IEEE) 802.15.4 standards for ultra-wideband (UWB), IEEE 802.11 standards (including those identified as Wi-Fi® technologies), the Bluetooth® standard, code division multiple access (CDMA), frequency division multiple access (FDMA), time division multiple access (TDMA), Global System for Mobile communications (GSM), GSM/General Packet Radio Service (GPRS), Enhanced Data GSM Environment (EDGE), Terrestrial Trunked Radio (TETRA), Wideband-CDMA (W-CDMA), Evolution Data Optimized (EV-DO), 1×EV-DO, EV-DO Rev A, EV-DO Rev B, High Rate Packet Data (HRPD), High Speed Packet Access (HSPA), High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Evolved High Speed Packet Access (HSPA+), Long Term Evolution (LTE), Advanced Mobile Phone System (AMPS), or other known signals that are used to communicate within a wireless, cellular or internet of things (IoT) network, such as a system utilizing 3G, 4G, 5G, 6G, or further implementations thereof, technology.
As used herein, an “RF signal” comprises an electromagnetic wave that transports information through the space between a transmitter (or transmitting device) and a receiver (or receiving device). As used herein, a transmitter may transmit a single “RF signal” or multiple “RF signals” to a receiver. However, the receiver may receive multiple “RF signals” corresponding to each transmitted RF signal due to the propagation characteristics of RF signals through multiple channels or paths.
Additionally, unless otherwise specified, references to “reference signals,” “positioning reference signals,” “reference signals for positioning,” and the like may be used to refer to signals used for positioning of a user equipment (UE). As described in more detail herein, such signals may comprise any of a variety of signal types but may not necessarily be limited to a Positioning Reference Signal (PRS) as defined in relevant wireless standards.
Further, unless otherwise specified, the term “positioning” as used herein may refer to absolute location determination, relative location determination, ranging, or a combination thereof. Such positioning may include and/or be based on timing, angular, phase, or power measurements, or a combination thereof (which may include RF sensing measurements) for the purpose of location or sensing services.
Various aspects relate generally to inertial navigation, and more particularly to the use of machine-learning modeling to aid inertial navigation. Some aspects more specifically relate to machine-learning model-based inertial navigation using estimation filtering. In some examples, the estimation filtering can be performed using linear quadratic estimation (also known as Kalman filtering). According to some aspects, in conjunction with conducting inertial navigation for a mobile device, a pose estimation engine can determine device pose estimates based on pose measurement vectors generated according to multiple different machine-learning models. Each machine-learning model can be associated with a respective one of multiple motion classes, and can produce pose measurement vectors according to a time interval value associated with its motion class. In some examples, the pose measurement vectors can describe current pose characteristics, such as current positions, orientations, linear or angular velocities, or the like. In other examples, the pose measurement vectors can be delta measurements that indicate changes in pose characteristics over intervals in time, such as linear or angular displacements, differences in linear or angular velocities, or the like. According to some aspects, pose measurement vectors generated according to the various machine-learning models can be used to update a multi-measurement estimation filter bank. In various examples, position and orientation measurements of the pose measurement vectors can be weighted by their associated uncertainties to determine an estimated position and orientation of the mobile device. According to aspects of the disclosure, applying multiple different machine-learning models trained for estimation of motion of various types (and various associated time interval values) can allow the pose estimation engine to achieve greater overall levels of inertial navigation accuracy.
It should be noted that
Depending on desired functionality, the network 170 may comprise any of a variety of wireless and/or wireline networks. The network 170 can, for example, comprise any combination of public and/or private networks, local and/or wide-area networks, and the like. Furthermore, the network 170 may utilize one or more wired and/or wireless communication technologies. In some embodiments, the network 170 may comprise a cellular or other mobile network, a wireless local area network (WLAN), a wireless wide-area network (WWAN), and/or the Internet, for example. Examples of network 170 include a Long-Term Evolution (LTE) wireless network, a Fifth Generation (5G) wireless network (also referred to as New Radio (NR) wireless network or 5G NR wireless network), a Wi-Fi WLAN, and the Internet. LTE, 5G and NR are wireless technologies defined, or being defined, by the 3rd Generation Partnership Project (3GPP). Network 170 may also include more than one network and/or more than one type of network.
The base stations 120 and access points (APs) 130 may be communicatively coupled to the network 170. In some embodiments, the base station 120s may be owned, maintained, and/or operated by a cellular network provider, and may employ any of a variety of wireless technologies, as described herein below. Depending on the technology of the network 170, a base station 120 may comprise a node B, an Evolved Node B (eNodeB or eNB), a base transceiver station (BTS), a radio base station (RBS), an NR NodeB (gNB), a Next Generation eNB (ng-eNB), or the like. A base station 120 that is a gNB or ng-eNB may be part of a Next Generation Radio Access Network (NG-RAN) which may connect to a 5G Core Network (5GC) in the case that Network 170 is a 5G network. The functionality performed by a base station 120 in earlier-generation networks (e.g., 3G and 4G) may be separated into different functional components (e.g., radio units (RUS), distributed units (DUs), and central units (CUs)) and layers (e.g., L1/L2/L3) in view Open Radio Access Networks (O-RAN) and/or Virtualized Radio Access Network (V-RAN or vRAN) in 5G or later networks, which may be executed on different devices at different locations connected, for example, via fronthaul, midhaul, and backhaul connections. As referred to herein, a “base station” (or ng-eNB, gNB, etc.) may include any or all of these functional components. An AP 130 may comprise a Wi-Fi AP or a Bluetooth® AP or an AP having cellular capabilities (e.g., 4G LTE and/or 5G NR), for example. Thus, UE 105 can send and receive information with network-connected devices, such as location server 160, by accessing the network 170 via a base station 120 using a first communication link 133. Additionally or alternatively, because APs 130 also may be communicatively coupled with the network 170, UE 105 may communicate with network-connected and Internet-connected devices, including location server 160, using a second communication link 135, or via one or more other mobile devices 145.
As used herein, the term “base station” may generically refer to a single physical transmission point, or multiple co-located physical transmission points, which may be located at a base station 120. A Transmission Reception Point (TRP) (also known as transmit/receive point) corresponds to this type of transmission point, and the term “TRP” may be used interchangeably herein with the terms “gNB,” “ng-eNB,” and “base station.” In some cases, a base station 120 may comprise multiple TRPs—e.g. with each TRP associated with a different antenna or a different antenna array for the base station 120. As used herein, the transmission functionality of a TRP may be performed with a transmission point (TP) and/or the reception functionality of a TRP may be performed by a reception point (RP), which may be physically separate or distinct from a TP. That said, a TRP may comprise both a TP and an RP. Physical transmission points may comprise an array of antennas of a base station 120 (e.g., as in a Multiple Input-Multiple Output (MIMO) system and/or where the base station employs beamforming). The term “base station” may additionally refer to multiple non-co-located physical transmission points, the physical transmission points may be a Distributed Antenna System (DAS) (a network of spatially separated antennas connected to a common source via a transport medium) or a Remote Radio Head (RRH) (a remote base station connected to a serving base station).
As used herein, the term “cell” may generically refer to a logical communication entity used for communication with a base station 120, and may be associated with an identifier for distinguishing neighboring cells (e.g., a Physical Cell Identifier (PCID), a Virtual Cell Identifier (VCID)) operating via the same or a different carrier. In some examples, a carrier may support multiple cells, and different cells may be configured according to different protocol types (e.g., Machine-Type Communication (MTC), Narrowband Internet-of-Things (NB-IoT), Enhanced Mobile Broadband (cMBB), or others) that may provide access for different types of devices. In some cases, the term “cell” may refer to a portion of a geographic coverage area (e.g., a sector) over which the logical entity operates.
Satellites 110 may be utilized for positioning of the UE 105 in one or more ways. For example, satellites 110 (also referred to as space vehicles (SVs)) may be part of a Global Navigation Satellite System (GNSS) such as the Global Positioning System (GPS), GLONASS, Galileo or Beidou. Positioning using RF signals from GNSS satellites may comprise measuring multiple GNSS signals at a GNSS receiver of the UE 105 to perform code-based and/or carrier-based positioning, which can be highly accurate. Additionally or alternatively, satellites 110 may be utilized for NTN-based positioning, in which satellites 110 may functionally operate as TRPs (or TPs) of a network (e.g., LTE and/or NR network) and may be communicatively coupled with network 170. In particular, reference signals (e.g., PRS) transmitted by satellites 110 NTN-based positioning may be similar to those transmitted by base stations 120, and may be coordinated by a location server 160. In some embodiments, satellites 110 used for NTN-based positioning may be different than those used for GNSS-based positioning. In some embodiments NTN nodes may include non-terrestrial vehicles such as airplanes, balloons, drones, etc., which may be in addition or as an alternative to NTN satellites.
The location server 160 may comprise a server and/or other computing device configured to determine an estimated location of UE 105 and/or provide data (e.g., “assistance data”) to UE 105 to facilitate location measurement and/or location determination by UE 105. According to some embodiments, location server 160 may comprise a Home Secure User Plane Location (SUPL) Location Platform (H-SLP), which may support the SUPL user plane (UP) location solution defined by the Open Mobile Alliance (OMA) and may support location services for UE 105 based on subscription information for UE 105 stored in location server 160. In some embodiments, the location server 160 may comprise, a Discovered SLP (D-SLP) or an Emergency SLP (E-SLP). The location server 160 may also comprise an Enhanced Serving Mobile Location Center (E-SMLC) that supports location of UE 105 using a control plane (CP) location solution for LTE radio access by UE 105. The location server 160 may further comprise a Location Management Function (LMF) that supports location of UE 105 using a control plane (CP) location solution for NR or LTE radio access by UE 105.
In a CP location solution, signaling to control and manage the location of UE 105 may be exchanged between elements of network 170 and with UE 105 using existing network interfaces and protocols and as signaling from the perspective of network 170. In a UP location solution, signaling to control and manage the location of UE 105 may be exchanged between location server 160 and UE 105 as data (e.g. data transported using the Internet Protocol (IP) and/or Transmission Control Protocol (TCP)) from the perspective of network 170.
As previously noted (and discussed in more detail below), the estimated location of UE 105 may be based on measurements of RF signals sent from and/or received by the UE 105. In particular, these measurements can provide information regarding the relative distance and/or angle of the UE 105 from one or more components in the positioning system 100 (e.g., GNSS satellites 110, APs 130, base stations 120). The estimated location of the UE 105 can be estimated geometrically (e.g., using multiangulation and/or multilateration), based on the distance and/or angle measurements, along with known position of the one or more components.
Although terrestrial components such as APs 130 and base stations 120 may be fixed, embodiments are not so limited. Mobile components may be used. For example, in some embodiments, a location of the UE 105 may be estimated at least in part based on measurements of RF signals 140 communicated between the UE 105 and one or more other mobile devices 145, which may be mobile or fixed. As illustrated, other mobile devices may include, for example, a mobile phone 145-1, vehicle 145-2, static communication/positioning device 145-3, or other static and/or mobile device capable of providing wireless signals used for positioning the UE 105, or a combination thereof. Wireless signals from mobile devices 145 used for positioning of the UE 105 may comprise RF signals using, for example, Bluetooth® (including Bluetooth Low Energy (BLE)), IEEE 802.11x (e.g., Wi-Fi®), Ultra Wideband (UWB), IEEE 802.15x, or a combination thereof. Mobile devices 145 may additionally or alternatively use non-RF wireless signals for positioning of the UE 105, such as infrared signals or other optical technologies.
Mobile devices 145 may comprise other UEs communicatively coupled with a cellular or other mobile network (e.g., network 170). When one or more other mobile devices 145 comprising UEs are used in the position determination of a particular UE 105, the UE 105 for which the position is to be determined may be referred to as the “target UE,” and each of the other mobile devices 145 used may be referred to as an “anchor UE.” For position determination of a target UE, the respective positions of the one or more anchor UEs may be known and/or jointly determined with the target UE. Direct communication between the one or more other mobile devices 145 and UE 105 may comprise sidelink and/or similar Device-to-Device (D2D) communication technologies. Sidelink, which is defined by 3GPP, is a form of D2D communication under the cellular-based LTE and NR standards. UWB may be one such technology by which the positioning of a target device (e.g., UE 105) may be facilitated using measurements from one or more anchor devices (e.g., mobile devices 145).
According to some embodiments, such as when the UE 105 comprises and/or is incorporated into a vehicle, a form of D2D communication used by the mobile device 105 may comprise vehicle-to-everything (V2X) communication. V2X is a communication standard for vehicles and related entities to exchange information regarding a traffic environment. V2X can include vehicle-to-vehicle (V2V) communication between V2X-capable vehicles, vehicle-to-infrastructure (V2I) communication between the vehicle and infrastructure-based devices (commonly termed roadside units (RSUs)), vehicle-to-person (V2P) communication between vehicles and nearby people (pedestrians, cyclists, and other road users), and the like. Further, V2X can use any of a variety of wireless RF communication technologies. Cellular V2X (CV2X), for example, is a form of V2X that uses cellular-based communication such as LTE (4G), NR (5G) and/or other cellular technologies in a direct-communication mode as defined by 3GPP. The UE 105 illustrated in
An estimated location of UE 105 can be used in a variety of applications—e.g. to assist direction finding or navigation for a user of UE 105 or to assist another user (e.g. associated with external client 180) to locate UE 105. A “location” is also referred to herein as a “location estimate”, “estimated location”, “location”, “position”, “position estimate”, “position fix”, “estimated position”, “location fix” or “fix”. The process of determining a location may be referred to as “positioning,” “position determination,” “location determination,” or the like. A location of UE 105 may comprise an absolute location of UE 105 (e.g. a latitude and longitude and possibly altitude) or a relative location of UE 105 (e.g. a location expressed as distances north or south, east or west and possibly above or below some other known fixed location (including, e.g., the location of a base station 120 or AP 130) or some other location such as a location for UE 105 at some known previous time, or a location of a mobile device 145 (e.g., another UE) at some known previous time). A location may be specified as a geodetic location comprising coordinates which may be absolute (e.g. latitude, longitude and optionally altitude), relative (e.g. relative to some known absolute location) or local (e.g. X, Y and optionally Z coordinates according to a coordinate system defined relative to a local area such a factory, warehouse, college campus, shopping mall, sports stadium or convention center). A location may instead be a civic location and may then comprise one or more of a street address (e.g. including names or labels for a country, state, county, city, road and/or street, and/or a road or street number), and/or a label or name for a place, building, portion of a building, floor of a building, and/or room inside a building etc. A location may further include an uncertainty or error indication, such as a horizontal and possibly vertical distance by which the location is expected to be in error or an indication of an area or volume (e.g. a circle or ellipse) within which UE 105 is expected to be located with some level of confidence (e.g. 95% confidence).
The external client 180 may be a web server or remote application that may have some association with UE 105 (e.g. may be accessed by a user of UE 105) or may be a server, application, or computer system providing a location service to some other user or users which may include obtaining and providing the location of UE 105 (e.g. to enable a service such as friend or relative finder, or child or pet location). Additionally or alternatively, the external client 180 may obtain and provide the location of UE 105 to an emergency services provider, government agency, etc.
The UE 105 may comprise and/or be referred to as a device, a mobile device, a wireless device, a mobile terminal, a terminal, a mobile station (MS), a Secure User Plane Location (SUPL)-Enabled Terminal (SET), or by some other name. Moreover, UE 105 may correspond to a cellphone, smartphone, laptop, tablet, personal data assistant (PDA), navigation device, Internet of Things (IoT) device, or some other portable or moveable device. Typically, though not necessarily, the UE 105 may support wireless communication using one or more Radio Access Technologies (RATs) such as using GSM, CDMA, W-CDMA, LTE, High Rate Packet Data (HRPD), IEEE 802.11 Wi-Fi®, Bluetooth, Worldwide Interoperability for Microwave Access (WiMAX™), 5G NR, etc.
The UE 105 may include a single entity or may include multiple entities, such as in a personal area network where a user may employ audio, video and/or data I/O devices, and/or body sensors and a separate wireline or wireless modem. An estimate of a location of the UE 105 may be referred to as a location, location estimate, location fix, fix, position, position estimate, or position fix, and may be geodetic, thus providing location coordinates for the UE 105 (e.g., latitude and longitude), which may or may not include an altitude component (e.g., height above sea level, height above or depth below ground level, floor level or basement level). Alternatively, a location of the UE 105 may be expressed as a civic location (e.g., as a postal address or the designation of some point or small area in a building such as a particular room or floor). A location of the UE 105 may also be expressed as an area or volume (defined either geodetically or in civic form) within which the UE 105 is expected to be located with some probability or confidence level (e.g., 67%, 95%, etc.). A location of the UE 105 may further be a relative location comprising, for example, a distance and direction or relative X, Y (and Z) coordinates defined relative to some origin at a known location which may be defined geodetically, in civic terms, or by reference to a point, area, or volume indicated on a map, floor plan or building plan. In the description contained herein, the use of the term location may comprise any of these variants unless indicated otherwise. When computing the location of a UE, it is common to solve for local X, Y, and possibly Z coordinates and then, if needed, convert the local coordinates into absolute ones (e.g. for latitude, longitude and altitude above or below mean sea level).
According to aspects of the disclosure, based on estimates of translational and/or rotational motion that it infers based on IMU data 206, pose estimation engine 208 can determine device pose estimates 230. Each device pose estimate 230 can generally indicate an estimated position and/or estimated orientation of mobile device 202 in terms of one or more dimensions of the coordinate system defined by the x, y, and z axes. According to aspects of the disclosure, in order to use IMU data 206 to determine device pose estimates 230, pose estimation engine 208 can apply a pose estimation scheme 210.
According to pose estimation scheme 400, the pose estimate is handled as an estimated system state that is tracked and updated using an estimation filter (EF). In some examples, the estimation filter can be a Kalman filter (KF). In other examples, the estimation filter can be an estimation filter of another type, such as an extended KF (EKF), unscented KF, cubature KF, alpha-beta filter, Gaussian-sum filter, interactive multiple model filter, or particle filter. IMU data provided by an IMU is both propagated to the estimation filter and input into the machine-learning model. The machine-learning model is used to generate pose measurements and uncertainties based on the IMU data.
The pose measurements can include three translational motion measurements (or position measurements implying those translational motion measurements), including a respective translational motion measurement (or position measurement implying that translational motion measurement) for each of three dimensions, and three rotational motion measurements (or orientation measurements implying those rotational motion measurements), including a respective rotational motion measurement (or orientation measurement implying that rotational motion measurement) for each of the three dimensions. The uncertainties can include respective uncertainties for each of the three translational motion (or position) measurements and each of the three rotational motion (or orientation) measurements.
The machine-learning model can generate the pose measurements according to a time interval value. The time interval value can define an amount of time across which changes in position and orientation (as a result of translational and rotational motion, respectively) are to be measured. Thus, for example, if the machine-learning model generates the pose measurements according to a time interval value of 1 second, the pose measurements can include translational motion measurements and rotational motion measurements corresponding to changes in position and orientation, respectively, occurring over a particular 1 second interval in time.
Each time the estimation filter is updated, pose measurements and uncertainties generated using the machine-learning model serve as bases for correction of the estimated system state. Upon completion of any given update, a pose estimate can be determined based on the estimated system state and can be provided to the client. The pose estimate can also be fed back to the machine-learning model, for reference in conjunction with generating pose measurements and uncertainties for a next estimation filter update.
Based on IMU data provided by an IMU, each machine-learning model can generate pose measurements and uncertainties associated with a different respective time interval value, which can correspond to the motion class for which that machine-learning model is responsible. For example, the machine-learning models A, B, C, and D shown in
According to pose estimation scheme 500, the pose estimate can be determined based on an estimated system state that is tracked and updated using a multi-measurement estimation filter bank. The multi-measurement estimation filter bank can be realized via the implementation of estimation filtering according to measurement and update equations that are modified to use multiple sets of pose measurements and uncertainties as inputs. In some examples, the multi-measurement estimation filter bank can be a bank of Kalman filters (a “Kalman filter bank”). In other examples, in lieu of—or in addition to—Kalman filters (KFs), the multi-measurement estimation filter bank can include estimation filters of one or more other types, such as extended KFs (EKFs), unscented KFs, cubature KFs, alpha-beta filters, Gaussian-sum filters, interactive multiple model filters, particle filters, or a combination thereof. According to aspects of the disclosure, updates of the multi-measurement estimation filter bank can be performed based on pose measurements and uncertainties generated by machine-learning models A. B, C, and D to correct the estimated system state. In this context, the translational and rotational motion measurements provided by the various machine-learning models can be weighted according to their corresponding uncertainties. Motion measurements having lesser associated levels of uncertainty can be afforded greater weights in conjunction with determining the pose estimate, and motion measurements having greater associated levels of uncertainty can be afforded lesser weights.
In operating environment 600, based on IMU data 606 obtained from IMU 204, pose estimation engine 208 can generate a plurality of pose measurement vectors 614. Each pose measurement vector 614 can generally describe one or more pose characteristics of mobile device 202, such as one or more of a position of mobile device 202, an orientation of mobile device 202, a linear velocity of mobile device 202, and an angular velocity of mobile device 202. Each pose measurement vector 614 can generally describe such pose characteristic(s) with respect to one or more coordinate dimensions, such as one or more of those defined by the x, y, and z axes in
According to aspects of the disclosure, each of the plurality of pose measurement vectors 614 can be generated according to a respective one of a plurality of machine-learning models 612 (such as machine-learning models A, B, C, and D of
In some examples, the position measurement parameters 616 can indicate positions according to a three-dimensional coordinate system, such as that defined by the x, y, and z axes in
In some examples, each pose measurement vector 614 can include one or more measurement parameters, including a respective measurement parameter for each of one or more dimensions. In some examples, for each of the one or more dimensions, the one or more measurement parameters can include a respective position measurement parameter 616, a respective orientation measurement parameter 618, or a combination of both. In some examples, for each of the one or more dimensions, the one or more uncertainty parameters can include a respective position uncertainty parameter 617, a respective orientation uncertainty parameter 619, or a combination of both.
In some examples, each pose measurement vector 614 can include three position measurement parameters 616, including a respective position measurement parameter 616 for each of three dimensions (such as the x, y, and z dimensions in
Based on IMU data 606 and the plurality of pose measurement vectors 614, pose estimation engine 208 can generate a device pose estimate 630, which can comprise one or more position estimate parameters 632, one or more orientation estimate parameters 634, or a combination of both. In some examples, the device pose estimate can comprise, for each of one or more dimensions, a respective position estimate parameter 632, a respective orientation estimate parameter 634, or a combination of both. In some examples, the device pose estimate 630 can comprise three position estimate parameters 632, including a respective position estimate parameter 632 for each of three dimensions, and can comprise three orientation estimate parameters 634, including a respective orientation estimate parameters 634 for each of the three dimensions.
In some examples, for each of one or more dimensions, pose estimation engine 208 can determine a respective position estimate parameter 632 for that dimension by weighting respective position measurement parameters 616 for that dimension among position measurement parameters 616 of the plurality of pose measurement vectors 614 according to respective position uncertainty parameters 617 for that dimension among position uncertainty parameters 617 of the plurality of pose measurement vectors 614.
In some examples, for each of the one or more dimensions, pose estimation engine 208 can additionally or alternatively determine a respective orientation estimate parameter 634 for that dimension by weighting respective orientation measurement parameters 618 for that dimension among orientation measurement parameters 618 of the plurality of pose measurement vectors 614 according to respective orientation uncertainty parameters 619 for that dimension among orientation uncertainty parameters 619 of the plurality of pose measurement vectors 614.
According to aspects of the disclosure, pose estimation engine 208 can determine the device pose estimate 630 based on an estimated system state that pose estimation engine 208 tracks and updates using a multi-measurement estimation filter bank 620. In some examples, the multi-measurement estimation filter bank can be realized via the implementation of estimation filtering according to measurement and update equations that are modified to use multiple sets of pose measurements and uncertainties as inputs. In some examples, the multi-measurement estimation filter bank can be a Kalman filter bank. In other examples, in lieu of—or in addition to—Kalman filters (KFs), the multi-measurement estimation filter bank can include estimation filters of one or more other types, such as extended KFs (EKFs), unscented KFs, cubature KFs, alpha-beta filters, Gaussian-sum filters, interactive multiple model filters, particle filters, or a combination thereof.
In some examples, pose estimation engine 208 can update a state of the multi-measurement estimation filter bank 620 based on the IMU data 606 and the plurality of pose measurement vectors 614, and can determine the device pose estimate 630 based on the updated state of the multi-measurement estimation filter bank 620. In some examples, the multi-measurement estimation filter bank 620 can be implemented as a state buffer, and pose estimation engine 208 can update the state of the multi-measurement estimation filter bank 620 by performing, for each of the plurality of pose measurement vectors 614, a respective update of the state buffer.
In some examples, one or more of the machine-learning models 612 according to which pose estimation engine 208 generates the plurality of pose measurement vectors 614 can be selected based on one or more navigation context parameters 622. The one or more navigation context parameters 622 can generally comprise information characterizing underlying circumstances under which pose estimation is being conducted. In some examples, the one or more navigation context parameters 622 can be included among context information that serves as a basis for context-based model selection according to pose estimation scheme 505 of
In some examples, based on the context of pose estimation, as characterized by one or more navigation context parameters 622, the uncertainty parameters (such as position uncertainty parameters 617 and/or orientation uncertainty parameters 619) associated with one or more machine-learning models 612 can be scaled (for instance, increased or decreased). The pose estimates produced by those one or more machine-learning models 612 can then be weighted according to their scaled uncertainties.
In some examples, pose estimation engine 208 can obtain sensor information 624 from one or more sensors (not shown in
In some examples, pose estimation engine 208 can determine or estimate a confidence level associated with device pose estimate 630 based on uncertainty parameters comprised in pose measurement vectors 614, such as position uncertainty parameters 617 and or orientation uncertainty parameters 619. In some examples, the confidence level can be included among uncertainty feedback that serves as a basis for determining whether or not to acquire and utilize supplemental measurements in conjunction with pose estimation according to pose estimation scheme 520 of
Updates to the multi-measurement estimation filter bank can be conducted by modifying buffer elements of the state buffer 721 according to outputs of multiple machine-learning models, including machine-learning models A, B, C, and D. The scope (in terms of buffer elements) of modifications to the state buffer 721 involved in an update based on outputs of a given machine-learning model corresponds to the time interval value associated with that machine-learning model. Updates based on outputs of machine-learning model A, which has an associated time interval value of 1 second, involve modifications spanning from buffer element (BE) 1 to buffer element 2, and thus a scope of one buffer element. Similarly, updates based on outputs of machine-learning models B, C, and D, which have respective associated time interval values of 2, 4, and 8 seconds, involve modifications having respective scopes of two, four, and 8 buffer elements. In some examples, any given pose measurement (which can include position and/or orientation components in one or multiple dimensions) can take the form of a current pose measurement for a specific buffer element in state buffer 721. In some other examples, any given pose measurement can take the form of a delta pose change from one buffer element of state buffer 721 to another.
At block 810, the functionality comprises obtaining IMU data from an IMU. For example, in operating environment 600 of
At block 820, the functionality comprises generating, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, where each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes. For example, in operating environment 600 of
In some examples, each pose measurement vector of the plurality of pose measurement vectors can include three position measurement parameters, including a respective position measurement parameter for each of three dimensions, and three orientation measurement parameters, including a respective orientation measurement parameter for each of the three dimensions. For example, in operating environment 600 of
In some examples, each pose measurement vector of the plurality of pose measurement vectors can comprise six uncertainty parameters, which can include three position uncertainty parameters, including a respective position uncertainty parameter for each of the three position measurement parameters, and three orientation uncertainty parameters, including a respective orientation uncertainty parameter for each of the three orientation measurement parameters. For example, in operating environment 600 of
In some examples, at least one of the plurality of machine-learning models can be selected based on one or more navigation context parameters. For example, in operating environment 600 of
At block 830, the functionality comprises determining a device pose estimate based on the IMU data and each of the plurality of pose measurement vectors. For example, in operating environment 600 of
In some examples, a state of a multi-measurement estimation filter bank can be updated based on the IMU data and each of the plurality of pose measurement vectors, and the device pose estimate can be determined based on the updated state of the multi-measurement estimation filter bank. For example, in operating environment 600 of
In some examples, the multi-measurement estimation filter bank can be implemented as a state buffer, and updating the state of the multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors can include performing, for each of the plurality of pose measurement vectors, a respective update of the state buffer. For example, updating the state of the multi-measurement estimation filter bank 620 based on the IMU data 606 and each of the plurality of pose measurement vectors 614 in operating environment 600 of
In some examples, the device pose estimate can comprise three position estimate parameters, including a respective position estimate parameter for each of three dimensions, and three orientation estimate parameters, including a respective orientation estimate parameter for each of the three dimensions. For example, in operating environment 600 of
In some examples, for each of the three dimensions, the respective position estimate parameter for that dimension can be determined by weighting the respective position measurement parameters for that dimension among the position measurement parameters of the plurality of pose measurement vectors according to the respective position uncertainty parameters for that dimension among the position uncertainty parameters of the plurality of pose measurement vectors. For example, for each of the x, y, and z dimensions in operating environment 600 of
In some examples, for each of the three dimensions, the respective orientation estimate parameter for that dimension can be determined by weighting the respective orientation measurement parameters for that dimension among the orientation measurement parameters of the plurality of pose measurement vectors according to the respective orientation uncertainty parameters for that dimension among the orientation uncertainty parameters of the plurality of pose measurement vectors. For example, for each of the x, y, and z dimensions in operating environment 600 of
In some examples, sensor information can be obtained from one or more sensors, and the device pose estimate can be determined at block 830 based on the IMU data, each of the plurality of pose measurement vectors, and the sensor information. For example, in operating environment 600 of
In some examples, based on uncertainty parameters of the plurality of pose measurement vectors, one or more types of supplemental measurements can be selected as inputs for determining the device pose estimate, and the sensor information can be obtained from the one or more sensors based on the one or more selected types of supplemental measurements. For example, in operating environment 600 of
The mobile device 900 is shown comprising hardware elements that can be electrically coupled via a bus 905 (or may otherwise be in communication, as appropriate). The hardware elements may include a processor(s) 910 which can include without limitation one or more general-purpose processors (e.g., an application processor), one or more special-purpose processors (such as digital signal processor (DSP) chips, graphics acceleration processors, application specific integrated circuits (ASICs), and/or the like), and/or other processing structures or means. Processor(s) 910 may comprise one or more processing units, which may be housed in a single integrated circuit (IC) or multiple ICs. As shown in
The mobile device 900 may also include a wireless communication interface 930, which may comprise without limitation a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset (such as a Bluetooth® device, an IEEE 802.11 device, an IEEE 802.15.4 device, a Wi-Fi device, a WiMAX device, a WAN device, and/or various cellular devices, etc.), and/or the like, which may enable the mobile device 900 to communicate with other devices as described in the embodiments above. The wireless communication interface 930 may permit data and signaling to be communicated (e.g., transmitted and received) with TRPs of a network, for example, via eNBs, gNBs, ng-eNBs, access points, various base stations and/or other access node types, and/or other network components, computer systems, and/or any other electronic devices communicatively coupled with TRPs, as described herein. The communication can be carried out via one or more wireless communication antenna(s) 932 that send and/or receive wireless signals 934. According to some embodiments, the wireless communication antenna(s) 932 may comprise a plurality of discrete antennas, antenna arrays, or any combination thereof. The antenna(s) 932 may be capable of transmitting and receiving wireless signals using beams (e.g., Tx beams and Rx beams). Beam formation may be performed using digital and/or analog beam formation techniques, with respective digital and/or analog circuitry. The wireless communication interface 930 may include such circuitry.
Depending on desired functionality, the wireless communication interface 930 may comprise a separate receiver and transmitter, or any combination of transceivers, transmitters, and/or receivers to communicate with base stations (e.g., ng-eNBs and gNBs) and other terrestrial transceivers, such as wireless devices and access points. The mobile device 900 may communicate with different data networks that may comprise various network types. For example, a WWAN may be a CDMA network, a Time Division Multiple Access (TDMA) network, a Frequency Division Multiple Access (FDMA) network, an Orthogonal Frequency Division Multiple Access (OFDMA) network, a Single-Carrier Frequency Division Multiple Access (SC-FDMA) network, a WiMAX (IEEE 802.16) network, and so on. A CDMA network may implement one or more RATs such as CDMA2000®, WCDMA, and so on. CDMA2000® includes IS-95, IS-2000 and/or IS-856 standards. A TDMA network may implement GSM, Digital Advanced Mobile Phone System (D-AMPS), or some other RAT. An OFDMA network may employ LTE, LTE Advanced, 5G NR, and so on. 5G NR. LTE, LTE Advanced, GSM, and WCDMA are described in documents from 3GPP. CDMA2000® is described in documents from a consortium named “3rd Generation Partnership Project 2” (3GPP2). 3GPP and 3GPP2 documents are publicly available. A wireless local area network (WLAN) may also be an IEEE 802.11x network, and a wireless personal area network (WPAN) may be a Bluetooth network, an IEEE 802.15x, or some other type of network. The techniques described herein may also be used for any combination of WWAN, WLAN and/or WPAN.
The mobile device 900 can further include sensor(s) 940. Sensor(s) 940 may comprise, without limitation, one or more inertial sensors and/or other sensors (e.g., accelerometer(s), gyroscope(s), camera(s), magnetometer(s), altimeter(s), microphone(s), proximity sensor(s), light sensor(s), barometer(s), and the like), some of which may be used to obtain position-related measurements and/or other information.
Embodiments of the mobile device 900 may also include a Global Navigation Satellite System (GNSS) receiver 980 capable of receiving signals 984 from one or more GNSS satellites using an antenna 982 (which could be the same as antenna 932). Positioning based on GNSS signal measurement can be utilized to complement and/or incorporate the techniques described herein. The GNSS receiver 980 can extract a position of the mobile device 900, using conventional techniques, from GNSS satellites of a GNSS system, such as Global Positioning System (GPS), Galileo, GLONASS, Quasi-Zenith Satellite System (QZSS) over Japan, IRNSS over India, BeiDou Navigation Satellite System (BDS) over China, and/or the like. Moreover, the GNSS receiver 980 can be used with various augmentation systems (e.g., a Satellite Based Augmentation System (SBAS)) that may be associated with or otherwise enabled for use with one or more global and/or regional navigation satellite systems, such as, e.g., Wide Area Augmentation System (WAAS), European Geostationary Navigation Overlay Service (EGNOS), Multi-functional Satellite Augmentation System (MSAS), and Geo Augmented Navigation system (GAGAN), and/or the like.
It can be noted that, although GNSS receiver 980 is illustrated in
The mobile device 900 may further include and/or be in communication with a memory 960. The memory 960 can include, without limitation, local and/or network accessible storage, a disk drive, a drive array, an optical storage device, a solid-state storage device, such as a random access memory (RAM), and/or a read-only memory (ROM), which can be programmable, flash-updateable, and/or the like. Such storage devices may be configured to implement any appropriate data stores, including without limitation, various file systems, database structures, and/or the like.
The memory 960 of the mobile device 900 also can comprise software elements (not shown in
It will be apparent to those skilled in the art that substantial variations may be made in accordance with specific requirements. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software (including portable software, such as applets, etc.), or both. Further, connection to other computing devices such as network input/output devices may be employed.
With reference to the appended figures, components that can include memory can include non-transitory machine-readable media. The term “machine-readable medium” and “computer-readable medium” as used herein, refer to any storage medium that participates in providing data that causes a machine to operate in a specific fashion. In embodiments provided hereinabove, various machine-readable media might be involved in providing instructions/code to processors and/or other device(s) for execution. Additionally or alternatively, the machine-readable media might be used to store and/or carry such instructions/code. In many implementations, a computer-readable medium is a physical and/or tangible storage medium. Such a medium may take many forms, including but not limited to, non-volatile media and volatile media. Common forms of computer-readable media include, for example, magnetic and/or optical media, any other physical medium with patterns of holes, a RAM, a programmable ROM (PROM), erasable PROM (EPROM), a FLASH-EPROM, any other memory chip or cartridge, or any other medium from which a computer can read instructions and/or code.
The methods, systems, and devices discussed herein are examples. Various embodiments may omit, substitute, or add various procedures or components as appropriate. For instance, features described with respect to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner. The various components of the figures provided herein can be embodied in hardware and/or software. Also, technology evolves and, thus many of the elements are examples that do not limit the scope of the disclosure to those specific examples.
It has proven convenient at times, principally for reasons of common usage, to refer to such signals as bits, information, values, elements, symbols, characters, variables, terms, numbers, numerals, or the like. It should be understood, however, that all of these or similar terms are to be associated with appropriate physical quantities and are merely convenient labels. Unless specifically stated otherwise, as is apparent from the discussion above, it is appreciated that throughout this Specification discussion utilizing terms such as “processing,” “computing,” “calculating,” “determining,” “ascertaining,” “identifying,” “associating,” “measuring,” “performing,” or the like refer to actions or processes of a specific apparatus, such as a special purpose computer or a similar special purpose electronic computing device. In the context of this Specification, therefore, a special purpose computer or a similar special purpose electronic computing device is capable of manipulating or transforming signals, typically represented as physical electronic, electrical, or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the special purpose computer or similar special purpose electronic computing device.
Terms, “and” and “or” as used herein, may include a variety of meanings that also is expected to depend, at least in part, upon the context in which such terms are used. Typically, “or” if used to associate a list, such as A, B, or C, is intended to mean A, B, and C, here used in the inclusive sense, as well as A, B, or C, here used in the exclusive sense. In addition, the term “one or more” as used herein may be used to describe any feature, structure, or characteristic in the singular or may be used to describe some combination of features, structures, or characteristics. However, it should be noted that this is merely an illustrative example and claimed subject matter is not limited to this example. Furthermore, the term “at least one of” if used to associate a list, such as A, B, or C, can be interpreted to mean any combination of A, B, and/or C, such as A, AB, AA, AAB, AABBCCC, etc.
Having described several embodiments, various modifications, alternative constructions, and equivalents may be used without departing from the scope of the disclosure. For example, the above elements may merely be a component of a larger system, wherein other rules may take precedence over or otherwise modify the application of the various embodiments. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description does not limit the scope of the disclosure.
In view of this description embodiments may include different combinations of features. Implementation examples are described in the following numbered clauses:
Clause 1. A method for inertial navigation, including obtaining inertial measurement unit (IMU) data from an IMU, generating, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, where each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and determining a device pose estimate based on the IMU data and the plurality of pose measurement vectors.
Clause 2. The method of clause 1, including updating a state of a multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors, and determining the device pose estimate based on the updated state of the multi-measurement estimation filter bank.
Clause 3. The method of clause 2, where the multi-measurement estimation filter bank comprises a state buffer, and where updating the state of the multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors includes performing, for each of the plurality of pose measurement vectors, a respective update of the state buffer.
Clause 4. The method of any of clauses 1 to 3, where each pose measurement vector of the plurality of pose measurement vectors includes one or more measurement parameters, including a respective measurement parameter for each of one or more dimensions, and one or more uncertainty parameters, including a respective uncertainty parameter for each of the one or more measurement parameters.
Clause 5. The method of clause 4, where for each of the one or more dimensions, the one or more measurement parameters include a respective position measurement parameter, a respective orientation measurement parameter, or a combination of both, and for each of the one or more dimensions, the one or more uncertainty parameters include a respective position uncertainty parameter, a respective orientation uncertainty parameter, or a combination of both.
Clause 6. The method of any of clauses 1 to 5, where the device pose estimate includes, for each of the one or more dimensions a respective position estimate parameter, a respective orientation estimate parameter, or a combination of both.
Clause 7. The method of clause 6, including, for each of the one or more dimensions determining the respective position estimate parameter for that dimension by weighting respective position measurement parameters for that dimension among position measurement parameters of the plurality of pose measurement vectors according to respective position uncertainty parameters for that dimension among position uncertainty parameters of the plurality of pose measurement vectors, determining the respective orientation estimate parameter for that dimension by weighting respective orientation measurement parameters for that dimension among orientation measurement parameters of the plurality of pose measurement vectors according to respective orientation uncertainty parameters for that dimension among orientation uncertainty parameters of the plurality of pose measurement vectors, or a combination of the above.
Clause 8. The method of any of clauses 1 to 7, where each of the multiple motion classes corresponds to a different respective time interval value.
Clause 9. The method of any of clauses 1 to 8, including selecting at least one of the plurality of machine-learning models based on one or more navigation context parameters.
Clause 10. The method of clause 9, including adjusting uncertainty parameters associated with a machine-learning model among the plurality of machine-learning models based on the one or more navigation context parameters, resulting in adjusted uncertainty parameters, and weighting a pose measurement vector associated with the machine-learning model according to the adjusted uncertainty parameters.
Clause 11. The method of any of clauses 1 to 10, including obtaining sensor information from one or more sensors, and determining the device pose estimate based on the IMU data, each of the plurality of pose measurement vectors, and the sensor information.
Clause 12. The method of clause 11, where the one or more sensors include one or more of a camera, a radar sensor, a pressure sensor, an ultrasound sensor, and a global navigation satellite system (GNSS) receiver.
Clause 13. The method of any of clauses 11 to 12, including selecting, based on uncertainty parameters of the plurality of pose measurement vectors, one or more types of supplemental measurements as inputs for determining the device pose estimate, and obtaining the sensor information from the one or more sensors based on the one or more selected types of supplemental measurements.
Clause 14. An apparatus for inertial navigation, including an inertial measurement unit (IMU), a transceiver, a memory, and one or more processors communicatively coupled with the transceiver and the memory, where the one or more processors are configured to obtain IMU data from the IMU, generate, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, where each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and determine a device pose estimate based on the IMU data and each of the plurality of pose measurement vectors.
Clause 15. The apparatus of clause 14, where to determine the device pose estimate based on the IMU data and each of the plurality of pose measurement vectors, the one or more processors are configured to update a state of a multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors, and determine the device pose estimate based on the updated state of the multi-measurement estimation filter bank.
Clause 16. The apparatus of clause 15, where the multi-measurement estimation filter bank comprises a state buffer, and where to update the state of the multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors, the one or more processors are configured to perform, for each of the plurality of pose measurement vectors, a respective update of the state buffer.
Clause 17. The apparatus of any of clauses 14 to 16, where each pose measurement vector of the plurality of pose measurement vectors includes one or more measurement parameters, including a respective measurement parameter for each of one or more dimensions, and one or more uncertainty parameters, including a respective uncertainty parameter for each of the one or more measurement parameters.
Clause 18. The apparatus of clause 17, where for each of the one or more dimensions, the one or more measurement parameters include a respective position measurement parameter, a respective orientation measurement parameter, or a combination of both, and for each of the one or more dimensions, the one or more uncertainty parameters include a respective position uncertainty parameter, a respective orientation uncertainty parameter, or a combination of both.
Clause 19. The apparatus of any of clauses 14 to 18, where the device pose estimate includes, for each of the one or more dimensions a respective position estimate parameter, a respective orientation estimate parameter, or a combination of both.
Clause 20. The apparatus of clause 19, where to determine the device pose estimate based on the IMU data and each of the plurality of pose measurement vectors, the one or more processors are configured to, for each of the one or more dimensions determine the respective position estimate parameter for that dimension by weighting respective position measurement parameters for that dimension among position measurement parameters of the plurality of pose measurement vectors according to respective position uncertainty parameters for that dimension among position uncertainty parameters of the plurality of pose measurement vectors, determine the respective orientation estimate parameter for that dimension by weighting respective orientation measurement parameters for that dimension among orientation measurement parameters of the plurality of pose measurement vectors according to respective orientation uncertainty parameters for that dimension among orientation uncertainty parameters of the plurality of pose measurement vectors, or a combination of the above.
Clause 21. The apparatus of any of clauses 14 to 20, where each of the multiple motion classes corresponds to a different respective time interval value.
Clause 22. The apparatus of any of clauses 14 to 21, where the one or more processors are configured to select at least one of the plurality of machine-learning models based on one or more navigation context parameters.
Clause 23. The apparatus of clause 22, where the one or more processors are configured to adjust uncertainty parameters associated with a machine-learning model among the plurality of machine-learning models based on the one or more navigation context parameters, resulting in adjusted uncertainty parameters, and weight a pose measurement vector associated with the machine-learning model according to the adjusted uncertainty parameters.
Clause 24. The apparatus of any of clauses 14 to 23, where the one or more processors are configured to obtain sensor information from one or more sensors, and determine the device pose estimate based on the IMU data, each of the plurality of pose measurement vectors, and the sensor information.
Clause 25. The apparatus of clause 24, where the one or more sensors include one or more of a camera, a radar sensor, a pressure sensor, an ultrasound sensor, and a global navigation satellite system (GNSS) receiver.
Clause 26. The apparatus of any of clauses 24 to 25, where the one or more processors are configured to select, based on uncertainty parameters of the plurality of pose measurement vectors, one or more types of supplemental measurements as inputs for determining the device pose estimate, and obtain the sensor information from the one or more sensors based on the one or more selected types of supplemental measurements.
Clause 27. An apparatus for inertial navigation, including means for obtaining inertial measurement unit (IMU) data from an IMU, means for generating, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, where each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and means for determining a device pose estimate based on the IMU data and each of the plurality of pose measurement vectors.
Clause 28. The apparatus of clause 27, where the means for determining the device pose estimate based on the IMU data and each of the plurality of pose measurement vectors includes means for updating a state of a multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors, and means for determining the device pose estimate based on the updated state of the multi-measurement estimation filter bank.
Clause 29. The apparatus of clause 28, where the multi-measurement estimation filter bank comprises a state buffer, and where the means for updating the state of the multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors includes means for performing, for each of the plurality of pose measurement vectors, a respective update of the state buffer.
Clause 30. The apparatus of any of clauses 27 to 29, where each pose measurement vector of the plurality of pose measurement vectors includes one or more measurement parameters, including a respective measurement parameter for each of one or more dimensions, and one or more uncertainty parameters, including a respective uncertainty parameter for each of the one or more measurement parameters.
Clause 31. The apparatus of clause 30, where for each of the one or more dimensions, the one or more measurement parameters include a respective position measurement parameter, a respective orientation measurement parameter, or a combination of both, and for each of the one or more dimensions, the one or more uncertainty parameters include a respective position uncertainty parameter, a respective orientation uncertainty parameter, or a combination of both.
Clause 32. The apparatus of any of clauses 27 to 31, where the device pose estimate includes, for each of the one or more dimensions a respective position estimate parameter, a respective orientation estimate parameter, or a combination of both.
Clause 33. The apparatus of clause 32, including means for, for each of the one or more dimensions determining the respective position estimate parameter for that dimension by weighting respective position measurement parameters for that dimension among position measurement parameters of the plurality of pose measurement vectors according to respective position uncertainty parameters for that dimension among position uncertainty parameters of the plurality of pose measurement vectors, determining the respective orientation estimate parameter for that dimension by weighting respective orientation measurement parameters for that dimension among orientation measurement parameters of the plurality of pose measurement vectors according to respective orientation uncertainty parameters for that dimension among orientation uncertainty parameters of the plurality of pose measurement vectors, or a combination of the above.
Clause 34. The apparatus of any of clauses 27 to 33, where each of the multiple motion classes corresponds to a different respective time interval value.
Clause 35. The apparatus of any of clauses 27 to 34, including means for selecting at least one of the plurality of machine-learning models based on one or more navigation context parameters.
Clause 36. The apparatus of clause 35, including means for adjusting uncertainty parameters associated with a machine-learning model among the plurality of machine-learning models based on the one or more navigation context parameters, resulting in adjusted uncertainty parameters, and means for weighting a pose measurement vector associated with the machine-learning model according to the adjusted uncertainty parameters.
Clause 37. The apparatus of any of clauses 27 to 36, including means for obtaining sensor information from one or more sensors, and means for determining the device pose estimate based on the IMU data, each of the plurality of pose measurement vectors, and the sensor information.
Clause 38. The apparatus of clause 37, where the one or more sensors include one or more of a camera, a radar sensor, a pressure sensor, an ultrasound sensor, and a global navigation satellite system (GNSS) receiver.
Clause 39. The apparatus of any of clauses 37 to 38, including means for selecting, based on uncertainty parameters of the plurality of pose measurement vectors, one or more types of supplemental measurements as inputs for determining the device pose estimate, and means for obtaining the sensor information from the one or more sensors based on the one or more selected types of supplemental measurements.
Clause 40. A non-transitory computer-readable medium storing instructions for inertial navigation, the instructions including code for obtaining inertial measurement unit (IMU) data from an IMU, generating, based on the IMU data, a respective pose measurement vector according to each of a plurality of machine-learning models, resulting in a plurality of pose measurement vectors, where each of the plurality of pose measurement vectors is associated with a respective one of multiple motion classes, and determining a device pose estimate based on the IMU data and the plurality of pose measurement vectors.
Clause 41. The non-transitory computer-readable medium of clause 40, the instructions including code for updating a state of a multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors, and determining the device pose estimate based on the updated state of the multi-measurement estimation filter bank.
Clause 42. The non-transitory computer-readable medium of clause 41, where the multi-measurement estimation filter bank comprises a state buffer, and where updating the state of the multi-measurement estimation filter bank based on the IMU data and each of the plurality of pose measurement vectors includes performing, for each of the plurality of pose measurement vectors, a respective update of the state buffer.
Clause 43. The non-transitory computer-readable medium of any of clauses 40 to 42, where each pose measurement vector of the plurality of pose measurement vectors includes one or more measurement parameters, including a respective measurement parameter for each of one or more dimensions, and one or more uncertainty parameters, including a respective uncertainty parameter for each of the one or more measurement parameters.
Clause 44. The non-transitory computer-readable medium of clause 43, where for each of the one or more dimensions, the one or more measurement parameters include a respective position measurement parameter, a respective orientation measurement parameter, or a combination of both, and for each of the one or more dimensions, the one or more uncertainty parameters include a respective position uncertainty parameter, a respective orientation uncertainty parameter, or a combination of both.
Clause 45. The non-transitory computer-readable medium of any of clauses 40 to 44, where the device pose estimate includes, for each of the one or more dimensions a respective position estimate parameter, a respective orientation estimate parameter, or a combination of both.
Clause 46. The non-transitory computer-readable medium of clause 45, the instructions including code for, for each of the one or more dimensions determining the respective position estimate parameter for that dimension by weighting respective position measurement parameters for that dimension among position measurement parameters of the plurality of pose measurement vectors according to respective position uncertainty parameters for that dimension among position uncertainty parameters of the plurality of pose measurement vectors, determining the respective orientation estimate parameter for that dimension by weighting respective orientation measurement parameters for that dimension among orientation measurement parameters of the plurality of pose measurement vectors according to respective orientation uncertainty parameters for that dimension among orientation uncertainty parameters of the plurality of pose measurement vectors, or a combination of the above.
Clause 47. The non-transitory computer-readable medium of any of clauses 40 to 46, where each of the multiple motion classes corresponds to a different respective time interval value.
Clause 48. The non-transitory computer-readable medium of any of clauses 40 to 47, the instructions including code for selecting at least one of the plurality of machine-learning models based on one or more navigation context parameters.
Clause 49. The non-transitory computer-readable medium of clause 48, the instructions including code for adjusting uncertainty parameters associated with a machine-learning model among the plurality of machine-learning models based on the one or more navigation context parameters, resulting in adjusted uncertainty parameters, and weighting a pose measurement vector associated with the machine-learning model according to the adjusted uncertainty parameters.
Clause 50. The non-transitory computer-readable medium of any of clauses 40 to 49, the instructions including code for obtaining sensor information from one or more sensors, and determining the device pose estimate based on the IMU data, each of the plurality of pose measurement vectors, and the sensor information.
Clause 51. The non-transitory computer-readable medium of clause 50, where the one or more sensors include one or more of a camera, a radar sensor, a pressure sensor, an ultrasound sensor, and a global navigation satellite system (GNSS) receiver.
Clause 52. The non-transitory computer-readable medium of any of clauses 50 to 51, the instructions including code for selecting, based on uncertainty parameters of the plurality of pose measurement vectors, one or more types of supplemental measurements as inputs for determining the device pose estimate, and obtaining the sensor information from the one or more sensors based on the one or more selected types of supplemental measurements.