The invention relates generally to computer-vision-based object tracking and guidance apparatuses.
E-commerce continues to see significant year-over-year growth and is expected do so for the foreseeable future. Many such online retailers ship purchased goods to a customer's front door. With the rise of “porch pirates”, though, namely, people who steal packages off from customers' porches or front door areas, many customers want their online orders shipped to a store, where the purchased goods await their pickup. This process has the further advantage of saving money on shipping costs. Retailers are thus leveraging their brick-and-mortar stores to fulfill online sales, which increases customer foot traffic at their sites, wins more customers, and results in more volume.
Retailers, however, are not equipped to efficiently handle in-store pickups. Most buy-online-pickup-in store (BOPIS) solutions are expensive and require additional staff or significant changes in operation. A poorly designed pickup process can cause delay and frustrate customers. Once a customer has had a bad pickup experience, he or she is unlikely to try in-store pick-up again. Other self-pickup solutions, such as package lockers and package towers are expensive, restrictive, fixed, and take up space, and staffing a pickup counter takes staff away from the business of selling or other more productive business operations.
All examples and features mentioned below can be combined in any technically possible way.
In one aspect, the invention is related to an apparatus comprising a mount body by which to secure the apparatus to a structure and a camera assembly fixed to the mount body. The camera assembly includes an image sensor that captures images within its field of view. The apparatus further comprises a lighting assembly rotatably connected to the mount body. The lighting assembly houses one or more light sources including a directional light source secured to a laser assembly. A control-board assembly, fixed to the mount body, houses control boards that are in electrical communication with the camera assembly to acquire the images captured by the image sensor and with the lighting assembly to control operation of the one or more light sources. The control boards include one or more processors configured to acquire information about an object, to associate a location within the field of view of the image sensor with the object, to point light emitted by the directional light source at the location associated with the object by rotating the lighting assembly and turning the laser assembly, and, based on an image acquired from the camera assembly, to detect change within the field of view of the image sensor corresponding to placement or removal of the object.
In some embodiments, the camera assembly further comprises a depth sensor fixed to a mounting surface and a plurality of support mounts of different heights attached to a frame of the camera assembly, and the image sensor is mounted to a board held by the plurality of support mounts at a non-zero offset angle relative to the mounting surface upon which the depth sensor is fixed. The support mounts can have rivet holes, and the camera assembly can further comprise push rivets that pass through the board into the rivet holes of the support mounts to secure the image sensor within the camera assembly.
In some embodiments, the mount body has a channel extending therethrough. The channel has opposing upper and lower surfaces and a side wall therebetween. The sidewall has two angled surfaces that determine a full range of angles at which the mount body can be mounted to a rail. One of the surfaces of the channel has a retaining boss extending therefrom. The retaining boss is located on the one surface to align with a groove of the rail. The retaining boss has a size that fits closely within the groove of the rail. The apparatus may further comprise a bracket with two arms and a mounting surface, and a channel bar attached between ends of the two arms. The channel bar has dimensions adapted to fit closely within and pass through the channel of the mount body. In another embodiment, the bracket has two opposing walls and a sidewall disposed therebetween, and the mount body includes a pair of flanges, one flange of the pair on each side of the mount body, each flange having an opening therein. A first wall of the two walls of the bracket enters the channel of the mount body and has openings that align with the openings of the flanges for receiving fasteners therethrough that secure the first wall to the flanges. A second wall of the two walls has openings therein for receiving fasteners therethrough that secure the second wall to a surface.
In another aspect, the invention is related to an apparatus comprising a mount body, a lighting assembly, attached to the mount body, that houses a directional light source, and a camera assembly, attached to the mount body, that houses an RGB (read green blue) camera and a depth camera that capture image information within their fields of view. The camera assembly has a mounting surface upon which the depth camera is fixed and a plurality of support mounts of different heights attached to a frame of the camera assembly. The RGB camera is mounted to a board supported by the plurality of support mounts of different heights and held at a non-zero offset angle relative to the mounting surface upon which the depth camera is fixed. The apparatus further comprises a control-board assembly that is attached to the mount body. The control-board assembly is in communication with the camera assembly to receive image information captured by the cameras and with the lighting assembly to control operation of the directional light source. The control-board assembly houses control boards that include a processor configured to receive and process images captured by the camera assembly and to operate the directional light source in response to the processed images.
The support mounts may have rivet holes, and the camera assembly may further comprise push rivets that pass through the board into the rivet holes of the support mounts to secure the RGB camera within the camera assembly. The mount body may have a channel extending therethrough. The channel has opposing upper and lower surfaces and a side wall therebetween. The sidewall has two angled surfaces that determine a full range of angles at which the mount body can be mounted to a rail. One of the surfaces of the channel may have a retaining boss extending therefrom. The retaining boss is located and sized to align with and fit within a groove of the rail.
The apparatus may further comprise a bracket with two arms that meet at a mounting surface, and a channel bar attached between ends of the two arms. The channel bar has dimensions adapted to fit closely within and pass through the channel of the mount body. In another embodiment, the bracket has two opposing walls and a sidewall disposed therebetween, and the mount body includes a pair of flanges, one flange of the pair on each side of the mount body, each flange having an opening therein. A first wall of the two walls of the bracket enters the channel of the mount body and has openings that align with the openings of the flanges for receiving fasteners therethrough that secure the first wall to the flanges. A second wall of the two walls has openings therein for receiving fasteners therethrough that secure the second wall to a surface.
In another aspect, the invention is related to an apparatus comprising a mount body by which to secure the apparatus to a rail. The mount body has a channel sized to receive the rail therethrough. The channel has a sidewall disposed between opposing walls. The sidewall has multiple angled surfaces that determine a full range of angles at which the rail can be secured to the mount body. The apparatus further comprises a camera assembly housing a camera, a light-guidance assembly, and a control-board assembly. The camera assembly is attached to the mount body such that the camera has a field of view that faces downwards when the apparatus is secured to the rail. The light-guidance assembly is rotatably attached to the mount body and houses one or more light sources. The control-board assembly is attached to the mount body and is in communication with the camera assembly to receive image information captured by the cameras and with the lighting assembly to control operation of the one or more light sources. The control-board assembly houses control boards configured to receive and process images captured by the camera assembly and to rotate the light-guidance assembly and operate the one or more light sources in response to the processed images.
One of the surfaces of the channel may have a retaining boss extending therefrom. The retaining boss is located and sized to align with and fit within a groove of the rail.
The apparatus may further comprise a bracket with two arms that end at a mounting surface, and a channel bar attached between ends of the two arms. The channel bar has dimensions adapted to fit closely within and pass through the channel of the mount body. In an alternative embodiment, the bracket has two opposing walls and a sidewall disposed therebetween, and the mount body includes a pair of flanges, one flange of the pair on each side of the mount body, each flange having an opening therein. A first wall of the two walls of the bracket enters the channel of the mount body and has openings that align with the openings of the flanges for receiving fasteners therethrough that secure the first wall to the flanges. A second wall of the two walls has openings therein for receiving fasteners therethrough that secure the second wall to a surface.
In some embodiments, the camera assembly has a depth sensor fixed to a mounting surface and a plurality of support mounts of different heights attached to a frame of the camera assembly, and wherein the image sensor is mounted to a board supported by the plurality of support mounts and held at a non-zero offset angle relative to the mounting surface to which the depth sensor is fixed. The support mounts may have rivet holes, and the camera assembly may further comprise push rivets that pass through the board into the rivet holes of the support mounts to secure the image sensor within the camera assembly.
In one embodiment, the one or more light sources includes a directional light source fixed to a laser assembly, and the apparatus further comprises a first motor operably coupled to the lighting assembly to pan the directional light source horizontally and a second motor operably coupled to the laser assembly to tilt the directional light source vertically.
The above and further advantages of this invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like numerals indicate like structural elements and features in various figures. The drawings are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the invention.
Computer-vision-based object tracking and guidance apparatuses described herein can be used to provide a secure, self-service, buy-online-pickup-in-store (BOPIS) solution without the aforementioned shortcomings of package lockers, package towers, and staffed pickup counters. Embodiments of such apparatuses or modules, as they are referred to herein, enable users to locate, identify, and pickup items, such as packages, using light and audio cues.
In brief overview, a module can register and track objects within a module's field of view and, additionally or alternatively, guide users to specific objects using light, audio, or both. In brief overview, the module is comprised of a computer-vision system connected to and controlling a guidance system. The computer-vision system includes an image sensor, a depth sensor, or both, connected to a data processing unit capable of executing image-processing algorithms. The guidance system contains a directional light source and a mechanical and/or electrical system for the operation and orienting of the directional light source or audio system.
During operation of the module, the data processing unit acquires information or data about an object. The information may include, for example, a product description, package dimensions, addressor and addressee data. A code reader may acquire the information from a label affixed to or adjacent the object and transmit that information to the module. This object may be in the process of being placed within or being picked up from the module's field of view, typically on a shelf or other support surface.
In the case of object placement, the guidance system can direct light at or illuminate the location where the object should be placed and/or play audio that instructs the user to where the object should be placed. The computer-vision system can then detect a presence and location of an object within the module's field of view based on changes detected in one or more images captured by the camera assembly and determine whether the placement of the object occurred as expected. If object placement is correct, the data-processing unit registers the object at the placement location. The module can further signify correct placement by illuminating a green light (LED), for example, or audibly announcing successful placement. Conversely, the module can further signify incorrect placement by illuminating a red light (LED), for example, or audibly announcing detection of an error.
In the case of picking up the object, the data-processing unit determines the registered location of the object being picked up based on the information acquired about the object, and the light-guidance system can direct light at or illuminate the object at that location or audibly direct a user to the location. The computer-vision system can then detect whether the object has been removed from that location based on changes detected in the one or more images captured by the camera assembly. From the captured images, the computer-vision system can also determine whether the wrong package has been removed. As in the instance of object placement, the module can use light-guidance (e.g., illuminate a red or green light) to signify success or failure.
Example applications of modules described herein can be found in U.S. Pat. No. 10,148,918, issued Dec. 4, 2018, in U.S. application Ser. No. 15/861,414, U.S. Pat. Pub. No. US20180197139, published Jul. 12, 2018, titled “Modular Shelving Systems for Package Tracking,” and in U.S. application Ser. No. 15/259,474, U.S. Pat. Pub. No. US 20180068266, published Mar. 8, 2018, titled “System and Method of Object Tracking Using Weight Confirmation,” the entirety of which U.S. patent and U.S. published patent applications are incorporated by reference herein for all purposes.
The control-board assembly 102 has a panel 110 (encircled by circle A) with various electrical connectors 112 for communicating with control boards housed within the control-board assembly 102. In general, the control boards perform the operations of object registration, image processing, object tracking, and light guidance.
The mount body 104 has in the shown embodiment two joined sections: an upper mount section 114 and a lower mount section 116. The joined sections of the mount body form a channel 118 that receives a rail (not shown). The channel 118 is defined by opposing upper and lower interior surfaces of the upper and lower mount sections 114, 116, respectively, and a side wall disposed therebetween. This side wall has two angled surfaces (described in
The camera assembly 106 has an RGB camera (i.e., image sensor) 128, a depth sensor 130, and side vents 132 that allow heat generated by the internal optical sensor(s) 128, 130 to leave the assembly. The RGB camera 128 provides color information, and the depth sensor 130 provides estimated depth for each pixel of a captured image. The slant of the raised arm 126 holds the camera assembly such that the field of view of the RGB camera 128 and that of the depth sensor 130 face forward and generally downwards. One embodiment of the camera assembly 106 has no depth sensor 130. References made herein to the field of view of the camera assembly or to the field of view of the module corresponds to the field of view of the camera 128 or to the intersection (i.e., overlap) of the fields of view of the camera 128 and depth sensor 130. In one embodiment, the camera 128 and depth sensor 130 are each capable of data acquisition at 3 meters, and the module 100 monitors a 4-foot wide by 8-foot high by 1.5-foot deep zone, depending on the distance of the module from its target viewing area and/or on the fields of view of the RGB camera and depth sensor. This zone is monitored for changes in depth and/or color. The control boards of the control-board assembly 102 are in communication with the camera and optional depth sensor (via wiring that runs from the camera assembly directly to a camera receptacle 202 (
The lighting assembly 108 has a translucent dome-shaped cover 134 with a frontally located slot 136. The slot 136 runs vertically along the side of the cover 134 and extends along the bottom (or crown) of the dome-shaped cover 134. Directed light (e.g., laser), when activated, originates from within the lighting assembly and passes through this slot 136 in a direction determined by the electronics on the control boards of the control-board assembly 102. The control boards are in communication with one or more light sources (not shown) in the lighting assembly (via wiring that runs from the lighting assembly, through the mount body, and into an opening in the base of the control-board assembly), to control each light source in order to provide light guidance to certain objects or areas within the field of view of the camera assembly 106, depending upon the object or region of interest. A pan pivot base 138 is fixed to the lower mount section 116.
The lighting assembly 108 further comprises a laser tilt base 139 which is rotatably coupled to the pan pivot base 138. The dome-shaped cover 134 is removably secured to the laser tilt base 139 by three snap hooks 140 (only two are visible in
The dome-shaped cover 134 has three tabs (of which tabs 142-1 and 142-2 (generally, 142) are shown). The third tab is located on the far side of the dome-shaped cover, directly opposite the laser slot. The two tabs 142-1, 142-2 are spaced 135 degrees apart from the far side tab, one on either side of the third tab. The uneven spacing between the tabs ensures there is only one way to attach the cover 134 to the laser tilt base 139, to ensure correct assembly of the dome-shaped cover. The dome-shaped cover 134 is effectively keyed by its three indexing tabs 142.
When the laser slot 136 faces forward, in line with the camera assembly 106, the laser tilt base 139 is considered to be at center. The rotatable laser tilt base 139 can rotate a total of 60°; 30° to either side of center. When the laser tilt base 139 rotates, the internally located laser (not shown) and the dome-shaped cover rotates with it, thereby changing the direction in which the laser points and towards which the laser slot faces.
When deployed for operation, the module 100 is mounted in a fixed position with its RGB camera 128 and optional depth camera 130 facing a target area of interest, for example, a supporting surface or an object-holding area. Examples of the supporting surface include, but are not limited to, desktops, tables, shelves, and floor space. The object-holding area can be in a store, supermarket, warehouse, business enterprise, inventory, room, closet, hallway, cupboards, lockers, each with or without secured access. Examples of identified and tracked objects include, but are not limited to, packages, parcels, boxes, equipment, tools, food products, bottles, jars, and cans. (People may also be identified and tracked.) Each separate optical sensor 128, 130 has its own perspective of the area and of the objects placed on the supporting surface.
Modules 100 may be adjustably mounted, for example, on a sliding rail in a surveillance configuration so that all corners of an enterprise are covered. Although particularly suited for mounting to an overhead rail, modules can also be secured to other types of structures, for example, walls, posts, shelves, and pillars. In general, these modules are small and non-intrusive and can track the identifications and paths of individuals through the enterprise, for example, as described in U.S. Pat. Pub. No. US-2018-0164103-A1, published Jun. 14, 2018, titled “System and Method of Personalized Navigation inside a Business Enterprise,” the entirety of which application is incorporated by reference herein.
By the POE+ port 200, also called an RJ45 receptacle, the module 100 can be added to a network and remotely communicated with over the Internet. For example, over a network connection, the module may communicate with one or more servers (i.e., server system), which may perform third-party services, such as “cloud services” for the module. As used herein, the “cloud” refers to software and services that run on a remote network, such as the Internet. In addition, power is supplied to the module by the POE+ connection and other operations can be performed, for example firmware updates, and remote troubleshooting.
Through the RGB camera receptacle 202, a device (e.g., computer) may communicate with and operate the camera 128. The motor/optical sensor receptacle 204 allows a device to communicate with and control pan and tilt stepper motors and optical sensor boards for pan and tilt motion of a laser gimbal (see 1310 in
Above the RJ45 receptacle 200 are the depth sensor receptacle 208, an HDMI (High-definition Multimedia Interface) port 210, a 5 v DC power input port 212, and a power button 214. The depth sensor receptacle 208 enables communication with the depth sensor 130 of the camera assembly 106. By the HDMI port 210, the module 100 can transmit streams of audio and video to another device (e.g., a high-definition television or display). The power button 214 turns power on and off to the processor board (not shown) within the control-board assembly 102.
Extending from the upper mount section 114 is the arm 126 of the mount body 104. In this embodiment, the arm 126 holds the camera assembly 106 at a fixed downwards facing slant. The downward-facing slant accommodates the installation of such modules at an elevated position relative to the object-holding area, to place as much of the object-holding area as possible within the fields of view of the cameras 128, 130 housed in the camera assembly 106. In another embodiment, the arm 126 is movable to allow for a manual or automated change in the mounting angle of the camera assembly.
With the module mounted in this position, the rail 400 rests flush against the angled surface 402-1 (
The control-board assembly 102 houses a complex of control boards 1200-1, 1200-2, 1200-3 (generally, 1200), and a spacer board 1200-4 in a tower arrangement. Control board 1200-1, atop the tower, is the processor core board 1200-1 that provides the computational power to run algorithms and process images. On the processor core board 1200-1 is a processor (not shown) which executes the algorithms and performs image processing. Mounted to the processor core board 1200-1 is a heat sink 1202. Disposed below the control board 1200-1 is the control board 1200-2, also referred to as the POE+ board. The POE+ board includes the RJ45 receptacle 200 (
The mount body 104 includes the upper mount section 114, the lower mount section 116, and the arm 126. The upper mount section 114 includes the retaining boss 404, which projects into the channel 118 between the sections 114, 116. Fasteners 1210 (only one of three shown) secure the control-board assembly 102 to the upper mount section 114, fasteners 1212 (only one of three shown) secure the upper mount section 114 to the lower mount section 116, and fasteners 1214 (only one of three shown) secure the lower mount section 116 to the pan pivot base 138. Bosses in the lower mount section 116 ensure assembly can occur in only one manner. Two electrical connections 1216 pass through the two sections 114, 116, for the pan motor and accompanying optical sensor board. The lower mount section 116 includes a cavity 1218, within which a stepper motor 1220 is disposed. The shaft 1222 of the stepper motor 1220 projects into the laser tilt base 139, by which the stepper motor 1220 rotates the laser tilt base 139, and thus the lighting assembly 108. The laser tilt base 139 can rotate a total of 60°, 30° to either side of center.
The camera assembly 106 houses the RGB camera 128 and the depth sensor 130. Because of the slant at which the arm 126 holds the camera assembly 106, the lower mount section 116 has a recessed region 1224 that allows the bottom of the camera assembly 106 to extend into it.
The lighting assembly 108 houses a laser tilt assembly 1225, which includes a wheel-shaped laser assembly 1226 with the laser 1206 housed therein. In one embodiment, the laser 1206 is a class IIIR red laser. The light-emitting end of the laser 1206 is at the circumference of the wheel-shaped laser assembly 1226. The laser assembly 1226 rotates about an axis 1228 that is perpendicular to the drawn page. Rotating the laser assembly 1226 tilts the laser and, thus, the pointing direction of the laser; the laser tilts vertically, in accordance with the rotation of the laser assembly. In one embodiment, the full range by which laser assembly can tilt the laser is 135 degrees. Below the laser assembly 1226 is the LED board 1208 having an array of LEDs. The LED board 1208 produces RGB light. Under control of the processor, the LED board can provide a variety of signals, for example, red is a warning, green is success, blinking is an attractive alert.
From a first side of the laser mount upright 2000, the shaft 2014 of the stepper motor 2008 passes through an opening 2016 in the laser mount upright 2000 and enters the central keyed opening of the hub 1900 (
On the opposite side of the laser mount upright 2000, fasteners 2020 secure the stepper motor 2008 to the laser mount upright 2000. On the opposite side of the laser mount upright 2000, fasteners 2022 secure the optical sensor board 2006 to the laser mount upright 2000. The optical sensor 2006 determines when the stepper motor 2008 has rotated the laser assembly 1226 (
The rectangular-shaped bracket 2200 can be mounted in at least six different ways.
In a second configuration, the bracket 2200 can be mounted against a wall or similar surface by fastening the surface 2206 flush against the surface using four bolts through slots 2214. In this configuration, the channel bar 2204 is mounted in holes 2216-3 and 2216-2, or in holes 2216-2 and 2216-1, with the groove in the channel bar 2204 parallel to and facing the panel 2220. Before mounting, the inner material of the two selected holes is punched out to allow the bolts 2222 to pass through them and into the channel bar 2204.
In the third and fourth configurations, the bracket 2200 can be mounted with either the side 2202-1 or side 2202-2 pressed flush against a surface, using holes 2218-1, 2218-2, 2218-3, and 2218-4 to mount to the surface or to a circular tube measuring 1″ in diameter using U-bolts.
In a fifth configuration, the bracket 2200 can be mounted with the surface 2206 flush on top of a surface or shelf and fastened using bolts and the slots 2214. In this configuration, the bracket 2200 is upside down from at shown in
In a sixth configuration, the bracket 2200 can be mounted on a vertical rail or on a vertical surface using holes 2208 to fasten the bracket to the rail or surface. In this configuration the channel rail 2204 is mounted either in holes 2216-3 and 2216-2 or in holes 2216-2 and 2216-1 with the groove in the channel bar being parallel to and facing the panel 2220.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, and apparatus. Thus, some aspects of the present invention may be embodied entirely in hardware, entirely in software (including, but not limited to, firmware, program code, resident software, microcode), or in a combination of hardware and software.
Having described above several aspects of at least one embodiment, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be part of this disclosure and are intended to be within the scope of the invention. Embodiments of the methods and apparatuses discussed herein are not limited in application to the details of construction and the arrangement of components set forth in the foregoing description or illustrated in the accompanying drawings. The methods and apparatuses are capable of implementation in other embodiments and of being practiced or of being carried out in various ways. Examples of specific implementations are provided herein for illustrative purposes only and are not intended to be limiting. References to “one embodiment” or “an embodiment” or “another embodiment” means that a feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment described herein. References to one embodiment within the specification do not necessarily all refer to the same embodiment. The features illustrated or described in connection with one exemplary embodiment may be combined with the features of other embodiments.
Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use herein of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. References to “or” may be construed as inclusive so that any terms described using “or” may indicate any of a single, more than one, and all the described terms. Any references to front and back, left and right, top and bottom, upper and lower, and vertical and horizontal are intended for convenience of description, not to limit the present systems and methods or their components to any one positional or spatial orientation. Accordingly, the foregoing description and drawings are by way of example only, and the scope of the invention should be determined from proper construction of the appended claims, and their equivalents.
This application claims priority to and the benefit of co-pending U.S. Provisional Application No. 62/791,413, titled “Computer Vision Tracking and Guidance Module”, filed on Jan. 11, 2019, the entirety of which provisional application is incorporated by reference herein for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2408122 | Wirkler | Sep 1946 | A |
3824596 | Guion et al. | Jul 1974 | A |
3940700 | Fischer | Feb 1976 | A |
4018029 | Safranski et al. | Apr 1977 | A |
4328499 | Anderson et al. | May 1982 | A |
4570416 | Shoenfeld | Feb 1986 | A |
5010343 | Andersson | Apr 1991 | A |
5343212 | Rose et al. | Aug 1994 | A |
5426438 | Peavey et al. | Jun 1995 | A |
5510800 | McEwan | Apr 1996 | A |
5545880 | Bu et al. | Aug 1996 | A |
5574468 | Rose | Nov 1996 | A |
5592180 | Yokev et al. | Jan 1997 | A |
5600330 | Blood | Feb 1997 | A |
5657026 | Culpepper et al. | Aug 1997 | A |
5923286 | Divakaruni | Jul 1999 | A |
5953683 | Hansen et al. | Sep 1999 | A |
6088653 | Sheikh et al. | Jul 2000 | A |
6101178 | Beal | Aug 2000 | A |
6167347 | Lin | Dec 2000 | A |
6255991 | Hedin | Jul 2001 | B1 |
6285916 | Kadaba et al. | Sep 2001 | B1 |
6292750 | Lin | Sep 2001 | B1 |
6409687 | Foxlin | Jun 2002 | B1 |
6417802 | Diesel | Jul 2002 | B1 |
6492905 | Mathias et al. | Dec 2002 | B2 |
6496778 | Lin | Dec 2002 | B1 |
6512748 | Mizuki et al. | Jan 2003 | B1 |
6593885 | Wisherd et al. | Jul 2003 | B2 |
6619550 | Good | Sep 2003 | B1 |
6630904 | Gustafson et al. | Oct 2003 | B2 |
6634804 | Toste et al. | Oct 2003 | B1 |
6683568 | James et al. | Jan 2004 | B1 |
6697736 | Lin | Feb 2004 | B2 |
6720920 | Breed et al. | Apr 2004 | B2 |
6721657 | Ford et al. | Apr 2004 | B2 |
6744436 | Chirieleison, Jr. et al. | Jun 2004 | B1 |
6750816 | Kunysz | Jun 2004 | B1 |
6861982 | Forstrom et al. | Mar 2005 | B2 |
6867774 | Halmshaw et al. | Mar 2005 | B1 |
6988079 | Or-Bach et al. | Jan 2006 | B1 |
6989789 | Ferreol et al. | Jan 2006 | B2 |
7009561 | Menache et al. | Mar 2006 | B2 |
7104453 | Zhu | Sep 2006 | B1 |
7143004 | Townsend et al. | Nov 2006 | B2 |
7168618 | Schwartz | Jan 2007 | B2 |
7190309 | Hill | Mar 2007 | B2 |
7193559 | Ford et al. | Mar 2007 | B2 |
7236091 | Xiang et al. | Jun 2007 | B2 |
7292189 | Orr et al. | Nov 2007 | B2 |
7295925 | Breed et al. | Nov 2007 | B2 |
7315281 | Dejanovic et al. | Jan 2008 | B2 |
7336078 | Merewether et al. | Feb 2008 | B1 |
7353994 | Farrall | Apr 2008 | B2 |
7409290 | Lin | Aug 2008 | B2 |
7443342 | Shirai et al. | Oct 2008 | B2 |
7499711 | Hoctor et al. | Mar 2009 | B2 |
7533569 | Sheynblat | May 2009 | B2 |
7612715 | Macleod | Nov 2009 | B2 |
7646330 | Karr | Jan 2010 | B2 |
7689465 | Shakes et al. | Mar 2010 | B1 |
7844507 | Levy | Nov 2010 | B2 |
7868760 | Smith et al. | Jan 2011 | B2 |
7876268 | Jacobs | Jan 2011 | B2 |
7933730 | Li | Apr 2011 | B2 |
8009918 | Van Droogenbroeck et al. | Aug 2011 | B2 |
8189855 | Opalach et al. | May 2012 | B2 |
8201737 | Palacios Durazo et al. | Jun 2012 | B1 |
8219438 | Moon et al. | Jul 2012 | B1 |
8269624 | Chen et al. | Sep 2012 | B2 |
8295542 | Albertson et al. | Oct 2012 | B2 |
8406470 | Jones et al. | Mar 2013 | B2 |
8457655 | Zhang et al. | Jun 2013 | B2 |
8619144 | Chang et al. | Dec 2013 | B1 |
8749433 | Hill | Jun 2014 | B2 |
8843231 | Ragusa et al. | Sep 2014 | B2 |
8860611 | Anderson et al. | Oct 2014 | B1 |
8957812 | Hill | Feb 2015 | B1 |
9063215 | Perthold et al. | Jun 2015 | B2 |
9092898 | Fraccaroli et al. | Jul 2015 | B1 |
9120621 | Curlander et al. | Sep 2015 | B1 |
9141194 | Keyes et al. | Sep 2015 | B1 |
9171278 | Kong et al. | Oct 2015 | B1 |
9174746 | Bell et al. | Nov 2015 | B1 |
9269022 | Rhoads et al. | Feb 2016 | B2 |
9349076 | Liu et al. | May 2016 | B1 |
9424493 | He et al. | Aug 2016 | B2 |
9482741 | Min et al. | Nov 2016 | B1 |
9497728 | Hill | Nov 2016 | B2 |
9500396 | Yoon et al. | Nov 2016 | B2 |
9514389 | Erhan et al. | Dec 2016 | B1 |
9519344 | Hill | Dec 2016 | B1 |
9544552 | Takahashi | Jan 2017 | B2 |
9594983 | Alattar et al. | Mar 2017 | B2 |
9656749 | Hanlon | May 2017 | B1 |
9740937 | Zhang et al. | Aug 2017 | B2 |
9782669 | Hill | Oct 2017 | B1 |
9872151 | Puzanov et al. | Jan 2018 | B1 |
9904867 | Fathi et al. | Feb 2018 | B2 |
9933509 | Hill et al. | Apr 2018 | B2 |
9961503 | Hill | May 2018 | B2 |
9996818 | Ren et al. | Jun 2018 | B1 |
10001833 | Hill | Jun 2018 | B2 |
10148918 | Seiger et al. | Dec 2018 | B1 |
10163149 | Famularo et al. | Dec 2018 | B1 |
10180490 | Schneider et al. | Jan 2019 | B1 |
10257654 | Hill | Apr 2019 | B2 |
10324474 | Hill et al. | Jun 2019 | B2 |
10332066 | Palaniappan et al. | Jun 2019 | B1 |
10373322 | Buibas et al. | Aug 2019 | B1 |
10399778 | Shekhawat | Sep 2019 | B1 |
10416276 | Hill et al. | Sep 2019 | B2 |
10444323 | Min et al. | Oct 2019 | B2 |
10455364 | Hill | Oct 2019 | B2 |
10605904 | Min et al. | Mar 2020 | B2 |
20010027995 | Patel et al. | Oct 2001 | A1 |
20020021277 | Kramer et al. | Feb 2002 | A1 |
20020095353 | Razumov | Jul 2002 | A1 |
20020140745 | Ellenby et al. | Oct 2002 | A1 |
20020177476 | Chou | Nov 2002 | A1 |
20030024987 | Zhu | Feb 2003 | A1 |
20030053492 | Matsunaga | Mar 2003 | A1 |
20030110152 | Hara | Jun 2003 | A1 |
20030115162 | Konick | Jun 2003 | A1 |
20030120425 | Stanley et al. | Jun 2003 | A1 |
20030176196 | Hall et al. | Sep 2003 | A1 |
20030184649 | Mann | Oct 2003 | A1 |
20030195017 | Chen et al. | Oct 2003 | A1 |
20040002642 | Dekel et al. | Jan 2004 | A1 |
20040095907 | Agee et al. | May 2004 | A1 |
20040107072 | Dietrich et al. | Jun 2004 | A1 |
20040176102 | Lawrence et al. | Sep 2004 | A1 |
20040203846 | Caronni et al. | Oct 2004 | A1 |
20040267640 | Bong et al. | Dec 2004 | A1 |
20050001712 | Yarbrough | Jan 2005 | A1 |
20050057647 | Nowak | Mar 2005 | A1 |
20050062849 | Foth et al. | Mar 2005 | A1 |
20050074162 | Tu et al. | Apr 2005 | A1 |
20050143916 | Kim et al. | Jun 2005 | A1 |
20050154685 | Mundy et al. | Jul 2005 | A1 |
20050184907 | Hall et al. | Aug 2005 | A1 |
20050275626 | Mueller et al. | Dec 2005 | A1 |
20060013070 | Holm et al. | Jan 2006 | A1 |
20060022800 | Krishna et al. | Feb 2006 | A1 |
20060061469 | Jaeger et al. | Mar 2006 | A1 |
20060066485 | Min | Mar 2006 | A1 |
20060101497 | Hirt et al. | May 2006 | A1 |
20060192709 | Schantz et al. | Aug 2006 | A1 |
20060279459 | Akiyama et al. | Dec 2006 | A1 |
20060290508 | Moutchkaev et al. | Dec 2006 | A1 |
20070060384 | Dohta | Mar 2007 | A1 |
20070138270 | Reblin | Jun 2007 | A1 |
20070205867 | Kennedy et al. | Sep 2007 | A1 |
20070210920 | Panotopoulos | Sep 2007 | A1 |
20070222560 | Posamentier | Sep 2007 | A1 |
20070237356 | Dwinell et al. | Oct 2007 | A1 |
20080007398 | DeRose et al. | Jan 2008 | A1 |
20080035390 | Wurz | Feb 2008 | A1 |
20080048913 | Macias et al. | Feb 2008 | A1 |
20080143482 | Shoarinejad et al. | Jun 2008 | A1 |
20080150678 | Giobbi et al. | Jun 2008 | A1 |
20080154691 | Wellman et al. | Jun 2008 | A1 |
20080156619 | Patel | Jul 2008 | A1 |
20080174485 | Carani et al. | Jul 2008 | A1 |
20080183328 | Danelski | Jul 2008 | A1 |
20080204322 | Oswald et al. | Aug 2008 | A1 |
20080266253 | Seeman et al. | Oct 2008 | A1 |
20080281618 | Mermet et al. | Nov 2008 | A1 |
20080316324 | Rofougaran et al. | Dec 2008 | A1 |
20090043504 | Bandyopadhyay et al. | Feb 2009 | A1 |
20090073428 | Magnus | Mar 2009 | A1 |
20090114575 | Carpenter et al. | May 2009 | A1 |
20090121017 | Cato et al. | May 2009 | A1 |
20090149202 | Hill et al. | Jun 2009 | A1 |
20090224040 | Kushida et al. | Sep 2009 | A1 |
20090243932 | Moshfeghi | Oct 2009 | A1 |
20090323586 | Hohl et al. | Dec 2009 | A1 |
20100019905 | Boddie et al. | Jan 2010 | A1 |
20100076594 | Salour et al. | Mar 2010 | A1 |
20100090852 | Eitan et al. | Apr 2010 | A1 |
20100097208 | Rosing et al. | Apr 2010 | A1 |
20100103173 | Lee et al. | Apr 2010 | A1 |
20100103989 | Smith et al. | Apr 2010 | A1 |
20100123664 | Shin et al. | May 2010 | A1 |
20100159958 | Naguib et al. | Jun 2010 | A1 |
20110002509 | Nobori et al. | Jan 2011 | A1 |
20110006774 | Baiden | Jan 2011 | A1 |
20110037573 | Choi | Feb 2011 | A1 |
20110066086 | Aarestad et al. | Mar 2011 | A1 |
20110166694 | Griffits et al. | Jul 2011 | A1 |
20110187600 | Landt | Aug 2011 | A1 |
20110208481 | Slastion | Aug 2011 | A1 |
20110210843 | Kummetz | Sep 2011 | A1 |
20110241942 | Hill | Oct 2011 | A1 |
20110256882 | Markhovsky et al. | Oct 2011 | A1 |
20110264520 | Puhakka | Oct 2011 | A1 |
20110286633 | Wang et al. | Nov 2011 | A1 |
20110313893 | Weik, III | Dec 2011 | A1 |
20110315770 | Patel | Dec 2011 | A1 |
20120013509 | Wisherd et al. | Jan 2012 | A1 |
20120020518 | Taguchi | Jan 2012 | A1 |
20120081544 | Wee | Apr 2012 | A1 |
20120087572 | Dedeoglu et al. | Apr 2012 | A1 |
20120127088 | Pance et al. | May 2012 | A1 |
20120176227 | Nikitin | Jul 2012 | A1 |
20120184285 | Sampath et al. | Jul 2012 | A1 |
20120257061 | Edwards et al. | Oct 2012 | A1 |
20120286933 | Hsiao | Nov 2012 | A1 |
20120319822 | Hansen | Dec 2012 | A1 |
20130018582 | Miller et al. | Jan 2013 | A1 |
20130021417 | Ota et al. | Jan 2013 | A1 |
20130029685 | Moshfeghi | Jan 2013 | A1 |
20130036043 | Faith | Feb 2013 | A1 |
20130051624 | Iwasaki et al. | Feb 2013 | A1 |
20130063567 | Burns et al. | Mar 2013 | A1 |
20130073093 | Songkakul | Mar 2013 | A1 |
20130113993 | Dagit, III | May 2013 | A1 |
20130182114 | Zhang et al. | Jul 2013 | A1 |
20130191193 | Calman et al. | Jul 2013 | A1 |
20130226655 | Shaw | Aug 2013 | A1 |
20130281084 | Batada et al. | Oct 2013 | A1 |
20130293684 | Becker et al. | Nov 2013 | A1 |
20130293722 | Chen | Nov 2013 | A1 |
20130314210 | Schoner et al. | Nov 2013 | A1 |
20130335318 | Nagel et al. | Dec 2013 | A1 |
20130335415 | Chang | Dec 2013 | A1 |
20140022058 | Striemer et al. | Jan 2014 | A1 |
20140108136 | Zhao et al. | Apr 2014 | A1 |
20140139426 | Kryze et al. | May 2014 | A1 |
20140253368 | Holder | Sep 2014 | A1 |
20140270356 | Dearing et al. | Sep 2014 | A1 |
20140300516 | Min et al. | Oct 2014 | A1 |
20140317005 | Balwani | Oct 2014 | A1 |
20140330603 | Corder et al. | Nov 2014 | A1 |
20140357295 | Skomra et al. | Dec 2014 | A1 |
20140361078 | Davidson | Dec 2014 | A1 |
20150009949 | Khoryaev et al. | Jan 2015 | A1 |
20150012396 | Puerini et al. | Jan 2015 | A1 |
20150019391 | Kumar et al. | Jan 2015 | A1 |
20150029339 | Kobres et al. | Jan 2015 | A1 |
20150039458 | Reid | Feb 2015 | A1 |
20150055821 | Fotland | Feb 2015 | A1 |
20150059374 | Hebei | Mar 2015 | A1 |
20150085096 | Smits | Mar 2015 | A1 |
20150091757 | Shaw et al. | Apr 2015 | A1 |
20150130664 | Hill et al. | May 2015 | A1 |
20150133162 | Meredith et al. | May 2015 | A1 |
20150134418 | Leow et al. | May 2015 | A1 |
20150169916 | Hill et al. | Jun 2015 | A1 |
20150170002 | Szegedy et al. | Jun 2015 | A1 |
20150202770 | Patron | Jul 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150221135 | Hill et al. | Aug 2015 | A1 |
20150227890 | Bednarek et al. | Aug 2015 | A1 |
20150248765 | Criminisi et al. | Sep 2015 | A1 |
20150254906 | Berger et al. | Sep 2015 | A1 |
20150278759 | Harris et al. | Oct 2015 | A1 |
20150310539 | McCoy et al. | Oct 2015 | A1 |
20150323643 | Hill et al. | Nov 2015 | A1 |
20150341551 | Perrin et al. | Nov 2015 | A1 |
20150362581 | Friedman et al. | Dec 2015 | A1 |
20150371178 | Abhyanker et al. | Dec 2015 | A1 |
20150371319 | Argue et al. | Dec 2015 | A1 |
20150379366 | Nomura et al. | Dec 2015 | A1 |
20160035078 | Lin et al. | Feb 2016 | A1 |
20160063610 | Argue et al. | Mar 2016 | A1 |
20160093184 | Locke et al. | Mar 2016 | A1 |
20160098679 | Levy | Apr 2016 | A1 |
20160140436 | Yin et al. | May 2016 | A1 |
20160142868 | Kulkarni et al. | May 2016 | A1 |
20160150196 | Horvath | May 2016 | A1 |
20160156409 | Chang | Jun 2016 | A1 |
20160178727 | Bottazzi | Jun 2016 | A1 |
20160195602 | Meadow | Jul 2016 | A1 |
20160232857 | Tamaru | Aug 2016 | A1 |
20160238692 | Hill et al. | Aug 2016 | A1 |
20160248969 | Hurd | Aug 2016 | A1 |
20160256100 | Jacofsky et al. | Sep 2016 | A1 |
20160286508 | Khoryaev et al. | Sep 2016 | A1 |
20160300187 | Kashi et al. | Oct 2016 | A1 |
20160335593 | Clarke et al. | Nov 2016 | A1 |
20160366561 | Min et al. | Dec 2016 | A1 |
20160370453 | Boker et al. | Dec 2016 | A1 |
20160371574 | Nguyen et al. | Dec 2016 | A1 |
20170030997 | Hill et al. | Feb 2017 | A1 |
20170031432 | Hill | Feb 2017 | A1 |
20170066597 | Hiroi | Mar 2017 | A1 |
20170117233 | Anayama | Apr 2017 | A1 |
20170123426 | Hill et al. | May 2017 | A1 |
20170140329 | Bernhardt et al. | May 2017 | A1 |
20170234979 | Mathews et al. | Aug 2017 | A1 |
20170261592 | Min et al. | Sep 2017 | A1 |
20170280281 | Pandey et al. | Sep 2017 | A1 |
20170293885 | Grady et al. | Oct 2017 | A1 |
20170313514 | Lert, Jr. et al. | Nov 2017 | A1 |
20170323174 | Joshi et al. | Nov 2017 | A1 |
20170323376 | Glaser et al. | Nov 2017 | A1 |
20170350961 | Hill et al. | Dec 2017 | A1 |
20170351255 | Anderson et al. | Dec 2017 | A1 |
20170359573 | Kim et al. | Dec 2017 | A1 |
20170372524 | Hill | Dec 2017 | A1 |
20170374261 | Teich et al. | Dec 2017 | A1 |
20180003962 | Urey et al. | Jan 2018 | A1 |
20180033151 | Matsumoto et al. | Feb 2018 | A1 |
20180068100 | Seo | Mar 2018 | A1 |
20180068266 | Kirmani et al. | Mar 2018 | A1 |
20180094936 | Jones et al. | Apr 2018 | A1 |
20180108134 | Venable et al. | Apr 2018 | A1 |
20180139431 | Simek | May 2018 | A1 |
20180164103 | Hill | Jun 2018 | A1 |
20180197139 | Hill | Jul 2018 | A1 |
20180197218 | Mallesan et al. | Jul 2018 | A1 |
20180231649 | Min et al. | Aug 2018 | A1 |
20180242111 | Hill | Aug 2018 | A1 |
20180339720 | Singh | Nov 2018 | A1 |
20190029277 | Skrædderdal | Jan 2019 | A1 |
20190053012 | Hill | Feb 2019 | A1 |
20190073785 | Hafner | Mar 2019 | A1 |
20190090744 | Mahfouz | Mar 2019 | A1 |
20190098263 | Seiger et al. | Mar 2019 | A1 |
20190138849 | Zhang | May 2019 | A1 |
20190295290 | Schena et al. | Sep 2019 | A1 |
20190394448 | Ziegler et al. | Dec 2019 | A1 |
20200005116 | Kuo | Jan 2020 | A1 |
20200011961 | Hill et al. | Jan 2020 | A1 |
20200012894 | Lee | Jan 2020 | A1 |
20200097724 | Chakravarty et al. | Mar 2020 | A1 |
Number | Date | Country |
---|---|---|
102017205958 | Oct 2018 | DE |
2001006401 | Jan 2001 | WO |
2005010550 | Feb 2005 | WO |
2009007198 | Jan 2009 | WO |
2020061276 | Mar 2020 | WO |
Entry |
---|
Non-Final Office Action in U.S. Appl. No. 15/091,180 dated Oct. 1, 2020. |
Non-Final Office Action in U.S. Appl. No. 16/206,745 dated Sep. 23, 2020. |
Non-Final Office Action in U.S. Appl. No. 15/416,379 dated Oct. 2, 2020. |
Non-Final Office Action in U.S. Appl. No. 15/259,474, dated Sep. 1, 2020; 17 pages. |
Final Office Action in U.S. Appl. No. 15/861,414 dated Nov. 17, 2020. |
International Search Report and Written Opinion in PCT/US2020/013280 dated Mar. 10, 2020; 9 pages. |
Szeliski, Richard “Image Alignment and Stitching: A Tutorial,:” Technical Report, MST-TR-2004-92, Dec. 10, 2006. |
Xu, Wei and Jane Mulligan “Performance Evaluation of Color Correction Approaches for Automatic Multi-view Image and Video Stitching,” International Converence on Computer Vision and Pattern Recognition (CVPR10), San Francisco, CA, 2010. |
Brown, Matthew and David G. Lowe “Automatic Panoramic Image Stitching using Invariant Features,” International Journal of Computer Vision, vol. 74, No. 1, pp. 59-73, 2007. |
Hill, et al. “Package Tracking Systems and Methods” U.S. Appl. No. 15/091,180, filed Apr. 5, 2016. |
Hill, et al. “Video for Real-Time Confirmation in Package Tracking Systems” U.S. Appl. No. 15/416,366, filed Jan. 26, 2017. |
Piotrowski, et al. “Light-Based Guidance for Package Tracking Systems” U.S. Appl. No. 15/416,379, filed Jan. 26, 2017. |
Chakravarty, et al. “Machine-Learning-Assisted Self-Improving Object-Identification System and Method ” U.S. Appl. No. 16/575,837, filed Sep. 19, 2019. |
Final Office Action in U.S. Appl. No. 16/206,745 dated Feb. 5, 2020; 15 pages. |
Non-Final Office Action in U.S. Appl. No. 16/206,745 dated Oct. 18, 2019; 8 pages. |
Final Office Action in U.S. Appl. No. 15/416,366 dated Oct. 7, 2019; 14 pages. |
Non-Final Office Action in U.S. Appl. No. 15/416,366 dated Apr. 6, 2020; 13 pages. |
Final Office Action in U.S. Appl. No. 15/416,379 dated Jan. 27, 2020; 15 pages. |
Non-Final Office Action in U.S. Appl. No. 15/416,379, dated Jun. 27, 2019; 12 pages. |
Final Office Action in U.S. Appl. No. 15/259,474 dated Jan. 10, 2020; 19 pages. |
Non-Final Office Action in U.S. Appl. No. 15/861,414 dated Apr. 6, 2020; 14 pages. |
Morbella N50: 5-inch GPS Navigator User's Manual, Maka Technologies Group, May 2012. |
Final Office Action in U.S. Appl. No. 16/206,745 dated May 22, 2019; 9 pages. |
Non-Final Office Action in U.S. Appl. No. 16/206,745 dated Jan. 7, 2019; 10 pages. |
Non-Final Office Action in U.S. Appl. No. 15/416,366 dated Jun. 13, 2019; 11 pages. |
Non-Final Office Action in U.S. Appl. No. 15/259,474 dated May 29, 2019; 19 pages. |
Wilde, Andreas, “Extended Tracking Range Delay-Locked Loop,” Proceedings IEEE International Conference on Communications, Jun. 1995, pp. 1051-1054. |
Notice of Allowance in U.S. Appl. No. 15/270,749 dated Oct. 4, 2018; 5 pages. |
Li, et al. “Multifrequency-Based Range Estimation of RFID Tags,” IEEE International Conference on RFID, 2009. |
Welch, Greg and Gary Bishop, “An Introduction to the Kalman Filter,” Department of Computer Science, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-3175, Updated: Monday, Jul. 24, 2006. |
Non-Final Office Action in U.S. Appl. No. 15/270,749 dated Apr. 4, 2018; 8 pages. |
“ADXL202/ADXL210 Product Sheet,” Analog.com, 1999. |
Farrell & Barth, “The Global Positiong System & Interial Navigation”, 1999, McGraw-Hill; pp. 245-252. |
Pourhomayoun, Mohammad and Mark Fowler, “Improving WLAN-based Indoor Mobile Positioning Using Sparsity,” Conference Record of the Forty Sixth Asilomar Conference on Signals, Systems and Computers, Nov. 4-7, 2012, pp. 1393-1396, Pacific Grove, California. |
Schmidt & Phillips, “INS/GPS Integration Architectures”, NATO RTO Lecture Seriers, First Presented Oct. 20-21, 2003. |
Grewal & Andrews, “Global Positioning Systems, Inertial Nagivation, and Integration”, 2001, John Weiley and Sons, pp. 252-256. |
Jianchen Gao, “Development of a Precise GPS/INS/On-Board Vehicle Sensors Integrated Vehicular Positioning System”, Jun. 2007, UCGE Reports No. 20555. |
Yong Yang, “Tightly Coupled MEMS INS/GPS Integration with INS Aided Receiver Tracking Loops”, Jun. 2008, UCGE Reports No. 20270. |
Goodall, Christopher L., “Improving Usability of Low-Cost INS/GPS Navigation Systems using Intelligent Techniques”, Jan. 2009, UCGE Reports No. 20276. |
Debo Sun, “Ultra-Tight GPS/Reduced IMU for Land Vehicle Navigation”, Mar. 2010, UCGE Reports No. 20305. |
Sun, et al., “Analysis of the Kalman Filter With Different INS Error Models for GPS/INS Integration in Aerial Remote Sensing Applications”, Bejing, 2008, The International Archives of the Photogrammerty, Remote Sensing and Spatial Information Sciences vol. XXXVII, Part B5. |
Adrian Schumacher, “Integration of a GPS aised Strapdown Inertial Navigation System for Land Vehicles”, Master of Science Thesis, KTH Electrical Engineering, 2006. |
Vikas Numar N., “Integration of Inertial Navigation System and Global Positioning System Using Kalman Filtering”, M.Tech Dissertation, Indian Institute of Technology, Bombay, Mumbai, Jul. 2004. |
Jennifer Denise Gautier, “GPS/INS Generalized Evaluation Tool (Giget) for The Design and Testing of Integrated Navigation Systems”, Dissertation, Stanford University, Jun. 2003. |
Farrell, et al, “Real-Time Differential Carrier Phase GPS=Aided INS”, Jul. 2000, IEEE Transactions on Control Systems Technology, vol. 8, No. 4. |
Filho, et al., “Integrated GPS/INS Navigation System Based on a Gyrpscope-Free IMU”, Dincon Brazilian Conference on Synamics, Control, and Their Applications, May 22-26, 2006. |
Santiago Alban, “Design and Performance of a Robust GPS/INS Attitude System for Automobile Applications”, Dissertation, Stanford University, Jun. 2004. |
Proakis, John G. and Masoud Salehi, “Communication Systems Engineering”, Second Edition, Prentice-Hall, Inc., Upper Saddle River, New Jersey, 2002. |
International Search Report & Written Opinion in international patent application PCT/US12/64860, dated Feb. 28, 2013; 8 pages. |
Dictionary Definition for Peripheral Equipment. (2001). Hargrave's Communications Dictionary, Wiley. Hoboken, NJ: Wiley. Retrieved from Https://search.credorefemce.com/content/entry/hargravecomms/peripheral_equioment/0 (Year:2001). |
Non-Final Office Action in U.S. Appl. No. 15/091,180, dated Jun. 27, 2019; 11 pages. |
Restriction Requirement in U.S. Appl. No. 15/091,180 dated Mar. 19, 2019; 8 pages. |
Final Office Action in U.S. Appl. No. 15/091,180 dated Jan. 23, 2020; 17 pages. |
Corrected Notice of Allowability in U.S. Appl. No. 15/270,749 dated Oct. 30, 2018; 5 pages. |
Non-Final Office Action in U.S. Appl. No. 16/437,767, dated Jul. 15, 2020; 19 pages. |
International Search Report and Written Opinion in PCT/US2019/051874 dated Dec. 13, 2020; 9 pages. |
Notice of Allowance in U.S. Appl. No. 15/416,366 dated Aug. 19, 2020; 13 pages. |
Raza, Rana Hammad “Three Dimensional Localization and Tracking for Site Safety Using Fusion of Computer Vision and RFID,” 2013, Dissertation, Michigan State University. |
Final Office Action in U.S. Appl. No. 15/091,180, dated Mar. 10, 2021; 24 pages. |
Notice of Allowance and Fees Due in U.S. Appl. No. 16/206,745, dated Mar. 12, 2021; 9 pages. |
Final Office Action in U.S. Appl. No. 15/259,474, dated Mar. 9, 2021; 23 pages. |
Final Office Action in U.S. Appl. No. 15/861,414 dated Feb. 8, 2021; 13 pages. |
Final Office Action in U.S. Appl. No. 16/437,767 dated Feb. 5, 2021; 18 pages. |
Final Office Action in U.S. Appl. No. 15/416,379, dated May 13, 2021; 18 pages. |
Notice of Allowance and Fees Due in U.S. Appl. No. 16/437,767, dated May 14, 2021; 8 pages. |
Non-Final Office Action in U.S. Appl. No. 16/575,837, dated Apr. 21, 2021; 18 pages. |
International Preliminary Report on Patentability in PCT/US2019/051874, dated Apr. 1, 2021; 8 pages. |
Number | Date | Country | |
---|---|---|---|
20200228697 A1 | Jul 2020 | US |
Number | Date | Country | |
---|---|---|---|
62791413 | Jan 2019 | US |