A common complaint among shoppers is that they are often frustrated by not knowing where particular items are located within the store. They wander about inefficiently through the aisles searching for items on their shopping list, often retracing steps, taking the long path in their quest of the desired items.
Aspects and embodiments relate generally to systems and methods for providing personalized navigation for people while inside certain business enterprises.
According to one embodiment, a system for navigating a person through a business enterprise comprises one or more radio-frequency (RF) receivers, disposed near an entrance to the business enterprise, receiving RF signals from an RF-transmitting device carried by the person, a controller in communication with the one or more RF-frequency receivers to receive therefrom information carried by the RF signals, the controller being adapted to identify the person based on the information carried by the RF signals and to determine a location of the person based on a location of the RF receiver that receives the RF signals, and at least one optical device adapted to capture images of the person identified by the controller and to send the captured images to the controller, the controller determining from the captured images a current location of the person and using the determined current location to track, in real time, a relative position of the person within the enterprise as the person moves through the business enterprise.
In one example the system further comprises an RF transmitter in communication with the controller to receive therefrom the current location determined by the controller, the RF transmitter being adapted to transmit the current location of the person within the business enterprise to the RF-transmitting device carried by the person. In one example an application executing on the RF-transmitting device carried by the person produces a floor plan of the business enterprise and indicates the current location of the person in the floor plan.
In another example the information carried by the RF signals includes a shopping list, and the controller identifies where in the business enterprise items on the shopping list may be found. The controller may produce a route through the business enterprise based on the items on the shopping list, and sends the route to the RF-transmitting device. In one example the controller identifies an item of potential interest determined from a previous shopping experience of the person.
The business enterprise may be a grocery store, hardware store, warehouse, retail outlet, shipping hub or terminal, transportation hub, mall, amusement park, indoor area and/or outdoor area used or managed by the enterprise, for example.
According to another embodiment, a method for directing a person through a business enterprise comprises identifying the person based on information carried by radio-frequency (RF) signals received from an RF-transmitting device carried by the person, capturing images of the identified person, determining from the captured images a current location of the person within the business enterprise, and optically tracking the current location of the person within the business enterprise as the person moves.
In one example the method further comprises transmitting radio signals carrying the current location to the RF-transmitting device carried by the person. In another example the method further comprises displaying, on a screen of the RF-transmitting device, a floor plan of the business enterprise and indicating the current location of the person in the displayed floor plan.
In one example information carried by the RF signals includes a shopping list; and the method further comprises determining where items on the shopping list may be found in the business enterprise. In another example the method further comprises determining a route through the business enterprise based on the items on the shopping list, and transmitting the route to the RF-transmitting device for display on a screen of the RF-transmitting device. The method may further comprise identifying an item of potential interest determined from a previous shopping experience of the person, and transmitting a notification to the RF-transmitting device when the current location of the person is near the identified item of potential interest to alert the person of the identified item.
In one example information carried by the RF signals includes instructions for the person with the RF-transmitting device to perform a service inside the business enterprise.
According to another embodiment a mobile device comprises a screen, a radio frequency (RF) transmitter, an RF receiver, memory storing application software for directing a user of the mobile device through a business enterprise, and a processor adapted to execute the application software stored in memory, the processor, when executing the application software, instructing the RF transmitter to transmit RF signals carrying information that identifies the user, the RF receiver receiving, in response to the transmitted RF signals, a current location of the user within the business enterprise, the processor, when executing the application software, displaying on the screen a floor plan of the business enterprise, and receiving the current location of the user from the RF receiver, and indicating the current location of the user in the displayed floor plan.
In one example the information carried by the RF signals further includes a shopping list, the RF receiver receives RF signals carrying information where items on the shopping list may be found in the business enterprise, and the processor, when executing the application software, indicates the items on the shopping list in the displayed floor plan. In another example the RF receiver receives RF signals carrying information related to a route through the business enterprise based on the items on the shopping list, and the processor, when executing the application software, indicates the route in the displayed floor plan.
In one example the RF receiver receives RF signals carrying information related to an item of potential interest when the current location of the identified user is near the item of potential interest, and the processor, when executing the application software, alerts the user of the identified item.
The mobile device may further comprise an RF transceiver that includes the RF transmitter and the RF receiver.
Another embodiment is directed to a navigation and tracking system for installation at a business enterprise, the system comprising a radio-frequency (RF) node having an RF receiver antenna and configured to receive first RF signals from a mobile device, the RF node being disposed proximate an entrance to the business enterprise, a plurality of video cameras disposed within the business enterprise and configured to collect images, and a controller coupled to the RF node and to the plurality of video cameras, the controller being configured to receive the first RF signals from the RF node, to determine an identity of a person associated with the mobile device based on information carried in the first RF signals, to receive and analyze the images from the plurality of video cameras, to detect a human in at least one of the images, and to match the human with the person associated with the mobile device.
In one example the controller is further configured to track the person through the business enterprise based on detecting the person in the images received from at least some of the plurality of video cameras over time.
The navigation and tracking system may further comprise a database coupled to the controller.
In one example the controller is further configured to merge the images received from at least some of the plurality of video cameras into a map of the business enterprise. In another example the information carried in the first RF signals includes a shopping list identifying a plurality of items of interest. In another example the RF node is further configured to send second RF signals to the wireless device, the second RF signals containing information specifying locations of the items of interest within the business enterprise.
Still other aspects, embodiments, and advantages of these exemplary aspects and embodiments are discussed in detail below. Embodiments disclosed herein may be combined with other embodiments in any manner consistent with at least one of the principles disclosed herein, and references to “an embodiment,” “some embodiments,” “an alternate embodiment,” “various embodiments,” “one embodiment” or the like are not necessarily mutually exclusive and are intended to indicate that a particular feature, structure, or characteristic described may be included in at least one embodiment. The appearances of such terms herein are not necessarily all referring to the same embodiment.
Various aspects of at least one embodiment are discussed below with reference to the accompanying figures, which are not intended to be drawn to scale. The figures are included to provide illustration and a further understanding of the various aspects and embodiments, and are incorporated in and constitute a part of this specification, but are not intended as a definition of the limits of the invention. In the figures, each identical or nearly identical component that is illustrated in various figures is represented by a like numeral. For purposes of clarity, not every component may be labeled in every figure. In the figures:
Personalized navigation systems according to certain aspects and embodiments use a combination of radio-frequency (RF) technology and optical imaging technology to identify and track persons at business enterprises, and software to provide the persons with individualized information and an individualized navigation experience within the business enterprise. As discussed in more detail below, a person can provide the personalized navigation system with information, such as a shopping list or service request, for example, and receive personalized navigation or other information in response to aid the person in efficiently completing their objective(s) at the business enterprise. In this manner, the person's experience at the business enterprise and can be improved.
According to one embodiment, a personalized navigation system uses RF technology to initially identify a shopper who approaches an entrance to the business enterprise, and uses optical technology to detect and track movement of the shopper after the shopper arrives at and enters the business enterprise. To cooperate with the navigation system, the shopper carries a mobile device (e.g., a smartphone or smart watch) with RF transmitting and RF receiving capability. In certain embodiments, the mobile device runs certain application software that transmits RF signals containing an identity of the shopper and the shopper's shopping list. The shopper can acquire the application software and download it to the mobile device from an “app store”. Many business enterprises are currently equipped with RF transmitters, RF receivers, and video cameras, and advantageously, the navigation systems described herein do not require any hardware modifications to this existing RF and video equipment.
During typical operation, a person with the mobile device 140 approaches an entrance to the business enterprise 110. The mobile device 140 runs a personalized navigation app and transmits RF signals. In certain examples the RF signals carry an identifier associated with the person by which an operator of the business enterprise 110 knows the person. For example, the identifier may include the person's name, a telephone number, a rewards program number connected with the business enterprise, or other identifying information. The RF signals may also carry the person's shopping list identifying those items that the person wishes to find upon visiting the business enterprise 110. Typically, the person may prepare this shopping list before visiting the business enterprise 110; however, the shopping list can be constructed or edited at any time before or after the person arrives at the business enterprise 110.
When the person comes into range of an RF receiver antenna 130, the mobile device 140 establishes communications with the RF node 120. In particular, in certain examples the mobile device 140 may pass to the RF node 120 the identifier and shopping list. The mobile device may also or alternatively pass other data to the RF node 120, such as a set of instructions or other information for a technician or similar repair staff performing certain services at the business enterprise 110, for example. In certain examples, the RF node 120 forwards the identifier and shopping list or other data to a computer processing unit (also called a controller) 150, which can use this identifier to access a database 160 where information relating to the person associated with the identifier is kept. This information can include records of prior visits to the business enterprise 110 by the person, for example. Although the computer processing unit 150 and database 160 are shown in
As discussed above, tracking of persons within the business enterprise 110 can be accomplished using optical technology; in particular, by capturing and processing images from the video cameras 220 located throughout the business enterprise 110. According to one embodiment, during typical operation of the personalized navigation system, the video cameras 220 continuously capture images within their fields of view. At least one video camera 220 can be placed proximate each entrance of the business enterprise 110 to acquire images of persons entering the business enterprise. In some embodiments, multiple video cameras 220 can be placed proximate each entrance in such a way as to provide a complete field of view, or at least a functionally sufficient field of view, of the area around the entrance such that images of all persons entering the business enterprise 110 can be acquired. As discussed above, when a person having a mobile device 140 configured to run the application software to engage with the personalized navigation system 110 (referred to as a tracked person) arrives at an entrance to the business enterprise 110 the RF node 120 at that entrance receives the identifier, and optionally other information (such as the shopping list), from the mobile device 140. At the same time, the video camera(s) 220 proximate that entrance capture images of the area around the entrance, and at least some of these images should contain the tracked person. As discussed above, in certain examples, the computer processing unit 150 knows which entrance a person used to enter the enterprise 110 based on which RF node 120 detected the person and known locations of each RF node. Accordingly, the computer processing unit 150 knows which video camera or cameras 220 are in position to capture images of the person. These video cameras 220 pass their captured images to the networking device 240, which sends the captured images to the central processing unit 150. The central processing unit 150 includes an image processor that performs image processing techniques adapted to detect a person within the image and to associate the detected person with the most recently acquired identifier and shopping list.
Techniques for processing images to identify a person within the images are known in the art, and any such image processing techniques can be implemented by the central processing unit 150. For example, the image processor can be adapted to examine images captured by the video camera 220-1 positioned at the relevant entrance for a smartphone in the hand of an individual, which may indicate that the individual is engaging with the personalized navigation system 100. Alternatively, or in conjunction, the image processor can be adapted to examine the captured images for the head or hands of a person. Since the central processing unit 150 expects the next person to fall within the field of view of the video camera 220-1 located at the entrance to be the same as the person who communicated with the RF node 120-1 located at that entrance, that detected person becomes associated with received identifier and shopping list. Once a person has been identified in an image and associated with the received identifier, the personalized navigation system 100 tracks and guides the person as he or she moves through the business enterprise 110.
Tracking can be accomplished by collecting images from the various video cameras 220 located amongst the aisles 210 and processing these images to follow the tracked person. In certain examples the central processing unit 150 follows the movement of the tracked person as her or she moves from one camera field of view to another, dynamically registering and updating the location of the person within the business enterprise 110. In one example the video cameras 120 operate in parallel, with all or some of the video cameras providing images to the central processing unit simultaneously. The images can be merged into a map or layout of the business enterprise 110, such as shown in
In certain examples, an image stitching process first performs image alignment using algorithms that can discover the relationships among images with varying degrees of overlap. These algorithms are suited for applications such as video stabilization, summarization, and the creation of panoramic mosaics, which can be used in the images taken from the cameras 220. After alignment is complete, image-stitching algorithms take the estimates produced by such algorithms and blend the images in a seamless manner, while taking care of potential problems, such as blurring or ghosting caused by parallax and scene movement as well as varying image exposures inside the business enterprise 110. Various image stitching algorithms and processes are known in the art, and any such image processing techniques can be implemented by the central processing unit 150.
A handoff can be made when a tracked person moves from one viewpoint to another or is seen by one camera 220 and not the others. These handoffs may be made using the images running in parallel on the central processing unit 150, with the tracked person's location and movement determined by the central processing unit using whichever camera 220 has the best view of the tracked person.
In certain examples, the video cameras 220 can include depth sensors. In such examples, the image stitching operation can be omitted, and each camera stream data is processed independently for change, person detection and recognition. Then, the resulting “areas of interest” are converted to individual point clouds (described further below) and transformed in to a single common coordinate system. The translation and rotation transformations used for this process are based on the position and orientation of the video cameras (and their associated depth sensors) in relation to one another. In one example, one camera is picked as the main sensor and all other camera data is transformed into the main coordinate system, achieving the same end result as the image stitching procedure, namely, unification of the actual location of the tracked person among sensors.
In some examples the central processing unit 150 may use known information about the floor plan of the business enterprise to assist with identifying and tracking persons based on the images acquired from the video cameras 220. For example, the central processing unit can use known shapes and positions of shelving along the aisles 210 to provide reference points. At times, a tracked person may be occluded in a camera's field of view, for example, by another person, equipment, or shelving. The personalized navigation system 100 can be configured to store the tracked person's prior-determined position and compare multiple image frames to relocate the tracked person after a temporary occlusion. As discussed further below, the personalized navigation system 100 can be configured to provide a proposed route for the tracked person through the business enterprise 110, and therefore the central processing unit can use a predicted future location of the tracked person to relocate the person after a temporary occlusion.
According to certain embodiments, the central processing unit 150 can run an image-processing process, optionally supplemented with depth information, to track a person as discussed above. A two-dimensional (2D) optical image capture device (i.e., a video camera 220) with a single aperture is capable of capturing 2D image information on a plane (film, CCD, etc.). To acquire three-dimensional (3D) information typically requires acquisition of additional data. Three-dimensional data can be acquired using multiple video cameras 220 or by combining one or more video cameras with one or more depth sensors. The video cameras 220 can utilize visible light, infrared light, or other optical wavelength ranges. Depth sensors can be based on infrared, laser or other wavelength emitters that transmit light to an object. Depth sensors typically determine the distance to the object from which the light that is reflected or backscattered. Alternatively, depth sensors can utilize acoustic signals to determine distance. In one embodiment, depth sensing is integrated into the video cameras 220.
Image frames are acquired from the video cameras 220. A video camera system with depth sensing capability typically outputs video (e.g., RGB, CYMG) and depth field information. Video may optionally be encoded to a well-known format, such as MPEG. The optical and depth information are stitched together. Open libraries such as OpenCV or OpenNI (used to capture depth images) enable the optical and depth information to be stitched together. Alternatively, a user of the personalized navigation system 100 may develop customized software for generating 3D information or object data generated by optical images and depth sensors.
An initial calibration can be performed over multiple image frames to determine background information both for 2D optical images and the depth sensing. During the calibration, any motion (e.g., people) is extracted or ignored during background extraction until stable background optical (RGB) and depth information can be stored, for example, in the database 160. Calibration may be performed periodically, or may be initiated by the personalized navigation system 100, for example, if errors are detected.
After calibration is complete, the resulting spatial filter masks can be used to extract an “area of interest” for each video camera 220. For example, for a video camera located near an entrance to the business enterprise 110, the area of interest may correspond to the area between the background and a foreground (area where a person is expected to be), so that everything that is not walls, doors, or other infrastructure (for background) and also not a detected person, is ignored. This ignoring of the background and foreground focuses on data within the depth threshold of the area of interest being monitored. Alternatively, the “area of interest” can include a different part of the scene, for example, the foreground in order to see where the person is in later recognition steps and can be expanded or contracted as system requirements dictate. In general, the area of interest applies to any cut-out of a scene that is to be the focus within which to perform person tracking.
According to certain embodiments, multiple image frames (e.g., N−1 and N) are obtained and compared, and in certain examples the image frames can include depth information in addition to RGB (color) data, as discussed above. Image and depth information can be filtered for noise and then processed to determine if a difference between two frames exists. This can be done with edge detection, threshold and difference algorithms, or other image processing techniques. In certain examples, information from the depth sensors is also processed to compare image frames. The system can use changes between image frames, in particular, changes in the position or orientation of a detected person, to track the movement of the person. In some embodiments, change detection can be limited to the area of interest to increase processing speed.
In one embodiment, when the area of interest is determined, a “point cloud” is generated using the video camera's extrinsic and intrinsic parameters through algorithms for “2D to 3D” data representation conversion preformed on the RGB and/or depth images obtained and processed through OpenNI and OpenCV. In one embodiment, the Point Cloud Library may be used. The object shape and location information generated from the Point Cloud Library are used to identify and track a person in three dimensions using edge detection, color detection, object recognition and/or other algorithms for determining the presence of a person within the scene. If object information is in the shape of a human, for example, then the process continues to track the person. However, if the size, shape or other appearance information indicates that the object is not a person, subsequent image frames can be analyzed until a person is detected. In some examples, images captured by the video cameras 220 may include more than one person. Accordingly, the process may compare expected features and/or appearance attributes of the tracked person with persons detected in the image frames to continue to track the correct person.
As discussed above, the central processing unit 150 can merge the acquired images from the video cameras 220 into a map to be able to track identified persons as they moved through the business enterprise. In certain examples, the application software running on the mobile device 140 can be configured to display the map or a similar map view or virtual layout of the floor plan of the business enterprise 110, such that the tracked person can view their location within the business enterprise. The central processing unit 150 can send commands to the RF transmitters 230—by way of the networking device 240—to transmit RF signals carrying the updated location of the tracked person, which can be determined using image processing techniques as discussed above. The mobile device 140—with its RF receiver—receives these signals and registers the updated location of the person within the application software, which can show the location of the person within the virtual layout of the business enterprise 110 displayed on the mobile device 140.
In
In addition to identifying the desired items 330, the central processing unit 150 can notify the person of an item that may be of interest, as the person's current location approaches the location of that item, even if that item is not on the shopping list. Such an advertisement may be based on the shopping history of the person, for example. As discussed above, in certain examples the information provided from the mobile device 140 to the RF node 120 (and therefore to the central processing unit 150) can include a service request. Accordingly, in such examples instead of or in addition to displaying the locations of the desired items 330, the location of the service desk or other relevant information can be displayed on the map, and the route 320 can be configured to guide the person to that location.
Referring to
For example, retailer's costs can be reduced with a system in place that automatically keeps track of inventory on shelves and/or what is taken off these shelves by customers to automatically keep track of what customers take from stores and to manage inventory on shelves. The ability to track inventory and what products customers remove from shelves can improve the cost basis for retailers by eliminating the need for cashiers or extra staff to constantly go to shelves to inspect what items need replacing and re-stocking. In addition, the system can update the shopping list received from a tracked person based on items founds and taken by the tracked person, and update the displayed route 320 based on the progress made by the tracked person.
It is appreciated that variations of image processing can be used for shelf and product tracking. One aspect of the system 400 includes a product recognition camera 420 facing the shelves to view what products are on the shelf and what products are removed by customers. The system may have one or more first shelf facing cameras 420 with a view angle 422 focused on the shelf to see what products are there and what products are removed. However, there may also be situations where one or more shelf focused product recognition cameras 420 may not be sufficient as there may be times that two people reach for products in the same area, potentially even cross arms while reaching for their individual products, and/or possibly blocking the view of the one or more product tracking cameras 420 when reaching and grabbing the product on the shelf.
Thus, an embodiment of the system incorporates an additional outward looking (aisle facing) camera 430. Thus an aspect of this embodiment of the system includes at least two cameras on an integrated arm mount 440. At least one first product tracking camera 420 is oriented to focus on the products on the shelf and at least one second aisle tracking camera 430 is oriented to focus on the aisle and the customers doing the shopping. Both cameras can be a video camera, and both cameras can be a video camera of the video cameras 220-1 through 220-n (generally 220) placed throughout the business enterprise, as discussed above, to provide full coverage of the interior of the business enterprise 110. Thus, in this embodiment, at least one camera 420 (“shelf tracking camera”) may be used primarily for product recognition on the shelf and at least one additional camera 430 (“aisle tracking camera”) may be used primarily for customer skeletal tracking to confirm where that customer is reaching.
Some advantages of this embodiment of the system 400 are that by using at least one aisle tracking camera 430 to focus into the aisle and on the shopper, the system can eliminate any occlusion issues from the shopper standing in front of the shelf-facing camera 420 or any of the other video cameras 220. In addition, the combination of the first shelf facing camera 420 and second aisle facing cameras 430 can also prevent the cameras from confusing what item was taken should two shoppers reach in the same area for products and either cross arms or occlude the camera potentially causing the system to charge the wrong customer for the item taken.
Aspects of this embodiment of the system 400 having the dual cameras can include accomplishing multiple functions from at least one first camera 420 and at least one second camera 430 including shopper registration, shopper movement tracking, and product identification on retail shelving, inventory tracking, and monitoring the amount of products on the shelving.
It is appreciated that aspects of the cameras and the system can also include color sensing, comparison and depth sensing, which can be accomplished for example with infrared sensing. The first shelf tracking camera 420 can use either or both of color and depth sensing to register the products' position and to recognize the actual products on the shelf. The second aisle tracking camera 430 can use depth sensing to perform skeletal tracking to confirm where that customer is reaching. Confirmation of which customer selected which product on the shelf is achieved by the shelf camera 420 providing product identification and removal (from shelf) detection and by the position of the person's arm in relation to the item and, upon removal, the item actually in the hand of the customer provided by the aisle camera 430. The fusing of the functions of these two cameras provides a much more robust method for confirming what was taken off the shelf and by which shopper.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method, and computer program product. Thus, aspects of the present invention may be embodied entirely in hardware, entirely in software (including, but not limited to, firmware, program code, resident software, microcode), or in a combination of hardware and software. In addition, aspects of the present invention may be in the form of a computer program product embodied in one or more computer readable media having computer readable program code stored thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable medium may be a non-transitory computer readable storage medium, examples of which include, but are not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination thereof.
As used herein, a computer readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, device, computer, computing system, computer system, or any programmable machine or device that inputs, processes, and outputs instructions, commands, or data. A non-exhaustive list of specific examples of a computer readable storage medium include an electrical connection having one or more wires, a portable computer diskette, a floppy disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), a USB flash drive, an non-volatile RAM (NVRAM or NOVRAM), an erasable programmable read-only memory (EPROM or Flash memory), a flash memory card, an electrically erasable programmable read-only memory (EEPROM), an optical fiber, a portable compact disc read-only memory (CD-ROM), a DVD-ROM, an optical storage device, a magnetic storage device, or any suitable combination thereof.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. As used herein, a computer readable storage medium is not a computer readable propagating signal medium or a propagated signal.
Program code may be embodied as computer-readable instructions stored on or in a computer readable storage medium as, for example, source code, object code, interpretive code, executable code, or combinations thereof. Any standard or proprietary, programming or interpretive language can be used to produce the computer-executable instructions. Examples of such languages include C, C++, Pascal, JAVA, BASIC, Smalltalk, Visual Basic, and Visual C++.
Transmission of program code embodied on a computer readable medium can occur using any appropriate medium including, but not limited to, wireless, wired, optical fiber cable, radio frequency (RF), or any suitable combination thereof.
The program code may execute entirely on a user's device, such as the mobile device 140, partly on the user's device, as a stand-alone software package, partly on the user's device and partly on a remote computer or entirely on a remote computer or server. Any such remote computer may be connected to the user's device through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet, using an Internet Service Provider).
Additionally, the methods of this invention can be implemented on a special purpose computer, a programmed microprocessor or microcontroller and peripheral integrated circuit element(s), an ASIC or other integrated circuit, a digital signal processor, a hard-wired electronic or logic circuit such as discrete element circuit, a programmable logic device such as PLD, PLA, FPGA, PAL, or the like. In general, any device capable of implementing a state machine that is in turn capable of implementing the proposed methods herein can be used to implement the principles of this invention.
Furthermore, the disclosed methods may be readily implemented in software using object or object-oriented software development environments that provide portable source code that can be used on a variety of computer or workstation platforms. Alternatively, the disclosed system may be implemented partially or fully in hardware using standard logic circuits or a VLSI design. Whether software or hardware is used to implement the systems in accordance with this invention is dependent on the speed and/or efficiency requirements of the system, the particular function, and the particular software or hardware systems or microprocessor or microcomputer systems being utilized. The methods illustrated herein however can be readily implemented in hardware and/or software using any known or later developed systems or structures, devices and/or software by those of ordinary skill in the applicable art from the functional description provided herein and with a general basic knowledge of the computer and image processing arts.
Moreover, the disclosed methods may be readily implemented in software executed on programmed general-purpose computer, a special purpose computer, a microprocessor, or the like. In these instances, the systems and methods of this invention may be implemented as program embedded on personal computer such as JAVA® or CGI script, as a resource residing on a server or graphics workstation, as a plug-in, or the like. The system may also be implemented by physically incorporating the system and method into a software and/or hardware system.
Having described above several aspects of at least one embodiment, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be part of this disclosure and are intended to be within the scope of the invention. Embodiments of the methods and apparatuses discussed herein are not limited in application to the details of construction and the arrangement of components set forth in the foregoing description or illustrated in the accompanying drawings. The methods and apparatuses are capable of implementation in other embodiments and of being practiced or of being carried out in various ways. Examples of specific implementations are provided herein for illustrative purposes only and are not intended to be limiting. Also, the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use herein of “including,” “comprising,” “having,” “containing,” “involving,” and variations thereof is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. References to “or” may be construed as inclusive so that any terms described using “or” may indicate any of a single, more than one, and all the described terms. Any references to front and back, left and right, top and bottom, upper and lower, and vertical and horizontal are intended for convenience of description, not to limit the present systems and methods or their components to any one positional or spatial orientation. Accordingly, the foregoing description and drawings are by way of example only, and the scope of the invention should be determined from proper construction of the appended claims, and their equivalents.
This application claims the benefit under 35 U.S.C. § 119(e) of U.S. Provisional Application No. 62/432,876 titled “SYSTEM AND METHOD OF PERSONALIZED NAVIGATION INSIDE A BUSINESS ENTERPRISE” and filed on Dec. 12, 2016, which is herein incorporated by reference in its entirety for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
2408122 | Wirkler | Sep 1946 | A |
3824596 | Guion | Jul 1974 | A |
3940700 | Fischer | Feb 1976 | A |
4018029 | Safranski et al. | Apr 1977 | A |
4328499 | Anderson et al. | May 1982 | A |
4570416 | Shoenfeld | Feb 1986 | A |
5010343 | Andersson | Apr 1991 | A |
5343212 | Rose | Aug 1994 | A |
5426438 | Peavey et al. | Jun 1995 | A |
5510800 | McEwan | Apr 1996 | A |
5574468 | Rose | Nov 1996 | A |
5592180 | Yokev | Jan 1997 | A |
5600330 | Blood | Feb 1997 | A |
5657026 | Culpepper et al. | Aug 1997 | A |
5923286 | Divakaruni | Jul 1999 | A |
5953683 | Hansen et al. | Sep 1999 | A |
6088653 | Sheikh et al. | Jul 2000 | A |
6101178 | Beal | Aug 2000 | A |
6167347 | Lin | Dec 2000 | A |
6255991 | Hedin | Jul 2001 | B1 |
6292750 | Lin | Sep 2001 | B1 |
6409687 | Foxlin | Jun 2002 | B1 |
6417802 | Diesel | Jul 2002 | B1 |
6492905 | Mathias et al. | Dec 2002 | B2 |
6496778 | Lin | Dec 2002 | B1 |
6512748 | Mizuki | Jan 2003 | B1 |
6593885 | Wisherd et al. | Jul 2003 | B2 |
6630904 | Gustafson et al. | Oct 2003 | B2 |
6634804 | Toste | Oct 2003 | B1 |
6683568 | James et al. | Jan 2004 | B1 |
6697736 | Lin | Feb 2004 | B2 |
6720920 | Breed et al. | Apr 2004 | B2 |
6721657 | Ford et al. | Apr 2004 | B2 |
6744436 | Chirieleison, Jr. et al. | Jun 2004 | B1 |
6750816 | Kunysz | Jun 2004 | B1 |
6861982 | Forstrom | Mar 2005 | B2 |
6867774 | Halmshaw et al. | Mar 2005 | B1 |
6988079 | Or-Bach et al. | Jan 2006 | B1 |
6989789 | Ferreol | Jan 2006 | B2 |
7009561 | Menache | Mar 2006 | B2 |
7143004 | Townsend et al. | Nov 2006 | B2 |
7168618 | Schwartz | Jan 2007 | B2 |
7190309 | Hill | Mar 2007 | B2 |
7193559 | Ford et al. | Mar 2007 | B2 |
7236091 | Kiang et al. | Jun 2007 | B2 |
7292189 | Orr | Nov 2007 | B2 |
7295925 | Breed et al. | Nov 2007 | B2 |
7315281 | Dejanovic et al. | Jan 2008 | B2 |
7336078 | Merewether et al. | Feb 2008 | B1 |
7409290 | Lin | Aug 2008 | B2 |
7443342 | Shirai et al. | Oct 2008 | B2 |
7499711 | Hoctor et al. | Mar 2009 | B2 |
7533569 | Sheynblat | May 2009 | B2 |
7612715 | Macleod | Nov 2009 | B2 |
7646330 | Karr | Jan 2010 | B2 |
7689465 | Shakes et al. | Mar 2010 | B1 |
7844507 | Levy | Nov 2010 | B2 |
7868760 | Smith et al. | Jan 2011 | B2 |
7876268 | Jacobs | Jan 2011 | B2 |
7933730 | Li | Apr 2011 | B2 |
8009918 | Van Droogenbroeck et al. | Aug 2011 | B2 |
8189855 | Opalach et al. | May 2012 | B2 |
8201737 | Palacios Durazo et al. | Jun 2012 | B1 |
8219438 | Moon | Jul 2012 | B1 |
8269624 | Chen et al. | Sep 2012 | B2 |
8269654 | Jones et al. | Sep 2012 | B2 |
8295542 | Albertson et al. | Oct 2012 | B2 |
8406470 | Jones et al. | Mar 2013 | B2 |
8457655 | Zhang et al. | Jun 2013 | B2 |
8619144 | Chang et al. | Dec 2013 | B1 |
8749433 | Hill | Jun 2014 | B2 |
8860611 | Anderson et al. | Oct 2014 | B1 |
8957812 | Hill et al. | Feb 2015 | B1 |
9063215 | Perthold et al. | Jun 2015 | B2 |
9092898 | Fraccaroli et al. | Jul 2015 | B1 |
9120621 | Curlander | Sep 2015 | B1 |
9141194 | Keyes et al. | Sep 2015 | B1 |
9171278 | Kong et al. | Oct 2015 | B1 |
9174746 | Bell et al. | Nov 2015 | B1 |
9269022 | Rhoads et al. | Feb 2016 | B2 |
9349076 | Liu et al. | May 2016 | B1 |
9424493 | He et al. | Aug 2016 | B2 |
9482741 | Min | Nov 2016 | B1 |
9497728 | Hill | Nov 2016 | B2 |
9514389 | Erhan et al. | Dec 2016 | B1 |
9519344 | Hill | Dec 2016 | B1 |
9544552 | Takahashi | Jan 2017 | B2 |
9594983 | Alattar et al. | Mar 2017 | B2 |
9656749 | Hanlon | May 2017 | B1 |
9740937 | Zhang et al. | Aug 2017 | B2 |
9782669 | Hill | Oct 2017 | B1 |
9872151 | Puzanov | Jan 2018 | B1 |
9904867 | Fathi et al. | Feb 2018 | B2 |
9933509 | Hill et al. | Apr 2018 | B2 |
9961503 | Hill | May 2018 | B2 |
9996818 | Ren | Jun 2018 | B1 |
10001833 | Hill | Jun 2018 | B2 |
10163149 | Famularo et al. | Dec 2018 | B1 |
10180490 | Schneider et al. | Jan 2019 | B1 |
10416276 | Hill et al. | Sep 2019 | B2 |
10444323 | Min et al. | Oct 2019 | B2 |
10455364 | Hill | Oct 2019 | B2 |
20010027995 | Patel et al. | Oct 2001 | A1 |
20020021277 | Kramer et al. | Feb 2002 | A1 |
20020095353 | Razumov | Jul 2002 | A1 |
20020140745 | Ellenby | Oct 2002 | A1 |
20020177476 | Chou | Nov 2002 | A1 |
20030053492 | Matsunaga | Mar 2003 | A1 |
20030120425 | Stanley et al. | Jun 2003 | A1 |
20030176196 | Hall et al. | Sep 2003 | A1 |
20030184649 | Mann | Oct 2003 | A1 |
20030195017 | Chen et al. | Oct 2003 | A1 |
20040002642 | Dekel et al. | Jan 2004 | A1 |
20040095907 | Agee et al. | May 2004 | A1 |
20040107072 | Dietrich et al. | Jun 2004 | A1 |
20040176102 | Lawrence et al. | Sep 2004 | A1 |
20040203846 | Carronni et al. | Oct 2004 | A1 |
20050001712 | Yarbrough | Jan 2005 | A1 |
20050057647 | Nowak | Mar 2005 | A1 |
20050062849 | Foth et al. | Mar 2005 | A1 |
20050143916 | Kim et al. | Jun 2005 | A1 |
20050154685 | Mundy et al. | Jul 2005 | A1 |
20050184907 | Hall | Aug 2005 | A1 |
20050275626 | Mueller et al. | Dec 2005 | A1 |
20060013070 | Holm et al. | Jan 2006 | A1 |
20060022800 | Krishna et al. | Feb 2006 | A1 |
20060061469 | Jaeger et al. | Mar 2006 | A1 |
20060066485 | Min | Mar 2006 | A1 |
20060101497 | Hirt | May 2006 | A1 |
20060192709 | Schantz et al. | Aug 2006 | A1 |
20060279459 | Akiyama | Dec 2006 | A1 |
20060290508 | Moutchkaev et al. | Dec 2006 | A1 |
20070060384 | Dohta | Mar 2007 | A1 |
20070138270 | Reblin | Jun 2007 | A1 |
20070205867 | Kennedy et al. | Sep 2007 | A1 |
20070210920 | Panotopoulos | Sep 2007 | A1 |
20070222560 | Posamentier | Sep 2007 | A1 |
20080007398 | DeRose et al. | Jan 2008 | A1 |
20080048913 | Macias et al. | Feb 2008 | A1 |
20080143482 | Shoarinejad et al. | Jun 2008 | A1 |
20080150678 | Giobbi et al. | Jun 2008 | A1 |
20080154691 | Wellman et al. | Jun 2008 | A1 |
20080174485 | Carani et al. | Jul 2008 | A1 |
20080204322 | Oswald et al. | Aug 2008 | A1 |
20080266253 | Seeman et al. | Oct 2008 | A1 |
20080281618 | Mermet et al. | Nov 2008 | A1 |
20080316324 | Rofougaran | Dec 2008 | A1 |
20090043504 | Bandyopadhyay et al. | Feb 2009 | A1 |
20090114575 | Carpenter et al. | May 2009 | A1 |
20090121017 | Cato et al. | May 2009 | A1 |
20090149202 | Hill et al. | Jun 2009 | A1 |
20090224040 | Kushida et al. | Sep 2009 | A1 |
20090243932 | Moshifeghi | Oct 2009 | A1 |
20090323586 | Hohl et al. | Dec 2009 | A1 |
20100090852 | Eitan et al. | Apr 2010 | A1 |
20100097208 | Rosing et al. | Apr 2010 | A1 |
20100103173 | Lee | Apr 2010 | A1 |
20100103989 | Smith et al. | Apr 2010 | A1 |
20100123664 | Shin et al. | May 2010 | A1 |
20100159958 | Naguib et al. | Jun 2010 | A1 |
20110002509 | Nobori | Jan 2011 | A1 |
20110006774 | Balden | Jan 2011 | A1 |
20110037573 | Choi | Feb 2011 | A1 |
20110066086 | Aarestad et al. | Mar 2011 | A1 |
20110187600 | Landt | Aug 2011 | A1 |
20110208481 | Slastion | Aug 2011 | A1 |
20110210843 | Kummetz | Sep 2011 | A1 |
20110241942 | Hill | Oct 2011 | A1 |
20110256882 | Markhovsky et al. | Oct 2011 | A1 |
20110264520 | Puhakka | Oct 2011 | A1 |
20110286633 | Wang | Nov 2011 | A1 |
20110313893 | Weik, III | Dec 2011 | A1 |
20120013509 | Wisherd et al. | Jan 2012 | A1 |
20120020518 | Taguchi | Jan 2012 | A1 |
20120087572 | Dedeoglu | Apr 2012 | A1 |
20120127088 | Pance et al. | May 2012 | A1 |
20120176227 | Nikitin | Jul 2012 | A1 |
20120184285 | Sampath | Jul 2012 | A1 |
20120257061 | Edwards et al. | Oct 2012 | A1 |
20120286933 | Hsiao | Nov 2012 | A1 |
20120319822 | Hansen | Dec 2012 | A1 |
20130018582 | Miller et al. | Jan 2013 | A1 |
20130021417 | Ota et al. | Jan 2013 | A1 |
20130029685 | Moshfeghi | Jan 2013 | A1 |
20130036043 | Faith | Feb 2013 | A1 |
20130051624 | Iwasaki | Feb 2013 | A1 |
20130063567 | Burns et al. | Mar 2013 | A1 |
20130073093 | Songkakul | Mar 2013 | A1 |
20130113993 | Dagit, III | May 2013 | A1 |
20130182114 | Zhang | Jul 2013 | A1 |
20130191193 | Calman | Jul 2013 | A1 |
20130226655 | Shaw | Aug 2013 | A1 |
20130281084 | Batada et al. | Oct 2013 | A1 |
20130293722 | Chen | Nov 2013 | A1 |
20130314210 | Schoner | Nov 2013 | A1 |
20130335318 | Nagel et al. | Dec 2013 | A1 |
20130335415 | Chang | Dec 2013 | A1 |
20140022058 | Striemer et al. | Jan 2014 | A1 |
20140108136 | Zhao et al. | Apr 2014 | A1 |
20140139426 | Kryze et al. | May 2014 | A1 |
20140253368 | Holder | Sep 2014 | A1 |
20140270356 | Dearing et al. | Sep 2014 | A1 |
20140300516 | Min et al. | Oct 2014 | A1 |
20140317005 | Balwani | Oct 2014 | A1 |
20140330603 | Corder et al. | Nov 2014 | A1 |
20140357295 | Skomra et al. | Dec 2014 | A1 |
20140361078 | Davidson | Dec 2014 | A1 |
20150009949 | Khoryaev et al. | Jan 2015 | A1 |
20150012396 | Puerini | Jan 2015 | A1 |
20150019391 | Kumar | Jan 2015 | A1 |
20150029339 | Kobres et al. | Jan 2015 | A1 |
20150039458 | Reid | Feb 2015 | A1 |
20150055821 | Fotland | Feb 2015 | A1 |
20150059374 | Hebel | Mar 2015 | A1 |
20150085096 | Smits | Mar 2015 | A1 |
20150091757 | Shaw et al. | Apr 2015 | A1 |
20150134418 | Leow et al. | May 2015 | A1 |
20150169916 | Hill | Jun 2015 | A1 |
20150170002 | Szegedy et al. | Jun 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150221135 | Hill | Aug 2015 | A1 |
20150248765 | Criminisi | Sep 2015 | A1 |
20150254906 | Berger et al. | Sep 2015 | A1 |
20150278759 | Harris et al. | Oct 2015 | A1 |
20150323643 | Hill et al. | Nov 2015 | A1 |
20150341551 | Perrin et al. | Nov 2015 | A1 |
20150362581 | Friedman | Dec 2015 | A1 |
20150371178 | Abhyanker et al. | Dec 2015 | A1 |
20150371319 | Argue | Dec 2015 | A1 |
20150379366 | Nomura | Dec 2015 | A1 |
20160035078 | Lin | Feb 2016 | A1 |
20160063610 | Argue | Mar 2016 | A1 |
20160093184 | Locke | Mar 2016 | A1 |
20160098679 | Levy | Apr 2016 | A1 |
20160140436 | Yin et al. | May 2016 | A1 |
20160142868 | Kulkarni et al. | May 2016 | A1 |
20160150196 | Horvath | May 2016 | A1 |
20160156409 | Chang | Jun 2016 | A1 |
20160178727 | Bottazzi | Jun 2016 | A1 |
20160195602 | Meadow | Jul 2016 | A1 |
20160238692 | Hill | Aug 2016 | A1 |
20160256100 | Jacofsky | Sep 2016 | A1 |
20160286508 | Khoryaev et al. | Sep 2016 | A1 |
20160366561 | Min et al. | Dec 2016 | A1 |
20160370453 | Boker et al. | Dec 2016 | A1 |
20160371574 | Nguyen et al. | Dec 2016 | A1 |
20170030997 | Hill | Feb 2017 | A1 |
20170031432 | Hill | Feb 2017 | A1 |
20170140329 | Bernhardt et al. | May 2017 | A1 |
20170234979 | Mathews | Aug 2017 | A1 |
20170280281 | Pandey et al. | Sep 2017 | A1 |
20170293885 | Grady et al. | Oct 2017 | A1 |
20170323174 | Joshi et al. | Nov 2017 | A1 |
20170323376 | Glaser | Nov 2017 | A1 |
20170350961 | Hill | Dec 2017 | A1 |
20170351255 | Anderson et al. | Dec 2017 | A1 |
20170372524 | Hill | Dec 2017 | A1 |
20170374261 | Teich et al. | Dec 2017 | A1 |
20180033151 | Matsumoto et al. | Feb 2018 | A1 |
20180068266 | Kirmani | Mar 2018 | A1 |
20180094936 | Jones et al. | Apr 2018 | A1 |
20180197139 | Hill | Jul 2018 | A1 |
20180197218 | Mallesan et al. | Jul 2018 | A1 |
20190053012 | Hill | Feb 2019 | A1 |
20190090744 | Mahfouz | Mar 2019 | A1 |
20190098263 | Seiger et al. | Mar 2019 | A1 |
20190295290 | Schena et al. | Sep 2019 | A1 |
20200011961 | Hill et al. | Jan 2020 | A1 |
Number | Date | Country |
---|---|---|
2001006401 | Jan 2001 | WO |
2005010550 | Feb 2005 | WO |
2009007198 | Jan 2009 | WO |
Entry |
---|
Sun, et al., “Analysis of the Kalman Filter With Different INS Error Models for GPS/INS Integration in Aerial Remote Sensing Applications”, Bejing, 2008, The International Archives of the Photogrammerty, Remote Sensing and Spatial Information Sciences vol. XXXVII, Part B5.; 8 pages. |
Proakis, John G. and Masoud Salehi, “Communication Systems Engineering”, Second Edition, Prentice-Hall, Inc., Upper Saddle River, New Jersey, 2002; 815 pages. |
Alban, Santago “Design and Performance of a Robust GPS/INS Attitude System for Automobile Applications”, Dissertation, Stanford University, Jun. 2004; 218 pages. |
Grewal and Andrews “Global Positioning Systems, Inertial Navigation, and Integration” Section 8.6-8.6.3.1, 2001, John Weiley and Sons, pp. 252-256. |
Gautier, Jennifer Denise “GPS/INS Generalized Evaluation Tool (GIGET) for the Design and Testing of Integrated Navigation Systems”, Dissertation, Stanford University, Jun. 2003; 160 pages. |
Szeliski, Richard “Image Alignment and Stitching: A Tutorial”, Technical Report, MST-TR-2004-92, Dec. 10, 2006. |
Pourhomayoun, Mohammad and Mark Fowler “Improving WLAN-based Indoor Mobile Positioning Using Sparsity,” Conference Record of the Forty Sixth Asilomar Conference on Signals, Systems and Computers, Nov. 4-7, 2012, pp. 1393-1396, Pacific Grove, California. |
Filho, et al. “Integrated GPS/INS Navigation System Based on a Gyroscope-Free IMU”, DINCON Brazilian Conference on Synamics, Control, and Their Applications, May 22-26, 2006; 6 pages. |
Schumacher, Adrian “Integration of a GPS aided Strapdown Inertial Navigation System for Land Vehicles”, Master of Science Thesis, KTH Electrical Engineering, 2006; 67 pages. |
Vikas Numar N. “Integration of Inertial Navigation System and Global Positioning System Using Kalman Filtering”, M.Tech Dissertation, Indian Institute of Technology, Bombay, Mumbai, Jul. 2004; 69 pages. |
Xu, Wei and Jane Mulligan “Performance Evaluation of Color Correction Approaches for Automatic Multi-view Image and VideoStitching”, International Converence on Computer Vision and Pattern Recognition (CVPR10), San Francisco, CA,2010. |
Farrell, et al. “Real-Time Differential Carrier Phase GPS-Aided INS”, Jul. 2000, IEEE Transactions on Control Systems Technology, vol. 8, No. 4; 13 pages. |
Yang, Yong “Tightly Coupled MEMS INS/GPS Integration with INS Aided Receiver Tracking Loops”, Jun. 2008, UCGE Reports No. 20270; 205 pages. |
Sun, Debo “Ultra-Tight GPS/Reduced IMU for Land Vehicle Navigation”, Mar. 2010, UCGE Reports No. 20305; 254 pages. |
Brown, Matthew and David G. Lowe “Automatic Panoramic Image Stitching Using Invariant Features”, International Journal of Computer Vision, vol. 74, No. 1, pp. 59-73, 2007. |
Gao, Jianchen “Development of a Precise GPS/INS/On-Board Vehicle Sensors Integrated Vehicular Positioning System”, Jun. 2007, UCGE Reports No. 20555; 245 pages. |
Schmidt and Phillips, “INS/GPS Integration Architectures”, NATO RTO Lecture Seriers, First Presented Oct. 20-21, 2003; 24 pages. |
Farrell and Barth, “The Global Positioning System & Interial Navigation”, 1999, McGraw-Hill; pp. 245-252. |
Goodall, Christopher L. , “Improving Usability of Low-Cost INS/GPS Navigation Systems using Intelligent Techniques”, Jan. 2009, UCGE Reports No. 20276; 234 pages. |
“ADXL202/ADXL210 Product Sheet”, Analog Devices, Inc., Analog.com, 1999. |
Welch, Greg and Gary Bishop “An Introduction to the Kalman Filter,” TR95-041, Department of Computer Science, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-3175, Updated: Monday, Jul. 24, 2006. |
Li, Xin Multifrequency-Based Range Estimation of RFID Tags, 2009, IEEE. |
Hill, et al., U.S. Appl. No. 14/600,025 entitled “Position Tracking System and Method Using Radio Signals and Inertial Sensing” filed Jan. 20, 2015. |
Min, et al., U.S. Appl. No. 15/953,798 entitled “Systems and Methods of Wireless Position Tracking” filed Apr. 16, 2018. |
Hill, Edward L., U.S. Appl. No. 15/961,274 entitled “Wireless Relay Station for Radio Frequency-Based Tracking System” filed Apr. 24, 2018. |
Hill, Edward L., U.S. Appl. No. 13/918,295 entitled “RF Tracking With Active Sensory Feedback” filed Jun. 14, 2013. |
Schneider, et al., U.S. Appl. No. 13/975,724 entitled “Radio Frequency Communication System” filed Aug. 26, 2013. |
Hill, et al., U.S. Appl. No. 15/404,668 entitled “Spatial Diveristy for Relative Position Tracking” filed Jan. 12, 2017. |
Hill, et al., U.S. Appl. No. 15/091,180 entitled “Package Tracking Systems and Methods” filed Apr. 5, 2016. |
Seiger, et al., U.S. Appl. No. 15/270,749 entitled “Modular Shelving Systems for Package Tracking” filed Sep. 20, 2016. |
Hill, et al., U.S. Appl. No. 15/416,366 entitled “Video for Real-Time Confirmation in Package Tracking Systems” filed Jan. 26, 2017. |
Piotrowski, et al., U.S. Appl. No. 15/416,379 entitled “Light-Based Guidance for Package Tracking Systems” filed Jan. 26, 2017. |
Min, et al., U.S. Appl. No. 15/446,602 entitled “Expandable, Decentralized Position Tracking Systems and Methods” filed Mar. 1, 2017. |
Wilde, Andreas, “Extended Tracking Range Delay-Locked Loop,” Proceedings IEEE International Conference on Communications, Jun. 1995, pp. 1051-1054. |
Hill, Edward L., “System and Method of Personalized Navigation Inside a Business Enterprise,” U.S. Appl. No. 16/163,708, filed Oct. 18, 2018. |
Dictionary Definition for Peripheral Equipment. (2001). Hargrave's Communications Dictionary, Wiley. Hoboken, NJ: Wiley. Retrieved from Https://search.credorefernce.com/content/entry/hargravecomms/peripheral_equioment/0 (Year:2001). |
Non-Final Office Action in U.S. Appl. No. 16/163,708, dated Mar. 29, 2019; 15 pages. |
Schena, et al., “System and Method of Calibrating a Directional Light Source Relative to a Camera's Field of View,” U.S. Appl. No. 16/437,767, filed Jun. 11, 2019. |
Morbella N50: 5-inch GPS Navigator User's Manual, Maka Technologies Group, May 2012. |
Notice of Allowance in U.S. Appl. No. 16/658,951 dated Feb. 7, 2020; 7 pages. |
Number | Date | Country | |
---|---|---|---|
20180164103 A1 | Jun 2018 | US |
Number | Date | Country | |
---|---|---|---|
62432876 | Dec 2016 | US |