Dynamically created and updated indoor positioning map

Information

  • Patent Grant
  • 10395116
  • Patent Number
    10,395,116
  • Date Filed
    Thursday, October 29, 2015
    9 years ago
  • Date Issued
    Tuesday, August 27, 2019
    5 years ago
Abstract
A system for creating and/or dynamically updating indoor positioning maps includes an augmented reality (AR) device and a computing device communicatively coupled to the AR device. The AR device has a display for displaying AR content to a user that overlaps the AR device's perspective view of an environment. The AR device also has one or more depth sensors for gathering mapping data of physical objects in the environment. The computing device has a processor that is configured by software to create a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the AR device, and/or dynamically update the 3D indoor positioning map of the environment in the building based on the mapping data gathered by the AR device.
Description
FIELD OF THE INVENTION

The present invention relates to indoor positioning maps and more specifically, to a method of dynamically creating and updating indoor positioning maps utilizing augmented reality (AR) or an AR device.


BACKGROUND

Mixed reality refers to the merging of real and virtual (i.e., computer generated) worlds. Augmented reality (AR) lies within the spectrum of mixed reality experiences. The use of AR devices is becoming more prevalent. AR devices are typically worn on a user's head and are used to display information that augments the user's visual experience. The AR experience is created by presenting AR content (e.g., text, graphics, images, etc.) that overlay the user's field of view (FOV). This AR content is typically positioned so that it lends context to things (e.g., objects, people, etc.) within the user's immediate environment. When used in the workplace, a worker may use this information to analyze/understand their environment, leading to enhanced productivity and effectiveness.


However, the AR experience and AR devices are not known for dynamically creating and/or updating indoor positioning maps.


Generally speaking, there are several different approaches to indoor positioning, which require varying degrees of infrastructure capital expenditure. Some of the less expensive systems use known locations of RF anchor nodes such as Bluetooth low energy beacons, WiFi access points and even magnetic fields to calculate position. With these types of systems a mapping of RF signal strengths at different locations within a building needs to be created. This can be time consuming and subject to environmental changes that affect the map's accuracy over time.


Therefore, a need exists for a method and system that dynamically creates and updates indoor positioning mapping utilizing AR devices.


SUMMARY

Accordingly, in one aspect, the present invention embraces a system for dynamically creating and/or updating indoor positioning maps. The system may generally include an augmented reality (AR) device and a computing device communicatively coupled to the AR device. The AR device has a display for displaying AR content to a user that overlaps the AR device's perspective view of an environment. The AR device also has one or more depth sensors for gathering mapping data of physical objects in the environment. The computing device may have a processor that is configured by software to dynamically create and/or update a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the AR device.


In an exemplary embodiment, the dynamic creation and/or updating of the 3D indoor positioning map may include utilizing simultaneous location and mapping (SLAM) techniques for positioning within the 3D map. The SLAM techniques utilized for positioning may include collecting available environmental data at various points within the building. The environmental data collected may include any available environmental data, including, but not limited to, all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device at the time of each collection. For example, the RF signal information collected may contain signal strength, transmit power, and/or an identifier for the RF source. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device may be recorded for each location where environmental data may be collected. The collected environmental data may then be sent to the computing device and saved with the specific location and orientation in the 3D indoor positioning map.


In another exemplary embodiment, the computing device may include a remote program for communicating with a remote device. The remote device may be any remote device with at least one remote sensor for collecting and sharing remote data. The computing device may determine the location of the remote device on the 3D indoor positioning map by comparing the remote data with the recorded environmental data. The location of the remote device on the 3D indoor positioning map may include a probability of accuracy based on a number of degrees of freedom of the remote device.


In another exemplary embodiment, the AR device may be configured to provide guidance to provide missing data and/or data older than a defined refresh period. For guidance, the processor of the AR device may be configured to create guidance AR content corresponding to the AR device's perspective view of the environment for guidance, and transmit the 3D indoor positioning map with the guidance AR content to the display for guidance. The missing data and/or data older than a defined refresh period collected by the depth sensors while the AR device is guided through the building may be sent to the computing device and constantly updated in real time. The guidance AR content may include visible instructions to rotate around 360 degrees at each location so that an omni-directional mapping of the environment is recorded at each location. As an example, the visible instructions may include an AR visual graphic that shows how to turn and how long to stay at each location configured for an accurate reading to be recorded.


In another exemplary embodiment, the processor of the computing device may be further configured to create a two dimensional (2D) view of the indoor positioning map, and show at least one location of a remote device on the 2D view of the indoor positioning map in real time. The computing device may include a remote program for communicating with the remote device the 2D view of the indoor positioning map with its location.


In another exemplary embodiment, the at least one depth sensor may include an optical 3D scanner.


In another exemplary embodiment, the display may be a head mounted display (HMD). The HMD may comprise a transparent plate that may be positioned in front of the user's eye or eyes, allowing the user to view the environment through the transparent plate. The transparent plate may also be arranged to display AR content to the user's eye or eyes so that the AR content appears superimposed on the user's view of the environment.


In another aspect, the present invention embraces a method for dynamically creating and/or updating indoor positioning maps. The method may generally include the steps of: collecting position information from at least one depth sensor of an AR device; dynamically creating a three-dimensional (3D) indoor positioning map based on the received position information from the AR device; and/or dynamically updating the 3D indoor positioning map based on the received position information from the AR device.


In an exemplary embodiment, the steps of creating the 3D indoor positioning map and/or dynamically updating the 3D indoor positioning map may include positioning the AR device using simultaneous location and mapping (SLAM) techniques. The SLAM techniques used for positioning may include, but are not limited to, collecting any available environmental data at various points within the building. The environmental data may include all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device at the time of each collection. For example, the RF signal information collected may contain signal strength, transmit power, and/or an identifier for the RF source. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device may be recorded at each location where environmental data may be collected. The collected environmental data may then be sent to the computing device and saved with the specific location and orientation in the 3D indoor positioning map.


In another exemplary embodiment, the method for creating and/or updating indoor positioning maps may further include the step of guiding the AR device to provide missing data and/or data older than a defined refresh period. The step of guiding may include creating guidance AR content corresponding to the AR device's perspective view of the environment for guidance, and transmitting the 3D indoor positioning map with the created guidance AR content to the display for guidance.


In another aspect, the present invention embraces an augmented reality (AR) device. The AR device generally includes a display, one or more depth sensors, and a processor. The display may display guidance AR content to a user that may overlap the AR device's perspective view of an environment. The one or more depth sensors may gather mapping data of physical objects in the environment. The processor may be communicatively coupled to the one or more depth sensors. The processor may be configured by software to: dynamically create and/or update a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the AR device.


In an exemplary embodiment, the AR device may use simultaneous location and mapping (SLAM) techniques for the positioning of the AR device in the dynamic creation and/or updating of the 3D indoor positioning map. The SLAM techniques used for positioning may include, but are not limited to, collecting any available environmental data at various points within the building. The environmental data may include all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device at the time of each collection. For example, the RF signal information collected may contain a signal strength, transmit power, and/or an identifier for the RF source. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device may be recorded at each location where environmental data may be collected. The collected environmental data may then be sent to the computing device and saved with the specific location and orientation in the 3D indoor positioning map.


In another exemplary embodiment, the AR device may be configured to provide guidance to provide missing data and/or data older than a defined refresh period. In this guidance embodiment, the processor of the AR device may be configured to create guidance AR content corresponding to the AR device's perspective view of the environment for guidance, and transmit the 3D indoor positioning map with the guidance AR content to the display for guidance.


In another exemplary embodiment, the at least one depth sensor of the AR device may include an optical 3D scanner.


In another exemplary embodiment, the display of the AR device may be a head mounted display (HMD). The HMD may include a transparent plate that may be positioned in front of the user's eye or eyes, allowing the user to view the environment through the transparent plate. The transparent plate may also be arranged to display AR content to the user's eye or eyes so that the AR content appears superimposed on the user's view of the environment.


In yet another exemplary embodiment, the display of the AR device may include a liquid crystal display (LCD).


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 graphically depicts, according to an embodiment of the present invention, a user wearing an exemplary AR device and an exemplary output of said AR device seen by the user.



FIG. 2 schematically depicts a system/method for creating and/or dynamically updating indoor positioning maps according to an embodiment of the present invention.





DETAILED DESCRIPTION

The present invention embraces a system, method, and device that utilize augmented reality (AR) and AR devices for creating and/or updating indoor positioning maps. AR systems allow a user to view and (in some cases) interact with an enhanced version of the physical world. AR systems combine a user's perspective view of the physical world (i.e., the user's environment) with virtual objects. The virtual objects may be overlaid and positioned within the user's perspective view to provide contextually relevant information.


Virtual objects may include graphics or text and may presented in two dimensions (2D) and/or three dimensions (3D). The virtual objects (i.e., AR content) are continually updated (e.g., real time) to correspond with a user's change in perspective. As such, AR systems typically include body-worn cameras/displays (e.g., head mounted display) or hand-held cameras/displays (e.g., smartphone, tablet, etc.).


A head mounted display (HMD) may be part of an AR system. One possible HMD type is the video see-through HMD. Here, the environment is presented as a video stream to the user via a display (e.g., a liquid crystal display). Another possible HMD type is the optical see-through HMD (e.g., smart glasses), wherein the user looks through a transparent plate. The transparent plate is configured to display AR content so the AR content is overlaid with the user's perspective view of the environment.


An exemplary AR device is shown in FIG. 1. The AR device 12 is a smart glasses type HMD (e.g., MICROSOFT™ HOLOLENS™). When a user 2 wears the AR device like a pair of glasses, AR content 15 is presented to both eyes. This AR content may appear 3D resulting from the stereoscopic view and the display's ability to create “holograms” of virtual objects. The user's perspective view 3 of an environment 4 is displayed to a user with AR content 15 overlaid and positioned to help the user understand the environment 4.


The AR content 15 may change in response to movement of the AR device 12 within the environment (i.e., position). These changes typically occur in real time allowing a user to move freely while the AR content 15 updates appropriately to match changes in the user's perspective.


Tracking of the AR device's position/orientation is required to update the AR content 15 appropriately. Tracking may utilize one or more sensors to determine the user's position/orientation. For example, inertial measurement sensors (e.g., gyroscope, accelerometer, magnetometer, etc.) may facilitate tracking. In addition, tracking may also utilize depth sensors.


Depth sensing may be used to create range images of the AR system's perspective. Range images are images with pixel values corresponding to the range between the AR system and points within the AR system's field of view.


Depth sensors (i.e., range cameras) may produce these range images using one of several possible techniques (e.g., stereo triangulation, sheet of light triangulation, structured light, time of flight, interferometry, coded aperture, etc.). Structure light depth sensors, for example, illuminate an environment with a specially designed light pattern (e.g., points, checkerboard, lines, etc.). The reflected light pattern is compared to a reference pattern to obtain a range image.


AR systems may include a camera to help tracking and mapping. This camera (e.g., CCD camera, CMOS camera, etc.) is typically aligned with the perspective view of the user. The images captured by the camera may be processed by processors running algorithms (such as simultaneous localization and mapping (SLAM)) to track and map. SLAM algorithms may aid in the creation of maps (i.e., models) of the environment, which include the locations of physical objects and/or light sources in the environment.


Detecting light sources for mapping may be accomplished using the camera or by using one of a variety of possible photo sensor types (e.g., photodiodes, phototransistors, etc.). For example, light levels measured by the light sensor (e.g., camera, photo sensor, etc.) may be compared to a threshold as part of a light-source detection process.


However, the AR experience and AR devices are not known to be utilized for dynamically creating and/or updating indoor positioning maps. Therefore, the instant disclosure recognizes the need for systems, methods, and devices that dynamically creates and/or updates indoor positioning mapping utilizing such AR devices.


Referring to FIG. 2 an embodiment of a system 10 for dynamically creating and/or updating indoor positioning maps is shown. The system 10 may generally include AR device 12 and computing device 18 communicatively coupled to the AR device 12 (i.e. via a wireless connection like WiFi). AR device 12 may have display 14 for displaying AR content 15 to the user 2 that overlaps the AR device's perspective view 3 of the environment 4. AR device 12 may also have one or more depth sensors 16 for gathering mapping data of physical objects in the environment 4. The computing device 18 may have a processor that is configured by software to dynamically create and/or update a three-dimensional (3D) indoor positioning map of the environment 4 in a building based on the mapping data gathered by the AR device 12.


The dynamic creation and/or updating of the 3D indoor positioning map may include utilizing simultaneous location and mapping (SLAM) techniques for positioning of the AR device 12 in the environment 4 while dynamically creating and/or updating the 3D indoor positioning map. The SLAM techniques utilized for positioning may include, but are not limited to, collecting available or perceivable environmental data at various points within the building of environment 4. The environmental data collected may include any available environmental data, including, but not limited to, all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device 12 at the time of each collection. Recording the orientation of the device may be important, as the signal strengths may change depending on the direction AR device 12 is facing. For example, the RF signal information collected may contain signal strength and an identifier (e.g. MAC address) for the RF source. The RF signal information collected might also include the transmit power, which may be needed for signal strength multilateration. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device 12 may be recorded for each location where environmental data may be collected. The collected environmental data may then be sent to the computing device 18 and saved with the specific location and orientation in the 3D indoor positioning map.


The computing device 18 may be any computing device like a processor, server, and or combinations thereof that is in communication with AR device 12 for dynamically creating/updating 3D indoor positioning maps. Computing device 18 may be remote to AR device 12 and/or coupled with AR device 12.


The computing device 18 may include a remote program for communicating with a remote device. The remote device may be any remote device with at least one remote sensor for collecting remote data, including collecting environmental data for utilizing SLAM techniques. This remote program of the computing device 18 could then allow less capable devices (i.e. less capable than AR device 12) to share environmental data to aid in dynamically creating/updating the 3D indoor positioning map. The computing device may determine the location of the remote device on the 3D indoor positioning map by comparing the remote data with the recorded environmental data. In one embodiment, the location of the remote device on the 3D indoor positioning map may include a probability of accuracy based on a number of degrees of freedom of the remote device.


Referring to FIG. 1, one feature of the instant disclosure may be that AR device 12 may be configured to provide guidance to provide missing data (i.e. environmental or structural info not yet created or new environmental data known to need updating) and/or data older than a defined refresh period (i.e. stale data). It may be important to frequently update data as something as simple as a new table in the room could completely alter the RF signal path within the room. For guidance, the processor of AR device 12 may be configured to create guidance AR content 15 corresponding to the AR device's perspective view 3 of the environment 4 for guidance, and transmit the 3D indoor positioning map with the guidance AR content 15 to the display 14 for guidance. The missing data and/or data older than a defined refresh period collected by the depth sensors 16 while AR device 12 is guided through the building may be sent to the computing device and constantly updated in real time. As shown in FIG. 1, the guidance AR content 15 may include visible instructions to rotate around 360 degrees at each location so that an omni-directional mapping of the environment is recorded at each location. As shown in the example, the visible instructions may include an AR visual graphic that shows how to turn and how long to stay at each location configured for an accurate reading to be recorded.


The processor of computing device 18 may be further configured to create a two dimensional (2D) view of the indoor positioning map, and show at least one location of the remote device on the 2D view of the indoor positioning map in real time. The remote program may then be for communicating with the remote device the 2D view of the indoor positioning map with its location. This 2D feature of the disclosure may show a device where they are positioned at any given moment, as well as to show an administrator where all their assets were at any given moment.


Referring again to FIG. 2, in another aspect the present invention embraces a method for dynamically creating and/or updating indoor positioning maps. The method may generally include the steps of: a step 21 of collecting or gathering position information from at least one depth sensor 16 of AR device 12; step 22 of dynamically creating a three-dimensional (3D) indoor positioning map based on the received position information from AR device 12; and/or step 23 of dynamically updating the 3D indoor positioning map based on the received position information from AR device 12.


Steps 22 and 23 of creating the 3D indoor positioning map and/or dynamically updating the 3D indoor positioning map may include positioning AR device 12 in the 3D indoor positioning map using SLAM techniques. The SLAM techniques utilized for positioning may be any of the SLAM techniques as known and/or described herein.


The method for creating and/or updating indoor positioning maps may further include step 24 of guiding the AR device to provide missing data and/or data older than a defined refresh period. Step 24 of guiding may include step 25 of creating guidance AR content 15 corresponding to the AR device's perspective view 3 of the environment 4 for guidance, and step 26 of transmitting the 3D indoor positioning map with the created guidance AR content 15 to the display 14 for guidance.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266;
  • U.S. Pat. Nos. 7,159,783; 7,413,127;
  • U.S. Pat. Nos. 7,726,575; 8,294,969;
  • U.S. Pat. Nos. 8,317,105; 8,322,622;
  • U.S. Pat. Nos. 8,366,005; 8,371,507;
  • U.S. Pat. Nos. 8,376,233; 8,381,979;
  • U.S. Pat. Nos. 8,390,909; 8,408,464;
  • U.S. Pat. Nos. 8,408,468; 8,408,469;
  • U.S. Pat. Nos. 8,424,768; 8,448,863;
  • U.S. Pat. Nos. 8,457,013; 8,459,557;
  • U.S. Pat. Nos. 8,469,272; 8,474,712;
  • U.S. Pat. Nos. 8,479,992; 8,490,877;
  • U.S. Pat. Nos. 8,517,271; 8,523,076;
  • U.S. Pat. Nos. 8,528,818; 8,544,737;
  • U.S. Pat. Nos. 8,548,242; 8,548,420;
  • U.S. Pat. Nos. 8,550,335; 8,550,354;
  • U.S. Pat. Nos. 8,550,357; 8,556,174;
  • U.S. Pat. Nos. 8,556,176; 8,556,177;
  • U.S. Pat. Nos. 8,559,767; 8,599,957;
  • U.S. Pat. Nos. 8,561,895; 8,561,903;
  • U.S. Pat. Nos. 8,561,905; 8,565,107;
  • U.S. Pat. Nos. 8,571,307; 8,579,200;
  • U.S. Pat. Nos. 8,583,924; 8,584,945;
  • U.S. Pat. Nos. 8,587,595; 8,587,697;
  • U.S. Pat. Nos. 8,588,869; 8,590,789;
  • U.S. Pat. Nos. 8,596,539; 8,596,542;
  • U.S. Pat. Nos. 8,596,543; 8,599,271;
  • U.S. Pat. Nos. 8,599,957; 8,600,158;
  • U.S. Pat. Nos. 8,600,167; 8,602,309;
  • U.S. Pat. Nos. 8,608,053; 8,608,071;
  • U.S. Pat. Nos. 8,611,309; 8,615,487;
  • U.S. Pat. Nos. 8,616,454; 8,621,123;
  • U.S. Pat. Nos. 8,622,303; 8,628,013;
  • U.S. Pat. Nos. 8,628,015; 8,628,016;
  • U.S. Pat. Nos. 8,629,926; 8,630,491;
  • U.S. Pat. Nos. 8,635,309; 8,636,200;
  • U.S. Pat. Nos. 8,636,212; 8,636,215;
  • U.S. Pat. Nos. 8,636,224; 8,638,806;
  • U.S. Pat. Nos. 8,640,958; 8,640,960;
  • U.S. Pat. Nos. 8,643,717; 8,646,692;
  • U.S. Pat. Nos. 8,646,694; 8,657,200;
  • U.S. Pat. Nos. 8,659,397; 8,668,149;
  • U.S. Pat. Nos. 8,678,285; 8,678,286;
  • U.S. Pat. Nos. 8,682,077; 8,687,282;
  • U.S. Pat. Nos. 8,692,927; 8,695,880;
  • U.S. Pat. Nos. 8,698,949; 8,717,494;
  • U.S. Pat. Nos. 8,717,494; 8,720,783;
  • U.S. Pat. Nos. 8,723,804; 8,723,904;
  • U.S. Pat. Nos. 8,727,223; D702,237;
  • U.S. Pat. Nos. 8,740,082; 8,740,085;
  • U.S. Pat. Nos. 8,746,563; 8,750,445;
  • U.S. Pat. Nos. 8,752,766; 8,756,059;
  • U.S. Pat. Nos. 8,757,495; 8,760,563;
  • U.S. Pat. Nos. 8,763,909; 8,777,108;
  • U.S. Pat. Nos. 8,777,109; 8,779,898;
  • U.S. Pat. Nos. 8,781,520; 8,783,573;
  • U.S. Pat. Nos. 8,789,757; 8,789,758;
  • U.S. Pat. Nos. 8,789,759; 8,794,520;
  • U.S. Pat. Nos. 8,794,522; 8,794,525;
  • U.S. Pat. Nos. 8,794,526; 8,798,367;
  • U.S. Pat. Nos. 8,807,431; 8,807,432;
  • U.S. Pat. Nos. 8,820,630; 8,822,848;
  • U.S. Pat. Nos. 8,824,692; 8,824,696;
  • U.S. Pat. Nos. 8,842,849; 8,844,822;
  • U.S. Pat. Nos. 8,844,823; 8,849,019;
  • U.S. Pat. Nos. 8,851,383; 8,854,633;
  • U.S. Pat. Nos. 8,866,963; 8,868,421;
  • U.S. Pat. Nos. 8,868,519; 8,868,802;
  • U.S. Pat. Nos. 8,868,803; 8,870,074;
  • U.S. Pat. Nos. 8,879,639; 8,880,426;
  • U.S. Pat. Nos. 8,881,983; 8,881,987;
  • U.S. Pat. Nos. 8,903,172; 8,908,995;
  • U.S. Pat. Nos. 8,910,870; 8,910,875;
  • U.S. Pat. Nos. 8,914,290; 8,914,788;
  • U.S. Pat. Nos. 8,915,439; 8,915,444;
  • U.S. Pat. Nos. 8,916,789; 8,918,250;
  • U.S. Pat. Nos. 8,918,564; 8,925,818;
  • U.S. Pat. Nos. 8,939,374; 8,942,480;
  • U.S. Pat. Nos. 8,944,313; 8,944,327;
  • U.S. Pat. Nos. 8,944,332; 8,950,678;
  • U.S. Pat. Nos. 8,967,468; 8,971,346;
  • U.S. Pat. Nos. 8,976,030; 8,976,368;
  • U.S. Pat. Nos. 8,978,981; 8,978,983;
  • U.S. Pat. Nos. 8,978,984; 8,985,456;
  • U.S. Pat. Nos. 8,985,457; 8,985,459;
  • U.S. Pat. Nos. 8,985,461; 8,988,578;
  • U.S. Pat. Nos. 8,988,590; 8,991,704;
  • U.S. Pat. Nos. 8,996,194; 8,996,384;
  • U.S. Pat. Nos. 9,002,641; 9,007,368;
  • U.S. Pat. Nos. 9,010,641; 9,015,513;
  • U.S. Pat. Nos. 9,016,576; 9,022,288;
  • U.S. Pat. Nos. 9,030,964; 9,033,240;
  • U.S. Pat. Nos. 9,033,242; 9,036,054;
  • U.S. Pat. Nos. 9,037,344; 9,038,911;
  • U.S. Pat. Nos. 9,038,915; 9,047,098;
  • U.S. Pat. Nos. 9,047,359; 9,047,420;
  • U.S. Pat. Nos. 9,047,525; 9,047,531;
  • U.S. Pat. Nos. 9,053,055; 9,053,378;
  • U.S. Pat. Nos. 9,053,380; 9,058,526;
  • U.S. Pat. Nos. 9,064,165; 9,064,167;
  • U.S. Pat. Nos. 9,064,168; 9,064,254;
  • U.S. Pat. Nos. 9,066,032; 9,070,032;
  • U.S. Design Pat. No. D716,285;
  • U.S. Design Pat. No. D723,560;
  • U.S. Design Pat. No. D730,357;
  • U.S. Design Pat. No. D730,901;
  • U.S. Design Pat. No. D730,902;
  • U.S. Design Pat. No. D733,112;
  • U.S. Design Pat. No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A system for creating and dynamically updating indoor positioning maps comprising: an augmented reality (AR) device comprising: a display for displaying AR content to a user, the AR content overlapping the AR device's perspective view of an environment; andone or more depth sensors for gathering mapping data of physical objects in the environment;a computing device communicatively coupled to the AR device, the computing device comprising a processor that is configured by software to: dynamically create a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the AR device;create guidance AR content in response to the mapping data of the physical objects in the environment in the 3D indoor positioning map being older than a refresh period, wherein the guidance AR content is created corresponding to the AR device's perspective view of the environment for guidance;transmit the 3D indoor positioning map with the guidance AR content to the display for the guidance to gather updated mapping data; anddynamically update the 3D indoor positioning map of the environment in the building based on the updated mapping data gathered by the AR device when at least the mapping data of the physical objects in the environment in the 3D indoor positioning map is older than the refresh period.
  • 2. The system according to claim 1, wherein the creation of the 3D indoor positioning map and/or the dynamically updating of the 3D indoor positioning map includes utilizing simultaneous location and mapping (SLAM) techniques for positioning, wherein the SLAM techniques utilized for positioning include collecting environmental data at various points within the building, where the environmental data includes all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, and/or combinations thereof, wherein: the RF signal information contains a signal strength, a transmit power, and/or an identifier for an RF source; andthe light contains an intensity, a color and/or a frequency of light.
  • 3. The system according to claim 2, wherein the AR device is configured to record an orientation of the AR device and a location of the AR device with the environmental data, wherein the AR device is further configured to send the environmental data, the recorded location, and the recorded orientation to the computing device, wherein the processor of the computing device is configured to save with the environmental data with the recorded location and the recorded orientation in the 3D indoor positioning map.
  • 4. The system according to claim 2, wherein the computing device includes a remote program for communicating with a remote device with at least one remote sensor for sharing remote data.
  • 5. The system according to claim 4, wherein the computing device may determine the location of the remote device on the 3D indoor positioning map by comparing the remote data shared with the environmental data, where the location of the remote device on the 3D indoor positioning map includes a probability of accuracy based on a number of degrees of freedom of the remote device.
  • 6. The system of claim 1, wherein the AR device is configured to send the missing data and/or the data older than the refresh period collected by the one or more depth sensors, while the AR device is guided through the building, to the computing device and constantly updated in real time.
  • 7. The system of claim 1, wherein the guidance AR content includes visible instructions to rotate around 360 degrees at each location so that an omnidirectional mapping of the environment is recorded at each location, the visible instructions includes an AR visual graphic that shows how to turn and how long to stay at each location configured for an accurate reading to be recorded.
  • 8. The system of claim 1, wherein the processor of the computing device is further configured to: create a two dimensional (2D) view of the indoor positioning map; andshow at least one location of a remote device on the 2D view of the indoor positioning map in real time;wherein the computing device includes a remote program for communicating with the remote device the 2D view of the indoor positioning map with its location.
  • 9. The system according to claim 1, wherein: the one or more depth sensors include an optical 3D scanner; andthe display is a head mounted display (HMD) and comprises a transparent plate that is (i) positioned in front of the user's eye or eyes, allowing the user to view the environment through the transparent plate and (ii) arranged to display AR content to the user's eye or eyes so that the AR content appears superimposed on the user's view of the environment.
  • 10. A method for creating and updating indoor positioning maps, the method comprising: gathering mapping data of physical objects in an environment from at least one depth sensor of an AR device;dynamically creating a three-dimensional (3D) indoor positioning map based on the gathered mapping data from the AR device;creating guidance AR content in response to the mapping data of the physical objects in the environment in the 3D indoor positioning map being older than a refresh period, wherein the guidance AR content is created corresponding to the AR device's perspective view of the environment for guidance;transmitting the 3D indoor positioning map with the guidance AR content to a display in the AR device for the guidance to gather updated mapping data; anddynamically updating the 3D indoor positioning map based on the updated mapping data gathered at least when the mapping data of the physical objects in the environment in the 3D indoor positioning map is older than the refresh period.
  • 11. The method according to claim 10, wherein creating the 3D indoor positioning map and/or dynamically updating the 3D indoor positioning map further includes using simultaneous location and mapping (SLAM) techniques for positioning, including collecting environmental data at various points within a building, where the environmental data includes all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, and/or combinations thereof, wherein: the RF signal information contains a signal strength, transmit power, and/or an identifier for an RF source; andthe light contains an intensity, color and/or frequency of light.
  • 12. The method according to claim 11, wherein collecting environmental data at various points within a building includes recording an orientation and a location of the AR device with the environmental data, and saving the recorded location and the recorded orientation of the AR device with the environmental data in the 3D indoor positioning map.
  • 13. An augmented reality (AR) device, comprising: a display for displaying guidance AR content to a user, the guidance AR content overlapping the AR device's perspective view of an environment; andone or more depth sensors for gathering mapping data of physical objects in the environment; anda processor communicatively coupled to the one or more depth sensors, the processor configured by software to: dynamically create a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the one or more depth sensors; andcreate guidance AR content in response to the mapping data of the physical objects in the environment in the 3D indoor positioning map being older than a refresh period, wherein the guidance AR content is created corresponding to the AR device's perspective view of the environment for guidance;transmit the 3D indoor positioning map with the guidance AR content to the display for the guidance to gather updated mapping data; anddynamically update the 3D indoor positioning map of the environment in the building based on the updated mapping data gathered by the AR device when at least the mapping data of physical objects in the environment in the 3D indoor positioning map is older than the refresh period.
  • 14. The AR device according to claim 13, wherein the creation of the 3D indoor positioning map and/or the dynamically updating of the 3D indoor positioning map includes utilizing simultaneous location and mapping (SLAM) techniques for positioning, wherein the SLAM techniques utilized for positioning include collecting environmental data at various points within a building, where the environmental data includes all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, and/or combinations thereof, wherein: the RF signal information contains a signal strength, a transmit power, and/or an identifier for an RF source; andthe light contains an intensity, a color and/or a frequency of light.
  • 15. The AR device according to claim 14, wherein the processor is configured to record an orientation of the AR device and a location of the AR device with the environmental data wherein the processor is further configured to save the recorded location and the recorded orientation of the AR device with the environmental data in the 3D indoor positioning map.
  • 16. The AR device according to claim 13, wherein: the one or more depth sensors include an optical 3D scanner; andthe display is a head mounted display (HMD) and comprises a transparent plate that is (i) positioned in front of the user's eye or eyes, allowing the user to view the environment through the transparent plate and (ii) arranged to display AR content to the user's eye or eyes so that the AR content appears superimposed on the user's view of the environment.
  • 17. The AR device according to claim 16, wherein the display comprises a liquid crystal display (LCD).
US Referenced Citations (412)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Marlton et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Suzhou et al. Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8736909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Bremer et al. Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
8965460 Rao Feb 2015 B1
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9582516 McKinnon Feb 2017 B2
9607401 Roumeliotis Mar 2017 B2
20070063048 Havens et al. Mar 2007 A1
20090134221 Zhu et al. May 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Galant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130282345 McCulloch et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Corcoran Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342564 Kinnebrew et al. Dec 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Li et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140121438 Kearney May 2014 A1
20140121445 Ding et al. May 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140315570 Yun Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071818 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150094952 Moeglein Apr 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chang et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150235447 Abovitz Aug 2015 A1
20160035246 Curtis Feb 2016 A1
20160140868 Lovett May 2016 A1
20160210785 Balachandreswaran Jul 2016 A1
20160307328 Moeglein Oct 2016 A1
20160349509 Lanier Dec 2016 A1
20170035645 Lydecker Feb 2017 A1
Foreign Referenced Citations (4)
Number Date Country
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (78)
Entry
S. Ito, F. Endres, M. Kuderer, G. Tipaldi, C. Stachniss and W. Burgard, “W-RGB-D: Floor-Plan-Based Indoor Global Localization Using a Depth Camera and WiFi,”, in Proc. 2014 IEEE International Conference on Robotics & Automation, May-Jun. 2014, pp. 417-422.
J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani and M. Ivkovic, “Augmented reality technologies, systems and applications,” Multimed Tools Appl (2011) 51: 341.
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed Aug. 19, 2014 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages.
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages.
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages.
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages.
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages.
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages.
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages.
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages.
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages.
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 14(748,446 for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages.
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages.
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages.
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages.
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Bian et al.); 22 pages.
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et al.); 23 pages.
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages.
U.S. Appl. No. 14/568,305 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages.
Extended Search Report in counterpart European Application No. 16194998.7 dated Apr. 10, 2017, pp. 1-8.
Mirowski et al., “Depth camera Slam on a low-cost WiFi mapping robot”, Technologies for Practical Robot Applications (TEPRA), 2012, IEEE International Conference on, IEEE, Apr. 23, 2012, pp. 1-6 [Cited in Search Report].
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages.
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages.
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages.
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages.
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages.
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages.
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages.
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages.
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 14 pages.
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages.
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages.
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages.
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages.
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages.
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages.
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages.
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 14 pages.
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using a/Ildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages.
U.S. Appl. No. 14/687,289 for System for Communication via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Related Publications (1)
Number Date Country
20170124396 A1 May 2017 US