The present invention relates to indoor positioning maps and more specifically, to a method of dynamically creating and updating indoor positioning maps utilizing augmented reality (AR) or an AR device.
Mixed reality refers to the merging of real and virtual (i.e., computer generated) worlds. Augmented reality (AR) lies within the spectrum of mixed reality experiences. The use of AR devices is becoming more prevalent. AR devices are typically worn on a user's head and are used to display information that augments the user's visual experience. The AR experience is created by presenting AR content (e.g., text, graphics, images, etc.) that overlay the user's field of view (FOV). This AR content is typically positioned so that it lends context to things (e.g., objects, people, etc.) within the user's immediate environment. When used in the workplace, a worker may use this information to analyze/understand their environment, leading to enhanced productivity and effectiveness.
However, the AR experience and AR devices are not known for dynamically creating and/or updating indoor positioning maps.
Generally speaking, there are several different approaches to indoor positioning, which require varying degrees of infrastructure capital expenditure. Some of the less expensive systems use known locations of RF anchor nodes such as Bluetooth low energy beacons, WiFi access points and even magnetic fields to calculate position. With these types of systems a mapping of RF signal strengths at different locations within a building needs to be created. This can be time consuming and subject to environmental changes that affect the map's accuracy over time.
Therefore, a need exists for a method and system that dynamically creates and updates indoor positioning mapping utilizing AR devices.
Accordingly, in one aspect, the present invention embraces a system for dynamically creating and/or updating indoor positioning maps. The system may generally include an augmented reality (AR) device and a computing device communicatively coupled to the AR device. The AR device has a display for displaying AR content to a user that overlaps the AR device's perspective view of an environment. The AR device also has one or more depth sensors for gathering mapping data of physical objects in the environment. The computing device may have a processor that is configured by software to dynamically create and/or update a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the AR device.
In an exemplary embodiment, the dynamic creation and/or updating of the 3D indoor positioning map may include utilizing simultaneous location and mapping (SLAM) techniques for positioning within the 3D map. The SLAM techniques utilized for positioning may include collecting available environmental data at various points within the building. The environmental data collected may include any available environmental data, including, but not limited to, all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device at the time of each collection. For example, the RF signal information collected may contain signal strength, transmit power, and/or an identifier for the RF source. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device may be recorded for each location where environmental data may be collected. The collected environmental data may then be sent to the computing device and saved with the specific location and orientation in the 3D indoor positioning map.
In another exemplary embodiment, the computing device may include a remote program for communicating with a remote device. The remote device may be any remote device with at least one remote sensor for collecting and sharing remote data. The computing device may determine the location of the remote device on the 3D indoor positioning map by comparing the remote data with the recorded environmental data. The location of the remote device on the 3D indoor positioning map may include a probability of accuracy based on a number of degrees of freedom of the remote device.
In another exemplary embodiment, the AR device may be configured to provide guidance to provide missing data and/or data older than a defined refresh period. For guidance, the processor of the AR device may be configured to create guidance AR content corresponding to the AR device's perspective view of the environment for guidance, and transmit the 3D indoor positioning map with the guidance AR content to the display for guidance. The missing data and/or data older than a defined refresh period collected by the depth sensors while the AR device is guided through the building may be sent to the computing device and constantly updated in real time. The guidance AR content may include visible instructions to rotate around 360 degrees at each location so that an omni-directional mapping of the environment is recorded at each location. As an example, the visible instructions may include an AR visual graphic that shows how to turn and how long to stay at each location configured for an accurate reading to be recorded.
In another exemplary embodiment, the processor of the computing device may be further configured to create a two dimensional (2D) view of the indoor positioning map, and show at least one location of a remote device on the 2D view of the indoor positioning map in real time. The computing device may include a remote program for communicating with the remote device the 2D view of the indoor positioning map with its location.
In another exemplary embodiment, the at least one depth sensor may include an optical 3D scanner.
In another exemplary embodiment, the display may be a head mounted display (HMD). The HMD may comprise a transparent plate that may be positioned in front of the user's eye or eyes, allowing the user to view the environment through the transparent plate. The transparent plate may also be arranged to display AR content to the user's eye or eyes so that the AR content appears superimposed on the user's view of the environment.
In another aspect, the present invention embraces a method for dynamically creating and/or updating indoor positioning maps. The method may generally include the steps of: collecting position information from at least one depth sensor of an AR device; dynamically creating a three-dimensional (3D) indoor positioning map based on the received position information from the AR device; and/or dynamically updating the 3D indoor positioning map based on the received position information from the AR device.
In an exemplary embodiment, the steps of creating the 3D indoor positioning map and/or dynamically updating the 3D indoor positioning map may include positioning the AR device using simultaneous location and mapping (SLAM) techniques. The SLAM techniques used for positioning may include, but are not limited to, collecting any available environmental data at various points within the building. The environmental data may include all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device at the time of each collection. For example, the RF signal information collected may contain signal strength, transmit power, and/or an identifier for the RF source. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device may be recorded at each location where environmental data may be collected. The collected environmental data may then be sent to the computing device and saved with the specific location and orientation in the 3D indoor positioning map.
In another exemplary embodiment, the method for creating and/or updating indoor positioning maps may further include the step of guiding the AR device to provide missing data and/or data older than a defined refresh period. The step of guiding may include creating guidance AR content corresponding to the AR device's perspective view of the environment for guidance, and transmitting the 3D indoor positioning map with the created guidance AR content to the display for guidance.
In another aspect, the present invention embraces an augmented reality (AR) device. The AR device generally includes a display, one or more depth sensors, and a processor. The display may display guidance AR content to a user that may overlap the AR device's perspective view of an environment. The one or more depth sensors may gather mapping data of physical objects in the environment. The processor may be communicatively coupled to the one or more depth sensors. The processor may be configured by software to: dynamically create and/or update a three-dimensional (3D) indoor positioning map of the environment in a building based on the mapping data gathered by the AR device.
In an exemplary embodiment, the AR device may use simultaneous location and mapping (SLAM) techniques for the positioning of the AR device in the dynamic creation and/or updating of the 3D indoor positioning map. The SLAM techniques used for positioning may include, but are not limited to, collecting any available environmental data at various points within the building. The environmental data may include all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device at the time of each collection. For example, the RF signal information collected may contain a signal strength, transmit power, and/or an identifier for the RF source. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device may be recorded at each location where environmental data may be collected. The collected environmental data may then be sent to the computing device and saved with the specific location and orientation in the 3D indoor positioning map.
In another exemplary embodiment, the AR device may be configured to provide guidance to provide missing data and/or data older than a defined refresh period. In this guidance embodiment, the processor of the AR device may be configured to create guidance AR content corresponding to the AR device's perspective view of the environment for guidance, and transmit the 3D indoor positioning map with the guidance AR content to the display for guidance.
In another exemplary embodiment, the at least one depth sensor of the AR device may include an optical 3D scanner.
In another exemplary embodiment, the display of the AR device may be a head mounted display (HMD). The HMD may include a transparent plate that may be positioned in front of the user's eye or eyes, allowing the user to view the environment through the transparent plate. The transparent plate may also be arranged to display AR content to the user's eye or eyes so that the AR content appears superimposed on the user's view of the environment.
In yet another exemplary embodiment, the display of the AR device may include a liquid crystal display (LCD).
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
The present invention embraces a system, method, and device that utilize augmented reality (AR) and AR devices for creating and/or updating indoor positioning maps. AR systems allow a user to view and (in some cases) interact with an enhanced version of the physical world. AR systems combine a user's perspective view of the physical world (i.e., the user's environment) with virtual objects. The virtual objects may be overlaid and positioned within the user's perspective view to provide contextually relevant information.
Virtual objects may include graphics or text and may presented in two dimensions (2D) and/or three dimensions (3D). The virtual objects (i.e., AR content) are continually updated (e.g., real time) to correspond with a user's change in perspective. As such, AR systems typically include body-worn cameras/displays (e.g., head mounted display) or hand-held cameras/displays (e.g., smartphone, tablet, etc.).
A head mounted display (HMD) may be part of an AR system. One possible HMD type is the video see-through HMD. Here, the environment is presented as a video stream to the user via a display (e.g., a liquid crystal display). Another possible HMD type is the optical see-through HMD (e.g., smart glasses), wherein the user looks through a transparent plate. The transparent plate is configured to display AR content so the AR content is overlaid with the user's perspective view of the environment.
An exemplary AR device is shown in
The AR content 15 may change in response to movement of the AR device 12 within the environment (i.e., position). These changes typically occur in real time allowing a user to move freely while the AR content 15 updates appropriately to match changes in the user's perspective.
Tracking of the AR device's position/orientation is required to update the AR content 15 appropriately. Tracking may utilize one or more sensors to determine the user's position/orientation. For example, inertial measurement sensors (e.g., gyroscope, accelerometer, magnetometer, etc.) may facilitate tracking. In addition, tracking may also utilize depth sensors.
Depth sensing may be used to create range images of the AR system's perspective. Range images are images with pixel values corresponding to the range between the AR system and points within the AR system's field of view.
Depth sensors (i.e., range cameras) may produce these range images using one of several possible techniques (e.g., stereo triangulation, sheet of light triangulation, structured light, time of flight, interferometry, coded aperture, etc.). Structure light depth sensors, for example, illuminate an environment with a specially designed light pattern (e.g., points, checkerboard, lines, etc.). The reflected light pattern is compared to a reference pattern to obtain a range image.
AR systems may include a camera to help tracking and mapping. This camera (e.g., CCD camera, CMOS camera, etc.) is typically aligned with the perspective view of the user. The images captured by the camera may be processed by processors running algorithms (such as simultaneous localization and mapping (SLAM)) to track and map. SLAM algorithms may aid in the creation of maps (i.e., models) of the environment, which include the locations of physical objects and/or light sources in the environment.
Detecting light sources for mapping may be accomplished using the camera or by using one of a variety of possible photo sensor types (e.g., photodiodes, phototransistors, etc.). For example, light levels measured by the light sensor (e.g., camera, photo sensor, etc.) may be compared to a threshold as part of a light-source detection process.
However, the AR experience and AR devices are not known to be utilized for dynamically creating and/or updating indoor positioning maps. Therefore, the instant disclosure recognizes the need for systems, methods, and devices that dynamically creates and/or updates indoor positioning mapping utilizing such AR devices.
Referring to
The dynamic creation and/or updating of the 3D indoor positioning map may include utilizing simultaneous location and mapping (SLAM) techniques for positioning of the AR device 12 in the environment 4 while dynamically creating and/or updating the 3D indoor positioning map. The SLAM techniques utilized for positioning may include, but are not limited to, collecting available or perceivable environmental data at various points within the building of environment 4. The environmental data collected may include any available environmental data, including, but not limited to, all available RF signals, magnetic fields, lighting conditions, GPS, three dimensional imagery, ambient sound, the like, and/or combinations thereof. In addition, the environmental data collected may include strength and/or intensities associated with each collection and an orientation of the AR device 12 at the time of each collection. Recording the orientation of the device may be important, as the signal strengths may change depending on the direction AR device 12 is facing. For example, the RF signal information collected may contain signal strength and an identifier (e.g. MAC address) for the RF source. The RF signal information collected might also include the transmit power, which may be needed for signal strength multilateration. As another example, the light collected may contain an intensity, color, and/or frequency of light. The orientation of the AR device 12 may be recorded for each location where environmental data may be collected. The collected environmental data may then be sent to the computing device 18 and saved with the specific location and orientation in the 3D indoor positioning map.
The computing device 18 may be any computing device like a processor, server, and or combinations thereof that is in communication with AR device 12 for dynamically creating/updating 3D indoor positioning maps. Computing device 18 may be remote to AR device 12 and/or coupled with AR device 12.
The computing device 18 may include a remote program for communicating with a remote device. The remote device may be any remote device with at least one remote sensor for collecting remote data, including collecting environmental data for utilizing SLAM techniques. This remote program of the computing device 18 could then allow less capable devices (i.e. less capable than AR device 12) to share environmental data to aid in dynamically creating/updating the 3D indoor positioning map. The computing device may determine the location of the remote device on the 3D indoor positioning map by comparing the remote data with the recorded environmental data. In one embodiment, the location of the remote device on the 3D indoor positioning map may include a probability of accuracy based on a number of degrees of freedom of the remote device.
Referring to
The processor of computing device 18 may be further configured to create a two dimensional (2D) view of the indoor positioning map, and show at least one location of the remote device on the 2D view of the indoor positioning map in real time. The remote program may then be for communicating with the remote device the 2D view of the indoor positioning map with its location. This 2D feature of the disclosure may show a device where they are positioned at any given moment, as well as to show an administrator where all their assets were at any given moment.
Referring again to
Steps 22 and 23 of creating the 3D indoor positioning map and/or dynamically updating the 3D indoor positioning map may include positioning AR device 12 in the 3D indoor positioning map using SLAM techniques. The SLAM techniques utilized for positioning may be any of the SLAM techniques as known and/or described herein.
The method for creating and/or updating indoor positioning maps may further include step 24 of guiding the AR device to provide missing data and/or data older than a defined refresh period. Step 24 of guiding may include step 25 of creating guidance AR content 15 corresponding to the AR device's perspective view 3 of the environment 4 for guidance, and step 26 of transmitting the 3D indoor positioning map with the created guidance AR content 15 to the display 14 for guidance.
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
Number | Name | Date | Kind |
---|---|---|---|
6832725 | Gardiner et al. | Dec 2004 | B2 |
7128266 | Marlton et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Suzhou et al. | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Van Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Horn et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van Horn et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8736909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van Horn et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Bremer et al. | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
8965460 | Rao | Feb 2015 | B1 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | Akel et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
9582516 | McKinnon | Feb 2017 | B2 |
9607401 | Roumeliotis | Mar 2017 | B2 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20120111946 | Galant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130282345 | McCulloch et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedraro | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Corcoran | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130342564 | Kinnebrew et al. | Dec 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140034734 | Sauerwein | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140100813 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Li et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140121438 | Kearney | May 2014 | A1 |
20140121445 | Ding et al. | May 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197238 | Lui et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140315570 | Yun | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van Horn et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150071818 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150094952 | Moeglein | Apr 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150169925 | Chang et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150235447 | Abovitz | Aug 2015 | A1 |
20160035246 | Curtis | Feb 2016 | A1 |
20160140868 | Lovett | May 2016 | A1 |
20160210785 | Balachandreswaran | Jul 2016 | A1 |
20160307328 | Moeglein | Oct 2016 | A1 |
20160349509 | Lanier | Dec 2016 | A1 |
20170035645 | Lydecker | Feb 2017 | A1 |
Number | Date | Country |
---|---|---|
2013163789 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
Entry |
---|
S. Ito, F. Endres, M. Kuderer, G. Tipaldi, C. Stachniss and W. Burgard, “W-RGB-D: Floor-Plan-Based Indoor Global Localization Using a Depth Camera and WiFi,”, in Proc. 2014 IEEE International Conference on Robotics & Automation, May-Jun. 2014, pp. 417-422. |
J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani and M. Ivkovic, “Augmented reality technologies, systems and applications,” Multimed Tools Appl (2011) 51: 341. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed Aug. 19, 2014 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages. |
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages. |
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages. |
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages. |
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages. |
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages. |
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages. |
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages. |
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages. |
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages. |
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages. |
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages. |
U.S. Appl. No. 14(748,446 for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages. |
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages. |
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages. |
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages. |
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Bian et al.); 22 pages. |
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et al.); 23 pages. |
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages. |
U.S. Appl. No. 14/568,305 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages. |
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages. |
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages. |
Extended Search Report in counterpart European Application No. 16194998.7 dated Apr. 10, 2017, pp. 1-8. |
Mirowski et al., “Depth camera Slam on a low-cost WiFi mapping robot”, Technologies for Practical Robot Applications (TEPRA), 2012, IEEE International Conference on, IEEE, Apr. 23, 2012, pp. 1-6 [Cited in Search Report]. |
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages. |
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages. |
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages. |
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages. |
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages. |
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages. |
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages. |
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages. |
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages. |
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages. |
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 14 pages. |
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages. |
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages. |
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages. |
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages. |
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages. |
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages. |
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages. |
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages. |
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages. |
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages. |
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages. |
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages. |
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages. |
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages. |
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages. |
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages. |
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 14 pages. |
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages. |
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages. |
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using a/Ildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages. |
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages. |
U.S. Appl. No. 14/687,289 for System for Communication via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages. |
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages. |
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages. |
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages. |
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages. |
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages. |
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages. |
Number | Date | Country | |
---|---|---|---|
20170124396 A1 | May 2017 | US |