Three dimensional aimer for barcode scanning

Information

  • Patent Grant
  • 9785814
  • Patent Number
    9,785,814
  • Date Filed
    Friday, September 23, 2016
    8 years ago
  • Date Issued
    Tuesday, October 10, 2017
    7 years ago
Abstract
A method of assisting in focusing a three dimensional camera system on an object within a field of view is disclosed. The process involves at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane; and rendering to a display, an aimer graphic element with the Z direction distance equal to D in a manner that causes the aimer graphic element to move in the Z direction with changes in the focal plane.
Description
FIELD OF THE INVENTION

The present invention relates to barcode and QR readers utilizing 3D camera and rendering technology.


BACKGROUND

Barcode scanning on smart devices can be an effective way to scan a barcode. However the optics on these systems are usually optimized for photography and rely on an autofocus routine to get the image into focus. When scanning barcodes, a slightly out of focus image combined with excessive motion can result in an image that is too blurry to decode. It is therefore desirable to have a good focus to assure accuracy of reading the barcode.


SUMMARY

Accordingly, in one aspect, certain embodiments consistent with the present disclosure relate to a method of assisting in focusing a camera system on an object within a field of view involves: at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane; and rendering to a display, an aimer graphic element with a Z direction distance equal to D in a manner that causes the aimer graphic element to move in the Z direction with changes in the focal plane.


In accord with certain example embodiments, the method further involves determining if the object in the field of view is within a depth of field distance ΔD about the distance D, and if so, modifying the rendering of the aimer graphic element in a manner that signifies that the object is in focus. In accord with certain example embodiments, the method further involves determining if the object in the field of view is closer than the focal plane, and if so, modifying the rendering of the aimer graphic element in a manner that signifies that the aimer graphic element is behind the object. In accord with certain example embodiments, the aimer graphic element is rendered in a first color if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and the aimer object is rendered in a second color if the object is within the depth of field distance ΔD about the distance D. In accord with certain example embodiments, the aimer graphic element is rendered in a first manner if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and the aimer object is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D.


In accord with certain example embodiments, the aimer graphic element is rendered in a third manner if the distance D is further from the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and the aimer graphic element is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D. In accord with certain example embodiments, the aimer graphic element is rendered in a first manner if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and the aimer graphic element is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D; and the aimer graphic element is rendered in a third manner if the distance D is further from the camera than the object and if the object is outside the depth of field distance ΔD about the distance D.


In accord with certain example embodiments, the rendering in the first manner comprises selecting a first aimer graphic; the rendering in the second manner comprises selecting a second aimer graphic; and the rendering in the third manner comprises selecting a third aimer graphic. In accord with certain example embodiments, the distance D is determined by either querying a depth sensor or querying an autofocus system for a current focal depth. In accord with certain example embodiments, the camera system forms a part of an augmented reality headset having a programmed processor that carries out the rendering to a binocular display.


In another example embodiment consistent with the present teachings, a method of assisting in focusing a camera system on an object within a field of view involves: at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane; determining if the object in the field of view is within a depth of field distance ΔD about the distance D, and: if so, then rendering the aimer graphic with the Z direction distance equal to D to a display in a manner that signifies that the object is in focus, and if not then rendering the aimer graphic with a Z direction distance equal to D to the display in a manner that signifies that the object is not in focus.


In accord with certain example embodiments, the method further involves determining if the object in the field of view is closer than the focal plane, and if so, rendering the aimer graphic in a manner that signifies that the aimer graphic is behind the object. In accord with certain example embodiments, the aimer graphic is rendered in a first color if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and the aimer object is rendered in a second color if the object is within the depth of field distance ΔD about the distance D. In accord with certain example embodiments, the distance D is determined by either querying a depth sensor or querying an autofocus system for a current focal depth. In accord with certain example embodiments, the camera system forms a part of an augmented reality headset having a programmed processor that carries out the rendering to a binocular display.


In yet another example embodiment, a method of assisting in focusing a camera system on an object within a field of view involves: at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane; rendering to a display, an aimer graphic with the Z direction distance equal to D in a manner that causes the aimer graphic to move in the Z direction with changes in the focal plane; where the aimer graphic is rendered in a first manner if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; where the aimer graphic is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D; and where the aimer graphic is rendered in a third manner if the distance D is further from the camera than the object and if the object is outside the depth of field distance ΔD about the distance D.


In accord with certain example embodiments, if the object in the field of view is closer than the focal plane, the aimer graphic is rendered in a manner that signifies that view of the aimer graphic is occluded by the object. In accord with certain example embodiments, the aimer graphic element is rendered in a first color if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and the aimer object is rendered in a second color if the object is within the depth of field distance ΔD about the distance D. In accord with certain example embodiments, the rendering in the first manner comprises selecting a first aimer graphic; the rendering in the second manner comprises selecting a second aimer graphic; and the rendering in the third manner comprises selecting a third aimer graphic. In accord with certain example embodiments, the distance D is determined by either querying a depth sensor or querying an autofocus system for a current focal depth.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 depicts an example bar code reader system consistent with certain embodiments of the present disclosure.



FIGS. 2-4 depict aimer objects as rendered at three different focal plane distances in a manner consistent with certain embodiments of the present disclosure.



FIGS. 5-9 represent a plurality of images on the left which represent a user's view in a system such as is disclosed with varying relationships between an object in the field of view and the aimer graphic 40. On the right is a graphical representation of the distance D and its relationship to object 30.



FIG. 10 is a flow chart of a first process consistent with example embodiments of the present teachings.



FIG. 11 is a flow chart of a further process consistent with example embodiments of the present teachings.





DETAILED DESCRIPTION

The present invention embraces a method for aligning a barcode to be scanned within the viewfinder in an Augmented Reality (AR) display device such as an AR headset, where a 3D (three-dimensional) aimer graphic operates as a visual cue that is rendered in the Field of View (FOV) of the user, showing where a camera is focusing. The 3D aimer graphic appears closer or moves away from the user based on the focal depth of the camera. With the help of the aimer graphic the user can more quickly understand where the camera is focusing. This can permit the user to bring a barcode or QR code or the like to be scanned into the focal plane of the camera for a quicker read operation.


On these types of systems an aimer graphic is sometimes used to help the user align the barcode within the viewfinder. This aimer graphic may be placed in an area of the image (e.g. center) where the decode algorithm is tuned to start its search for the barcode resulting in a shorter time to read. These types of visual cues can help to ensure a timely barcode scan. This disclosure describes a three dimensional aimer graphic that is particularly useful for a binocular augmented reality headset (e.g. Microsoft Hololens™) or other device that is capable of rendering three dimensional graphics in the user's field of view. The 3D aimer graphic leads to a quicker time to read by moving in and out (closer and further away from the user) as the camera focus is adjusted. This visual cue helps the user understand where the camera is currently focusing so that they may bring the barcode into the current focal plane.


For purposes of this document, the term “augmented reality” refers to any technology that can superimpose a computer-generated graphic such as an aimer graphic into a user's view or a 3D display view. Microsoft Corporation uses the term “mixed reality” in relation to their Hololens™ AR technology, but for purposes of this document, the terms are considered equivalent. The term “aimer graphic” or “aimer graphic element” or “aimer object” or the like is used to mean a graphic object that is computer generated and placed in a user's field of view in an augmented reality system to assist the user in the process of getting an object into focus.


When an aimer object is said to be rendered in a particular manner, it means that the aimer object is rendered with a particular color, a particular shape or with particular attributes associated with the relative position of the aimer object with respect to an object in view.


Binocular augmented reality headsets such as the Microsoft Hololens™ headset have the ability to render three dimensional graphics in the user's field of view so that it appears as if the objects are actually in the room with them. This is done by rendering two slightly different views of a scene, one for each of the user's eyes. This induced parallax tricks the brain into thinking the rendered objects are three dimensional.


In accord with this disclosure, an aimer object is rendered as if it existed at the current focal plane of the camera system. When the camera is currently focusing in the near field, the aimer object appears larger and closer to the user. When the camera is focused in the far field, the aimer object appears smaller and farther away from the user. The camera focus can be adjusted by the camera's autofocus routine or set at a fixed focal depth, but either way the aimer object moves to reflect the current plane of focus. The user can adjust the focus or the relative positions of the camera and target object to achieve correct focus.


In certain embodiments consistent with the present disclosure, technology similar to the commercially available SwiftDecoder™ Mobile barcode scanning software can be implemented on the Microsoft Hololens™ headset. This arrangement provides access to controlling the camera and the ability to query the current focal depth D of the camera. In operation, the Application Program Interface (API) can be repeatedly queried for the current focal depth of the camera. The aimer object is rendered/moved in 3D space to a Z direction position at this distance D from the camera, which exists at approximately the same depth as the display. If the plane of the display and camera were different the renderings are adjusted accordingly. The X (left/right) and Y (up/down) dimensions of the aimer object remain the same, but the Z (in/out) dimension is altered so that the graphical representation of the aimer object appears to be at the same focal depth as the camera. This would allow the user to easily see where the camera is currently focused and present the barcode on this plane.


Referring now to FIG. 1, a system that is arranged to carry out processes consistent with the present teachings is depicted in block diagram form. This system includes a programmed processor 20 that carries out control operations in a manner consistent with the present discussion in accord with programming instructions stored in a memory device (not separately shown). The processor 20 controls a camera 24 which captures an image in a field of view depicted by the dashed lines.


The image captured by camera system 24 may be a 3D binocular image in the case of camera system 24 being embodied as a 3D camera. But in other embodiments, the camera system 24, may utilize a two dimensional sensor (such as a CCD sensor). Since many autofocus routines work by adjusting the focus of an image until a maximum contrast in the image is reached, such systems have no appreciation of depth. In such cases, a depth sensor can be used to detect objects in the field of view. Depth sensors can also be used in conjunction with a 3D camera without limitation. Those skilled in the art will recognize many variations upon consideration of the present teachings.


The distance D is a distance from the camera's focus plane (hereinafter, just the camera) to a focal plane (shown by the dotted line) in the Z direction. In the illustration of FIG. 1, the forward-most surface of an object 30 is depicted within the field of view.


Processor 20 is in communication with the binocular camera 24 to obtain information regarding the current focus distance D or data that are related to the current focus distance from the camera 24. Based upon this information, the processor knows or can calculate the value of D and using a graphic rendering process 34 (either operating on processor 20 or on a separate graphics processing engine), renders an aimer graphic in an AR/3D display system 38 (such as the viewer of an AR headset or other 3D display) at a depth of Z=D for the current value of D.


It is noted that there is a distance about D shown as ΔD in FIG. 1 representing the depth of field for the current focal distance D. This value of ΔD can be determined from the camera's optics, D, lens aperture and tolerance of decoding algorithms. For purposes of this teaching, the value of Δ can be rigorously determined, approximated or set to a fixed distance or fixed proportion of D. In any case, when an object is approximately situated at D within the window defined by ΔD, the object can be considered to be in focus.


In FIG. 1, the processing may all take place within a system such as an AR headset. In the case of certain AR headsets, depth sensors are used which can be directly queried to determine a Z-direction distance D to an object. Such headsets may additionally incorporate a network interface 26 that can be used to communicate data (such as information about objects scanned) via a network 28 for storage on a server 32. In other embodiments, the server 32 can assist or fully control the processing discussed herein without limitation.


Referring to FIG. 2, an example aimer graphic 40 is depicted which is rendered as a three dimensional graphic with picture elements represented by X and Y coordinates that construct a circle intersected at 0, 90, 180 and 270 degrees by line segments (as an illustrative example). The Z coordinate is shown graphically to the right of aimer graphic 40, with the scale in the Z direction also being shown in FIGS. 3 and 4. In FIG. 2, the focal plane is at distance D, which as shown represents a relatively large distance from the camera. At this distance, the rendering of the aimer graphic 40 to the user appears small and distant (at distance Z=D) from the user.


Referring now to FIG. 3, distance D to the focal plane is now closer than in FIG. 2. The aimer object 40 appears larger and closer to the user (at distance Z=D) since the Z value of the aimer graphic object's picture elements represents a closer distance to the user. Hence the aimer object 40 is being rendered in a manner that appears closer to the user.


Referring now to FIG. 4, the distance D to the focal plane is closer still than either that depicted in FIG. 2 or FIG. 3. Hence, the aimer object 40 appears even larger and closer (at distance Z=D) as rendered by the graphic rendering process with the Z value being the closest to the camera lens of the three examples of FIGS. 2-4.


While the graphic object 40 as depicted is shown to be a circle with lines crossing the circle at 0, 90, 180 and 270 degrees, this is not to be considered limiting. The graphic object can be rendered as any suitable graphic and rendered with varying attributes in any manner. So, rendering a graphic in a particular manner may be interpreted as relating to the graphic's color, shape, or other attributes. Additionally, multiple graphic objects can be used as an aimer graphic depending upon various circumstances such as distance, Z direction relationship to objects in the field of view and proximity to focal plan, etc. Moreover, the color, shape and other attributes of the aimer object 40 can be manipulated to produce more information for the user as will be described. It is further noted that the size of the aimer object 40 is not indicative that an object is in focus. Different size aimer graphics 40 can be scaled for distance when rendered, or the size can be adjusted by use of multiple aimer graphics 40 with the currently displayed aimer graphic 40 being selected based on the distance D.



FIGS. 5-9 represent a plurality of images in which the left represents a user's view in a system such as is disclosed with varying relationships between an object in the field of view and the aimer graphic 40. On the right in each image is a graphical representation of the distance D and the camera's relationship to object 30. In each of these illustrations, the image on the left represents an example of a user's view of the forward-most surface (the ‘face’) of object 30 (which might be a package or product for example) that contains a QR code 44 (or barcode, or other graphic symbol) that the system is attempting to recognize. QR code 44 is depicted to the right of center for clarity, but may be anywhere within the field of view of the camera.


First considering FIG. 5, it is noted that the distance D representing the distance from camera lens to focal plane is relatively close to the camera with the object 30 being more distant from the camera. Accordingly, the aimer graphic appears large and the object 30 is in the background. (It is noted that the face of object 30 in this case (as well as others in the examples shown) is out of focus, but rendered in-focus in this and other figures for convenience.)



FIG. 6 represents an example of what the user will see when the face of object 30 is at the same distance as in FIG. 5, but the camera is now focused on the face of the object 30. In this instance, the aimer object appears smaller and more distant from the camera than in FIG. 5 because the focal plane is farther away. Since object 30 has not moved, it appears to be the same size and at the same distance as in FIG. 5.


In the example of FIG. 6, the aimer object 40 is shown to be enhanced by now containing two concentric circles and eight radial lines. Such enhancement in this example signifies that the object 30 is in focus (i.e., within the window ΔD). Additionally, the graphic process may render other indicators of focus such as the word “FOCUSED” in the display at 48. These are disclosed as merely illustrative of several ways in which the aimer object is enhanced to clearly indicate to the user that the main object in the display is in focus. In other examples, the aimer object 40 of FIG. 5 may be rendered in one color, (e.g. red) to signify that the object 30 is not in focus, and the color can change (e.g., to green) signifying that the object 30 is in focus. Such modifications to the aimer graphic and others will be apparent as mechanisms to enhance the information conveyed to the user to indicate that the object 30 is with the range ΔD. Those skilled in the art will recognize other techniques can be used upon consideration of the present teachings.


Proceeding to FIG. 7, this illustration depicts an example of what a user might see when the object 30 is in the foreground, but the focal plane is at distance D which is behind the object 30. In this instance, in accord with certain example embodiments, the aimer graphic 40 would be situated at Z=D which is behind object 30 and thus, in certain embodiments, the view of the aimer graphic 40 is occluded by object 30. In certain embodiments the aimer object 40 can be rendered in dashed lines, lighter color or otherwise to signify that the Z direction location of the aimer graphic is behind the object 30. This signifies to the user that the focus of the camera is on a distant focal plane behind the object 30. The user can thus either adjust the focus or move the object 30 further back.



FIG. 8 represents a scenario in which Object 30 is situated at the focal plane, but the focal plane is relatively distant. In this instance, the object 30 appears relatively small and distant as does the aimer object 40. In this case, the aimer object may also be enhanced by color, graphic change, animation or otherwise signify that the object is in focus, but such further enhancements are not shown in this illustration. It is also possible to simply have the aimer graphic 40 not be rendered if fully occluded by the object 30. It is further possible for the image of the aimer graphic to be illustrated as partially occluded if the object is only partially in front of the aimer graphic. Many other variations will occur to those skilled in the art upon consideration of the present teachings.


Finally, FIG. 9 shows a scenario in which the object 30 is relatively close and is situated at the focal plane. In this example, the object 30 appears close and thus larger to the user as does the aimer object 40. Again, in this example, the aimer object may be enhanced by color, graphic change, animation or any other attribute that can be modified to signify that the object is in focus. Such enhancements are not shown in this illustration.


In the present discussion, the aimer object may be said to be “modified” or “changed” under various circumstances. In this context, the term “modify” or the like may actually be implemented as a complete substitution of one graphic object for another to accomplish the prescribed rendering. For example, when an object is out of focus, the aimer object may have one appearance and when focus is achieved, a completely different aimer object with different attributes may be substituted therefor. The user sees that the aimer object has changed and may perceive this change as a modification or change. In this context, the term “modify” can be used to describe an attribute change (e.g., color or intensity) or could be used to signify that a different aimer graphic has been completely substituted for the prior aimer graphic and should be broadly construed without regard for the mechanics of implementation.


Turning attention now to FIG. 10, an example process 100 is depicted consistent with the present teachings starting at 104 after which a distance to the focal plane D of the camera (the focal distance) is determined at 108. This can be carried out, for example by query of a depth sensor or by knowledge of lens positioning in an autofocus system. Once this distance D is known, the process proceeds to 112 where the graphics processing generates an appropriate rendering of the aiming graphic at a depth of Z=D. Control then returns to 108, where the process is repeated over and over to maintain the aimer graphic at the focal distance D. Of course, many variations are possible with this basic process such as the variations depicted in FIG. 11.



FIG. 11 illustrates an example process 200 starting at 202 after which the distance to the focal plane D is determined at 108 after which the system prepares to render the aiming graphic in an unmodified form at 206 by setting the depth of the aimer graphic rendering to Z=D. However, in this example embodiment, prior to actually rendering the object, its position with respect to objects that are in the field of view is tested. In this example, the aimer graphic is considered unmodified if it indicates that the focal plane is in front of any object in view.


At 210, the camera first determines if the object is within the focus window distance ΔD. If so, then the aiming graphic is modified (or selected) to have attributes that depict an in-focus object and that modified aimer graphic is rendered at 214. If the object is not within the focus depth at 210, then the process determines if the object is closer than the focus depth at 218. If so, then the aimer graphic is modified (or selected) so as to depict that the aimer graphic is fully or partially occluded by the object in the foreground at 222.


If the distance D is neither at the object nor behind the object at 208 or 218, then the unmodified aimer graphic is rendered at 226. In all cases, when the rendering is done, the process returns to 108 so as to continuously (repeatedly) adjust the rendering of the aimer graphic to correctly depict the relative Z-direction (depth) of the aimer graphic with respect to an object or objects in the field of view.


Thus, a method of assisting in focusing a three dimensional camera system on an object within a field of view involves: at the camera system, determining a distance D, within the field of view, to a current focal plane; and rendering to a display, an aimer graphic element with a Z direction distance equal to D in a manner that causes the aimer graphic element to move in the Z direction with changes in the focal plane.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266;
  • U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,127;
  • U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969;
  • U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622;
  • U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507;
  • U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979;
  • U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464;
  • U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469;
  • U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863;
  • U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557;
  • U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712;
  • U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877;
  • U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076;
  • U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737;
  • U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420;
  • U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354;
  • U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174;
  • U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177;
  • U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957;
  • U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903;
  • U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107;
  • U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200;
  • U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945;
  • U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697;
  • U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789;
  • U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542;
  • U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271;
  • U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158;
  • U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309;
  • U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071;
  • U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487;
  • U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123;
  • U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013;
  • U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016;
  • U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491;
  • U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200;
  • U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215;
  • U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806;
  • U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960;
  • U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692;
  • U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200;
  • U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149;
  • U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286;
  • U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282;
  • U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880;
  • U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494;
  • U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783;
  • U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904;
  • U.S. Pat. No. 8,727,223; U.S. Pat. No. D702,237;
  • U.S. Pat. No. 8,740,082; U.S. Pat. No. 8,740,085;
  • U.S. Pat. No. 8,746,563; U.S. Pat. No. 8,750,445;
  • U.S. Pat. No. 8,752,766; U.S. Pat. No. 8,756,059;
  • U.S. Pat. No. 8,757,495; U.S. Pat. No. 8,760,563;
  • U.S. Pat. No. 8,763,909; U.S. Pat. No. 8,777,108;
  • U.S. Pat. No. 8,777,109; U.S. Pat. No. 8,779,898;
  • U.S. Pat. No. 8,781,520; U.S. Pat. No. 8,783,573;
  • U.S. Pat. No. 8,789,757; U.S. Pat. No. 8,789,758;
  • U.S. Pat. No. 8,789,759; U.S. Pat. No. 8,794,520;
  • U.S. Pat. No. 8,794,522; U.S. Pat. No. 8,794,525;
  • U.S. Pat. No. 8,794,526; U.S. Pat. No. 8,798,367;
  • U.S. Pat. No. 8,807,431; U.S. Pat. No. 8,807,432;
  • U.S. Pat. No. 8,820,630; U.S. Pat. No. 8,822,848;
  • U.S. Pat. No. 8,824,692; U.S. Pat. No. 8,824,696;
  • U.S. Pat. No. 8,842,849; U.S. Pat. No. 8,844,822;
  • U.S. Pat. No. 8,844,823; U.S. Pat. No. 8,849,019;
  • U.S. Pat. No. 8,851,383; U.S. Pat. No. 8,854,633;
  • U.S. Pat. No. 8,866,963; U.S. Pat. No. 8,868,421;
  • U.S. Pat. No. 8,868,519; U.S. Pat. No. 8,868,802;
  • U.S. Pat. No. 8,868,803; U.S. Pat. No. 8,870,074;
  • U.S. Pat. No. 8,879,639; U.S. Pat. No. 8,880,426;
  • U.S. Pat. No. 8,881,983; U.S. Pat. No. 8,881,987;
  • U.S. Pat. No. 8,903,172; U.S. Pat. No. 8,908,995;
  • U.S. Pat. No. 8,910,870; U.S. Pat. No. 8,910,875;
  • U.S. Pat. No. 8,914,290; U.S. Pat. No. 8,914,788;
  • U.S. Pat. No. 8,915,439; U.S. Pat. No. 8,915,444;
  • U.S. Pat. No. 8,916,789; U.S. Pat. No. 8,918,250;
  • U.S. Pat. No. 8,918,564; U.S. Pat. No. 8,925,818;
  • U.S. Pat. No. 8,939,374; U.S. Pat. No. 8,942,480;
  • U.S. Pat. No. 8,944,313; U.S. Pat. No. 8,944,327;
  • U.S. Pat. No. 8,944,332; U.S. Pat. No. 8,950,678;
  • U.S. Pat. No. 8,967,468; U.S. Pat. No. 8,971,346;
  • U.S. Pat. No. 8,976,030; U.S. Pat. No. 8,976,368;
  • U.S. Pat. No. 8,978,981; U.S. Pat. No. 8,978,983;
  • U.S. Pat. No. 8,978,984; U.S. Pat. No. 8,985,456;
  • U.S. Pat. No. 8,985,457; U.S. Pat. No. 8,985,459;
  • U.S. Pat. No. 8,985,461; U.S. Pat. No. 8,988,578;
  • U.S. Pat. No. 8,988,590; U.S. Pat. No. 8,991,704;
  • U.S. Pat. No. 8,996,194; U.S. Pat. No. 8,996,384;
  • U.S. Pat. No. 9,002,641; U.S. Pat. No. 9,007,368;
  • U.S. Pat. No. 9,010,641; U.S. Pat. No. 9,015,513;
  • U.S. Pat. No. 9,016,576; U.S. Pat. No. 9,022,288;
  • U.S. Pat. No. 9,030,964; U.S. Pat. No. 9,033,240;
  • U.S. Pat. No. 9,033,242; U.S. Pat. No. 9,036,054;
  • U.S. Pat. No. 9,037,344; U.S. Pat. No. 9,038,911;
  • U.S. Pat. No. 9,038,915; U.S. Pat. No. 9,047,098;
  • U.S. Pat. No. 9,047,359; U.S. Pat. No. 9,047,420;
  • U.S. Pat. No. 9,047,525; U.S. Pat. No. 9,047,531;
  • U.S. Pat. No. 9,053,055; U.S. Pat. No. 9,053,378;
  • U.S. Pat. No. 9,053,380; U.S. Pat. No. 9,058,526;
  • U.S. Pat. No. 9,064,165; U.S. Pat. No. 9,064,167;
  • U.S. Pat. No. 9,064,168; U.S. Pat. No. 9,064,254;
  • U.S. Pat. No. 9,066,032; U.S. Pat. No. 9,070,032;
  • U.S. Design Pat. No. D716,285;
  • U.S. Design Pat. No. D723,560;
  • U.S. Design Pat. No. D730,357;
  • U.S. Design Pat. No. D730,901;
  • U.S. Design Pat. No. D730,902;
  • U.S. Design Pat. No. D733,112;
  • U.S. Design Pat. No. D734,339;
  • International Publication No. 2013/163789;
  • International Publication No. 2013/173985;
  • International Publication No. 2014/019130;
  • International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432;
  • U.S. Patent Application Publication No. 2009/0134221;
  • U.S. Patent Application Publication No. 2010/0177080;
  • U.S. Patent Application Publication No. 2010/0177076;
  • U.S. Patent Application Publication No. 2010/0177707;
  • U.S. Patent Application Publication No. 2010/0177749;
  • U.S. Patent Application Publication No. 2010/0265880;
  • U.S. Patent Application Publication No. 2011/0202554;
  • U.S. Patent Application Publication No. 2012/0111946;
  • U.S. Patent Application Publication No. 2012/0168511;
  • U.S. Patent Application Publication No. 2012/0168512;
  • U.S. Patent Application Publication No. 2012/0193423;
  • U.S. Patent Application Publication No. 2012/0203647;
  • U.S. Patent Application Publication No. 2012/0223141;
  • U.S. Patent Application Publication No. 2012/0228382;
  • U.S. Patent Application Publication No. 2012/0248188;
  • U.S. Patent Application Publication No. 2013/0043312;
  • U.S. Patent Application Publication No. 2013/0082104;
  • U.S. Patent Application Publication No. 2013/0175341;
  • U.S. Patent Application Publication No. 2013/0175343;
  • U.S. Patent Application Publication No. 2013/0257744;
  • U.S. Patent Application Publication No. 2013/0257759;
  • U.S. Patent Application Publication No. 2013/0270346;
  • U.S. Patent Application Publication No. 2013/0287258;
  • U.S. Patent Application Publication No. 2013/0292475;
  • U.S. Patent Application Publication No. 2013/0292477;
  • U.S. Patent Application Publication No. 2013/0293539;
  • U.S. Patent Application Publication No. 2013/0293540;
  • U.S. Patent Application Publication No. 2013/0306728;
  • U.S. Patent Application Publication No. 2013/0306731;
  • U.S. Patent Application Publication No. 2013/0307964;
  • U.S. Patent Application Publication No. 2013/0308625;
  • U.S. Patent Application Publication No. 2013/0313324;
  • U.S. Patent Application Publication No. 2013/0313325;
  • U.S. Patent Application Publication No. 2013/0342717;
  • U.S. Patent Application Publication No. 2014/0001267;
  • U.S. Patent Application Publication No. 2014/0008439;
  • U.S. Patent Application Publication No. 2014/0025584;
  • U.S. Patent Application Publication No. 2014/0034734;
  • U.S. Patent Application Publication No. 2014/0036848;
  • U.S. Patent Application Publication No. 2014/0039693;
  • U.S. Patent Application Publication No. 2014/0042814;
  • U.S. Patent Application Publication No. 2014/0049120;
  • U.S. Patent Application Publication No. 2014/0049635;
  • U.S. Patent Application Publication No. 2014/0061306;
  • U.S. Patent Application Publication No. 2014/0063289;
  • U.S. Patent Application Publication No. 2014/0066136;
  • U.S. Patent Application Publication No. 2014/0067692;
  • U.S. Patent Application Publication No. 2014/0070005;
  • U.S. Patent Application Publication No. 2014/0071840;
  • U.S. Patent Application Publication No. 2014/0074746;
  • U.S. Patent Application Publication No. 2014/0076974;
  • U.S. Patent Application Publication No. 2014/0078341;
  • U.S. Patent Application Publication No. 2014/0078345;
  • U.S. Patent Application Publication No. 2014/0097249;
  • U.S. Patent Application Publication No. 2014/0098792;
  • U.S. Patent Application Publication No. 2014/0100813;
  • U.S. Patent Application Publication No. 2014/0103115;
  • U.S. Patent Application Publication No. 2014/0104413;
  • U.S. Patent Application Publication No. 2014/0104414;
  • U.S. Patent Application Publication No. 2014/0104416;
  • U.S. Patent Application Publication No. 2014/0104451;
  • U.S. Patent Application Publication No. 2014/0106594;
  • U.S. Patent Application Publication No. 2014/0106725;
  • U.S. Patent Application Publication No. 2014/0108010;
  • U.S. Patent Application Publication No. 2014/0108402;
  • U.S. Patent Application Publication No. 2014/0110485;
  • U.S. Patent Application Publication No. 2014/0114530;
  • U.S. Patent Application Publication No. 2014/0124577;
  • U.S. Patent Application Publication No. 2014/0124579;
  • U.S. Patent Application Publication No. 2014/0125842;
  • U.S. Patent Application Publication No. 2014/0125853;
  • U.S. Patent Application Publication No. 2014/0125999;
  • U.S. Patent Application Publication No. 2014/0129378;
  • U.S. Patent Application Publication No. 2014/0131438;
  • U.S. Patent Application Publication No. 2014/0131441;
  • U.S. Patent Application Publication No. 2014/0131443;
  • U.S. Patent Application Publication No. 2014/0131444;
  • U.S. Patent Application Publication No. 2014/0131445;
  • U.S. Patent Application Publication No. 2014/0131448;
  • U.S. Patent Application Publication No. 2014/0133379;
  • U.S. Patent Application Publication No. 2014/0136208;
  • U.S. Patent Application Publication No. 2014/0140585;
  • U.S. Patent Application Publication No. 2014/0151453;
  • U.S. Patent Application Publication No. 2014/0152882;
  • U.S. Patent Application Publication No. 2014/0158770;
  • U.S. Patent Application Publication No. 2014/0159869;
  • U.S. Patent Application Publication No. 2014/0166755;
  • U.S. Patent Application Publication No. 2014/0166759;
  • U.S. Patent Application Publication No. 2014/0168787;
  • U.S. Patent Application Publication No. 2014/0175165;
  • U.S. Patent Application Publication No. 2014/0175172;
  • U.S. Patent Application Publication No. 2014/0191644;
  • U.S. Patent Application Publication No. 2014/0191913;
  • U.S. Patent Application Publication No. 2014/0197238;
  • U.S. Patent Application Publication No. 2014/0197239;
  • U.S. Patent Application Publication No. 2014/0197304;
  • U.S. Patent Application Publication No. 2014/0214631;
  • U.S. Patent Application Publication No. 2014/0217166;
  • U.S. Patent Application Publication No. 2014/0217180;
  • U.S. Patent Application Publication No. 2014/0231500;
  • U.S. Patent Application Publication No. 2014/0232930;
  • U.S. Patent Application Publication No. 2014/0247315;
  • U.S. Patent Application Publication No. 2014/0263493;
  • U.S. Patent Application Publication No. 2014/0263645;
  • U.S. Patent Application Publication No. 2014/0267609;
  • U.S. Patent Application Publication No. 2014/0270196;
  • U.S. Patent Application Publication No. 2014/0270229;
  • U.S. Patent Application Publication No. 2014/0278387;
  • U.S. Patent Application Publication No. 2014/0278391;
  • U.S. Patent Application Publication No. 2014/0282210;
  • U.S. Patent Application Publication No. 2014/0284384;
  • U.S. Patent Application Publication No. 2014/0288933;
  • U.S. Patent Application Publication No. 2014/0297058;
  • U.S. Patent Application Publication No. 2014/0299665;
  • U.S. Patent Application Publication No. 2014/0312121;
  • U.S. Patent Application Publication No. 2014/0319220;
  • U.S. Patent Application Publication No. 2014/0319221;
  • U.S. Patent Application Publication No. 2014/0326787;
  • U.S. Patent Application Publication No. 2014/0332590;
  • U.S. Patent Application Publication No. 2014/0344943;
  • U.S. Patent Application Publication No. 2014/0346233;
  • U.S. Patent Application Publication No. 2014/0351317;
  • U.S. Patent Application Publication No. 2014/0353373;
  • U.S. Patent Application Publication No. 2014/0361073;
  • U.S. Patent Application Publication No. 2014/0361082;
  • U.S. Patent Application Publication No. 2014/0362184;
  • U.S. Patent Application Publication No. 2014/0363015;
  • U.S. Patent Application Publication No. 2014/0369511;
  • U.S. Patent Application Publication No. 2014/0374483;
  • U.S. Patent Application Publication No. 2014/0374485;
  • U.S. Patent Application Publication No. 2015/0001301;
  • U.S. Patent Application Publication No. 2015/0001304;
  • U.S. Patent Application Publication No. 2015/0003673;
  • U.S. Patent Application Publication No. 2015/0009338;
  • U.S. Patent Application Publication No. 2015/0009610;
  • U.S. Patent Application Publication No. 2015/0014416;
  • U.S. Patent Application Publication No. 2015/0021397;
  • U.S. Patent Application Publication No. 2015/0028102;
  • U.S. Patent Application Publication No. 2015/0028103;
  • U.S. Patent Application Publication No. 2015/0028104;
  • U.S. Patent Application Publication No. 2015/0029002;
  • U.S. Patent Application Publication No. 2015/0032709;
  • U.S. Patent Application Publication No. 2015/0039309;
  • U.S. Patent Application Publication No. 2015/0039878;
  • U.S. Patent Application Publication No. 2015/0040378;
  • U.S. Patent Application Publication No. 2015/0048168;
  • U.S. Patent Application Publication No. 2015/0049347;
  • U.S. Patent Application Publication No. 2015/0051992;
  • U.S. Patent Application Publication No. 2015/0053766;
  • U.S. Patent Application Publication No. 2015/0053768;
  • U.S. Patent Application Publication No. 2015/0053769;
  • U.S. Patent Application Publication No. 2015/0060544;
  • U.S. Patent Application Publication No. 2015/0062366;
  • U.S. Patent Application Publication No. 2015/0063215;
  • U.S. Patent Application Publication No. 2015/0063676;
  • U.S. Patent Application Publication No. 2015/0069130;
  • U.S. Patent Application Publication No. 2015/0071819;
  • U.S. Patent Application Publication No. 2015/0083800;
  • U.S. Patent Application Publication No. 2015/0086114;
  • U.S. Patent Application Publication No. 2015/0088522;
  • U.S. Patent Application Publication No. 2015/0096872;
  • U.S. Patent Application Publication No. 2015/0099557;
  • U.S. Patent Application Publication No. 2015/0100196;
  • U.S. Patent Application Publication No. 2015/0102109;
  • U.S. Patent Application Publication No. 2015/0115035;
  • U.S. Patent Application Publication No. 2015/0127791;
  • U.S. Patent Application Publication No. 2015/0128116;
  • U.S. Patent Application Publication No. 2015/0129659;
  • U.S. Patent Application Publication No. 2015/0133047;
  • U.S. Patent Application Publication No. 2015/0134470;
  • U.S. Patent Application Publication No. 2015/0136851;
  • U.S. Patent Application Publication No. 2015/0136854;
  • U.S. Patent Application Publication No. 2015/0142492;
  • U.S. Patent Application Publication No. 2015/0144692;
  • U.S. Patent Application Publication No. 2015/0144698;
  • U.S. Patent Application Publication No. 2015/0144701;
  • U.S. Patent Application Publication No. 2015/0149946;
  • U.S. Patent Application Publication No. 2015/0161429;
  • U.S. Patent Application Publication No. 2015/0169925;
  • U.S. Patent Application Publication No. 2015/0169929;
  • U.S. Patent Application Publication No. 2015/0178523;
  • U.S. Patent Application Publication No. 2015/0178534;
  • U.S. Patent Application Publication No. 2015/0178535;
  • U.S. Patent Application Publication No. 2015/0178536;
  • U.S. Patent Application Publication No. 2015/0178537;
  • U.S. Patent Application Publication No. 2015/0181093;
  • U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGUMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method of assisting in focusing a camera system on an object within a field of view, comprising: at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane; andrendering to a display, an aimer graphic element with a Z direction distance equal to D in a manner that causes the aimer graphic element to move in the Z direction with changes in the focal plane.
  • 2. The method according to claim 1, further comprising determining if the object in the field of view is within a depth of field distance ΔD about the distance D, and if so, modifying the rendering of the aimer graphic element in a manner that signifies that the object is in focus.
  • 3. The method according to claim 1, further comprising determining if the object in the field of view is closer than the focal plane, and if so, modifying the rendering of the aimer graphic element in a manner that signifies that the aimer graphic element is behind the object.
  • 4. The method according to claim 1, where the aimer graphic element is rendered in a first color if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and where the aimer object is rendered in a second color if the object is within the depth of field distance ΔD about the distance D.
  • 5. The method according to claim 1, where the aimer graphic element is rendered in a first manner if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and where the aimer object is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D.
  • 6. The method according to claim 1, where the aimer graphic element is rendered in a third manner if the distance D is further from the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and where the aimer graphic element is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D.
  • 7. The method according to claim 1, where the aimer graphic element is rendered in a first manner if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and where the aimer graphic element is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D; and where the aimer graphic element is rendered in a third manner if the distance D is further from the camera than the object and if the object is outside the depth of field distance ΔD about the distance D.
  • 8. The method according to claim 7, where the rendering in the first manner comprises selecting a first aimer graphic; where the rendering in the second manner comprises selecting a second aimer graphic; and where the rendering in the third manner comprises selecting a third aimer graphic.
  • 9. The method according to claim 1, where the distance D is determined by either querying a depth sensor or querying an autofocus system for a current focal depth.
  • 10. The method according to claim 1, where the camera system forms a part of an augmented reality headset having a programmed processor that carries out the rendering to a binocular display.
  • 11. A method of assisting in focusing a camera system on an object within a field of view, comprising: at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane;determining if the object in the field of view is within a depth of field distance ΔD about the distance D, and:if so, then rendering an aimer graphic with the Z direction distance equal to D to a display in a manner that signifies that the object is in focus, andif not then rendering the aimer graphic with a Z direction distance equal to D to the display in a manner that signifies that the object is not in focus.
  • 12. The method according to claim 11, further comprising determining if the object in the field of view is closer than the focal plane, and if so, rendering the aimer graphic in a manner that signifies that the aimer graphic is behind the object.
  • 13. The method according to claim 11, where the aimer graphic is rendered in a first color if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and where the aimer object is rendered in a second color if the object is within the depth of field distance ΔD about the distance D.
  • 14. The method according to claim 11, where the distance D is determined by either querying a depth sensor or querying an autofocus system for a current focal depth.
  • 15. The method according to claim 11, where the camera system forms a part of an augmented reality headset having a programmed processor that carries out the rendering to a binocular display.
  • 16. A method of assisting in focusing a camera system on an object within a field of view, comprising: at the camera system, determining a distance D in a z direction, within the field of view, to a current focal plane;rendering to a display, an aimer graphic with the Z direction distance equal to D in a manner that causes the aimer graphic to move in the Z direction with changes in the focal plane;where the aimer graphic is rendered in a first manner if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D;where the aimer graphic is rendered in a second manner if the object is within the depth of field distance ΔD about the distance D; andwhere the aimer graphic is rendered in a third manner if the distance D is further from the camera than the object and if the object is outside the depth of field distance ΔD about the distance D.
  • 17. The method according to claim 16, where if the object in the field of view is closer than the focal plane, the aimer graphic is rendered in a manner that signifies that view of the aimer graphic is occluded by the object.
  • 18. The method according to claim 16, where the aimer graphic element is rendered in a first color if the distance D is closer to the camera than the object and if the object is outside the depth of field distance ΔD about the distance D; and where the aimer object is rendered in a second color if the object is within the depth of field distance ΔD about the distance D.
  • 19. The method according to claim 16, where the rendering in the first manner comprises selecting a first aimer graphic; where the rendering in the second manner comprises selecting a second aimer graphic; and where the rendering in the third manner comprises selecting a third aimer graphic.
  • 20. The method according to claim 16, where the distance D is determined by either querying a depth sensor or querying an autofocus system for a current focal depth.
US Referenced Citations (456)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7090137 Bennett Aug 2006 B1
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851379 Gopalakrishnan et al. Oct 2014 B2
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber et al. Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9100576 Fan et al. Aug 2015 B2
9202094 Chen et al. Dec 2015 B1
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443123 Hejl Sep 2016 B2
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
20030089776 Hennick May 2003 A1
20040182925 Anderson Sep 2004 A1
20070063048 Havens et al. Mar 2007 A1
20070170259 Nunnink Jul 2007 A1
20090134221 Zhu et al. May 2009 A1
20100127081 Kearney May 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140100813 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Liu et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chen et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160070439 Bostick et al. Mar 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160125873 Braho et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Wiltz, Sr. et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
Foreign Referenced Citations (4)
Number Date Country
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (25)
Entry
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.) 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Bamdringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Holmdahl, Todd; “BUILD 2015: A closer look at the Microsoft HoloLens hardware,” dated Apr. 30, 2015; 10 pages{downloaded Aug. 5, 2016 from https://blogs.windows.com/devices/2015/04/30/build-2015-a-closer-look-at-the-microsoft-hololens-hardware/#kUc9X6wRjidwt0Px.97}.