The present disclosure relates generally to indicia reading devices, and more specifically, to indicial reading devices and methods for decoding decodable indicia employing stereoscopy or stereoscopic cameras and imagery.
Generally speaking, indicia reading devices, also referred to as scanners, laser scanners, image readers, indicia readers, mobile computers, terminals, etc., typically read data represented by printed or displayed information bearing indicia, also referred to as symbols, symbologies, bar codes, etc. Barcodes, such as UPC codes, use thin and thick bar patterns to represent data while more complex coding systems, known as 2D matrix codes, use intricate patterns of blocks and arrangements to store information.
One-dimensional (1D) or linear optical bar code readers are characterized by reading data that is encoded along a single axis, in the presence and/or widths of bars and spaces, so that such symbols can be read from a single scan along that axis.
Two-dimensional (2D) or area optical bar code readers utilize a lens to focus an image of the bar code onto a multiple pixel image sensor array, which often is provided by a CMOS-based or CCD-based image sensor array that converts light signals into electric signals.
Conventional 1D and 2D indicia readers or barcode scanners/readers are known and come in many different shapes and sizes, like 1D and/or 2D wireless handheld barcode scanners used for scanning codes. As should be readily understood by one skilled in the art, the more user friendly and the faster the reader works, the better. As such, there is clearly a need or desire to create indicia readers or barcode scanners that are more user friendly and/or faster. In addition, the accuracy of the reader or scanner is critical. Many scenarios lead to inaccurate or unreadable indicia or barcodes. For example, geometrical distortion, specular reflections, direct part marking like dot peen or laser etch, on screen reading, etc. may lead to inaccurate or unreadable data. As such, there is always a need/desire to improve the reading and accuracy of indicia readers or barcode scanners.
Barcode scanners can include many different options or features for improving the reading and accuracy of the data. One such feature is error checking, or the ability to verify the barcode or indicia scanned. As an example, a portable wireless 3D imaging handheld barcode reader may scan/read the barcodes and may have the capacity of error correcting. However, the standard imaging used in known barcode scanners to scan the 2D barcode and decode the 2D barcode information is based on 2D imagery, including the feature of error checking. These 2D imagery used for decoding and error checking are limited by the 2D imagery displayed and, as a result, do not include any 3D imagery or relevant depth information of the images.
Stereoscopy, also known as stereoscopics or 3D imaging, is a technique for creating or enhancing the illusion of depth in an image by means of stereopsis for binocular vision. Most stereoscopic methods present two offset images separately to the left and right eye of the viewer. These 2D images are then combined in the brain to give the perception of 3D depth. As such, stereoscopy creates the illusion of 3D depth from given two-dimensional images. Prior to the instant disclosure, there was no known indicia reading devices or barcode scanners that employed stereoscopic imagery to decode indicia or read barcodes and/or for error checking the decoded imagery or barcodes read based on the 3D imagery produced from stereoscopic images and the associated depth information from such 3D imagery.
Another feature or option that is growing in need for conventional 1D and 2D indicia readers or barcode scanners is the ability to, not only read standard printed form indicia or barcodes, but also the ability to read indicia or barcodes from electronic displays or screens, like reading barcodes on cellphones, tablets, etc. For example, in many applications (airport ticket checking for instance) the user has to read both regular printed barcodes and electronically displayed barcodes (smartphones, tablets, etc.). Because electronic displays are typically lit to display their contents, the illumination of the electronic display is not required in order to read or decode the display. In fact, if illumination is directed at the lit electronic display, decoding is difficult as the standard illumination from barcode readers produces glares and/or specular reflections. This corresponds to the need for two different working modes. To do that with a single image reader, the user needs to enter a working mode that continuously switches the lighting on and off resulting in a very unpleasant flickering. Thus, there is clearly a need to provide a reader that is user friendly and can easily scan both regular printed barcodes and electronically displayed barcodes.
Therefore, a need exists for a user friendly indicia reader and/or barcode scanner used to more accurately decode 2D indicia or barcode information. In addition, a need exists for a barcode reader that can operate in one mode but able to handle illumination & non-illumination indicia for normal vs. electronic display readings.
Accordingly, in one aspect, the present invention embraces an indicia reading device for decoding decodable indicia using stereoscopy. The indicia reading device includes an illumination subsystem, an aimer subsystem, the imaging subsystem, a memory, and a processor. The illumination subsystem is operative for projecting an illumination pattern. The aimer subsystem is operative for projecting an aiming pattern. The imaging subsystem includes a stereoscopic imager. The memory is in communication with the stereoscopic imager and is capable of storing frames of image data representing light incident on the stereoscopic imager. The processor is in communication with the memory and is operative to decode a decodable indicia represented in at least one of the frames of image data. The stereoscopic imager is configured to capture multiple images at a baseline distance apart (creates varying angles) to create three-dimensional images with depth information of the decodable indicia.
In another exemplary embodiment, an indicia reading device for decoding decodable indicia in both standard printed form and electronically displayed form in a single mode or operation. This indicia reading device includes an illumination subsystem, an imaging subsystem, a memory, and a processor. The illumination subsystem is operative for projecting an illumination pattern. The aimer subsystem is operative for projecting an aiming pattern. The memory is in communication with the imaging subsystem and is capable of storing frames of image data representing light incident on the imaging subsystem. The processor is in communication with the memory, and is operative to decode a decodable indicia represented in at least one of the frames of image data. In this exemplary embodiment, the indicia reading device is configured to simultaneously (or almost simultaneously) take illuminated images for decodable indicia in normal printed form and non-illuminated images for decodable indicia in electronic display form.
In another aspect, the present invention embraces a method of decoding decodable indicia. The method includes the steps of:
The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.
The present invention embraces imaging devices such as optical readers or indicia reading devices for use in reading decodable indicia in which in various aspects employ stereoscopy or stereoscopic imagery. In select embodiments, the stereoscopic imagery may include capturing two or more images at differing angles. Such data provides three dimensional (“3D”) additional information with related depth information of the scanned indicia, objects, scenes, or barcodes compared to conventional optical readers or indicia reading devices. In various aspects, the operation of the imaging devices may be configured to operably process or use one or more portions of the stereoscopic 3D image data for read out and/or for decoding the representation of the decodable indicia. As described in greater detail below, the use of stereoscopy and stereoscopic image data may allow for improved reading of decodable indicia, barcodes, scenes, and objects compared to conventional imaging devices.
With reference still to
For example, device 1000 in one embodiment may include a trigger 1220, a display 1222, a pointer mechanism 1224, and a keyboard 1226 disposed on a common side of a hand held housing 1014. Display 1222 and pointer mechanism 1224 in combination can be regarded as a user interface of device 1000. Display 1222 in one embodiment can incorporate a touch panel for navigation and virtual actuator selection in which case a user interface of device 1000 can be provided by display 1222.
In other embodiments, a hand held housing 1015 of an indicia reading device 1001 may be devoid of a display and a keyboard, and may be in a gun style form factor having a trigger 1220 as shown in
The following description uses nomenclature associated with indicia reading devices and may generally include hand held indicia reading devices, fixed indicia reading devices, however those of ordinary skill in the art will recognize that aspects of the present disclosure may be incorporated in other electronic devices having an imager for image capture and/or indicia reading which may be configured as, for example, mobile phones, cell phones, satellite phones, smart phones, telemetric devices, personal data assistants, cameras, and other devices.
Referring to the indicia reading devices 1000, 1001, 1002 shown in
Stereoscopic imager 2000 may be configured to capture multiple images at baseline distance 2030 (creates varying angles of the images) to create three-dimensional images with depth information of the decodable indicia 15 and/or 115, or for accurately determining lengths and widths of the 2D decodable indicia 15 and/or 115, or extracting barcode absolute dimensions with bar and space widths. Stereoscopic imager 2000 may include two or more sensors or cameras (like a tricamera, quadcamera, etc.) for capturing the multiple images at varying angles to create 3D images with depth information. The resulting 3D images with depth information and more accurate lengths and widths of the 2D images (like absolute dimensions of barcode with bar and space widths) of the decodable indicia 15 and/or 115 may lead to many new uses and benefits compared to conventional readers and scanners that are limited to 2D information with no depth information. As examples, and clearly not limited thereto, processor 1060 may be further operative to decode decodable indicia 15 and/or 115 that may include geometrical distortions, specular reflections, direct part markings, dot peen, laser etch, electronic displays, and combinations thereof by using the three-dimensional images from stereoscopic imager 2000 with depth information of the decodable indicia.
More specifically, processor 1060 may be operative to decode geometrical distortions using the information of the captured three-dimensional images from stereoscopic imager 2000 to get missing length and width information of the decodable indicia 15 and/or 115.
More specifically, processor 1060 may be operative to determine absolute dimensions of bar and space widths of barcodes 15 and/or 115 using the depth information of the captured three-dimensional images from stereoscopic imager 2000.
In another specific example, processor 1060 may be operative to decode specular reflections using different viewing angles from stereoscopic imager 2000 to reconstruct a specular free image from the different viewing angles.
In yet another specific example, processor 1060 may be further operative to filter or deblur decodable indicia 15 and/or 115 based on distances to the decodable indicia determined from the captured three-dimensional images from stereoscopic imager 2000.
In yet another specific example, processor 1060 may be further operative to verify the decodable indicia based on dimensions of the decodable indicia determined from the captured three-dimensional images from stereoscopic imager 2000.
In yet another specific example, processor 1060 may be further operative to decode dot peen markings (series of holes in metal) or laser etch marks (leaves a raised surface) using 3D depth information from the captured 3D images from stereoscopic imager 2000. In these embodiments, the dot peen or laser etch markings may be decoded based on 3D depth not just light intensity. This feature may reduce or eliminate the problems associated with difficulties in decoding direct part makings like dot peen or laser etch marks, especially those with a rough or noisy background.
In yet another specific example, processor 1060 may be further configured for three-dimensional scanning of a scene based on scene images from stereoscopic imager 2000.
In yet another specific example, processor 1060 may be further configured for object recognition based on the scanned three-dimensional scene images and for verifying that the object is consistent with the decoded indicia 15 and/or 115.
In yet another specific example, processor 1060 may be further configured for anti-counterfeiting by recognizing object texture and/or specific tags, including but not limited to, random indentions/protrusions (like BubbleTag™), random microchips of metal embedded in a polymer, stereo views of security holograms (which will look different from differing angles of the stereoscopy imagery), the like, etc.
In yet another specific embodiment, processor 1060 may be configured for modeling and/or dimensioning small objects in the scene S.
Referring again to the indicia reading devices 1000, 1001, 1002 shown in
Stereoscopic imager 2000 may be any type of imager utilizing stereoscopy for capturing and creating 3D images with depth information. In select embodiments, stereoscopic imager 2000 may include left sensor 2020R (with lens 2021R) and right sensor 2020L (with lens 2021L) separated by baseline distance 2030. Baseline distance 2030 may be set to any desired distance for varying the angle between left sensor 2020R and right sensor 2020L. For example, baseline distance 2030 may be approximately or equal to 2 cm. In this example, the 3D accuracy at a scan angle of 36 degrees horizontal may have a depth accuracy of approximately:
The barcode reading may thus have the following characteristics: a resolution of approximately 0.1 mm or 4 mils, a depth of field (“DOF”) of approximately 100% UPC up to 34 cm and/or greater than 40 cm with specialized decoders (i.e. a Vesta™ decoder), a motion tolerance of less than approximately 2.5 m/s or greater than 100 inch/sec. However, the invention is not so limited to these exact 3D accuracies or barcode reading characteristics and other results may be obtained with various setups, including, but not limited to varying baseline distance 2030 and/or the scan angle.
Referring again to the indicia reading devices 1000, 1001, 1002 shown in
Referring now to
In select embodiments, method 5000 of decoding decodable indicia 15 and/or 115 may further include step 5010 of capturing multiple images at baseline distance 2030 with stereoscopic imager 2000, and step 5012 of creating three-dimensional images with depth information of the decodable indicia 15 and/or 115.
In other select embodiments, step 5008 of decoding the decodable indicia may further include decoding indicia 15 and/or 115, wherein the indicia may include geometrical distortions, specular reflections, direct part markings (like dot peen or laser etch), electronic displays, and combinations thereof of the decodable indicia using the three-dimensional images from stereoscopic imager 2000 with depth information of the decodable indicia 15 and/or 115.
In yet further embodiments, method 5000 may further include step 5014 of verifying the decodable indicia 15 and/or 115 based on dimensions of the decodable indicia determined from the captured three-dimensional images from stereoscopic imager 2000.
In other select embodiments, step 5004 of capturing stereoscopic images of the illuminated decodable indicia with stereoscopic imager 2000 may include step 5016 of simultaneously capturing an illuminated image and a non-illuminated image with global shutter sensors 2010 working in conjunction with stereoscopic imager 2000. In these embodiments, step 5008 of decoding the decodable indicia from the image data may include the steps of: step 5018 of decoding normal print decodable indicia 15 from the illuminated images; and step 5020 of decoding electronically displayed decodable indicia 115 from the non-illuminated images. In addition, step 5002 of illuminating the decodable indicia 15 and/or 115 may include pulsed LED illumination 500 controlled by LED driver 1206.
Other embodiments may include devices that have either no aimer, no projected illumination, and/or neither an aimer nor projected illumination, thereby relying on screen feedback and or ambient lighting to create the images. Typical devices that operate without an aimer and/or projected illumination include some bar code scanners, cell phones, tablets, personal assistants, the like, etc.
Referring now specifically to
With reference again to
In one example, image sensor integrated circuit 1040 can be provided e.g., by an MT9V022 (752×480 pixel array) or an MT9V023 (752×480 pixel array) image sensor integrated circuit available from Micron Technology, Inc. In one example, image sensor array 1033 can be a hybrid monochrome and color image sensor array having a first subset of monochrome pixels without color filter elements and a second subset of color pixels having color sensitive filter elements. In one example, image sensor integrated circuit 1040 can incorporate a Bayer pattern filter, so that defined at the image sensor array 1033 are red pixels at red pixel positions, green pixels at green pixel positions, and blue pixels at blue pixel positions. Frames that are provided utilizing such an image sensor array incorporating a Bayer pattern can include red pixel values at red pixel positions, green pixel values at green pixel positions, and blue pixel values at blue pixel positions. In an embodiment incorporating a Bayer pattern image sensor array, processor 1060 prior to subjecting a frame to further processing can interpolate pixel values at frame pixel positions intermediate of green pixel positions utilizing green pixel values for development of a monochrome frame of image data. Alternatively, processor 1060 prior to subjecting a frame for further processing can interpolate pixel values intermediate of red pixel positions utilizing red pixel values for development of a monochrome frame of image data. Processor 1060 can alternatively, prior to subjecting a frame for further processing interpolate pixel values intermediate of blue pixel positions utilizing blue pixel values. An imaging subsystem of devices 1000 and 5000 and can include image sensor 1032 and plenoptic lens assembly 200 for projecting a plenoptic image onto image sensor array 1033 of image sensor 1032.
In the course of operation of the devices, image signals can be read out of image sensor 1032, converted, and stored into a system memory such as RAM 1080. Memory 1085 of the devices can include RAM 1080, a nonvolatile memory such as EPROM 1082 and a storage memory device 1084 such as may be provided by a flash memory or a hard drive memory. In one embodiment, the devices can include processor 1060 which can be adapted to read out image data stored in memory 1080 and subject such image data to various image processing algorithms. The devices can include a direct memory access unit (DMA) 1070 for routing image information read out from image sensor 1032 that has been subject to conversion to RAM 1080. In another embodiment, the devices can employ a system bus providing for bus arbitration mechanism (e.g., a PCI bus) thus eliminating the need for a central DMA controller. A skilled artisan would appreciate that other embodiments of the system bus architecture and/or direct memory access components providing for efficient data transfer between the image sensor 1032 and RAM 1080 are within the scope and the spirit of the disclosure.
Reference still to
The devices may include illumination subsystem 800 for illumination of target, and projection of illumination pattern 1260. Illumination pattern 1260, in the embodiment shown can be projected to be proximate to but larger than an area defined by field of view 1240, but can also be projected in an area smaller than an area defined by a field of view 1240. Illumination subsystem 800 can include a light source bank 500, comprising one or more light sources. Light source assembly 800 may further include one or more light source banks, each comprising one or more light sources, for example. Such light sources can illustratively include light emitting diodes (LEDs), in an illustrative embodiment. LEDs with any of a wide variety of wavelengths and filters or combination of wavelengths or filters may be used in various embodiments. Other types of light sources may also be used in other embodiments. The light sources may illustratively be mounted to a printed circuit board. This may be the same printed circuit board on which an image sensor integrated circuit 1040 having an image sensor array 1033 may illustratively be mounted.
The devices can also include an aiming subsystem 600 for projecting an aiming pattern (not shown). Aiming subsystem 600 which can comprise a light source bank can be coupled to aiming light source bank power input unit 1208 for providing electrical power to a light source bank of aiming subsystem 600. Power input unit 1208 can be coupled to system bus 1500 via interface 1108 for communication with processor 1060.
In one embodiment, illumination subsystem 800 may include, in addition to light source bank 500, an illumination lens assembly 300. In addition to or in place of illumination lens assembly 300, illumination subsystem 800 can include alternative light shaping optics, e.g. one or more diffusers, mirrors, and prisms. In use, the devices, such as devices 1000,1001, and 1002 can be oriented by an operator with respect to a target, (e.g., a piece of paper, a package, another type of substrate, screen, etc.) bearing decodable indicia 15 in such manner that illumination pattern 1260 is projected on decodable indicia 15. In the example of
In another aspect, the devices can include a power supply 1402 that supplies power to a power grid 1404 to which electrical components of device 1000 can be connected. Power supply 1402 can be coupled to various power sources, e.g., a battery 1406, a serial interface 1408 (e.g., USB, RS232), and/or AC/DC transformer 1410.
Further, regarding power input unit 1206, power input unit 1206 can include a charging capacitor that is continually charged by power supply 1402. Power input unit 1206 can be configured to output energy within a range of energization levels. An average energization level of illumination subsystem 800 during exposure periods with the first illumination and exposure control configuration active can be higher than an average energization level of illumination and exposure control configuration active.
The devices can also include a number of peripheral devices including, for example, a trigger 1220 which may be used to make active a trigger signal for activating frame readout and/or certain decoding processes. The devices can be adapted so that activation of trigger 1220 activates a trigger signal and initiates a decode attempt. Specifically, device 1000 can be operative so that in response to activation of a trigger signal, a succession of frames can be captured by way of read out of image information from image sensor array 1033 (typically in the form of analog signals) and then storage of the image information after conversion into memory 1080 (which can buffer one or more of the succession of frames at a given time). Processor 1060 can be operative to subject one or more of the succession of frames to a decode attempt.
For attempting to decode a barcode symbol, e.g., a one dimensional barcode symbol, processor 1060 can process image data of a frame corresponding to a line of pixel positions (e.g., a row, a column, or a diagonal set of pixel positions) to determine a spatial pattern of dark and light cells and can convert each light and dark cell pattern determined into a character or character string via table lookup. Where a decodable indicia representation is a 2D barcode symbology, a decode attempt can comprise the steps of locating a finder pattern using a feature detection algorithm, locating matrix lines intersecting the finder pattern according to a predetermined relationship with the finder pattern, determining a pattern of dark and light cells along the matrix lines, and converting each light pattern into a character or character string via table lookup.
The devices can include various interface circuits for coupling various peripheral devices to system address/data bus (system bus) 1500, for communication with processor 1060 also coupled to system bus 1500. The devices can include an interface circuit 1028 for coupling image sensor timing and control circuit 1038 to system bus 1500, an interface circuit 1106 for coupling illumination light source bank power input unit 1206 to system bus 1500, and an interface circuit 1120 for coupling trigger 1220 to system bus 1500. The devices can also include display 1222 coupled to system bus 1500 and in communication with processor 1060, via an interface 1122, as well as pointer mechanism 1224 in communication with processor 1060 via an interface 1124 connected to system bus 1500. The devices can also include keyboard 1226 coupled to systems bus 1500 and in communication with processor 1060 via an interface 1126. The devices can also include range detector unit 1210 coupled to system bus 1500 via interface 1110. In one embodiment, range detector unit 1210 can be an acoustic range detector unit. Various interface circuits of the devices can share circuit components. For example, a common microcontroller providing control inputs to circuit 1038 and to power input unit 1206 can be provided to coordinate timing between image sensor array controls and illumination subsystem controls.
A succession of frames of image data that can be captured and subject to the described processing can be full frames (including pixel values corresponding to each pixel of image sensor array 1033 or a maximum number of pixels read out from image sensor array 1033 during operation of the devices). A succession of frames of image data that can be captured and subject to the described processing can also be “windowed frames” comprising pixel values corresponding to less than a full frame of pixels of image sensor array 1033. A succession of frames of image data that can be captured and subject to the above described processing can also comprise a combination of full frames and windowed frames. A full frame can be read out for capture by selectively addressing pixels of image sensor 1032 having image sensor array 1033 corresponding to the full frame. A windowed frame can be read out for capture by selectively addressing pixels or ranges of pixels of image sensor 1032 having image sensor array 1033 corresponding to the windowed frame. In one embodiment, a number of pixels subject to addressing and read out determine a picture size of a frame. Accordingly, a full frame can be regarded as having a first relatively larger picture size and a windowed frame can be regarded as having a relatively smaller picture size relative to a picture size of a full frame. A picture size of a windowed frame can vary depending on the number of pixels subject to addressing and readout for capture of a windowed frame.
The devices can capture frames of image data at a rate known as a frame rate. A typical frame rate is 60 frames per second (FPS) which translates to a frame time (frame period) of 16.6 ms. Another typical frame rate is 30 frames per second (FPS) which translates to a frame time (frame period) of 33.3 ms per frame. A frame rate of device 1000 can be increased (and frame time decreased) by decreasing of a frame picture size.
To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:
In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.
This application is a continuation of U.S. application Ser. No. 15/138,358, filed Apr. 26, 2016, the entire contents of which are incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
6832725 | Gardiner et al. | Dec 2004 | B2 |
7013040 | Shiratani | Mar 2006 | B2 |
7128266 | Zhu et al. | Oct 2006 | B2 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7219843 | Havens et al. | May 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7726575 | Wang et al. | Jun 2010 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8322622 | Liu | Dec 2012 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8376233 | Horn et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Van et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein, Jr. | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre, Jr. | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz, Sr. | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8763909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van et al. | Aug 2014 | B2 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue et al. | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Caballero | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein, Jr. | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8903172 | Smith | Dec 2014 | B2 |
8908995 | Benos et al. | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | El et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber et al. | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
9224022 | Ackley et al. | Dec 2015 | B2 |
9224027 | Van et al. | Dec 2015 | B2 |
D747321 | London et al. | Jan 2016 | S |
9230140 | Ackley | Jan 2016 | B1 |
9250712 | Todeschini | Feb 2016 | B1 |
9258033 | Showering | Feb 2016 | B2 |
9262633 | Todeschini et al. | Feb 2016 | B1 |
9310609 | Rueblinger et al. | Apr 2016 | B2 |
D757009 | Oberpriller et al. | May 2016 | S |
9342724 | McCloskey et al. | May 2016 | B2 |
9375945 | Bowles | Jun 2016 | B1 |
D760719 | Zhou et al. | Jul 2016 | S |
9390596 | Todeschini | Jul 2016 | B1 |
D762604 | Fitch et al. | Aug 2016 | S |
D762647 | Fitch et al. | Aug 2016 | S |
9412242 | Van et al. | Aug 2016 | B2 |
D766244 | Zhou et al. | Sep 2016 | S |
9443123 | Hejl | Sep 2016 | B2 |
9443222 | Singel et al. | Sep 2016 | B2 |
9478113 | Xie et al. | Oct 2016 | B2 |
9536186 | Nash et al. | Jan 2017 | B2 |
9779276 | Todeschini et al. | Oct 2017 | B2 |
10185906 | Thuries | Jan 2019 | B2 |
20060043194 | Barkan et al. | Mar 2006 | A1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedrao | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308625 | Park et al. | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130342651 | Zatloukal et al. | Dec 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140034734 | Sauerwein, Jr. | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140100813 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Giordano et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein, Jr. | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131438 | Kearney | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131445 | Ding et al. | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140172363 | Deichmann et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140197238 | Liu et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140278387 | Digregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150071819 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150169925 | Chen et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150210199 | Payne | Jul 2015 | A1 |
20150220753 | Zhu et al. | Aug 2015 | A1 |
20150254485 | Feng et al. | Sep 2015 | A1 |
20150327012 | Bian et al. | Nov 2015 | A1 |
20160014251 | Hejl | Jan 2016 | A1 |
20160040982 | Li et al. | Feb 2016 | A1 |
20160042241 | Todeschini | Feb 2016 | A1 |
20160057230 | Todeschini et al. | Feb 2016 | A1 |
20160104019 | Todeschini et al. | Apr 2016 | A1 |
20160109219 | Ackley et al. | Apr 2016 | A1 |
20160109220 | Laffargue et al. | Apr 2016 | A1 |
20160109224 | Thuries et al. | Apr 2016 | A1 |
20160112631 | Ackley et al. | Apr 2016 | A1 |
20160112643 | Laffargue et al. | Apr 2016 | A1 |
20160124516 | Schoon et al. | May 2016 | A1 |
20160125217 | Todeschini | May 2016 | A1 |
20160125342 | Miller et al. | May 2016 | A1 |
20160125873 | Braho et al. | May 2016 | A1 |
20160133253 | Braho et al. | May 2016 | A1 |
20160171720 | Todeschini | Jun 2016 | A1 |
20160178479 | Goldsmith | Jun 2016 | A1 |
20160180678 | Ackley et al. | Jun 2016 | A1 |
20160189087 | Morton et al. | Jun 2016 | A1 |
20160227912 | Oberpriller et al. | Aug 2016 | A1 |
20160232891 | Pecorari | Aug 2016 | A1 |
20160292477 | Bidwell | Oct 2016 | A1 |
20160294779 | Yeakley et al. | Oct 2016 | A1 |
20160306769 | Kohtz et al. | Oct 2016 | A1 |
20160314276 | Wilz et al. | Oct 2016 | A1 |
20160314294 | Kubler et al. | Oct 2016 | A1 |
Number | Date | Country |
---|---|---|
3007096 | Apr 2016 | EP |
2013163789 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
2014019130 | Feb 2014 | WO |
2014110495 | Jul 2014 | WO |
Entry |
---|
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages. |
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned. |
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned. |
U.S. Patent Application No for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages, U.S. Appl. No. 14/747,197. |
U.S. Patent Application for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages. U.S. Appl. No. 14/702,979. |
U.S. Patent Application for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Bamdringa); 38 pages, U.S. Appl. No. 14/740,320. |
U.S. Patent Application for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages, U.S. Appl. No. 14/702,110. |
U.S. Patent Application for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages, U.S. Appl. No. 14/705,407. |
U.S. Patent Application for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages, U.S. Appl. No. 14/704,050. |
U.S. Patent Application for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages, U.S. Appl. No. 14/735,717. |
U.S. Patent Application for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages, U.S. Appl. No. 14/705,012. |
U.S. Patent Application for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages, U.S. Appl. No. 14/715,916. |
U.S. Patent Application for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages. U.S. Appl. No. 14/747,490. |
U.S. Patent Application for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages., U.S. Appl. No. 14/740,373. |
U.S. Patent Application for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages, U.S. Appl. No. 14/715,672. |
U.S. Patent Application for Application Independent DEX/UCS Interface filed May 8, 2015 (Page); 47 pages, U.S. Appl. No. 14/707,123. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
Search Report in related European Application No. 17166725.6 dated Sep. 12, 2017, pp. 1-8. |
Guangzhou Winson Information Technology Co., Ltd, web page titled “2.4G Wni-3002 Wireless Handheld Barcode Scanner Customized Portable 3D Imaging Tablet PC Programmable”, downloaded from: http://www.made-in-china.com/showroom/winson-barcodescan/product-detailJoixsvzKfyrN/Cina-2-4G-Wni-3002-Wireless-Handheld-Barcode-Scanner, Dec. 4, 2014, pp. 1-7. |
Decision to Grant for European Application No. 17166725.6 dated Feb. 6, 2020, 2 pages. |
Intention to Grant for European Application No. 17166725.6 dated Sep. 30, 2019, 8 pages. |
Non-Final Rejection dated Jan. 12, 2018 for U.S. Appl. No. 15/138,358. |
Notice of Allowance and Fees Due (PTOL-85) dated Sep. 6, 2018 for U.S. Appl. No. 15/138,358. |
Extended European Search Report for European Application No. 20152999.7 dated Apr. 16, 2020, 7 pages. |
Number | Date | Country | |
---|---|---|---|
20190122087 A1 | Apr 2019 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15138358 | Apr 2016 | US |
Child | 16220895 | US |