Technical Field
This disclosure generally relates to the field of automated package handling.
Description of the Related Art
Package handling efficiency is increased by automating aspects of the process. Two processes that are time consuming are the determination of the package dimensions and/or determining dimensional weight. Package dimensions are important to optimize the loading process. In addition, knowing the remaining space is also useful.
The concept of dimensional weight has commonly been adopted by the transportation industry in most parts of the world as a standard way of determining charges for the shipment of packaged goods. Determining the dimensional weight of an object involves measuring the cubic space occupied by the object, or dimensioning. Dimensional weight is widely used because shipping costs calculated based on weight of the goods alone would render it unprofitable for carriers when the shipped goods have low density, e.g., small weight but occupy a large space. By using dimensional weight in calculating shipping costs, on the other hand, carriers can charge based on either the actual weight or the dimensional weight of the shipped goods, usually depending on whichever is greater. Moreover, by dimensioning objects, such as parcels, packages, and pallets, carriers, warehouses, shipping retailers, postal companies, or the like may optimally utilize their storage space and charge for the service accordingly.
Dimensional weight involves the volumetric weight of an object, and, more specifically, the cubic space the object occupies. Typically, the dimensional weight of an object is calculated as the multiplicative product of the object's length, width, and height divided by a constant. For example, in the United States, the dimensional weight of an object is calculated by domestic air carriers as (length×width×height)/166, with all dimensions in inches. A parcel weighing 25 pounds and having volumetric dimensions of 20×15×15 inches would have, using the above formula, a dimensional weight of 27.1 pounds. In this example, the shipping charge would be determined based on the dimensional weight of 27.1 pounds, because it is greater than the actual weight of 25 pounds.
To expedite the dimensioning process and to facilitate accurate dimensioning, companies have invested in various automatic dimensioning systems. One type of dimensioning system, such as a volume dimensioning application, performs volumetric dimensioning of objects by first capturing an image of the objects and then finding those objects in the image. For instance, an image capturing device may be utilized to capture an image of a number of parcels waiting to be dimensioned. Afterwards, a computing device may select one of the parcels from the parcels in the image to calculate the dimensional weight for. To do so, the computing device may need to estimate the boundary of the selected parcel to determine its approximate length, width, and height for the calculation. However, it can be very difficult at times to discern a particular object or objects in an image due to insufficient lighting or the presence of numerous objects in the same image. Although such a volume dimensioning application may be designed as a standalone, automatic application, issues such as those mentioned above may cause inaccuracy in the dimensioning process and ultimately result in delay and extra operational costs.
A method of operating a dimensioning system to determine dimensional information for objects may be summarized as including acquiring a number of images; computationally identifying objects in at least one of the acquired images; computationally initially selecting one object represented in the at least one of the acquired images as a candidate for processing; providing an indication to a user indicative of the initially selected object; receiving at least one user input indicative of an object selected for processing; and computationally determining dimensional data for the object indicated by the received user input.
Receiving at least one user input indicative of an object selected for processing may include receiving a user selection that confirms the initially selected one object as the object for processing. Computationally determining dimensional data for the object indicated by the received user input may include determining a dimensional weight based on an estimated perimeter of the initially selected one object as represented in the acquired image. Receiving at least one user input indicative of an object selected for processing may include receiving a user selection that indicates an object other than the initially selected one object as the object for processing. Computationally determining dimensional data for the object indicated by the received user input may include determining a dimensional weight based on an estimated perimeter of the object indicated by the received user selection as the object is represented in the acquired image. The method may further include providing an indication to the user indicative of a currently selected object, the indication visually distinguishing the currently selected object in a display of the acquired image from any other object represented in the display of the acquired image. Receiving at least one user input indicative of an object selected for processing may include receiving a user selection that indicates at least a portion of a new perimeter for the object for processing. Computationally determining dimensional data for the object indicated by the received user input may include computationally determining dimensional data based on the new perimeter of the object represented in the acquired image. Providing an indication to a user indicative of the initially selected object may include displaying the acquired image and visually distinguishing the initially selected object in the display of the acquired image from any other objects represented in the display of the acquired image. Visually distinguishing the initially selected object in the display of the acquired image from any other objects represented in the display of the acquired image may include displaying a border about at least a portion of the initially selected object in the display of the acquired image. Receiving at least one user input indicative of an object selected for processing may include receiving at least one signal representing a position in the image that indicates a position of at least a portion of a new perimeter for the object for processing. Visually distinguishing the initially selected object in the display of the acquired image from any other objects represented in the display of the acquired image may include displaying a draggable border about at least a portion of the initially selected object in the display of the acquired image. Receiving at least one user input indicative of an object selected for processing may include receiving at least one signal representing a dragging of the draggable border to a new position that indicates at least a portion of a new perimeter for the object for processing. Computationally determining dimensional data for the object indicated by the received user input may include computationally determining a dimension of at least one of a box, a package, a parcel, a pallet or a document represented in the acquired image.
A method of operating a dimensioning system to determine dimensional information for objects may be summarized as including acquiring a number of images; computationally identifying objects or spaces in at least one of the acquired images; determining dimensional data for at least one object or space; and receiving at least one user input indicative of an object or space selected for processing.
The method may further include computationally determining dimensional data for the object or space selected by the received user input. The method may further include computationally revising the determined dimensional data for the at least one object or space in response to the received user input. Receiving at least one user input may include receiving at least one user input in the form of at least one of a keyboard entry, a computer mouse entry, a touch-screen device entry, a voice command, an audible command, and a bar code reading. Receiving at least one user input indicative of an object or space selected for processing may include receiving a user selection that confirms the initially selected one object or space as the object for processing. Computationally determining dimensional data for the object or space selected by the received user input may include determining a dimensional weight based on an estimated perimeter of the initially selected one object as represented in the acquired image. Computationally determining dimensional data for the object or space selected by the received user input may include determining a dimensional data based on an estimated perimeter of the initially selected one space as represented in the acquired image. Receiving at least one user input indicative of an object or space selected for processing may include receiving a user selection that indicates an object or space other than the initially selected one object or space as the object or space for processing. Computationally determining dimensional data for the object or space selected by the received user input may include determining a dimensional weight based on an estimated perimeter of the object selected by the received user selection as the object is represented in the acquired image. Computationally determining dimensional data for the object or space selected by the received user input may include determining a dimensional data based on an estimated perimeter of the space selected by the received user selection as the space is represented in the acquired image. The method may further include providing an indication to the user indicative of a currently selected object or space, the indication visually distinguishing the currently selected object or space in a display of the acquired image from any other object or space represented in the display of the acquired image. Receiving at least one user input indicative of an object or space selected for processing may include receiving a user selection that indicates at least a portion of a new perimeter for the object or space for processing. Computationally determining dimensional data for the object or space selected by the received user input may include computationally determining dimensional data based on the new perimeter of the object or space represented in the acquired image in response to the received user input.
A dimensioning system to determine dimensional information for objects may be summarized as including an imager configured to acquire images; a user input/output system configured to display images and to receive user input; and a processor configured to identify objects in the acquired images, initially select one of the identified objects for processing, cause the acquired images to be displayed via the user input/output system along with an indication indicative of the initially selected one object, and computationally determine dimensional data for an object indicated by at least one user input received via the user input/output system.
The processor may be configured to determine a dimensional weight based on an estimated perimeter of the initially selected one object as represented in the acquired image in response to at least one user input confirming the initially selected one object as the object to be processed. The processor may be configured to computationally determine a dimensional weight based on a new perimeter of the initially selected one object represented in the acquired image in response to at least one user input indicative of the new perimeter. The processor may be configured to determine a dimensional weight based on an estimated perimeter of an object represented in the acquired image other than the initially selected one object in response to at least one user input selecting the other object as the object to be processed. The processor may be configured to determine a dimensional weight based on a user identified perimeter of an object represented in the acquired image other than the initially selected one object in response to at least one user input selecting the other object as the object to be processed and identifying at least a portion of the user identified perimeter. The processor may be configured to cause acquired images to be displayed via the user input/output system along with an indication indicative of the initially selected one object by displaying a draggable border about at least a portion of the initially selected object in the display of the acquired image. The processor may be further configured to cause acquired images to be displayed via the user input/output system along with an indication indicative of a user selected object by displaying a draggable border about at least a portion of a user selected object in the display of the acquired image. The user input/output system may include a touch-sensitive display. The processor may be further configured to cause the user input/output system to display dimensional data for one or more objects in the acquired images.
A dimensioning system to determine dimensional information for confined empty spaces may be summarized as including an imager to acquire images; a user input/output system to display images and to receive user input; and a processor configured to identify spaces in the acquired images, initially select one of the identified spaces for processing, cause the acquired images to be displayed via the user input/output system along with an indication indicative of selection of the initially selected space, and computationally determine dimensional data for a space indicated by at least one user input received via the user input/output system.
The processor may be configured to computationally determine the dimension data based on an estimated perimeter of the initially selected space as represented in the acquired image in response to at least one user input confirming the initially selected space as the space to be processed. The processor may be configured to computationally determine the dimensional data based on a new perimeter of the initially selected space represented in the acquired image in response to at least one user input indicative of the new perimeter. The processor may be configured to computationally determine the dimensional data based on an estimated perimeter of a space represented in the acquired image other than the initially selected space in response to at least one user input selecting the other space as the space to be processed. The processor may be configured to computationally determine the dimensional data based on a user identified perimeter of a space represented in the acquired image other than the initially selected space in response to at least one user input selecting the other space as the space to be processed and identifying at least a portion of the user identified perimeter. The processor may be configured to cause acquired images to be displayed via the user input/output system along with an indication indicative of the initially selected space by displaying a draggable border about at least a portion of the initially selected space in the display of the acquired image. The processor may be further configured to cause acquired images to be displayed via the user input/output system along with an indication indicative of a user selected space by displaying a draggable border about at least a portion of a user selected space in the display of the acquired image. The user input/output system may include a touch-sensitive display. The processor may be further configured to cause the user input/output system to display dimensional data related to one or more objects.
A computer-readable medium storing therein instructions to cause a computer to execute a process related to determining dimensional information for objects may be summarized as including displaying an image; identifying objects represented in the displayed image; initially selecting one object of the objects represented in the displayed image for processing; causing the displayed image and an indication indicative of the initially selected one object to be displayed; receiving user input; and determining dimensional data for an object indicated by at least one user input.
Determining dimensional data for an object indicated by at least one user input may include determining a dimensional weight based on an estimated perimeter of the initially selected one object as represented in the displayed image in response to at least one user input confirming the initially selected one object as the object to be processed. Determining dimensional data for an object indicated by at least one user input may include determining a dimensional weight based on a new perimeter of the initially selected one object represented in the displayed image in response to at least one user input indicative of the new perimeter. Determining dimensional data for an object indicated by at least one user input may include determining a dimensional weight based on an estimated perimeter of an object represented in the displayed image other than the initially selected one object in response to at least one user input selecting the other object as the object to be processed. Determining dimensional data for an object indicated by the at least one user input may include determining a dimensional weight based on a user identified perimeter of an object represented in the displayed image other than the initially selected one object in response to at least one user input selecting the other object as the object to be processed and identifying at least a portion of the user identified perimeter. Causing the displayed image and an indication indicative of the initially selected one object to be displayed may include causing the displayed image to be displayed and causing a draggable border about at least a portion of the initially selected one object to be displayed in the displayed image. Causing the displayed image and an indication indicative of the initially selected one object to be displayed may include causing the displayed image to be displayed and causing a draggable border about at least a portion of a user selected object to be displayed in the displayed image.
A computer-readable medium storing therein instructions to cause a computing system to execute a process related to determining dimensional information for objects may be summarized as including displaying an image; identifying objects or spaces represented in the displayed image; providing an indication to a user; receiving user input; and determining dimensional data for an object or space in response to the user input.
Providing an indication to a user may include indicating a problem related to an object or space of the objects or spaces in the displayed image to the user. Indicating a problem related to an object or space of the objects or spaces in the displayed image to the user may include indicating a problem in determining dimensional data for an object or space of the objects or spaces in the displayed image to the user. Receiving user input may include receiving the user input in the form of at least one of a keyboard entry, a computer mouse entry, a touch-screen device entry, a voice command, an audible command, and a bar code reading. The process may further include displaying a second image after receiving the user input; identifying objects or spaces represented in the second image; and receiving a second user input. Determining dimensional data for an object or space in response to the user input may include determining dimensional data for an object or space identified in the second image in response to the second user input. The process may further include determining dimensional data for one of the identified objects or spaces in the displayed image prior to receiving the user input. The process may further include displaying a dimensional data for an object or space.
A processor-implemented method of selecting an object from at least one object in an image to process information about the selected object may be summarized as including providing an image of the at least one object; selecting a first object of the at least one object in the image; updating the image to indicate the selection of the first object; receiving an input related to the selection of the first object; updating the image to indicate the input; and computationally determining dimensional data related to one of the at least one object using the input.
Updating the image to indicate the selection of the first object may include updating the image to indicate an estimated perimeter around the first object. Receiving an input related to the selection of the first object may include receiving the input selecting a second object of the at least one object that is different than the first object. Receiving an input related to the selection of the first object may include receiving the input to modify an aspect related to the indication of the selection of the first object. Receiving the input to modify an aspect related to the indication of the selection of the first object may include receiving the input to modify an estimated perimeter of the first object. Receiving an input related to the selection of the first object may include receiving the input as a user selection on a portion of a touch-screen device to select a second object of the at least one object. Receiving an input related to the selection of the first object may include receiving the input as a boundary drawn on a touch-screen device around an image of a second object of the at least one object to select the second object. Receiving an input related to the selection of the first object may include detecting a number of contacts at a number of positions on a touch-screen device, the contacts indicative of a number of corners of the first object. Receiving an input related to the selection of the first object may include receiving at least one user input indicative of a new position of a corner of the first object in the image displayed on a touch-screen device. Receiving an input related to the selection of the first object may include receiving at least one user input indicative of a perimeter of one of the at least one object on a touch-screen device indicative of a selection of the one of the at least one object. Determining dimensional data related to one of the at least one object using the input may include determining a dimensional weight of the one of the at least one object based on a computationally determined estimated perimeter of the one of the at least one object. Determining dimensional data related to one of the at least one object using the input may include determining a dimensional weight of the one of the at least one object based on a user identified perimeter of the one of the at least one object.
A processor-implemented method of selecting an object from at least one object in an image to process information about the selected object may be summarized as including displaying the image of the at least one object; selecting a first object of the at least one object in the image; updating the image to indicate the selection of the first object; receiving an input related to the selection of the first object; and updating the image to indicate the input.
Updating the image to indicate the selection of the first object may include updating the image to indicate an estimated perimeter around the first object. Receiving an input related to the selection of the first object may include receiving the input selecting a second object of the at least one object that is different than the first object. Receiving an input related to the selection of the first object may include receiving the input to modify an aspect related to the indication of the selection of the first object. Receiving the input to modify an aspect related to the indication of the selection of the first object may include receiving the input to modify an estimated perimeter of the first object. Receiving an input related to the selection of the first object may include receiving the input as a user selection on a portion of a touch-screen device to select a second object of the at least one object. Receiving an input related to the selection of the first object may include receiving the input as a boundary drawn on a touch-screen device around an image of a second object of the at least one object to select the second object. Receiving an input related to the selection of the first object may include detecting a number of contacts at a number of positions on a touch-screen device, the contacts indicative of a number of corners of the first object. Receiving an input related to the selection of the first object may include receiving at least one user input indicative of a new position of a corner of the first object in the image displayed on a touch-screen device. Receiving an input related to the selection of the first object may include receiving at least one user input indicative of a perimeter of one of the at least one object on a touch-screen device indicative of a selection of the one of the at least one object. Receiving an input may include receiving an audible command from a user. Receiving an input may include receiving a verbal command from a user. The method of claim may further include computationally determining dimensional data related to one of the at least one object using the input. Determining dimensional data related to one of the at least one object using the input may include determining a dimensional weight of the one of the at least one object based on a computationally determined estimated perimeter of the one of the at least one object. Determining dimensional data related to one of the at least one object using the input may include determining a dimensional weight of the one of the at least one object based on a user identified perimeter of the one of the at least one object.
In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn, are not intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with computing systems, imagers (e.g., cameras), and/or transport mechanisms (e.g., conveyors) have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense that is as “including, but not limited to.”
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, the appearances of the phrases “in one embodiment” or “in an embodiment” in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.
In some embodiments, the computing system 12 may be, for example, a desktop computer, a notebook computer, a handheld computer, a PDA, a workstation, a mainframe computer, or a processor in any type of the aforementioned computers or devices. The user input device 14 may be, for example, a keyboard, a computer mouse, a touch-screen device, a voice recognition device, a bar code reader, or any combination thereof. The user output device 16 may be, for example, a standalone monitor (e.g., a liquid-crystal display monitor or a cathode-ray tube monitor), a display screen, an auditory device, or a touch-screen device. In one embodiment, the user input device 14 and the user output device 16 may each be a part of a touch-screen device, which, as known in the art, is a display device that can detect the presence and location of a touch within a display area of the display device. For example, a touch-screen device including both the user input device 14 and the user output device 16 may have a screen that is operable to display an image and detect a contact to the screen by a user's finger, hand, or a writing tool such as a stylus. The data storage device 18 is preferably operable to store digital data that includes textual and numerical data, digitized images, and data input by a user of the dimensioning system 10, etc. The data storage device 18 may comprise a memory device such as, for example, a hard drive (whether as an integral part of the dimensioning system 10 or as a standalone device), a recording medium, or an integrated-circuit memory device (e.g., memory chip). The imager 15 may be, for example, a charge-coupled device (CCD), a complementary metal-oxide-semiconductor (CMOS) active-pixel sensor, or any similar image sensing or capture device that converts an optical image to a signal representative of the image.
In operation, the dimensioning system 10 of
A user of the dimensioning system 10 viewing the displayed image of the one or more objects may provide user input through the user input device 14. If the user agrees that it is the selected object on which further computation is to be performed, the user input may simply be one form of validation, such as, for example, a click of a computer mouse, a touch on a “Yes” button or user selectable icon on the user input device 14 in the form of a touch-screen device, pressing and releasing of a key on a keyboard, a check mark entered in an input field displayed on the user input device 14 in the form of a touch-screen device, or an audible or verbal command such as “Object 3,” for example. If the user agrees with the selection but wishes to make some modification to the selection (e.g., to correct the estimated perimeter of the selected object before dimensional weight is calculated based on the estimated perimeter), the user input may include both a validation and a modification, or simply a modification. For example, the perimeter estimated by the computing system 12 may be entirely or partially incorrect due to insufficient lighting in the image or too many or overlapping objects in the image making it difficult to discern the perimeter of the selected object. In such case, the user may modify all or a portion of the perimeter of the selected object as estimated by the computing system 12 and show in the image on the output device 16. If, however, the user disagrees with the selection by the computing system 12 and wishes to select a different object among the objects represented in the displayed image, the user input may include a selection of a different object in the image. For instance, when an object A of two objects represented in the displayed image is selected by the computing system 12 and the user wishes to select an object B instead, the user may enter his/her selection of object B by one of various ways. In another situation, there may be an inadequate image for certain packages, such as when a package is viewed straight on, only two of the tree dimensions are visible. Accordingly, in one embodiment, the computing system 12 may request that the user perform a task, such as issuing the command “Please move right or left and re-image Object 3,” for example.
The method by which the user enters his/her input may include, but is not limited to, one of the following: selecting or indicating at least a portion of the representation of object B in the displayed image; drawing or otherwise indicating a boundary around the representation of object B in the displayed image displayed on a touch-screen device; drawing a mark on or otherwise marking object B in the displayed image; and/or pointing out or otherwise indicating or selecting the corners or other specific features of object B in the displayed image. The user may take such action by, for example, manipulating a cursor or pointer icon displayed in the image using a pointer device (e.g., computer mouse, trackball, joystick, and rocker switch or arrow keys), or by using audible or verbal commands. The user may take such action by touching one or more portions of a touch sensitive screen on which the image is displayed, for example, to select portions of the objects B or user selectable icons.
Whether the user input validates or modifies the selection of the computing system 12, or selects a new object in the image, the user output device 16 may display the user input along with the representation of the at least one object in the image. For example, the user may validate the selection of a first object of the objects represented in the image yet at the same time modify the estimated perimeter of the representation of the first object by tracing or otherwise indicating the actual perimeter (i.e., indicated perimeter) of the representation of the first object on the display of the user output device 16. For instance, the user may select a portion of the perimeter of object and drag the selected portion using a pointer device (e.g., mouse, trackball, joystick, etc), a finger or a stylus (e.g., touch screen). In such case, the user output device 16 may show the traced line on the display as drawn by the user. If the user selects a second object of the objects represented in the image (e.g., by drawing a cross or a check mark on the second object), the user output device 16 may then represent the cross or check mark on the display. This provides an interaction between the dimensioning system 10 and the user in that the user provides user input as a part of the overall process of determining a dimensional value of the selected object (e.g., volume dimensioning), and the dimensioning system 10 provides an indication or feedback to the user of the user's input and performs computation based on the user input.
After the user provides input to the dimensioning system 10 through the user input device 14, the computing system 12 performs computation related to the selected object based on the user input. In the case of volume dimensioning where the dimensional weight of an object is computed, the computing system 12 computes the dimensional weight of the selected object based on an estimated or indicated perimeter of the selected object. More specifically, in one embodiment, the computing system 12 is able to estimate a length, width, and height of the selected object, for example, by using the estimated or indicated perimeter of the selected object. Once the dimensional weight of the selected object has been determined, the charge for shipment of the selected object may then be determined.
Thus, by allowing the user to validate the object selection by the computing system 12, or by allowing the user to select an object for the computing system 20 to perform volume dimensioning on, or both, issues related to inaccuracy caused by selection of a wrong object or an erroneous estimation of the object's perimeter (and thus dimensions) by the computing system 12 due to insufficient lighting or presence of numerous objects in the image or other issues may be avoided. The user interaction serves as a check in the dimensioning process, ensuring that the correct object is selected and that computation is based on dimensions derived from a correct perimeter of the selected object.
In one embodiment, the processing component 22 is operable to determine from an image captured by the imager 25 an approximate perimeter of a first object of at least one object in the image. The processing component 22 may cause the display component 26 to display the captured image and an indicator that indicates the approximate perimeter of the first object. Upon receiving at least one input from the user via the user input component 24, the processing component 22 determines a dimensional value of one of the objects in the displayed image based on the user input. For example, the processing component 22 may perform computation for volume dimensioning on the first object if the user input validates or modifies the approximate perimeter of the first object. If the user input modifies the approximate perimeter of the first object, the computation will be based on the modified perimeter. Otherwise, in the case of the user input indicates validation, the computation will be based on the approximate perimeter determined by the processing component 22. Alternatively, if the user selects a second object different than the first object from objects represented in the displayed image, the processing component 22 may perform volume dimensioning on the second object to determine the dimensional weight of the second object.
In one embodiment, the at least one user input received may include a user selection that confirms the initially selected one object as the object for processing. In another embodiment, the at least one user input received may include a user selection that indicates an object other than the initially selected one object as the object for processing. For example, when the user desires to determine dimensional data for an object that is different than the object initially selected by an automatic process executed in a computing system, the user manually selects the user-selected object before the computing system proceeds further. When the user input confirms the initially selected one object as the object for processing, in one embodiment, a dimensional weight of the initially selected one object is determined based on an estimated perimeter of the initially selected one object as represented in the acquired image. Alternatively, when the user input selects a different object, a dimensional weight of the user-selected object is determined based on an estimated perimeter of the user-selected object as represented in the acquired image. Further, when the user input selects a different object, process 30a may additionally include (not shown) providing an indication to the user indicative of a currently selected object to visually distinguish the currently selected object in a display of the acquired image from any other object represented in the display of the acquired image.
In one embodiment, the at least one user input received indicates at least a portion of a new perimeter for the object for processing. In such case, in one embodiment, the dimensional data is computationally determined based on the new perimeter, as indicated by the user input, of the object represented in the acquired image. For instance, when an estimated perimeter of the selected object, as represented in the acquired image on a display, is partially or entirely incorrect (e.g., due to insufficient lighting or the presence of numerous objects when the image is acquired), the user may modify the estimated perimeter so that dimensional data for the selected object is computed not based on incorrect information (e.g., incorrect estimated perimeter) but based on modified information.
In one embodiment, the user is notified of the initial selection of the initially selected one object by a display of the acquired image, where the initially selected one object is visually distinguished in the display of the acquired image from any other objects represented in the display of the acquired image. In one embodiment, the initially selected one object is visually distinguished from any other objects represented in the display of the acquired image with a display of a border about at least a portion of the initially selected object in the display of the acquired image. In such a case, in an embodiment, the at least one user input received may include at least one signal representing a position in the image that indicates a position of at least a portion of a new perimeter for the object for processing. In an alternative embodiment, the initially selected one object is visually distinguished from any other objects represented in the display of the acquired image with a display of a draggable border about at least a portion of the initially selected object in the display of the acquired image. For example, the at least one user input received may include at least one signal representing a dragging of the draggable border to a new position that indicates at least a portion of a new perimeter for the object for processing.
It should be appreciated by one skilled in the art that the process 30a may be implemented in one integrated device or in multiple standalone devices. For example, the process 30a may be implemented in any of the computing system 10a, 10b, and 10c of
The process 30b may further computationally determine dimensional data for the object or space selected by the received user input at 35b. For example, in one embodiment, this may be due to the user input selecting an object or space that is different from the object or space for which a dimensional data has been determined. The process 30b may further computationally revise the determined dimensional data for the at least one object or space in response to the received user input at 36b. For example, in one embodiment, the user may agree with the selection of the object or space but disagree with the certain aspect of the selection (e.g., the border of the selected object or space which is used to determine the volume of the object or space). In such case, the user input may be a modification to that aspect of the selection of the object or space, such as, for example, a change in the selected object's or space's border.
The user input may come in different forms. For example, the user input may be a keyboard entry on a keyboard, a click or “click and drag” using a computer mouse, entry through a touch-sensitive screen by the user's finger or a stylus or similar tool, a voice command including at least one command word, an audible command such as a clap or some recognizable sound, or entry by a bar code reader.
As with the process 30a, the process 30b may be implemented in one integrated device or in multiple standalone devices. For example, the process 30b may be implemented in any of the computing system 10a, 10b, and 10c of
In one embodiment, the image is updated to indicate an estimated perimeter around the first object when updating the image to indicate the selection of the first object. In one embodiment, the input (e.g., a user input manually entered by a user) selects a second object different than the first object. In another embodiment, the input modifies an aspect related to the indication of the selection of the first object. For example, in an embodiment, one aspect related to the indication of the selection of the first object may be an estimated perimeter of the first object as shown in the image, and accordingly the input may modify the estimated perimeter of the first object.
In some embodiments, the input may be a mark or a line drawn on, for example, a touch-screen device by a user to either validate the selection of the first object or to select a second object different than the first object. The input may also be a user input to point out corners of the first object in the image on, for example, a touch-screen device. When an estimated perimeter of the first object is also indicated in the image, the input may be a user input to correct a corner position of the estimated perimeter by moving a corner point of the first object in the image on, say, a touch-screen device. The estimated perimeter of the first object may be a draggable perimeter displayed on a display device and modifiable by a user dragging at least a portion of the estimated perimeter (e.g., in a click-and-drag or point-and-drag fashion) to change the estimated perimeter into a modified boundary that more closely resembles the real perimeter of the first object. Alternatively, the input may be a boundary line drawn by the user to indicate the selection of an object approximately surrounded by the line drawn by the user. As indicated previously, the input may be done by using the user's finger, a stylus or similar tool, by using a keyboard, or by using a computer mouse.
Accordingly, in one embodiment, the input received may be a user selection on a portion of a touch-screen device to select a second object of the at least one object. In another embodiment, the input received may be a boundary drawn on a touch-screen device around an image of a second object of the at least one object to select the second object that is different than the first object. In one embodiment, receiving an input may include detecting a number of contacts at a number of positions on a touch-screen device where the contacts indicate a number of corners of the first object. In a different embodiment, the received input may include at least one user input indicative of a new position of a corner of the first object in the image displayed on a touch-screen device. Alternatively, the received input may include at least one user input indicative of a perimeter of one of the at least one object on a touch-screen device indicative of a selection of the selected object.
In one embodiment, determining dimensional data related to one of the at least one object using the input may include determining a dimensional weight of the one of the at least one object based on a computationally determined estimated perimeter of the one of the at least one object. For instance, when the user input confirms the initial selection of the first object by a computing system, the computing system will determine the dimensional weight of the first object based on the estimated perimeter as determined by the computing system. In an alternative embodiment, determining dimensional data related to one of the at least one object using the input may include determining a dimensional weight of the one of the at least one object based on a user identified perimeter of the one of the at least one object. For example, when the user input modifies an estimated perimeter of the first object as determined by the computing system, the computing system determines the dimensional weight of the first object based on the modified perimeter. If the user input instead selects a second object that is not the first object, the computing system may determine the dimensional weight of the user-selected second object based on an estimated perimeter of the second object as determined by the computing system or based on a user-identified perimeter of the second object.
It should be appreciated by one skilled in the art that the process 40a may be implemented in one integrated device or in multiple standalone devices. For example, the process 40a may be implemented in any of the computing system 10a, 10b, and 10c of
As with the process 40a, the process 40b may be implemented in one integrated device or in multiple standalone devices. For example, the process 40b may be implemented in any of the computing system 10a, 10b, and 10c of
In one embodiment, determining dimensional data for the object indicated by at least one user input may include determining a dimensional weight based on an estimated perimeter of the initially selected one object as represented in the acquired image in response to at least one user input confirming the initially selected one object as the object to be processed. For example, when a user confirms the estimated perimeter of initially selected object A, the dimensional weight of object A is determined based on the estimated perimeter. Alternatively, determining dimensional data for the object indicated by at least one user input may include determining a dimensional weight based on a new perimeter of the initially selected one object represented in the acquired image in response to at least one user input indicative of the new perimeter. For example, when a user modifies an estimated perimeter of initially selected object A to form a new perimeter, the dimensional weight of object A is determined based on the new perimeter.
In one embodiment, determining dimensional data for the object indicated by at least one user input may include determining a dimensional weight based on an estimated perimeter of an object represented in the acquired image other than the initially selected one object in response to at least one user input selecting the other object as the object to be processed. For example, when the user selects object B, which is different than object A as initially selected by the computer-executable program, the dimensional weight of object B may be determined based on an estimated perimeter of object B as determined by the program. In another embodiment, determining dimensional data for the object indicated by at least one user input may include determining a dimensional weight based on a user identified perimeter of an object represented in the acquired image other than the initially selected one object in response to at least one user input selecting the other object as the object to be processed and identifying at least a portion of the user identified perimeter. For instance, when the user selects object B and identifies a user-identified perimeter of object B, the dimensional weight of object B may be determined based on the user-identified perimeter. The user-identified perimeter may also be displayed as a feedback to the user on a display of the acquired image to acknowledge the user input.
In one embodiment, causing the acquired image and an indication indicative of the initially selected one object to be displayed may include causing the acquired image to be displayed and causing a draggable border about at least a portion of the initially selected one object to be displayed in a display of the acquired image. In another embodiment, causing the acquired image and an indication indicative of the initially selected one object to be displayed may include causing the acquired image to be displayed and causing a draggable border about at least a portion of a user selected object to be displayed in a display of the acquired image. In either case, with the draggable border displayed, a user may drag the draggable border to make modifications to correct error in the displayed border.
In one embodiment, the process 50b may further determine dimensional data for one of the identified objects or spaces in the displayed image prior to receiving the user input at 56b. In another embodiment, the process 50b may further display a dimensional data for an object or space. For example, a dimensional data, such as length, width, height, area, or volume, of the automatically selected object or space may be displayed before and/or after the user input is received. In yet another embodiment, at 57b, the process 50b may display a second image after receiving the user input. Objects or spaces represented in the second image are identified at 58b. Another user input is received at 59b.
Thus, systems and methods to allow user interaction in volume dimensioning an object are disclosed herein and should greatly improve upon the inaccuracy problem described above. For instance, when a dimensioning application selects an incorrect object for dimensioning or when the estimated perimeter of the selected object is erroneous, a user can intervene by selecting the correct object for dimensioning or by modifying the estimated perimeter. This user interaction provides a way for the user to validate or modify selections made by the application, and thereby avoid inaccuracies that might arise if the process is fully automated.
The above description of illustrated embodiments, including what is described in the Abstract, is not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. Although specific embodiments of and examples are described herein for illustrative purposes, various equivalent modifications can be made without departing from the spirit and scope of the disclosure, as will be recognized by those skilled in the relevant art. The teachings provided herein of the various embodiments can be applied to other context, not necessarily the exemplary context of volume dimensioning generally described above. It will be understood by those skilled in the art that, although the embodiments described above and shown in the figures are generally directed to the context of volume dimensioning, applications for determining other values related to objects, such as parcels and packages, may also benefit from the concepts described herein. Further, although the embodiments described above and shown in the figures are directed to volume dimensioning using a portable electronic device, the concepts and the embodiments described herein are equally applicable to non-portable devices or to a system having multiple standalone devices coupled to one.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
Number | Date | Country | Kind |
---|---|---|---|
09368001 | Jan 2009 | EP | regional |
The present application claims the benefit of U.S. patent application Ser. No. 12/685,816 for Semi-Automatic Dimensioning with Imager on a Portable Device filed Jan. 12, 2010 (and published Aug. 12, 2010 as U.S. Patent Application Publication No. 2010/0202702), now U.S. Pat. No. 8,908,995, which claims the benefit of U.S. Patent Application No. 61/149,912 for Semi-Automatic Dimensioning with Imager on a Portable Device filed Feb. 4, 2009. Each of the foregoing patent applications, patent publication, and patent is hereby incorporated by reference in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
3971065 | Bayer | Jul 1976 | A |
4279328 | Ahlbom | Jul 1981 | A |
4398811 | Nishioka et al. | Aug 1983 | A |
4495559 | Gelatt, Jr. | Jan 1985 | A |
4730190 | Win et al. | Mar 1988 | A |
4803639 | Steele et al. | Feb 1989 | A |
4974919 | Muraki et al. | Dec 1990 | A |
5111325 | DeJager | May 1992 | A |
5175601 | Fitts | Dec 1992 | A |
5184733 | Arnarson | Feb 1993 | A |
5220536 | Stringer et al. | Jun 1993 | A |
5243619 | Albers et al. | Sep 1993 | A |
5331118 | Jensen | Jul 1994 | A |
5359185 | Hanson | Oct 1994 | A |
5384901 | Glassner et al. | Jan 1995 | A |
5548707 | LoNegro et al. | Aug 1996 | A |
5555090 | Schmutz | Sep 1996 | A |
5561526 | Huber | Oct 1996 | A |
5590060 | Granville et al. | Dec 1996 | A |
5592333 | Lewis | Jan 1997 | A |
5606534 | Stringer et al. | Feb 1997 | A |
5619245 | Kessler et al. | Apr 1997 | A |
5655095 | LoNegro et al. | Aug 1997 | A |
5661561 | Wurz et al. | Aug 1997 | A |
5699161 | Woodworth | Dec 1997 | A |
5729750 | Ishida | Mar 1998 | A |
5730252 | Herbinet | Mar 1998 | A |
5732147 | Tao | Mar 1998 | A |
5734476 | Dlugos | Mar 1998 | A |
5737074 | Raga et al. | Apr 1998 | A |
5748199 | Palm | May 1998 | A |
5767962 | Suzuki et al. | Jun 1998 | A |
5802092 | Endriz | Sep 1998 | A |
5808657 | Kurtz et al. | Sep 1998 | A |
5831737 | Stringer et al. | Nov 1998 | A |
5850370 | Stringer et al. | Dec 1998 | A |
5850490 | Johnson | Dec 1998 | A |
5869827 | Rando | Feb 1999 | A |
5870220 | Migdal et al. | Feb 1999 | A |
5900611 | Hecht | May 1999 | A |
5923428 | Woodworth | Jul 1999 | A |
5929856 | LoNegro et al. | Jul 1999 | A |
5938710 | Lanza et al. | Aug 1999 | A |
5959568 | Woolley | Sep 1999 | A |
5960098 | Tao | Sep 1999 | A |
5969823 | Wurz et al. | Oct 1999 | A |
5978512 | Kim et al. | Nov 1999 | A |
5979760 | Freyman et al. | Nov 1999 | A |
5988862 | Kacyra et al. | Nov 1999 | A |
5991041 | Woodworth | Nov 1999 | A |
6009189 | Schaack | Dec 1999 | A |
6025847 | Marks | Feb 2000 | A |
6049386 | Stringer et al. | Apr 2000 | A |
6053409 | Brobst et al. | Apr 2000 | A |
6064759 | Buckley et al. | May 2000 | A |
6067110 | Nonaka et al. | May 2000 | A |
6069696 | McQueen et al. | May 2000 | A |
6115114 | Berg | Sep 2000 | A |
6137577 | Woodworth | Oct 2000 | A |
6177999 | Wurz et al. | Jan 2001 | B1 |
6189223 | Haug | Feb 2001 | B1 |
6232597 | Kley | May 2001 | B1 |
6236403 | Chaki | May 2001 | B1 |
6246468 | Dimsdale | Jun 2001 | B1 |
6333749 | Reinhardt et al. | Dec 2001 | B1 |
6336587 | He et al. | Jan 2002 | B1 |
6369401 | Lee | Apr 2002 | B1 |
6373579 | Ober et al. | Apr 2002 | B1 |
6429803 | Kumar | Aug 2002 | B1 |
6457642 | Good et al. | Oct 2002 | B1 |
6507406 | Yagi et al. | Jan 2003 | B1 |
6517004 | Good et al. | Feb 2003 | B2 |
6519550 | D'Hooge et al. | Feb 2003 | B1 |
6535776 | Tobin et al. | Mar 2003 | B1 |
6661521 | Stern | Dec 2003 | B1 |
6674904 | McQueen | Jan 2004 | B1 |
6705526 | Zhu et al. | Mar 2004 | B1 |
6773142 | Rekow | Aug 2004 | B2 |
6781621 | Gobush et al. | Aug 2004 | B1 |
6804269 | Lizotte et al. | Oct 2004 | B2 |
6824058 | Patel et al. | Nov 2004 | B2 |
6832725 | Gardiner et al. | Dec 2004 | B2 |
6858857 | Pease et al. | Feb 2005 | B2 |
6922632 | Foxlin | Jul 2005 | B2 |
6971580 | Zhu et al. | Dec 2005 | B2 |
6995762 | Pavlidis et al. | Feb 2006 | B1 |
7057632 | Yamawaki et al. | Jun 2006 | B2 |
7085409 | Sawhney et al. | Aug 2006 | B2 |
7086162 | Tyroler | Aug 2006 | B2 |
7104453 | Zhu et al. | Sep 2006 | B1 |
7128266 | Marlton et al. | Oct 2006 | B2 |
7137556 | Bonner | Nov 2006 | B1 |
7159783 | Walczyk et al. | Jan 2007 | B2 |
7161688 | Bonner et al. | Jan 2007 | B1 |
7205529 | Andersen | Apr 2007 | B2 |
7214954 | Schopp | May 2007 | B2 |
7233682 | Levine | Jun 2007 | B2 |
7277187 | Smith et al. | Oct 2007 | B2 |
7307653 | Dutta | Dec 2007 | B2 |
7310431 | Gokturk et al. | Dec 2007 | B2 |
7413127 | Ehrhart et al. | Aug 2008 | B2 |
7527205 | Zhu et al. | May 2009 | B2 |
7586049 | Wurz | Sep 2009 | B2 |
7602404 | Reinhardt et al. | Oct 2009 | B1 |
7614563 | Nunnink et al. | Nov 2009 | B1 |
7639722 | Paxton et al. | Dec 2009 | B1 |
7726575 | Wang et al. | Jun 2010 | B2 |
7780084 | Zhang et al. | Aug 2010 | B2 |
7788883 | Buckley et al. | Sep 2010 | B2 |
7974025 | Topliss | Jul 2011 | B2 |
8009358 | Zalevsky et al. | Aug 2011 | B2 |
8027096 | Feng et al. | Sep 2011 | B2 |
8028501 | Buckley et al. | Oct 2011 | B2 |
8050461 | Shpunt et al. | Nov 2011 | B2 |
8055061 | Katano | Nov 2011 | B2 |
8061610 | Nunnink | Nov 2011 | B2 |
8072581 | Breiholz | Dec 2011 | B1 |
8102395 | Kondo et al. | Jan 2012 | B2 |
8132728 | Dwinell et al. | Mar 2012 | B2 |
8134717 | Pangrazio et al. | Mar 2012 | B2 |
8149224 | Kuo et al. | Apr 2012 | B1 |
8194097 | Xiao et al. | Jun 2012 | B2 |
8212889 | Chanas et al. | Jul 2012 | B2 |
8224133 | Popovich et al. | Jul 2012 | B2 |
8228510 | Pangrazio et al. | Jul 2012 | B2 |
8230367 | Bell | Jul 2012 | B2 |
8294969 | Plesko | Oct 2012 | B2 |
8301027 | Shaw et al. | Oct 2012 | B2 |
8305458 | Hara | Nov 2012 | B2 |
8310656 | Zalewski | Nov 2012 | B2 |
8313380 | Zalewski et al. | Nov 2012 | B2 |
8317105 | Kotlarsky et al. | Nov 2012 | B2 |
8320621 | McEldowney | Nov 2012 | B2 |
8322622 | Suzhou et al. | Dec 2012 | B2 |
8339462 | Stec et al. | Dec 2012 | B2 |
8350959 | Topliss et al. | Jan 2013 | B2 |
8351670 | Ijiri et al. | Jan 2013 | B2 |
8366005 | Kotlarsky et al. | Feb 2013 | B2 |
8371507 | Haggerty et al. | Feb 2013 | B2 |
8374498 | Pastore | Feb 2013 | B2 |
8376233 | Van Horn et al. | Feb 2013 | B2 |
8381976 | Mohideen et al. | Feb 2013 | B2 |
8381979 | Franz | Feb 2013 | B2 |
8390909 | Plesko | Mar 2013 | B2 |
8408464 | Zhu et al. | Apr 2013 | B2 |
8408468 | Horn et al. | Apr 2013 | B2 |
8408469 | Good | Apr 2013 | B2 |
8424768 | Rueblinger et al. | Apr 2013 | B2 |
8437539 | Komatsu et al. | May 2013 | B2 |
8441749 | Brown et al. | May 2013 | B2 |
8448863 | Xian et al. | May 2013 | B2 |
8457013 | Essinger et al. | Jun 2013 | B2 |
8459557 | Havens et al. | Jun 2013 | B2 |
8463079 | Ackley et al. | Jun 2013 | B2 |
8469272 | Kearney | Jun 2013 | B2 |
8474712 | Kearney et al. | Jul 2013 | B2 |
8479992 | Kotlarsky et al. | Jul 2013 | B2 |
8490877 | Kearney | Jul 2013 | B2 |
8517271 | Kotlarsky et al. | Aug 2013 | B2 |
8523076 | Good | Sep 2013 | B2 |
8528818 | Ehrhart et al. | Sep 2013 | B2 |
8544737 | Gomez et al. | Oct 2013 | B2 |
8548420 | Grunow et al. | Oct 2013 | B2 |
8550335 | Samek et al. | Oct 2013 | B2 |
8550354 | Gannon et al. | Oct 2013 | B2 |
8550357 | Kearney | Oct 2013 | B2 |
8556174 | Kosecki et al. | Oct 2013 | B2 |
8556176 | Van Horn et al. | Oct 2013 | B2 |
8556177 | Hussey et al. | Oct 2013 | B2 |
8559767 | Barber et al. | Oct 2013 | B2 |
8561895 | Gomez et al. | Oct 2013 | B2 |
8561903 | Sauerwein | Oct 2013 | B2 |
8561905 | Edmonds et al. | Oct 2013 | B2 |
8565107 | Pease et al. | Oct 2013 | B2 |
8570343 | Halstead | Oct 2013 | B2 |
8571307 | Li et al. | Oct 2013 | B2 |
8576390 | Nunnink | Nov 2013 | B1 |
8579200 | Samek et al. | Nov 2013 | B2 |
8583924 | Caballero et al. | Nov 2013 | B2 |
8584945 | Wang et al. | Nov 2013 | B2 |
8587595 | Wang | Nov 2013 | B2 |
8587697 | Hussey et al. | Nov 2013 | B2 |
8588869 | Sauerwein et al. | Nov 2013 | B2 |
8590789 | Nahill et al. | Nov 2013 | B2 |
8596539 | Havens et al. | Dec 2013 | B2 |
8596542 | Havens et al. | Dec 2013 | B2 |
8596543 | Havens et al. | Dec 2013 | B2 |
8599271 | Havens et al. | Dec 2013 | B2 |
8599957 | Peake et al. | Dec 2013 | B2 |
8600158 | Li et al. | Dec 2013 | B2 |
8600167 | Showering | Dec 2013 | B2 |
8602309 | Longacre et al. | Dec 2013 | B2 |
8608053 | Meier et al. | Dec 2013 | B2 |
8608071 | Liu et al. | Dec 2013 | B2 |
8611309 | Wang et al. | Dec 2013 | B2 |
8615487 | Gomez et al. | Dec 2013 | B2 |
8621123 | Caballero | Dec 2013 | B2 |
8622303 | Meier et al. | Jan 2014 | B2 |
8628013 | Ding | Jan 2014 | B2 |
8628015 | Wang et al. | Jan 2014 | B2 |
8628016 | Winegar | Jan 2014 | B2 |
8629926 | Wang | Jan 2014 | B2 |
8630491 | Longacre et al. | Jan 2014 | B2 |
8635309 | Berthiaume et al. | Jan 2014 | B2 |
8636200 | Kearney | Jan 2014 | B2 |
8636212 | Nahill et al. | Jan 2014 | B2 |
8636215 | Ding et al. | Jan 2014 | B2 |
8636224 | Wang | Jan 2014 | B2 |
8638806 | Wang et al. | Jan 2014 | B2 |
8640958 | Lu et al. | Feb 2014 | B2 |
8640960 | Wang et al. | Feb 2014 | B2 |
8643717 | Li et al. | Feb 2014 | B2 |
8646692 | Meier et al. | Feb 2014 | B2 |
8646694 | Wang et al. | Feb 2014 | B2 |
8657200 | Ren et al. | Feb 2014 | B2 |
8659397 | Vargo et al. | Feb 2014 | B2 |
8668149 | Good | Mar 2014 | B2 |
8678285 | Kearney | Mar 2014 | B2 |
8678286 | Smith et al. | Mar 2014 | B2 |
8682077 | Longacre | Mar 2014 | B1 |
D702237 | Oberpriller et al. | Apr 2014 | S |
8687282 | Feng et al. | Apr 2014 | B2 |
8692927 | Pease et al. | Apr 2014 | B2 |
8695880 | Bremer et al. | Apr 2014 | B2 |
8698949 | Grunow et al. | Apr 2014 | B2 |
8702000 | Barber et al. | Apr 2014 | B2 |
8717494 | Gannon | May 2014 | B2 |
8720783 | Biss et al. | May 2014 | B2 |
8723804 | Fletcher et al. | May 2014 | B2 |
8723904 | Marty et al. | May 2014 | B2 |
8727223 | Wang | May 2014 | B2 |
8740082 | Wilz | Jun 2014 | B2 |
8740085 | Furlong et al. | Jun 2014 | B2 |
8746563 | Hennick et al. | Jun 2014 | B2 |
8750445 | Peake et al. | Jun 2014 | B2 |
8752766 | Xian et al. | Jun 2014 | B2 |
8756059 | Braho et al. | Jun 2014 | B2 |
8757495 | Qu et al. | Jun 2014 | B2 |
8760563 | Koziol et al. | Jun 2014 | B2 |
8736909 | Reed et al. | Jul 2014 | B2 |
8777108 | Coyle | Jul 2014 | B2 |
8777109 | Oberpriller et al. | Jul 2014 | B2 |
8779898 | Havens et al. | Jul 2014 | B2 |
8781520 | Payne et al. | Jul 2014 | B2 |
8783573 | Havens et al. | Jul 2014 | B2 |
8789757 | Barten | Jul 2014 | B2 |
8789758 | Hawley et al. | Jul 2014 | B2 |
8789759 | Xian et al. | Jul 2014 | B2 |
8792688 | Unsworth | Jul 2014 | B2 |
8794520 | Wang et al. | Aug 2014 | B2 |
8794522 | Ehrhart | Aug 2014 | B2 |
8794525 | Amundsen et al. | Aug 2014 | B2 |
8794526 | Wang et al. | Aug 2014 | B2 |
8798367 | Ellis | Aug 2014 | B2 |
8807431 | Wang et al. | Aug 2014 | B2 |
8807432 | Van Horn et al. | Aug 2014 | B2 |
8810779 | Hilde | Aug 2014 | B1 |
8820630 | Qu et al. | Sep 2014 | B2 |
8822848 | Meagher | Sep 2014 | B2 |
8824692 | Sheerin et al. | Sep 2014 | B2 |
8824696 | Braho | Sep 2014 | B2 |
8842849 | Wahl et al. | Sep 2014 | B2 |
8844822 | Kotlarsky et al. | Sep 2014 | B2 |
8844823 | Fritz et al. | Sep 2014 | B2 |
8849019 | Li et al. | Sep 2014 | B2 |
D716285 | Chaney et al. | Oct 2014 | S |
8851383 | Yeakley et al. | Oct 2014 | B2 |
8854633 | Laffargue | Oct 2014 | B2 |
8866963 | Grunow et al. | Oct 2014 | B2 |
8868421 | Braho et al. | Oct 2014 | B2 |
8868519 | Maloy et al. | Oct 2014 | B2 |
8868802 | Barten | Oct 2014 | B2 |
8868803 | Bremer et al. | Oct 2014 | B2 |
8870074 | Gannon | Oct 2014 | B1 |
8879639 | Sauerwein | Nov 2014 | B2 |
8880426 | Smith | Nov 2014 | B2 |
8881983 | Havens et al. | Nov 2014 | B2 |
8881987 | Wang | Nov 2014 | B2 |
8897596 | Passmore et al. | Nov 2014 | B1 |
8903172 | Smith | Dec 2014 | B2 |
8908277 | Pesach et al. | Dec 2014 | B2 |
8908995 | Benos | Dec 2014 | B2 |
8910870 | Li et al. | Dec 2014 | B2 |
8910875 | Ren et al. | Dec 2014 | B2 |
8914290 | Hendrickson et al. | Dec 2014 | B2 |
8914788 | Pettinelli et al. | Dec 2014 | B2 |
8915439 | Feng et al. | Dec 2014 | B2 |
8915444 | Havens et al. | Dec 2014 | B2 |
8916789 | Woodburn | Dec 2014 | B2 |
8918250 | Hollifield | Dec 2014 | B2 |
8918564 | Caballero | Dec 2014 | B2 |
8925818 | Kosecki et al. | Jan 2015 | B2 |
8939374 | Jovanovski et al. | Jan 2015 | B2 |
8942480 | Ellis | Jan 2015 | B2 |
8944313 | Williams et al. | Feb 2015 | B2 |
8944327 | Meier et al. | Feb 2015 | B2 |
8944332 | Harding et al. | Feb 2015 | B2 |
8950678 | Germaine et al. | Feb 2015 | B2 |
D723560 | Zhou et al. | Mar 2015 | S |
8967468 | Gomez et al. | Mar 2015 | B2 |
8971346 | Sevier | Mar 2015 | B2 |
8976030 | Cunningham et al. | Mar 2015 | B2 |
8976368 | Akel et al. | Mar 2015 | B2 |
8978981 | Guan | Mar 2015 | B2 |
8978983 | Bremer et al. | Mar 2015 | B2 |
8978984 | Hennick et al. | Mar 2015 | B2 |
8985456 | Zhu et al. | Mar 2015 | B2 |
8985457 | Soule et al. | Mar 2015 | B2 |
8985459 | Kearney et al. | Mar 2015 | B2 |
8985461 | Gelay et al. | Mar 2015 | B2 |
8988578 | Showering | Mar 2015 | B2 |
8988590 | Gillet et al. | Mar 2015 | B2 |
8991704 | Hopper et al. | Mar 2015 | B2 |
8993974 | Goodwin | Mar 2015 | B2 |
8996194 | Davis et al. | Mar 2015 | B2 |
8996384 | Funyak et al. | Mar 2015 | B2 |
8998091 | Edmonds et al. | Apr 2015 | B2 |
9002641 | Showering | Apr 2015 | B2 |
9007368 | Laffargue et al. | Apr 2015 | B2 |
9010641 | Qu et al. | Apr 2015 | B2 |
9014441 | Truyen | Apr 2015 | B2 |
9015513 | Murawski et al. | Apr 2015 | B2 |
9016576 | Brady et al. | Apr 2015 | B2 |
D730357 | Fitch et al. | May 2015 | S |
9022288 | Nahill et al. | May 2015 | B2 |
9030964 | Essinger et al. | May 2015 | B2 |
9033240 | Smith et al. | May 2015 | B2 |
9033242 | Gillet et al. | May 2015 | B2 |
9036054 | Koziol et al. | May 2015 | B2 |
9037344 | Chamberlin | May 2015 | B2 |
9038911 | Xian et al. | May 2015 | B2 |
9038915 | Smith | May 2015 | B2 |
D730901 | Oberpriller et al. | Jun 2015 | S |
D730902 | Fitch et al. | Jun 2015 | S |
D733112 | Chaney et al. | Jun 2015 | S |
9047098 | Barten | Jun 2015 | B2 |
9047359 | Caballero et al. | Jun 2015 | B2 |
9047420 | Caballero | Jun 2015 | B2 |
9047525 | Barber | Jun 2015 | B2 |
9047531 | Showering et al. | Jun 2015 | B2 |
9049640 | Wang et al. | Jun 2015 | B2 |
9053055 | Caballero | Jun 2015 | B2 |
9053378 | Hou et al. | Jun 2015 | B1 |
9053380 | Xian et al. | Jun 2015 | B2 |
9057641 | Amundsen et al. | Jun 2015 | B2 |
9058526 | Powilleit | Jun 2015 | B2 |
9064165 | Havens et al. | Jun 2015 | B2 |
9064167 | Xian et al. | Jun 2015 | B2 |
9064168 | Todeschini et al. | Jun 2015 | B2 |
9064254 | Todeschini et al. | Jun 2015 | B2 |
9066032 | Wang | Jun 2015 | B2 |
9066087 | Shpunt | Jun 2015 | B2 |
9070032 | Corcoran | Jun 2015 | B2 |
D734339 | Zhou et al. | Jul 2015 | S |
D734751 | Oberpriller et al. | Jul 2015 | S |
9082023 | Feng et al. | Jul 2015 | B2 |
9082195 | Holeva et al. | Jul 2015 | B2 |
9142035 | Rotman et al. | Sep 2015 | B1 |
9233470 | Bradski et al. | Jan 2016 | B1 |
9273846 | Rossi et al. | Mar 2016 | B1 |
9299013 | Curlander et al. | Mar 2016 | B1 |
9366861 | Johnson | Jun 2016 | B1 |
9424749 | Reed et al. | Aug 2016 | B1 |
9486921 | Straszheim et al. | Nov 2016 | B1 |
9736459 | Mor et al. | Aug 2017 | B2 |
9828223 | Svensson et al. | Nov 2017 | B2 |
20010027995 | Patel et al. | Oct 2001 | A1 |
20010032879 | He et al. | Oct 2001 | A1 |
20020036765 | McCaffrey | Mar 2002 | A1 |
20020054289 | Thibault et al. | May 2002 | A1 |
20020067855 | Chiu et al. | Jun 2002 | A1 |
20020105639 | Roelke | Aug 2002 | A1 |
20020109835 | Goetz | Aug 2002 | A1 |
20020113946 | Kitaguchi et al. | Aug 2002 | A1 |
20020118874 | Chung | Aug 2002 | A1 |
20020158873 | Williamson | Oct 2002 | A1 |
20020167677 | Okada et al. | Nov 2002 | A1 |
20020179708 | Zhu et al. | Dec 2002 | A1 |
20020196534 | Lizotte et al. | Dec 2002 | A1 |
20030038179 | Tsikos et al. | Feb 2003 | A1 |
20030053513 | Vatan et al. | Mar 2003 | A1 |
20030063086 | Baumberg | Apr 2003 | A1 |
20030091227 | Chang et al. | May 2003 | A1 |
20030156756 | Gokturk et al. | Aug 2003 | A1 |
20030197138 | Pease et al. | Oct 2003 | A1 |
20030225712 | Cooper et al. | Dec 2003 | A1 |
20030235331 | Kawaike et al. | Dec 2003 | A1 |
20040019274 | Galloway et al. | Jan 2004 | A1 |
20040024754 | Mane et al. | Feb 2004 | A1 |
20040066329 | Leitfuss et al. | Apr 2004 | A1 |
20040073359 | Ichijo et al. | Apr 2004 | A1 |
20040083025 | Yamanouchi et al. | Apr 2004 | A1 |
20040089482 | Ramsden et al. | May 2004 | A1 |
20040098146 | Katae et al. | May 2004 | A1 |
20040105580 | Hager et al. | Jun 2004 | A1 |
20040118928 | Patel et al. | Jun 2004 | A1 |
20040122779 | Stickler et al. | Jun 2004 | A1 |
20040132297 | Baba et al. | Jul 2004 | A1 |
20040155975 | Hart et al. | Aug 2004 | A1 |
20040165090 | Ning | Aug 2004 | A1 |
20040184041 | Schopp | Sep 2004 | A1 |
20040211836 | Patel et al. | Oct 2004 | A1 |
20040214623 | Takahashi et al. | Oct 2004 | A1 |
20040008259 | Gokturk et al. | Nov 2004 | A1 |
20040233461 | Armstrong | Nov 2004 | A1 |
20040258353 | Gluckstad et al. | Dec 2004 | A1 |
20050006477 | Patel | Jan 2005 | A1 |
20050117215 | Lange | Jun 2005 | A1 |
20050128193 | Popescu et al. | Jun 2005 | A1 |
20050128196 | Popescu et al. | Jun 2005 | A1 |
20050168488 | Montague | Aug 2005 | A1 |
20050211782 | Martin | Sep 2005 | A1 |
20050240317 | Kienzle-Lietl | Oct 2005 | A1 |
20050257748 | Kriesel et al. | Nov 2005 | A1 |
20050264867 | Cho et al. | Dec 2005 | A1 |
20060047704 | Gopalakrishnan | Mar 2006 | A1 |
20060078226 | Zhou | Apr 2006 | A1 |
20060108266 | Bowers et al. | May 2006 | A1 |
20060109105 | Varner et al. | May 2006 | A1 |
20060112023 | Horhann | May 2006 | A1 |
20060151604 | Zhu et al. | Jul 2006 | A1 |
20060159307 | Anderson et al. | Jul 2006 | A1 |
20060159344 | Shao et al. | Jul 2006 | A1 |
20060232681 | Okada | Oct 2006 | A1 |
20060255150 | Longacre | Nov 2006 | A1 |
20060269165 | Viswanathan | Nov 2006 | A1 |
20060276709 | Khamene et al. | Dec 2006 | A1 |
20060291719 | Ikeda et al. | Dec 2006 | A1 |
20070003154 | Sun et al. | Jan 2007 | A1 |
20070025612 | Iwasaki et al. | Feb 2007 | A1 |
20070031064 | Zhao et al. | Feb 2007 | A1 |
20070063048 | Havens et al. | Mar 2007 | A1 |
20070116357 | Dewaele | May 2007 | A1 |
20070127022 | Cohen et al. | Jun 2007 | A1 |
20070143082 | Degnan | Jun 2007 | A1 |
20070153293 | Gruhlke et al. | Jul 2007 | A1 |
20070165013 | Goulanian et al. | Jul 2007 | A1 |
20070171220 | Kriveshko | Jul 2007 | A1 |
20070177011 | Lewin et al. | Aug 2007 | A1 |
20070181685 | Zhu et al. | Aug 2007 | A1 |
20070237356 | Dwinell et al. | Oct 2007 | A1 |
20070291031 | Konev et al. | Dec 2007 | A1 |
20070299338 | Stevick et al. | Dec 2007 | A1 |
20080013793 | Hillis et al. | Jan 2008 | A1 |
20080035390 | Wurz | Feb 2008 | A1 |
20080047760 | Georgitsis | Feb 2008 | A1 |
20080050042 | Zhang et al. | Feb 2008 | A1 |
20080056536 | Hildreth et al. | Mar 2008 | A1 |
20080062164 | Bassi et al. | Mar 2008 | A1 |
20080065509 | Williams | Mar 2008 | A1 |
20080077265 | Boyden | Mar 2008 | A1 |
20080079955 | Storm | Apr 2008 | A1 |
20080164074 | Wurz | Jun 2008 | A1 |
20080204476 | Montague | Aug 2008 | A1 |
20080212168 | Olmstead et al. | Sep 2008 | A1 |
20080247635 | Davis et al. | Oct 2008 | A1 |
20080273191 | Kim et al. | Nov 2008 | A1 |
20080273210 | Hilde | Nov 2008 | A1 |
20080278790 | Boesser et al. | Nov 2008 | A1 |
20090046296 | Kilpartrick et al. | Feb 2009 | A1 |
20090059004 | Bochicchio | Mar 2009 | A1 |
20090095047 | Patel et al. | Apr 2009 | A1 |
20090114818 | Casares et al. | May 2009 | A1 |
20090134221 | Zhu et al. | May 2009 | A1 |
20090161090 | Campbell et al. | Jun 2009 | A1 |
20090189858 | Lev et al. | Jul 2009 | A1 |
20090195790 | Zhu et al. | Aug 2009 | A1 |
20090225333 | Bendall et al. | Sep 2009 | A1 |
20090237411 | Gossweiler et al. | Sep 2009 | A1 |
20090268023 | Hsieh | Oct 2009 | A1 |
20090272724 | Gubler | Nov 2009 | A1 |
20090273770 | Bauhahn et al. | Nov 2009 | A1 |
20090313948 | Buckley et al. | Dec 2009 | A1 |
20090318815 | Barnes et al. | Dec 2009 | A1 |
20090323084 | Dunn et al. | Dec 2009 | A1 |
20090323121 | Valkenburg | Dec 2009 | A1 |
20100035637 | Varanasi et al. | Feb 2010 | A1 |
20100060604 | Zwart et al. | Mar 2010 | A1 |
20100091104 | Sprigle | Apr 2010 | A1 |
20100113153 | Yen et al. | May 2010 | A1 |
20100118200 | Gelman et al. | May 2010 | A1 |
20100128109 | Banks | May 2010 | A1 |
20100161170 | Siris | Jun 2010 | A1 |
20100171740 | Andersen et al. | Jul 2010 | A1 |
20100172567 | Prokoski | Jul 2010 | A1 |
20100177076 | Essinger et al. | Jul 2010 | A1 |
20100177080 | Essinger et al. | Jul 2010 | A1 |
20100177707 | Essinger et al. | Jul 2010 | A1 |
20100177749 | Essinger et al. | Jul 2010 | A1 |
20100202702 | Benos | Aug 2010 | A1 |
20100208039 | Stettner | Aug 2010 | A1 |
20100211355 | Horst et al. | Aug 2010 | A1 |
20100217678 | Goncalves | Aug 2010 | A1 |
20100220849 | Colbert et al. | Sep 2010 | A1 |
20100220894 | Ackley et al. | Sep 2010 | A1 |
20100223276 | Al-Shameri et al. | Sep 2010 | A1 |
20100245850 | Lee et al. | Sep 2010 | A1 |
20100254611 | Amz | Oct 2010 | A1 |
20100274728 | Kugelman | Oct 2010 | A1 |
20100303336 | Abraham | Dec 2010 | A1 |
20100315413 | Izadi et al. | Dec 2010 | A1 |
20100321482 | Cleveland | Dec 2010 | A1 |
20110019155 | Daniel et al. | Jan 2011 | A1 |
20110040192 | Brenner et al. | Feb 2011 | A1 |
20110040407 | Lim | Feb 2011 | A1 |
20110043609 | Choi et al. | Feb 2011 | A1 |
20110075936 | Deaver | Mar 2011 | A1 |
20110081044 | Peeper | Apr 2011 | A1 |
20110099474 | Grossman et al. | Apr 2011 | A1 |
20110169999 | Grunow et al. | Jul 2011 | A1 |
20110180695 | Li et al. | Jul 2011 | A1 |
20110188054 | Petronius et al. | Aug 2011 | A1 |
20110188741 | Sones et al. | Aug 2011 | A1 |
20110202554 | Powilleit et al. | Aug 2011 | A1 |
20110234389 | Mellin | Sep 2011 | A1 |
20110235854 | Berger et al. | Sep 2011 | A1 |
20110249864 | Venkatesan et al. | Oct 2011 | A1 |
20110254840 | Halstead | Oct 2011 | A1 |
20110260965 | Kim et al. | Oct 2011 | A1 |
20110279916 | Brown et al. | Nov 2011 | A1 |
20110286007 | Pangrazio et al. | Nov 2011 | A1 |
20110286628 | Goncalves et al. | Nov 2011 | A1 |
20110288818 | Thierman | Nov 2011 | A1 |
20110297590 | Ackley et al. | Dec 2011 | A1 |
20110301994 | Tieman | Dec 2011 | A1 |
20110303748 | Lemma et al. | Dec 2011 | A1 |
20110310227 | Konertz | Dec 2011 | A1 |
20120024952 | Chen | Feb 2012 | A1 |
20120056982 | Katz et al. | Mar 2012 | A1 |
20120057345 | Kuchibhotla | Mar 2012 | A1 |
20120067955 | Rowe | Mar 2012 | A1 |
20120074227 | Ferren et al. | Mar 2012 | A1 |
20120081714 | Pangrazio et al. | Apr 2012 | A1 |
20120111946 | Golant | May 2012 | A1 |
20120113223 | Hilliges et al. | May 2012 | A1 |
20120126000 | Kunzig et al. | May 2012 | A1 |
20120140300 | Freeman | Jun 2012 | A1 |
20120168509 | Nunnink et al. | Jul 2012 | A1 |
20120168512 | Kotlarsky et al. | Jul 2012 | A1 |
20120179665 | Baarman et al. | Jul 2012 | A1 |
20120185094 | Rosenstein et al. | Jul 2012 | A1 |
20120190386 | Anderson | Jul 2012 | A1 |
20120193423 | Samek | Aug 2012 | A1 |
20120197464 | Wang et al. | Aug 2012 | A1 |
20120201288 | Kolze et al. | Aug 2012 | A1 |
20120203647 | Smith | Aug 2012 | A1 |
20120218436 | Rodriguez et al. | Sep 2012 | A1 |
20120223141 | Good et al. | Sep 2012 | A1 |
20120224026 | Bayer et al. | Sep 2012 | A1 |
20120224060 | Gurevich et al. | Sep 2012 | A1 |
20120236212 | Itoh et al. | Sep 2012 | A1 |
20120236288 | Stanley | Sep 2012 | A1 |
20120242852 | Hayward et al. | Sep 2012 | A1 |
20120113250 | Farlotti et al. | Oct 2012 | A1 |
20120256901 | Bendall | Oct 2012 | A1 |
20120262558 | Boger et al. | Oct 2012 | A1 |
20120280908 | Rhoads et al. | Nov 2012 | A1 |
20120282905 | Owen | Nov 2012 | A1 |
20120282911 | Davis et al. | Nov 2012 | A1 |
20120284012 | Rodriguez et al. | Nov 2012 | A1 |
20120284122 | Brandis | Nov 2012 | A1 |
20120284339 | Rodriguez | Nov 2012 | A1 |
20120284593 | Rodriguez | Nov 2012 | A1 |
20120293610 | Doepke et al. | Nov 2012 | A1 |
20120293625 | Schneider et al. | Nov 2012 | A1 |
20120294549 | Doepke | Nov 2012 | A1 |
20120299961 | Ramkumar et al. | Nov 2012 | A1 |
20120300991 | Mikio | Nov 2012 | A1 |
20120313848 | Galor et al. | Dec 2012 | A1 |
20120314030 | Datta | Dec 2012 | A1 |
20120314058 | Bendall et al. | Dec 2012 | A1 |
20120316820 | Nakazato et al. | Dec 2012 | A1 |
20130019278 | Sun | Jan 2013 | A1 |
20130038881 | Pesach et al. | Feb 2013 | A1 |
20130038941 | Pesach et al. | Feb 2013 | A1 |
20130043312 | Van Horn | Feb 2013 | A1 |
20130050426 | Sarmast et al. | Feb 2013 | A1 |
20130075168 | Amundsen et al. | Mar 2013 | A1 |
20130076857 | Kurashige et al. | Mar 2013 | A1 |
20130093895 | Palmer et al. | Apr 2013 | A1 |
20130094069 | Lee et al. | Apr 2013 | A1 |
20130101158 | Lloyd et al. | Apr 2013 | A1 |
20130156267 | Muraoka et al. | Jun 2013 | A1 |
20130175341 | Kearney et al. | Jul 2013 | A1 |
20130175343 | Good | Jul 2013 | A1 |
20130200150 | Reynolds et al. | Aug 2013 | A1 |
20130208164 | Cazier et al. | Aug 2013 | A1 |
20130211790 | Loveland et al. | Aug 2013 | A1 |
20130222592 | Gieseke | Aug 2013 | A1 |
20130223673 | Davis et al. | Aug 2013 | A1 |
20130257744 | Daghigh et al. | Oct 2013 | A1 |
20130257759 | Daghigh | Oct 2013 | A1 |
20130270346 | Xian et al. | Oct 2013 | A1 |
20130287258 | Kearney | Oct 2013 | A1 |
20130291998 | Konnerth | Nov 2013 | A1 |
20130292475 | Kotlarsky et al. | Nov 2013 | A1 |
20130292477 | Hennick et al. | Nov 2013 | A1 |
20130293539 | Hunt et al. | Nov 2013 | A1 |
20130293540 | Laffargue et al. | Nov 2013 | A1 |
20130306728 | Thuries et al. | Nov 2013 | A1 |
20130306731 | Pedraro | Nov 2013 | A1 |
20130307964 | Bremer et al. | Nov 2013 | A1 |
20130308013 | Li et al. | Nov 2013 | A1 |
20130308625 | Corcoran | Nov 2013 | A1 |
20130313324 | Koziol et al. | Nov 2013 | A1 |
20130313325 | Wilz et al. | Nov 2013 | A1 |
20130317642 | Asaria | Nov 2013 | A1 |
20130329012 | Bartos | Dec 2013 | A1 |
20130329013 | Metois et al. | Dec 2013 | A1 |
20130342342 | Sabre et al. | Dec 2013 | A1 |
20130342717 | Havens et al. | Dec 2013 | A1 |
20140001267 | Giordano et al. | Jan 2014 | A1 |
20140002828 | Laffargue et al. | Jan 2014 | A1 |
20140008439 | Wang | Jan 2014 | A1 |
20140009586 | McNamer et al. | Jan 2014 | A1 |
20140019005 | Lee et al. | Jan 2014 | A1 |
20140021259 | Moed et al. | Jan 2014 | A1 |
20140025584 | Liu et al. | Jan 2014 | A1 |
20140031665 | Pinto et al. | Jan 2014 | A1 |
20140034731 | Gao et al. | Feb 2014 | A1 |
20140034734 | Sauerwein | Feb 2014 | A1 |
20140036848 | Pease et al. | Feb 2014 | A1 |
20140039674 | Motoyama et al. | Feb 2014 | A1 |
20140039693 | Havens et al. | Feb 2014 | A1 |
20140042814 | Kather et al. | Feb 2014 | A1 |
20140049120 | Kohtz et al. | Feb 2014 | A1 |
20140049635 | Laffargue et al. | Feb 2014 | A1 |
20140058612 | Wong et al. | Feb 2014 | A1 |
20140061306 | Wu et al. | Mar 2014 | A1 |
20140062709 | Hyer | Mar 2014 | A1 |
20140063289 | Hussey et al. | Mar 2014 | A1 |
20140066136 | Sauerwein et al. | Mar 2014 | A1 |
20140067104 | Osterhout | Mar 2014 | A1 |
20140067692 | Ye et al. | Mar 2014 | A1 |
20140070005 | Nahill et al. | Mar 2014 | A1 |
20140071430 | Hansen et al. | Mar 2014 | A1 |
20140071840 | Venancio | Mar 2014 | A1 |
20140074746 | Wang | Mar 2014 | A1 |
20140076974 | Havens et al. | Mar 2014 | A1 |
20140078341 | Havens et al. | Mar 2014 | A1 |
20140078342 | Li et al. | Mar 2014 | A1 |
20140078345 | Showering | Mar 2014 | A1 |
20140079297 | Tadayon et al. | Mar 2014 | A1 |
20140091147 | Evans et al. | Apr 2014 | A1 |
20140097238 | Ghazizadeh | Apr 2014 | A1 |
20140097252 | He et al. | Apr 2014 | A1 |
20140098091 | Hori | Apr 2014 | A1 |
20140098243 | Ghazizadeh | Apr 2014 | A1 |
20140098792 | Wang et al. | Apr 2014 | A1 |
20140100774 | Showering | Apr 2014 | A1 |
20140100813 | Showering | Apr 2014 | A1 |
20140103115 | Meier et al. | Apr 2014 | A1 |
20140104413 | McCloskey et al. | Apr 2014 | A1 |
20140104414 | McCloskey et al. | Apr 2014 | A1 |
20140104416 | Li et al. | Apr 2014 | A1 |
20140104451 | Todeschini et al. | Apr 2014 | A1 |
20140104664 | Lee | Apr 2014 | A1 |
20140106594 | Skvoretz | Apr 2014 | A1 |
20140106725 | Sauerwein | Apr 2014 | A1 |
20140108010 | Maltseff et al. | Apr 2014 | A1 |
20140108402 | Gomez et al. | Apr 2014 | A1 |
20140108682 | Caballero | Apr 2014 | A1 |
20140110485 | Toa et al. | Apr 2014 | A1 |
20140114530 | Fitch et al. | Apr 2014 | A1 |
20140121438 | Kearney | May 2014 | A1 |
20140121445 | Ding et al. | May 2014 | A1 |
20140124577 | Wang et al. | May 2014 | A1 |
20140124579 | Ding | May 2014 | A1 |
20140125842 | Winegar | May 2014 | A1 |
20140125853 | Wang | May 2014 | A1 |
20140125999 | Longacre et al. | May 2014 | A1 |
20140129378 | Richardson | May 2014 | A1 |
20140131441 | Nahill et al. | May 2014 | A1 |
20140131443 | Smith | May 2014 | A1 |
20140131444 | Wang | May 2014 | A1 |
20140131448 | Xian et al. | May 2014 | A1 |
20140133379 | Wang et al. | May 2014 | A1 |
20140135984 | Hirata | May 2014 | A1 |
20140136208 | Maltseff et al. | May 2014 | A1 |
20140139654 | Taskahashi | May 2014 | A1 |
20140140585 | Wang | May 2014 | A1 |
20140142398 | Patil et al. | May 2014 | A1 |
20140151453 | Meier et al. | Jun 2014 | A1 |
20140152882 | Samek et al. | Jun 2014 | A1 |
20140152975 | Ko | Jun 2014 | A1 |
20140158468 | Adami | Jun 2014 | A1 |
20140158770 | Sevier et al. | Jun 2014 | A1 |
20140159869 | Zumsteg et al. | Jun 2014 | A1 |
20140166755 | Liu et al. | Jun 2014 | A1 |
20140166757 | Smith | Jun 2014 | A1 |
20140166759 | Liu et al. | Jun 2014 | A1 |
20140168380 | Heidemann et al. | Jun 2014 | A1 |
20140168787 | Wang et al. | Jun 2014 | A1 |
20140175165 | Havens et al. | Jun 2014 | A1 |
20140175172 | Jovanovski et al. | Jun 2014 | A1 |
20140177931 | Kocherscheidt et al. | Jun 2014 | A1 |
20140191644 | Chaney | Jul 2014 | A1 |
20140191913 | Ge et al. | Jul 2014 | A1 |
20140192187 | Atwell et al. | Jul 2014 | A1 |
20140192551 | Masaki | Jul 2014 | A1 |
20140197238 | Lui et al. | Jul 2014 | A1 |
20140197239 | Havens et al. | Jul 2014 | A1 |
20140197304 | Feng et al. | Jul 2014 | A1 |
20140201126 | Zadeh et al. | Jul 2014 | A1 |
20140203087 | Smith et al. | Jul 2014 | A1 |
20140204268 | Grunow et al. | Jul 2014 | A1 |
20140205150 | Ogawa | Jul 2014 | A1 |
20140214631 | Hansen | Jul 2014 | A1 |
20140217166 | Berthiaume et al. | Aug 2014 | A1 |
20140217180 | Liu | Aug 2014 | A1 |
20140225918 | Mittal et al. | Aug 2014 | A1 |
20140225985 | Klusza et al. | Aug 2014 | A1 |
20140231500 | Ehrhart et al. | Aug 2014 | A1 |
20140232930 | Anderson | Aug 2014 | A1 |
20140240454 | Lee | Aug 2014 | A1 |
20140247279 | Nicholas et al. | Sep 2014 | A1 |
20140247280 | Nicholas et al. | Sep 2014 | A1 |
20140247315 | Marty et al. | Sep 2014 | A1 |
20140263493 | Amurgis et al. | Sep 2014 | A1 |
20140263645 | Smith et al. | Sep 2014 | A1 |
20140267609 | Laffargue | Sep 2014 | A1 |
20140268093 | Tohme et al. | Sep 2014 | A1 |
20140270196 | Braho et al. | Sep 2014 | A1 |
20140270229 | Braho | Sep 2014 | A1 |
20140270361 | Amma et al. | Sep 2014 | A1 |
20140278387 | DiGregorio | Sep 2014 | A1 |
20140282210 | Bianconi | Sep 2014 | A1 |
20140284384 | Lu et al. | Sep 2014 | A1 |
20140288933 | Braho et al. | Sep 2014 | A1 |
20140297058 | Barker et al. | Oct 2014 | A1 |
20140299665 | Barber et al. | Oct 2014 | A1 |
20140306833 | Ricci | Oct 2014 | A1 |
20140307855 | Withagen et al. | Oct 2014 | A1 |
20140312121 | Lu et al. | Oct 2014 | A1 |
20140313527 | Askan | Oct 2014 | A1 |
20140319219 | Liu et al. | Oct 2014 | A1 |
20140319220 | Coyle | Oct 2014 | A1 |
20140319221 | Oberpriller et al. | Oct 2014 | A1 |
20140320408 | Zagorsek et al. | Oct 2014 | A1 |
20140326787 | Barten | Nov 2014 | A1 |
20140332590 | Wang et al. | Nov 2014 | A1 |
20140344943 | Todeschini et al. | Nov 2014 | A1 |
20140346233 | Liu et al. | Nov 2014 | A1 |
20140347533 | Ovsiannikov et al. | Nov 2014 | A1 |
20140350710 | Gopalkrishnan et al. | Nov 2014 | A1 |
20140351317 | Smith et al. | Nov 2014 | A1 |
20140353373 | Van Horn et al. | Dec 2014 | A1 |
20140361073 | Qu et al. | Dec 2014 | A1 |
20140361082 | Xian et al. | Dec 2014 | A1 |
20140362184 | Jovanovski et al. | Dec 2014 | A1 |
20140363015 | Braho | Dec 2014 | A1 |
20140369511 | Sheerin et al. | Dec 2014 | A1 |
20140374483 | Lu | Dec 2014 | A1 |
20140374485 | Xian et al. | Dec 2014 | A1 |
20140379613 | Nishitani et al. | Dec 2014 | A1 |
20150001301 | Ouyang | Jan 2015 | A1 |
20150001304 | Todeschini | Jan 2015 | A1 |
20150003673 | Fletcher | Jan 2015 | A1 |
20150009100 | Haneda et al. | Jan 2015 | A1 |
20150009301 | Ribnick et al. | Jan 2015 | A1 |
20150009338 | Laffargue et al. | Jan 2015 | A1 |
20150009610 | London et al. | Jan 2015 | A1 |
20150014416 | Kotlarsky et al. | Jan 2015 | A1 |
20150021397 | Rueblinger et al. | Jan 2015 | A1 |
20150028102 | Ren et al. | Jan 2015 | A1 |
20150028103 | Jiang | Jan 2015 | A1 |
20150028104 | Ma et al. | Jan 2015 | A1 |
20150029002 | Yeakley et al. | Jan 2015 | A1 |
20150032709 | Maloy et al. | Jan 2015 | A1 |
20150036876 | Marrion et al. | Feb 2015 | A1 |
20150039309 | Braho et al. | Feb 2015 | A1 |
20150040378 | Saber et al. | Feb 2015 | A1 |
20150048168 | Fritz et al. | Feb 2015 | A1 |
20150049347 | Laffargue et al. | Feb 2015 | A1 |
20150051992 | Smith | Feb 2015 | A1 |
20150053766 | Havens et al. | Feb 2015 | A1 |
20150053768 | Wang et al. | Feb 2015 | A1 |
20150053769 | Thuries et al. | Feb 2015 | A1 |
20150062366 | Liu et al. | Mar 2015 | A1 |
20150062369 | Gehring et al. | Mar 2015 | A1 |
20150063215 | Wang | Mar 2015 | A1 |
20150063676 | Lloyd et al. | Mar 2015 | A1 |
20150069130 | Gannon | Mar 2015 | A1 |
20150070158 | Hayasaka | Mar 2015 | A1 |
20150071818 | Todeschini | Mar 2015 | A1 |
20150083800 | Li et al. | Mar 2015 | A1 |
20150086114 | Todeschini | Mar 2015 | A1 |
20150088522 | Hendrickson et al. | Mar 2015 | A1 |
20150096872 | Woodburn | Apr 2015 | A1 |
20150099557 | Pettinelli et al. | Apr 2015 | A1 |
20150100196 | Hollifield | Apr 2015 | A1 |
20150102109 | Huck | Apr 2015 | A1 |
20150115035 | Meier et al. | Apr 2015 | A1 |
20150116498 | Vartiainen et al. | Apr 2015 | A1 |
20150117749 | Chen et al. | Apr 2015 | A1 |
20150127791 | Kosecki et al. | May 2015 | A1 |
20150128116 | Chen et al. | May 2015 | A1 |
20150129659 | Feng et al. | May 2015 | A1 |
20150133047 | Smith et al. | May 2015 | A1 |
20150134470 | Hejl et al. | May 2015 | A1 |
20150136851 | Harding et al. | May 2015 | A1 |
20150136854 | Lu et al. | May 2015 | A1 |
20150142492 | Kumar | May 2015 | A1 |
20150144692 | Hejl | May 2015 | A1 |
20150144698 | Teng et al. | May 2015 | A1 |
20150144701 | Xian et al. | May 2015 | A1 |
20150149946 | Benos et al. | May 2015 | A1 |
20150161429 | Xian | Jun 2015 | A1 |
20150163474 | You | Jun 2015 | A1 |
20150169925 | Chang et al. | Jun 2015 | A1 |
20150169929 | Williams et al. | Jun 2015 | A1 |
20150178900 | Kim et al. | Jun 2015 | A1 |
20150186703 | Chen et al. | Jul 2015 | A1 |
20150193644 | Kearney et al. | Jul 2015 | A1 |
20150193645 | Colavito et al. | Jul 2015 | A1 |
20150199957 | Funyak et al. | Jul 2015 | A1 |
20150204662 | Kobayashi et al. | Jul 2015 | A1 |
20150204671 | Showering | Jul 2015 | A1 |
20150213647 | Laffargue et al. | Jul 2015 | A1 |
20150219748 | Hyatt | Aug 2015 | A1 |
20150229838 | Hakim et al. | Aug 2015 | A1 |
20150253469 | Le Gros et al. | Sep 2015 | A1 |
20150260830 | Ghosh et al. | Sep 2015 | A1 |
20150269403 | Lei et al. | Sep 2015 | A1 |
20150201181 | Herschbach | Oct 2015 | A1 |
20150276379 | Ni | Oct 2015 | A1 |
20150308816 | Laffargue et al. | Oct 2015 | A1 |
20150316368 | Moench et al. | Nov 2015 | A1 |
20150325036 | Lee | Nov 2015 | A1 |
20150332463 | Galera et al. | Nov 2015 | A1 |
20150355470 | Herschbach | Dec 2015 | A1 |
20160169665 | Deschenes et al. | Jan 2016 | A1 |
20160048725 | Holz et al. | Feb 2016 | A1 |
20160063429 | Varley et al. | Mar 2016 | A1 |
20160065912 | Peterson | Mar 2016 | A1 |
20160090283 | Svensson et al. | Mar 2016 | A1 |
20160090284 | Svensson et al. | Mar 2016 | A1 |
20160094016 | Beach et al. | Mar 2016 | A1 |
20160138247 | Conway et al. | May 2016 | A1 |
20160138248 | Conway et al. | May 2016 | A1 |
20160138249 | Svensson et al. | May 2016 | A1 |
20160164261 | Warren | Jun 2016 | A1 |
20160178915 | Mor et al. | Jun 2016 | A1 |
20160187186 | Coleman et al. | Jun 2016 | A1 |
20160187187 | Coleman et al. | Jun 2016 | A1 |
20160187210 | Coleman et al. | Jun 2016 | A1 |
20160191801 | Sivan | Jun 2016 | A1 |
20160202478 | Masson et al. | Jul 2016 | A1 |
20160203641 | Bostick et al. | Jul 2016 | A1 |
20160223474 | Tang et al. | Aug 2016 | A1 |
20170115490 | Hsieh et al. | Apr 2017 | A1 |
20170115497 | Chen et al. | Apr 2017 | A1 |
20170121158 | Wong | May 2017 | A1 |
20170139213 | Schmidtlin | May 2017 | A1 |
20107018294 | Hardy et al. | Jun 2017 | |
20170309108 | Sadovsky et al. | Oct 2017 | A1 |
20170336870 | Everett et al. | Nov 2017 | A1 |
Number | Date | Country |
---|---|---|
2004212587 | Apr 2005 | AU |
201139117 | Oct 2008 | CN |
3335760 | Apr 1985 | DE |
10210813 | Oct 2003 | DE |
102007037282 | Mar 2008 | DE |
3007096 | Apr 2016 | EE |
1111435 | Jun 2001 | EP |
1443312 | Aug 2004 | EP |
1112483 | May 2006 | EP |
1232480 | May 2006 | EP |
2013117 | Jan 2009 | EP |
2286932 | Feb 2011 | EP |
2372648 | Oct 2011 | EP |
2381421 | Oct 2011 | EP |
2533009 | Dec 2012 | EP |
2562715 | Feb 2013 | EP |
2722656 | Apr 2014 | EP |
2779027 | Sep 2014 | EP |
2833323 | Feb 2015 | EP |
2843590 | Mar 2015 | EP |
2843590 | Mar 2015 | EP |
2845170 | Mar 2015 | EP |
2966595 | Jan 2016 | EP |
3006893 | Apr 2016 | EP |
3012601 | Apr 2016 | EP |
2503978 | Jan 2014 | GB |
2525053 | Oct 2015 | GB |
2531928 | May 2016 | GB |
H04129902 | Apr 1992 | JP |
200696457 | Apr 2006 | JP |
2007084162 | Apr 2007 | JP |
2008210276 | Sep 2008 | JP |
2014210646 | Nov 2014 | JP |
2015174705 | Oct 2015 | JP |
20100020115 | Feb 2010 | KR |
20110013200 | Feb 2011 | KR |
20110117020 | Oct 2011 | KR |
20120028109 | Mar 2012 | KR |
9640452 | Dec 1996 | WO |
0077726 | Dec 2000 | WO |
0114836 | Mar 2001 | WO |
2006095110 | Sep 2006 | WO |
2007015059 | Feb 2007 | WO |
200712554 | Nov 2007 | WO |
2011017241 | Feb 2011 | WO |
2012175731 | Dec 2012 | WO |
2013021157 | Feb 2013 | WO |
2013033442 | Mar 2013 | WO |
2013163789 | Nov 2013 | WO |
2013166368 | Nov 2013 | WO |
2013173985 | Nov 2013 | WO |
20130184340 | Dec 2013 | WO |
2014019130 | Feb 2014 | WO |
2014102341 | Jul 2014 | WO |
2014110495 | Jul 2014 | WO |
2014149702 | Sep 2014 | WO |
2014151746 | Sep 2014 | WO |
2015006865 | Jan 2015 | WO |
2016020038 | Feb 2016 | WO |
2016061699 | Apr 2016 | WO |
2016061699 | Apr 2016 | WO |
Entry |
---|
U.S. Appl. No. 14/519,179 for Dimensioning System With Multipath Interference Mitigation filed Oct. 21, 2014 (Thuries et al.); 30 pages. |
U.S. Appl. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014, (Ackley et al.); 39 pages. |
U.S. Appl. No. 14/453,019 for Dimensioning System With Guided Alignment, filed Aug. 6, 2014 (Li et al.); 31 pages. |
U.S. Appl. No. 14/452,697 for Interactive Indicia Reader , filed Aug. 6, 2014, (Todeschini); 32 pages. |
U.S. Appl. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.); 36 pages. |
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages. |
U.S. Appl. No. 14/513,808 for Identifying Inventory Items in a Storage Facility filed Oct. 14, 2014 (Singel et al.); 51 pages. |
U.S. Appl. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.); 22 pages. |
U.S. Appl. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.); 21 pages. |
U.S. Appl. No. 14/483,056 for Variable Depth of Field Barcode Scanner filed Sep. 10, 2014 (McCloskey et al.); 29 pages. |
U.S. Appl. No. 14/531,154 for Directing an Inspector Through an Inspection filed Nov. 3, 2014 (Miller et al.); 53 pages. |
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages. |
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages. |
U.S. Appl. No. 14/340,627 for an Axially Reinforced Flexible Scan Element, filed Jul. 25, 2014 (Reublinger et al.); 41 pages. |
U.S. Appl. No. 14/676,327 for Device Management Proxy for Secure Devices filed Apr. 1, 2015 (Yeakley et al.); 50 pages. |
U.S. Appl. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering); 31 pages. |
U.S. Appl. No. 14/327,827 for a Mobile-Phone Adapter for Electronic Transactions, filed Jul. 10, 2014 (Hejl); 25 pages. |
U.S. Appl. No. 14/334,934 for a System and Method for Indicia Verification, filed Jul. 18, 2014 (Hejl); 38 pages. |
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages. |
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages. |
U.S. Appl. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages. |
U.S. Appl. No. 14/619,093 for Methods for Training a Speech Recognition System filed Feb. 11, 2015 (Pecorari); 35 pages. |
U.S. Appl. No. 29/524,186 for Scanner filed Apr. 17, 2015 (Zhou et al.); 17 pages. |
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages. |
U.S. Appl. No. 14/614,706 for Device for Supporting an Electronic Tool on a User's Hand filed Feb. 5, 2015 (Oberpriller et al.); 33 pages. |
U.S. Appl. No. 14/628,708 for Device, System, and Method for Determining the Status of Checkout Lanes filed Feb. 23, 2015 (Todeschini); 37 pages. |
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages. |
U.S. Appl. No. 14/529,563 for Adaptable Interface for a Mobile Computing Device filed Oct. 31, 2014 (Schoon et al.); 36 pages. |
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages. |
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages. |
U.S. Appl. No. 14/695,364 for Medication Management System filed Apr. 24, 2015 (Sewell et al.); 44 pages. |
U.S. Appl. No. 14/664,063 for Method and Application for Scanning a Barcode With a Smart Device While Continuously Running and Displaying an Application on the Smart Device Display filed Mar. 20, 2015 (Todeschini); 37 pages. |
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages. |
U.S. Appl. No. 14/527,191 for Method and System for Recognizing Speech Using Wildcards in an Expected Response filed Oct. 29, 2014 (Braho et al.); 45 pages. |
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/535,764 for Concatenated Expected Responses for Speech Recognition filed Nov. 7, 2014 (Braho et al.); 51 pages. |
U.S. Appl. No. 14/687,289 for System for Communication via a Peripheral Hub filed Apr. 15, 2015 (Kohtz et al.); 37 pages. |
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages. |
U.S. Appl. No. 14/674,329 for Aimer for Barcode Scanning filed Mar. 31, 2015 (Bidwell); 36 pages. |
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages. |
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages. |
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages. |
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages. |
U.S. Appl. No. 14/695,923 for Secure Unattended Network Authentication filed Apr. 24, 2015 (Kubler et al.); 52 pages. |
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages. |
U.S. Appl. No. 14/800,757 , Eric Todeschini, filed Jul. 16, 2015, not published yet, Dimensioning and Imaging Items, 80 pages. |
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0-7803-3258-X/96 1996 IEEE; 4 pages. |
U.S. Appl. No. 14/747,197, Serge Thuries et al., filed Jun. 23, 2015, not published yet, Optical Pattern Projector; 33 pages. |
U.S. Appl. No. 14/747,490, Brian L Jovanovski et al., filed Jun. 23, 2015, not published yet, Dual-Projector Three-Dimensional Scanner; 40 pages. |
U.S. Appl. No. 14/715,916, H. Sprague Ackley, filed May 19, 2015, not published yet, Evaluating Image Values; 54 pages. |
U.S. Appl. No. 14/793,149, H. Sprague Ackley, filed Jul. 7, 2015, not published yet, Mobile Dimensioner Apparatus for Use in Commerce; 57 pages. |
U.S. Appl. No. 14/740,373, H. Sprague Ackley et al., filed Jun. 16, 2015, not published yet, Calibrating a Volume Dimensioner; 63 pages. |
U.S. Appl. No. 14/801,023, Tyler Doomenbal et al., filed Jul. 16, 2015, not published yet, Adjusting Dimensioning Results Using Augmented Reality, 39 pages. |
Leotta, Matthew, Generic, Deformable Models for 3-D Vehicle Surveillance, May 2010, Doctoral Dissertation, Brown University, Providence RI, 248 pages. |
Ward, Benjamin, Interactive 3D Reconstruction from Video, Aug. 2012, Doctoral Thesis, Univesity of Adelaide, Adelaide, South Australia, 157 pages. |
Hood, Frederick W.; William A. Hoff, Robert King, Evaluation of an Interactive Technique for Creating Site Models from Range Data, Apr. 27-May 1, 1997 Proceedings of the ANS 7th Topical Meeting on Robotics & Remote Systems, Augusta GA, 9 pages. |
Gupta, Alok; Range Image Segmentation for 3-D Objects Recognition, May 1988, Technical Reports (CIS), Paper 736, University of Pennsylvania Department of Computer and Information Science, retrieved from Http://repository.upenn.edu/cis_reports/736, Accessed May 31, 2015, 157 pages. |
Reisner-Kollmann,Irene; Anton L Fuhrmann, Werner Purgathofer, Interactive Reconstruction of Industrial Sites Using Parametric Models, May 2010, Proceedings of the 26th Spring Conference of Computer Graphics SCCG 10, 8 pages. |
Drummond, Tom; Roberto Cipolla, Real-Time Visual Tracking of Complex Structures, Jul. 2002, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, No. 7; 15 pages. |
Zhang, Zhaoxiang; Tieniu Tan, Kaiqi Huang, Yunhong Wang; Three-Dimensional Deformable-Model-based Localization and Recognition of Road Vehicles; IEEE Transactions on Image Processing, vol. 21, No. 1, Jan. 2012, 13 pages. |
Leotta, Matthew J.; Joseph L. Mundy; Predicting High Resolution Image Edges with a Generic, Adaptive, 3-D Vehicle Model; IEEE Conference on Computer Vision and Pattern Recognition, 2009; 8 pages. |
Spiller, Jonathan; Object Localization Using Deformable Templates, Master's Dissertation, University of the Witwatersrand, Johannesburg, South Africa, 2007; 74 pages. |
EP Search and Written Opinion Report in related matter EP Application No. 14181437.6, dated Mar. 26, 2015, 7 pages. |
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2001 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3. |
Intention to Grant in counterpart European Application No. 14157971.4 dated Apr. 14, 2015, pp. 1-8. |
Decision to Grant in counterpart European Application No. 14157971.4 dated Aug. 6, 2015, pp. 1-2. |
Salvi, Joaquim et al. “Pattern Codification Strategies in Structured Light Systems” published in Pattern Recognition; The Journal of the Pattern Recognition Society, Received Mar. 6, 2003; Accepted Oct. 2, 2003; 23 pages. |
Office Action in counterpart European Application No. 13186043.9 dated Sep. 30, 2015, pp. 1-7. |
Lloyd et al., “System for Monitoring the Condition of Packages Throughout Transit”, U.S. Appl. No. 14/865,575, filed Sep. 25, 2015, 59 pages, not yet published. |
James Chamberlin, “System and Method for Picking Validation”, U.S. Appl. No. 14/865,797, filed Sep. 25, 2015, 44 pages, not yet published. |
Jovanovski et al., “Image-Stitching for Dimensioning”, U.S. Appl. No. 14/870,488, filed Sep. 30, 2015, 45 pages, not yet published. |
Todeschini et al.; “Depth Sensor Based Auto-Focus System for an Indicia Scanner,” U.S. Appl. No. 14/872,176, filed Oct. 1, 2015, 44 pages, not yet published. |
Wikipedia, “3D projection” Downloaded on Nov. 25, 2015 from www.wikipedia.com, 4 pages. |
McCloskey et al., “Methods for Improving the Accuracy of Dimensioning-System Measurements,” U.S. Appl. No. 14/873,613, filed Sep. 2, 2015, 47 pages, not yet published. |
Search Report in counterpart European Application No. 15182675.7, dated Dec. 4, 2015, 10 pages. |
McCloskey et al., “Image Transformation for Indicia Reading,” U.S. Appl. No. 14/982,032, filed Oct. 30, 2015, 48 pages, not yet published. |
Search Report and Opinion in related GB Application No. 1517112.7, dated Feb. 19, 2016, 6 Pages (GB2503978 is a commonly owned now abandoned application and not cited above). |
Lloyd, Ryan and Scott McCloskey, “Recognition of 3D Package Shapes for Singe Camera Metrology” IEEE Winter conference on Applications of computer Visiona, IEEE, Mar. 24, 2014, pp. 99-106, {retrieved on Jun. 16, 2014}, Authors are employees of common Applicant. |
European Search Report for Related EP Application No. 15189214.8, dated Mar. 3, 2016, 9 pages. |
European Partial Search Report for related EP Application No. 15190306.9, dated May 6, 2016, 8 pages. |
Mike Stensvold, “Get the Most Out of Variable Aperture Lenses”, published on www.OutdoorPhotogrpaher.com; dated Dec. 7, 2010; 4 pages, [As noted on search report retrieved from URL: http;//www.outdoorphotographer.com/gear/lenses/get-the-most-out-ofvariable-aperture-lenses.html on Feb. 9, 2016]. |
European Search Report for related EP Application No. 16152477.2, dated May 24, 2016, 8 pages [New Reference cited herein; Reference DE102007037282 A1 and its US Counterparts have been previously cited.]. |
Second Chinese Office Action in related CN Application No. 201520810685.6, dated Mar. 22, 2016, 5 pages, no references. |
European Search Report in related EP Application No. 15190315.0, dated Apr. 1, 2016, 7 pages [Commonly owned Reference 2014/0104416 has been previously cited]. |
Second Chinese Office Action in related CN Application No. 2015220810562.2, dated Mar. 22, 2016, 5 pages. English Translation provided [No references]. |
European Search Report for related Application EP 15190249.1, dated Mar. 22, 2016, 7 pages. |
Second Chinese Office Action in related CN Application No. 201520810313.3, dated Mar. 22, 2016, 5 pages. English Translation provided [No references]. |
Search Report and Opinion in Related EP Application 15176943.7, dated Jan. 8, 2016, 8 pages, (US Application 2014/0049635 has been previously cited). |
European Search Report for related EP Application No. 15188440.0, dated Mar. 8, 2016, 8 pages. |
United Kingdom Search Report in related application GB1517842.9, dated Apr. 8, 2016, 8 pages. |
Great Britain Search Report for related Application On. GB1517843.7, dated Feb. 23, 2016; 8 pages. |
M.Zahid Gurbuz, Selim Akyokus, Ibrahim Emiroglu, Aysun Guran, An Efficient Algorithm for 3D Rectangular Box Jacking, 2009, Applied Automatic Systems: Proceedings of Selected AAS 2009 Papers, pp. 131-134 [Examiner cited art in related US matter with Notice of Allowance dated Aug. 11, 2016]. |
U.S. Appl. No. 15/182,636; H. Sprague Ackley et al., Automatic Mode Switching In a Volumer Dimensioner, not yet published, filed Jun. 15, 2016, 53 pages. |
European Extended search report in related EP Application No. 15190306.9, dated Sep. 9, 2016, 15 pages [only new references are cited; remaining references were cited with partial search report in same application dated May 6, 2016]. |
Collings et al., “The Applications and Technology of Phase-Only Liquid Crystal on Silicon Devices”, Journal of Display Technology, IEEE Service Center, New, York, NY, US, vol. 7, No. 3, Mar. 1, 2011 (Mar. 1, 2011), pp. 112-119. |
European extended Search report in related EP Application 13785171.3, dated Sep. 19, 2016, 8 pages. |
El-Hakim et al., “Multicamera vision-based approach to flexible feature measurement for inspection and reverse engineering”, published in Optical Engineering, Society of Photo-Optical Instrumentation Engineers, vol. 32, No. 9, Sep. 1, 1993, 15 pages. |
El-Hakim et al., “A Knowledge-based Edge/Object Measurement Technique”, Retrieved from the Internet: URL: https://www.researchgate.net/profile/Sabry_E1-Hakim/publication/44075058_A_Knowledge_Based_EdgeObject_Measurement_Technique/links/00b4953b5faa7d3304000000.pdf [retrieved on Jul. 15, 2016] dated Jan. 1, 1993, 9 pages. |
United Kingdom combined Search and Examination Report in related GB Application No. 1607394.2, dated Oct. 19, 2016, 7 pages. |
European Extended Search Report in Related EP Application No. 16172995.9, dated Aug. 22, 2016, 11 pages (Only new references have been cited; U.S. Pat. No. 8,463,079 (formerly U.S. Publication 201010220894) and U.S. Publication 2001/0027955 have been previously cited.). |
European Search Report from related EP Application No. 16168216.6, dated Oct. 20, 2016, 8 pages [New reference cited above; U.S. Publication 2014/0104413 has been previously cited]. |
Peter Clarke, Actuator Develop Claims Anti-Shake Breakthrough for Smartphone Cams, Electronic Engineering Times, p. 24, May 16, 2011. |
U.S. Appl. No. 14/055,234, not yet published, Hand Held Products, Inc. filed Oct. 16, 2013; Dimensioning System; 26 pages. |
U.S. Appl. No. 13/912,262, not yet published, filed Jun. 7, 2013, Hand Held Products Inc., Method Error Correction for 3D Imaging Device; 33 pages. |
European Search Report for application No. EP13186043 dated Feb. 26, 2014 (now EP2722656 (Apr. 23, 2014)): Total pp. 7. |
International Search Report for PCT/US2013/039438 (WO2013166368), dated Oct. 1, 2013, 7 pages. |
U.S. Appl. No. 14/453,019, not yet published, filed Aug. 6, 2014, Hand Held Products Inc., Dimensioning System With Guided Alignment: 31 pages. |
European Office Action for application EP 13186043, dated Jun. 12, 2014(now EP2722656 (Apr. 23, 2014)), Total of 6 pages. |
U.S. Appl. No. 14/461,524, not yet published, filed Aug. 18, 2014, Hand Held Products Inc., System and Method for Package Dimensioning: 21 pages. |
U.S. Appl. No. 14/801,023, Tyler Doornenbal et al., filed Jul. 16, 2015, not published yet, Adjusting Dimensioning Results Using Augmented Reality, 39 pages. |
Wikipedia, YUV description and definition, downloaded from http://www.wikipeida.org/wiki/YUV on Jun. 29, 2012, 10 pages. |
YUV Pixel Format, downloaded from http://www.fource.org/yuv.php on Jun. 29, 2012; 13 pages. |
YUV to RGB Conversion, downloaded from http://www.fource.org/fccyvrgb.php on Jun. 29, 2012; 5 pages. |
Benos et al., “Semi-Automatic Dimensioning with Imager of a Portable Device,” U.S. Appl. No. 61/149,912, filed Feb. 4, 2009 (now expired), 56 pages. |
Dimensional Weight—Wikipedia, The Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensional_weight, download date Aug. 1, 2008, 2 pages. |
Dimensioning—Wikipedia, the Free Encyclopedia, URL=http://en.wikipedia.org/wiki/Dimensioning, download date Aug. 1, 2008, 1 page. |
European Patent Office Action for Application No. 14157971.4-1906, dated Jul. 16, 2014, 5 pages. |
European Patent Search Report for Application No. 14157971.4-1906, dated Jun. 30, 2014, 6 pages. |
Caulier, Yannick et al., “A New Type of Color-Coded Light Structures for an Adapted and Rapid Determination of Point Correspondences for 3D Reconstruction.” Proc. of SPIE, vol. 8082 808232-3; 2011; 8 pages. |
Kazantsev, Aleksei et al. “Robust Pseudo-Random Coded Colored STructured Light Techniques for 3D Object Model Recovery”; ROSE 2008 IEEE International Workshop on Robotic and Sensors Environments (Oct. 17-18, 2008) , 6 pages. |
Mouaddib E. et al. “Recent Progress in Structured Light in order to Solve the Correspondence Problem in Stereo Vision” Proceedings of the 1997 IEEE International Conference on Robotics and Automation, Apr. 1997; 7 pages. |
Proesmans, Marc et al. “Active Acquisition of 3D Shape for Moving Objects” 0/7803-3258-X196 1996 IEEE; 4 pages. |
Hetzel, Gunter et al.; “3D Object Recognition from Range Images using Local Feature Histograms,”, Proceedings 2OO1 IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2001. Kauai, Hawaii, Dec. 8-14, 2001; pp. 394-399, XP010584149, ISBN: 978-0-7695-1272-3. |
U.S. Appl. No. 14/519,179, Serge Thuries et al., filed Oct. 21, 2014, not published yet. Dimensioning System With Multipath Interference Mitigation; 40 pages. |
U.S. Appl. No. 14/519,249, H. Sprague Ackley et al., filed Oct. 21, 2014, not published yet. Handheld Dimensioning System With Measurement-Conformance Feedback; 36 pages. |
U.S. Appl. No. 14/519,233, Franck Laffargue et al., filed Oct. 21, 2014, not published yet. Handheld Dimensioner With Data-Quality Indication; 34 pages. |
U.S. Appl. No. 14/519,211, H. Sprague Ackley et al., filed Oct. 21, 2014, System and Method for Dimensioning; not published yet. 33 pages. |
U.S. Appl. No. 14/519,195, Franck Laffargue et al., filed Oct. 21, 2014, not published yet. Handheld Dimensioning System With Feedback; 35 pages. |
U.S. Appl. No. 14/795,332, Frankc Laffargue et al., filed Jul. 9, 2015, not published yet, Systems and Methods for Enhancing Dimensioning; 55 pages. |
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned. |
U.S. Appl. No. 14/462,801 for Mobile Computing Device With Data Cognition Software, filed Aug. 19, 2014 (Todeschini et al.); 38 pages. |
U.S. Appl. No. 14/596,757 for System and Method for Detecting Barcode Printing Errors filed Jan. 14, 2015 (Ackley); 41 pages. |
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages. |
U.S. Appl. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.); 42 pages. |
U.S. Appl. No. 14/662,922 for Multifunction Point of Sale System filed Mar. 19, 2015 (Van Horn et al.); 41 pages. |
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages. |
U.S. Appl. No. 29/528,165 for In-Counter Barcode Scanner filed May 27, 2015 (Oberpriller et al.); 13 pages. |
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages. |
U.S. Appl. No. 14/614,796 for Cargo Apportionment Techniques filed Feb. 5, 2015 (Morton et al.); 56 pages. |
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages. |
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages. |
U.S. Appl. No. 14/578,627 for Safety System and Method filed Dec. 22, 2014 (Ackley et al.); 32 pages. |
U.S. Appl. No. 14/573,022 for Dynamic Diagnostic Indicator Generation filed Dec. 17, 2014 (Goldsmith); 43 pages. |
U.S. Appl. No. 14/529,857 for Barcode Reader With Security Features filed Oct. 31, 2014 (Todeschini et al.); 32 pages. |
U.S. Appl. No. 14/519,195 for Handheld Dimensioning System With Feedback filed Oct. 21, 2014 (Laffargue et al.); 39 pages. |
U.S. Appl. No. 14/519,211 for System and Method for Dimensioning filed Oct. 21, 2014 (Ackley et al.); 33 pages. |
U.S. Appl. No. 14/519,233 for Handheld Dimensioner With Data-Quality Indication filed Oct. 21, 2014 (Laffargue et al.); 36 pages. |
U.S. Appl. No. 14/533,319 for Barcode Scanning System Using Wearable Device With Embedded Camera filed Nov. 5, 2014 (Todeschini); 29 pages. |
U.S. Appl. No. 14/748,446 for Cordless Indicia Reader With a Multifunction Coil for Wireless Charging and EAS Deactivation, filed Jun. 24, 2015 (Xie et al.); 34 pages. |
U.S. Appl. No. 29/528,590 for Electronic Device filed May 29, 2015 (Fitch et al.); 9 pages. |
U.S. Appl. No. 14/519,249 for Handheld Dimensioning System With Measurement-Conformance Feedback filed Oct. 21, 2014 (Ackley et al.); 36 pages. |
U.S. Appl. No. 29/519,017 for Scanner filed Mar. 2, 2015 (Zhou et al.); 11 pages. |
U.S. Appl. No. 14/398,542 for Portable Electronic Devices Having a Separate Location Trigger Unit for Use in Controlling an Application Unit filed Nov. 3, 2014 (Bian et al.); 22 pages. |
U.S. Appl. No. 14/405,278 for Design Pattern for Secure Store filed Mar. 9, 2015 (Zhu et al.); 23 pages. |
U.S. Appl. No. 14/590,024 for Shelving and Package Locating Systems for Delivery Vehicles filed Jan. 6, 2015 (Payne); 31 pages. |
U.S. Appl. No. 14/568,305 for Auto-Contrast Viewfinder for an Indicia Reader filed Dec. 12, 2014 (Todeschini); 29 pages. |
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages. |
U.S. Appl. No. 14/580,262 for Media Gate for Thermal Transfer Printers filed Dec. 23, 2014 (Bowles); 36 pages. |
Chinese Notice of Reexamination in related Chinese Application 201520810313.3, dated Mar. 14, 2017, English Computer Translation provided, 7 pages [No new art cited]. |
Extended European search report in related EP Application 16199707.7, dated Apr. 10, 2017, 15 pages. |
Ulusoy et al., One-Shot Scanning using De Bruijn Spaced Grids, 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, 7 pages [Cited in EP Extended search report dated Apr. 10, 2017]. |
European Exam Report in related EP Application No. 15176943.7, dated Apr. 12, 2017, 6 pages [Art previously cited in this matter]. |
European Exam Report in related EP Application No. 15188440.0, dated Apr. 21, 2017, 4 pages [No new art to cite]. |
European Examination report in related EP Application No. 14181437.6, dated Feb. 8, 2017, 5 pages [References have been previously cited]. |
Wikipedia, “Microlens”, Downloaded from https://en.wikipedia.org/wiki/Microlens, pp. 3. {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter}. |
Fukaya et al., “Characteristics of Speckle Random Pattern and Its Applications”, pp. 317-327, Nouv. Rev. Optique, t.6, n.6. (1975) {Cited by Examiner in Feb. 9, 2017 Final Office Action in related matter: downloaded Mar. 2, 17 from http://iopscience.iop.org}. |
European extended search report in related EP Application 16190833.0, dated Mar. 9, 2017, 8 pages [only new art has been cited; US Publication 2014/0034731 was previously cited]. |
United Kingdom Combined Search and Examination Report in related Application No. GB1620676.5, dated Mar. 8, 2017, 6 pages [References have been previously cited; WO2014/151746, WO2012/175731, US 2014/0313527, GB2503978]. |
European Exam Report in related , EP Application No. 16168216.6, dated Feb. 27, 2017, 5 pages, [References have been previously cited; WO2011/017241 and US 2014/0104413]. |
European Exam Report in related EP Application No. 16152477.2, dated Jun. 20, 2017, 4 pages [No art to be cited]. |
European Exam Report in related EP Applciation 16172995.9, dated Jul. 6, 2017, 9 pages [No new art to be cited]. |
United Kingdom Search Report in related Application No. GB1700338.5, dated Jun. 30, 2017, 5 pages. |
European Search Report in related EP Application No. 17175357.7, dated Aug. 17, 2017, pp. 1-7 [No new art to be cited]. |
Ralph Grabowski, “Smothing 3D Mesh Objects,” New Commands in AutoCAD 2010: Part 11, Examiner Cited art in related matter Non Final Office Action dated May 19, 2017; 6 pages. |
Thorlabs, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from https://www.thorlabs.com/newgrouppage9.cfm?objectgroup_id=6430, 4 pages. |
EKSMA Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, downloaded from http://eksmaoptics.com/optical-systems/f-theta-lenses/f-theta-lens-for-1064-nm/, 2 pages. |
Sill Optics, Examiner Cited NPL in Advisory Action dated Apr. 12, 2017 in related commonly owned application, http://www.silloptics.de/1/products/sill-encyclopedia/laser-optics/f-theta-lenses/, 4 pages. |
European Extended Search Report in related EP Application No. 16190017.0, dated Jan. 4, 2017, 6 pages. |
European Extended Search Report in related EP Application No. 16173429.8, dated Dec. 1, 2016, 8 pages [Only new references cited: US 2013/0038881 was previously cited]. |
Extended European Search Report in related EP Application No. 16175410.0, dated Dec. 13, 2016, 5 pages. |
Padzensky, Ron; “Augmera; Gesture Control”, Dated Apr. 18, 2015, 15 pages [in Office Action dated Jan. 20, 2017 in related Application.]. |
Grabowski, Ralph; “New Commands in AutoCADS 2010: Part 11 Smoothing 3D Mesh Objects” Dated 2011, 6 pages, [in Office Action dated Jan. 20, 2017 in related Application.]. |
Theodoropoulos, Gabriel; “Using Gesture Recognizers to Handle Pinch, Rotate, Pan, Swipe, and Tap Gestures” dated Aug. 25, 2014, 34 pages, [in Office Action dated Jan. 20, 2017 in related Application]. |
EP Search Report in related EP Application No. 17171844 dated Sep. 18, 2017. 4 pages [Only new art cited herein}. |
EP Extended Search Report in related EP Applicaton No. 17174843.7 dated Oct. 17, 2017, 5 pages {Only new art cited herein}. |
UK Further Exam Report in related UK Application No. GB1517842.9, dated Sep. 1, 2017, 5 pages (only new art cited herein). |
Ulusoy, Ali Osman et al.; “One-Shot Scanning using De Bruijn Spaced Grids”, Brown University; 2009 IEEE 12th International Conference on Computer Vision Workshops, ICCV Workshops, pp. 1786-1792 [Cited in EPO Search Report dated Dec. 5, 2017}. |
Extended European Search report in related EP Application No. 17189496.7 dated Dec. 5, 2017; 9 pages. |
Extended European Search report in related EP Application No. 17190323.0 dated Jan. 19, 2018; 6 pages [Only new art cited herein]. |
Examination Report in related GB Application No. GB1517843.7, dated Jan. 19, 2018, 4 pages [Only new art cited herein]. |
Examination Report in related EP Application No. 15190315, dated Jan. 26, 2018, 6 pages [Only new art cited herein]. |
Boavida et al., “Dam monitoring using combined terrestrial imaging systems”, 2009 Civil Engineering Survey De/Jan. 2009, pp. 33-38 {Cited in Notice of Allowance dated Sep. 15, 2017 in related matter}. |
European Extended Search Report in related EP Application No. 17201794.9, dated Mar. 16, 2018, 10 pages [Only new art cited herein]. |
European Extended Search Report in related EP Application 17205030.4, dated Mar. 22, 2018, 8 pages. |
European Exam Report in related EP Application 16172995.9, dated Mar. 15, 2018, 7 pages (Only new art cited herein). |
United Kingdom Combined Search and Examination Report dated Mar. 21, 2018, 5 pages (Art has been previously cited). |
European extended Search Report in related Application No. 17207882.6 dated Apr. 26, 2018, 10 pages. |
United Kingdom Further Examination Report in related GB Patent Application No. 1517842.9 dated Jul. 26, 2018; 5 pages [Cited art has been previously cited in this matter]. |
United Kingdom Further Examination Report in related GB Patent Application No. 1517112.7 dated Jul. 17, 2018; 4 pages [No art cited]. |
United Kingdom Further Examination Report in related GB Patent Application No. 1620676.5 dated Jul. 17, 2018; 4 pages [No art cited]. |
Number | Date | Country | |
---|---|---|---|
20150149946 A1 | May 2015 | US |
Number | Date | Country | |
---|---|---|---|
61149912 | Feb 2009 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 12685816 | Jan 2010 | US |
Child | 14561367 | US |