Method and system for inputting products into an inventory system

Information

  • Patent Grant
  • 10984374
  • Patent Number
    10,984,374
  • Date Filed
    Wednesday, January 3, 2018
    7 years ago
  • Date Issued
    Tuesday, April 20, 2021
    3 years ago
Abstract
A method of characterizing inventory items to a planogram includes receiving images of view of a scene with inventory items and corresponding indicia using an imaging device; receiving utterances (such as spoken description of inventory items) from a user using a voice recognition system; identifying inventory items in the scene and a corresponding identification code based at least in part on the images and at least in part on the utterances; identifying a plurality of attributes corresponding to the inventory items; and characterizing the inventory items to a planogram based on the respective identification code and plurality of attributes.
Description
FIELD OF THE INVENTION

The present invention relates to inventory systems.


BACKGROUND

Generally speaking, entering product information into a newly installed retail system heavily relies on importing data from an external database or entering all the information manually. To address situations when such information is missing or incomplete, a more efficient solution is required.


Some attempts have been made to resolve this issue. For example, U.S. Pat. Appl. No. 20,130,173,435 by Cozad discloses systems and methods for managing product location information. The application focuses on scanning item identifiers corresponding to items on the shelves of a store, and creating a planogram of a store when necessary. U.S. Pat. App. No. 20,130,037,613 by Soldate discloses apparatus, system, and method to stock product and maintain inventory, and focuses on providing instructions to correctly stock a product, providing a next task upon completion of the stocking, and updating an inventory database based on completion of the stocking. A user provides stocking confirmation by scanning device or through voice input, and updates inventory management database by stocking product to a particular location using scanners or other handheld devices. U.S. Pat. App. No. 20,140,003,727 by Lortz et al. discloses image-augmented inventory management and wayfinding. The application focuses on receiving a query regarding an establishment, retrieving an output image from an image database in response to the query, and transmitting the output image to a mobile device. However, none of the references explicitly mentions a workflow capable of allowing a user to direct an employee to describe location of retail inventory, especially using verbal commands on the go, along with configuring or updating a database of the inventory management system.


Therefore, a need exists for a method and system for efficiently entering product information into an inventory system.


SUMMARY

Accordingly, the present invention embraces methods for inputting products into an inventory system.


In an exemplary embodiment, a method of characterizing one or more inventory items to a planogram includes receiving images using an imaging device, the images including a view of a scene with inventory items and an indicia for the inventory items, each indicia comprising an identification code configured to identify a respective inventory item; receiving utterances (such as spoken description of inventory items) from a user using a voice recognition system; identifying inventory items in the scene and an identification code corresponding to each of the identified inventory items; identifying a plurality of attributes corresponding to the inventory items; and characterizing the inventory items to a planogram based at least in part on the respective identification code and the respective plurality of attributes.


In another exemplary embodiment, a method of characterizing inventory items to a planogram includes receiving images using an imaging device, the images including a view of a scene; receiving utterances from a user using a voice recognition system; identifying at least one of inventory items in the scene based at least in part on the images and at least in part on the utterances; identifying a plurality of attributes corresponding to the inventory items, the plurality of attributes identified based at least in part on the images and at least in part on the utterances; and characterizing the inventory items to a planogram based at least in part on a respective identification code and the respective plurality of attributes.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 schematically depicts a method of characterizing one or more inventory items to a planogram, according to an embodiment.



FIG. 2 schematically depicts a method of characterizing one or more inventory items to a planogram, according to another embodiment.





DETAILED DESCRIPTION

The present invention embraces methods for inputting products into an inventory system.


When an inventory system is initially installed at a business or warehouse (e.g., a retail business), the installer typically must manually enter all of the product information into the inventory system or import the data from an external system. In many real-life situations, the customer (i.e., the business or warehouse at which the inventory system is being installed) does not have the product information or only has incomplete product information.


The present invention embraces a method and system for entering product information into an inventory system (e.g., a retail inventor system). In typical embodiments, the product information is entered via a workflow, which may be voice-directed. The method and system permit a user to describe the location of a product (e.g., by speaking into a microphone or entering text via a keyboard or touchscreen input device), scan a product-identifying symbol (e.g., an indicia, barcode, UPC symbol, 2D barcode, QR code, text on the packaging), input a shelf label (e.g., via scanning, speech input, text input via a keyboard or touchscreen input device), capture an image of the product (e.g., with the device used to capture the location of the product, with the device used to scan the product-identifying symbol, with another device, and/or with a head-mounted display device that includes an image sensor). Each item of information captured by the user can be stored on one of the devices used to capture the information and later transmitted to the inventory system and/or transmitted directly to the inventory system. In exemplary embodiments, the method and system direct a user through a workflow to achieve the input of the product information into the inventory system. The workflow may be visually directed and/or voice-directed, for example, using one or more devices available from Vocollect and/or Honeywell, such as the Dolphin CT50 and the Dolphin 75.


In exemplary embodiments, the workflow, which can be visually directed and/or voice directed, allows a user to direct an employee to describe the location, scan the barcode and shelf label, take a picture with the device, and record the current inventory of each product that needs to be entered into an inventory management system. In situations where the location information of an inventory is not available from an external resource such as, a database, the user can describe the location using verbal commands while moving in the inventory and accordingly the database of the inventory management system would be configured or updated.



FIG. 1 shows a method 100 of characterizing one or more inventory items to a planogram, according to an embodiment. At 102, one or more images are received using an imaging device, the one or more images comprising a view of a scene including one or more inventory items and an indicia for at least one of the one or more inventory items, each indicia comprising an identification code configured to identify a respective one of the one or more inventory items. At 104, one or more utterances are received from a user using a voice recognition system, the one or more utterances comprising a spoken description of at least one of the one or more inventory items. 106 includes identifying at least one of the one or more inventory items in the scene and an identification code corresponding to each of the at least one identified inventory items, wherein the identifying is based at least in part on the one or more images and at least in part on the one or more utterances. At 108, a plurality of attributes corresponding to the at least one of the one or more inventory items is identified, the plurality of attributes identified based at least in part on the one or more images and at least in part on the one or more utterances. And at 110, the one or more inventory items are characterized to a planogram based at least in part on the respective identification code and the respective plurality of attributes.


In an embodiment, the one or more images comprising the view of the scene can include one or more of: a location corresponding to at least one of the one or more inventory items in the scene; a quantity corresponding to at least one of the one or more inventory items in the scene; a facing arrangement corresponding to at least one of the one or more inventory items in the scene; a floor layout corresponding to at least one of the one or more inventory items in the scene; a fixture attribute corresponding to at least one of the one or more inventory items in the scene; and an identification code corresponding to at least one of the one or more inventory items in the scene.


In an embodiment, the one or more utterances can include one or more of: a location corresponding to at least one of the one or more inventory items in the scene; a quantity corresponding to at least one of the one or more inventory items in the scene; a facing arrangement corresponding to at least one of the one or more inventory items in the scene; a floor layout corresponding to at least one of the one or more inventory items in the scene; a fixture attribute corresponding to at least one of the one or more inventory items in the scene; and an identification code corresponding to at least one of the one or more inventory items.


In the method 100, the plurality of attributes can include one or more of: a location corresponding to a respective inventory item; a quantity corresponding to a respective inventory item; a facing arrangement corresponding to a respective inventory item; a floor layout corresponding to a respective inventory item; and a fixture attribute corresponding to a respective inventory item.


In an embodiment, the planogram can include: a visual representation of the scene, the visual representation based at least in part on the one or more images and at least in part on the one or more utterances; a description of the at least one of the one or more inventory items in the scene, the description comprising a visual representation of the inventory item and/or a textual description of the inventory item, wherein the description is based at least in part on the one or more utterances; and a description of the plurality of attributes corresponding to the at least one of the one or more inventory items, the description comprising a visual representation of at least some of the attributes and a textural description of at least some of attributes.


In the method 100, the description of the at least one of the one or more inventory items in the scene can include an identification code corresponding to at least one of the one or more inventory items in the scene. The description of the plurality of attributes corresponding to the at least one of the one or more inventory items can include a description of one or more of: a location corresponding to at least one of the one or more inventory items in the scene, the description of the location comprising a visual representation of the location and/or a textual description of the location; a quantity corresponding to at least one of the one or more inventory items in the scene, the description of the quantity comprising a visual representation of the quantity and/or a textual description of the quantity; a facing arrangement corresponding to at least one of the one or more inventory items in the scene, the description of the facing arrangement comprising a visual representation of the facing arrangement and/or a textual description of the facing arrangement; a floor layout corresponding to at least one of the one or more inventory items in the scene, the description of the layout comprising a visual representation of the layout and/or a textual description of the layout; and a fixture attribute corresponding to at least one of the one or more inventory items in the scene, the description of the fixture attributes comprising a visual representation of the fixture attribute and/or a textual description of the fixture attribute.


In an embodiment, the method 100 can further include identifying a first inventory item based at least in part on the one or more images and at least in part on the one or more utterances. Additionally or alternatively, the method 100 can include identifying a first inventory item based at least in part on the one or more images and identifying a second inventory item based at least in part on the one or more utterances. Additionally or alternatively, the method 100 can include identifying a first inventory item based at least in part on the one or more images and identifying an identification code corresponding to the first inventory item based at least in part on the one or more utterances. Furthermore, the method 100 can include identifying a first inventory item based at least in part on the one or more utterances and identifying an identification code corresponding to the first inventory item based at least in part on the one or more images. The method 100 can also include identifying a first inventory item based at least in part on the one or more images and at least in part on the one or more utterances, and identifying an identification code corresponding to the first inventory item based at least in part on the one or more images and at least in part on the one or more utterances.


In an embodiment, the identification code can include a shelf identification code, and/or an SKU code. The one or more utterances can include a spoken name for an inventory item, a spoken identification code corresponding to an inventory item, and/or a spoken description of one or more of the plurality of attributes. The spoken description of the one or more of the plurality of attributes can include one or more of: a spoken description of a location corresponding to a respective inventory item; a spoken description of a quantity corresponding to a respective inventory item; a spoken description of a facing arrangement corresponding to a respective inventory item; a spoken description of a floor layout corresponding to a respective inventory item; and a spoken description of a fixture attribute corresponding to a respective inventory item.



FIG. 2 shows a method 200 of characterizing one or more inventory items to a planogram, according to an embodiment. At 202, one or more images are received using an imaging device. The one or more images include a view of a scene including one or more inventory items and one or more of: an indicia for at least one of the one or more inventory items, each indicia comprising an identification code configured to identify a respective one of the one or more inventory items; a location corresponding to at least one of the one or more inventory items in the scene; a quantity corresponding to at least one of the one or more inventory items in the scene; a facing arrangement corresponding to at least one of the one or more inventory items in the scene; a floor layout corresponding to at least one of the one or more inventory items in the scene; and a fixture attribute corresponding to at least one of the one or more inventory items in the scene. At 204, one or more utterances from a user are received using a voice recognition system. The one or more utterances include a spoken description of one or more of: at least one of the one or more inventory items; a location corresponding to at least one of the one or more inventory items in the scene; a quantity corresponding to at least one of the one or more inventory items in the scene; a facing arrangement corresponding to at least one of the one or more inventory items in the scene; a floor layout corresponding to at least one of the one or more inventory items in the scene; a fixture attribute corresponding to at least one of the one or more inventory items in the scene; and an identification code corresponding to at least one of the one or more inventory items. At 206, at least one of the one or more inventory items in the scene is identified based at least in part on the one or more images and at least in part on the one or more utterances. At 208, a plurality of attributes corresponding to the at least one of the one or more inventory items is identified. The plurality of attributes identified is based at least in part on the one or more images and at least in part on the one or more utterances. The plurality of attributes includes one or more of: a location corresponding to a respective inventory item; a quantity corresponding to a respective inventory item; a facing arrangement corresponding to a respective inventory item; a floor layout corresponding to a respective inventory item; and a fixture attribute corresponding to a respective inventory item. And at 210, the one or more inventory items are characterized to a planogram based at least in part on the respective identification code and the respective plurality of attributes. The planogram includes: a visual representation of the scene, the visual representation based at least in part on the one or more images and at least in part on the one or more utterances; a description of the at least one of the one or more inventory items in the scene and of the corresponding identification codes, the description comprising a visual representation of the inventory item and/or a textual description of the inventory item, wherein the description is based at least in part on the one or more utterances; and a description of placement of the at least one of the one or more inventory items in the scene. The placement is in respect of one or more of the following attributes respectively corresponding to the at least one of the one or more inventory items: a location corresponding to at least one of the one or more inventory items in the scene, the description of the location comprising a visual representation of the location and/or a textual description of the location; a quantity corresponding to at least one of the one or more inventory items in the scene, the description of the quantity comprising a visual representation of the quantity and/or a textual description of the quantity; a facing arrangement corresponding to at least one of the one or more inventory items in the scene, the description of the facing arrangement comprising a visual representation of the facing arrangement and/or a textual description of the facing arrangement; a floor layout corresponding to at least one of the one or more inventory items in the scene, the description of the layout comprising a visual representation of the layout and/or a textual description of the layout; and a fixture attribute corresponding to at least one of the one or more inventory items in the scene, the description of the fixture attributes comprising a visual representation of the fixture attribute and/or a textual description of the fixture attribute.


In an embodiment, the method 200 can further include identifying an identification code corresponding to at least one of the one or more inventory items based at least in part on the one or more images and at least in part on the one or more utterances. Additionally or alternatively, the method 200 can include initiating the method 200 at least in part by providing a prompt configured to instruct the user to characterize the one or more inventory items. The prompt can include an audible prompt, the audible prompt including an identification of at least one inventory item.


Additionally, the method 200 can include providing a prompt configured to instruct the user to characterize the one or more inventory items upon identifying at least one inventory item in the scene based at least in part on the one or more images. The method 200 can also include capturing the one or more images using the imaging device upon receiving the one or more utterances from the user using the voice recognition system.


In an embodiment, the method 200 can further include receiving coordinates corresponding to one or more of: the location for at least one of the one or more inventory items; the facing arrangement for at least one of the one or more inventory items; and the floor layout for at least one of the one or more inventory items. The coordinates can include geoposition coordinates. The coordinates can be generated using a Wi-Fi positioning system, and/or using the one or more images comprising the view of the scene.


Identifying the one or more inventory items in the scene at 206 can include: identifying the one or more indicia in the one or more images comprising the view of the scene; decoding the one or more indicia to obtain the corresponding one or more identification codes; and searching a database to identify the respective one or more inventory items corresponding to the one or more identification codes.


In an embodiment, identifying the quantity can include identifying a current quantity and/or a stocking quantity based at least in part on the one or more images, and/or based at least in part on the one or more utterances.


The method 200 can include generating the planogram in a two-dimensional graphic corresponding to the view of the scene and/or in a three-dimensional graphic corresponding to the view of the scene. Additionally or alternatively, the method 200 can include transposing the one or more images, providing a view of the scene from a different orientation, and wherein the planogram comprising the visual representation corresponds to the different orientation.


In an embodiment, the planogram can include instructions for positioning the one or more inventory items on the one or more fixture attributes. The method 200 can include storing the planogram in a database. Additionally or alternatively, the method 200 can include storing at least one of the following aspects of information pertaining to the one or more inventory items in one or more nascent database entries: the visual representation of the scene; the description of the at least one of the one or more inventory items in the scene and of the corresponding identification codes; the description of the location corresponding to at least one of the one or more inventory items in the scene; the description of the quantity corresponding to at least one of the one or more inventory items in the scene; the description of the facing arrangement corresponding to at least one of the one or more inventory items in the scene; the description of the floor layout corresponding to at least one of the one or more inventory items in the scene; the description of the fixture attribute corresponding to at least one of the one or more inventory items in the scene; and the description of the placement of the at least one of the one or more inventory items in the scene in respect of the one or more attributes.


In an embodiment, at least one of the following aspects of information pertaining to the one or more inventory items may have been unavailable prior to receiving the one or more images and/or prior to receiving the one or more utterances: the visual representation of the scene; the description of the at least one of the one or more inventory items in the scene and of the corresponding identification codes; the description of the location corresponding to at least one of the one or more inventory items in the scene; the description of the quantity corresponding to at least one of the one or more inventory items in the scene; the description of the facing arrangement corresponding to at least one of the one or more inventory items in the scene; the description of the floor layout corresponding to at least one of the one or more inventory items in the scene; the description of the fixture attribute corresponding to at least one of the one or more inventory items in the scene; and the description of the placement of the at least one of the one or more inventory items in the scene in respect of the one or more attributes.


In an embodiment, the description of the location and/or the description of the quantity corresponding to the at least one of the one or more inventory items in the scene can include coordinates corresponding to one or more of: the location corresponding to the at least one of the one or more inventory items; the facing arrangement corresponding to the at least one of the one or more inventory items; the floor layout corresponding to the at least one of the one or more inventory items; and the fixture attribute corresponding to the at least one of the one or more inventory items. Additionally or alternatively, the description of the location and/or the quantity corresponding to the at least one of the one or more inventory items can include a current quantity and/or a stocking quantity.


In an embodiment, the one or more utterances from the user can be received contemporaneously with the one or more images from the imaging device. Alternatively, the one or more utterances from the user can be received separately in time from the one or more images from the imaging device. The method 200 can include receiving the one or more utterances from a first user operating the voice recognition system, and receiving the one or more images from a second user operating the imaging device.


In an embodiment, the view of the scene can include a retail environment selected from the group consisting of: a softline retailer; a grocery retailer; a food retailer; a convenience retailer; a hardline retailer; and a specialty retailer. Additionally or alternatively, the view of the scene can include a retail environment selected from the group consisting of: a department store; a clothing store; a clothing store; a footwear store; a toiletries store; a cosmetics store; a pharmacy; an office-supply store; a discount outlet; a grocery store; a supermarket; a hypermarket; a convenience store; a big-box store; a restaurant; a fruit stand; a bakery; a coffee shop; a farmer's market; a home-improvement store; a hardware store; a warehouse club; an electronics store; an automobile dealership; an appliance store; a furniture store; a sporting goods store; a lumber yard; a bookstore; an art gallery; a craft store; a music store; a musical instrument store; a boutique; a jewelry store; a gift shop; an arcade; a bazaar; a toy store; a category killer; a chain store; a concept store; a co-operative store; a destination store; a general store; a mall; a kiosk; a pop-up retail store; and a retail market.


The method 200 can include providing stocking instructions based at least in part on the planogram. The planogram may characterize a customized store layout. The customized store layout can be selected based at least in part on localized customer desires and/or localized demand. The planogram may incorporate corporate-level business rules and/or best practices pertaining to product placement.


Additionally, the method 200 can include obtaining sales data corresponding to at least one of the one or more inventory items, the sales data stored in a database; calculating a performance metric corresponding to at least one of the one or more inventory items; and characterizing the at least one of the one or more inventory items to the planogram based at least in part on the performance metric. The planogram can include a modification to one or more of: the location corresponding to the at least one of the one or more inventory items in the scene; the quantity corresponding to the at least one of the one or more inventory items in the scene; the facing arrangement corresponding to the at least one of the one or more inventory items in the scene; the floor layout corresponding to the at least one of the one or more inventory items in the scene; and the fixture attribute corresponding to the at least one of the one or more inventory items. In an embodiment, the metric can include one or more of: sales value; gross margin; profit margin; inventory turn; customer conversion ratio; shelf space; and items per purchase.


In an embodiment, the method 200 can include user capturing the one or more images using the imaging device while working in a retail environment in which the one or more inventory items are sold, the working at least in part comprising a task customarily performed by a worker in the retail environment. The imaging device can include a camera and/or a scanner.


Additionally, the method 200 can include receiving one or more manual inputs from a hand-operated input device, wherein the planogram is based at least in part on the one or more manual inputs. The method 200 can also include providing a workflow to the user while the user is working in a retail environment in which the one or more inventory items are sold. The workflow can be configured to direct the user to: capture the one or more images using the imaging device while working in the retail environment; and/or provide the one or more utterances using the voice recognition system; wherein the working at least in part comprising a task customarily performed by a worker in the retail environment. The workflow can include a voice directed workflow, the voice directed workflow provided by an audio headset. Additionally or alternatively, the workflow can include a visually directed workflow, the visually directed workflow provided on a screen of a mobile device.


In an embodiment, the location corresponding to the at least one of the one or more inventory items in the scene may be unavailable from any database resource prior to receiving the one or more images and/or prior to receiving the one or more utterances. The workflow may be prepared in advance based at least in part on known attributes of the retail environment. Additionally or alternatively, the workflow may be generated in real-time while the user is working in a retail environment in which the one or more inventory items are sold. The method 200 can include using the one or more images and/or the one or more utterances to generate a workflow in real-time while the user is working in a retail environment in which the one or more inventory items are sold.


Device and method components are meant to show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the disclosure with details that will be readily apparent to those of ordinary skill in the art having the benefit of the description herein. In various embodiments, the sequence in which the elements of appear in exemplary embodiments disclosed herein may vary. Two or more method steps may be performed simultaneously or in a different order than the sequence in which the elements appear in the exemplary embodiments.


To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. No. 6,832,725; U.S. Pat. No. 7,128,266; U.S. Pat. No. 7,159,783; U.S. Pat. No. 7,413,1279; U.S. Pat. No. 7,726,575; U.S. Pat. No. 8,294,969; U.S. Pat. No. 8,317,105; U.S. Pat. No. 8,322,622; U.S. Pat. No. 8,366,005; U.S. Pat. No. 8,371,507; U.S. Pat. No. 8,376,233; U.S. Pat. No. 8,381,979; U.S. Pat. No. 8,390,909; U.S. Pat. No. 8,408,464; U.S. Pat. No. 8,408,468; U.S. Pat. No. 8,408,469; U.S. Pat. No. 8,424,768; U.S. Pat. No. 8,448,863; U.S. Pat. No. 8,457,013; U.S. Pat. No. 8,459,557; U.S. Pat. No. 8,469,272; U.S. Pat. No. 8,474,712; U.S. Pat. No. 8,479,992; U.S. Pat. No. 8,490,877; U.S. Pat. No. 8,517,271; U.S. Pat. No. 8,523,076; U.S. Pat. No. 8,528,818; U.S. Pat. No. 8,544,737; U.S. Pat. No. 8,548,242; U.S. Pat. No. 8,548,420; U.S. Pat. No. 8,550,335; U.S. Pat. No. 8,550,354; U.S. Pat. No. 8,550,357; U.S. Pat. No. 8,556,174; U.S. Pat. No. 8,556,176; U.S. Pat. No. 8,556,177; U.S. Pat. No. 8,559,767; U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,561,895; U.S. Pat. No. 8,561,903; U.S. Pat. No. 8,561,905; U.S. Pat. No. 8,565,107; U.S. Pat. No. 8,571,307; U.S. Pat. No. 8,579,200; U.S. Pat. No. 8,583,924; U.S. Pat. No. 8,584,945; U.S. Pat. No. 8,587,595; U.S. Pat. No. 8,587,697; U.S. Pat. No. 8,588,869; U.S. Pat. No. 8,590,789; U.S. Pat. No. 8,596,539; U.S. Pat. No. 8,596,542; U.S. Pat. No. 8,596,543; U.S. Pat. No. 8,599,271; U.S. Pat. No. 8,599,957; U.S. Pat. No. 8,600,158; U.S. Pat. No. 8,600,167; U.S. Pat. No. 8,602,309; U.S. Pat. No. 8,608,053; U.S. Pat. No. 8,608,071; U.S. Pat. No. 8,611,309; U.S. Pat. No. 8,615,487; U.S. Pat. No. 8,616,454; U.S. Pat. No. 8,621,123; U.S. Pat. No. 8,622,303; U.S. Pat. No. 8,628,013; U.S. Pat. No. 8,628,015; U.S. Pat. No. 8,628,016; U.S. Pat. No. 8,629,926; U.S. Pat. No. 8,630,491; U.S. Pat. No. 8,635,309; U.S. Pat. No. 8,636,200; U.S. Pat. No. 8,636,212; U.S. Pat. No. 8,636,215; U.S. Pat. No. 8,636,224; U.S. Pat. No. 8,638,806; U.S. Pat. No. 8,640,958; U.S. Pat. No. 8,640,960; U.S. Pat. No. 8,643,717; U.S. Pat. No. 8,646,692; U.S. Pat. No. 8,646,694; U.S. Pat. No. 8,657,200; U.S. Pat. No. 8,659,397; U.S. Pat. No. 8,668,149; U.S. Pat. No. 8,678,285; U.S. Pat. No. 8,678,286; U.S. Pat. No. 8,682,077; U.S. Pat. No. 8,687,282; U.S. Pat. No. 8,692,927; U.S. Pat. No. 8,695,880; U.S. Pat. No. 8,698,949; U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,717,494; U.S. Pat. No. 8,720,783; U.S. Pat. No. 8,723,804; U.S. Pat. No. 8,723,904; U.S. Pat. No. 8,727,223; U.S. Pat. No. 8,740,082; U.S. Pat. No. 8,740,085; U.S. Pat. No. 8,746,563; U.S. Pat. No. 8,750,445; U.S. Pat. No. 8,752,766; U.S. Pat. No. 8,756,059; U.S. Pat. No. 8,757,495; U.S. Pat. No. 8,760,563; U.S. Pat. No. 8,763,909; U.S. Pat. No. 8,777,108; U.S. Pat. No. 8,777,109; U.S. Pat. No. 8,779,898; U.S. Pat. No. 8,781,520; U.S. Pat. No. 8,783,573; U.S. Pat. No. 8,789,757; U.S. Pat. No. 8,789,758; U.S. Pat. No. 8,789,759; U.S. Pat. No. 8,794,520; U.S. Pat. No. 8,794,522; U.S. Pat. No. 8,794,525; U.S. Pat. No. 8,794,526; U.S. Pat. No. 8,798,367; U.S. Pat. No. 8,807,431; U.S. Pat. No. 8,807,432; U.S. Pat. No. 8,820,630; U.S. Pat. No. 8,822,848; U.S. Pat. No. 8,824,692; U.S. Pat. No. 8,824,696; U.S. Pat. No. 8,842,849; U.S. Pat. No. 8,844,822; U.S. Pat. No. 8,844,823; U.S. Pat. No. 8,849,019; U.S. Pat. No. 8,851,383; U.S. Pat. No. 8,854,633; U.S. Pat. No. 8,866,963; U.S. Pat. No. 8,868,421; U.S. Pat. No. 8,868,519; U.S. Pat. No. 8,868,802; U.S. Pat. No. 8,868,803; U.S. Pat. No. 8,870,074; U.S. Pat. No. 8,879,639; U.S. Pat. No. 8,880,426; U.S. Pat. No. 8,881,983; U.S. Pat. No. 8,881,987; U.S. Pat. No. 8,903,172; U.S. Pat. No. 8,908,995; U.S. Pat. No. 8,910,870; U.S. Pat. No. 8,910,875; U.S. Pat. No. 8,914,290; U.S. Pat. No. 8,914,788; U.S. Pat. No. 8,915,439; U.S. Pat. No. 8,915,444; U.S. Pat. No. 8,916,789; U.S. Pat. No. 8,918,250; U.S. Pat. No. 8,918,564; U.S. Pat. No. 8,925,818; U.S. Pat. No. 8,939,374; U.S. Pat. No. 8,942,480; U.S. Pat. No. 8,944,313; U.S. Pat. No. 8,944,327; U.S. Pat. No. 8,944,332; U.S. Pat. No. 8,950,678; U.S. Pat. No. 8,967,468; U.S. Pat. No. 8,971,346; U.S. Pat. No. 8,976,030; U.S. Pat. No. 8,976,368; U.S. Pat. No. 8,978,981; U.S. Pat. No. 8,978,983; U.S. Pat. No. 8,978,984; U.S. Pat. No. 8,985,456; U.S. Pat. No. 8,985,457; U.S. Pat. No. 8,985,459; U.S. Pat. No. 8,985,461; U.S. Pat. No. 8,988,578; U.S. Pat. No. 8,988,590; U.S. Pat. No. 8,991,704; U.S. Pat. No. 8,996,194; U.S. Pat. No. 8,996,384; U.S. Pat. No. 9,002,641; U.S. Pat. No. 9,007,368; U.S. Pat. No. 9,010,641; U.S. Pat. No. 9,015,513; U.S. Pat. No. 9,016,576; U.S. Pat. No. 9,022,288; U.S. Pat. No. 9,030,964; U.S. Pat. No. 9,033,240; U.S. Pat. No. 9,033,242; U.S. Pat. No. 9,036,054; U.S. Pat. No. 9,037,344; U.S. Pat. No. 9,038,911; U.S. Pat. No. 9,038,915; U.S. Pat. No. 9,047,098; U.S. Pat. No. 9,047,359; U.S. Pat. No. 9,047,420; U.S. Pat. No. 9,047,525; U.S. Pat. No. 9,047,531; U.S. Pat. No. 9,053,055; U.S. Pat. No. 9,053,378; U.S. Pat. No. 9,053,380; U.S. Pat. No. 9,058,526; U.S. Pat. No. 9,064,165; U.S. Pat. No. 9,064,165; U.S. Pat. No. 9,064,167; U.S. Pat. No. 9,064,168; U.S. Pat. No. 9,064,254; U.S. Pat. No. 9,066,032; U.S. Pat. No. 9,070,032; U.S. Pat. No. 9,076,459; U.S. Pat. No. 9,079,423; U.S. Pat. No. 9,080,856; U.S. Pat. No. 9,082,023; U.S. Pat. No. 9,082,031; U.S. Pat. No. 9,084,032; U.S. Pat. No. 9,087,250; U.S. Pat. No. 9,092,681; U.S. Pat. No. 9,092,682; U.S. Pat. No. 9,092,683; U.S. Pat. No. 9,093,141; U.S. Pat. No. 9,098,763; U.S. Pat. No. 9,104,929; U.S. Pat. No. 9,104,934; U.S. Pat. No. 9,107,484; U.S. Pat. No. 9,111,159; U.S. Pat. No. 9,111,166; U.S. Pat. No. 9,135,483; U.S. Pat. No. 9,137,009; U.S. Pat. No. 9,141,839; U.S. Pat. No. 9,147,096; U.S. Pat. No. 9,148,474; U.S. Pat. No. 9,158,000; U.S. Pat. No. 9,158,340; U.S. Pat. No. 9,158,953; U.S. Pat. No. 9,159,059; U.S. Pat. No. 9,165,174; U.S. Pat. No. 9,171,543; U.S. Pat. No. 9,183,425; U.S. Pat. No. 9,189,669; U.S. Pat. No. 9,195,844; U.S. Pat. No. 9,202,458; U.S. Pat. No. 9,208,366; U.S. Pat. No. 9,208,367; U.S. Pat. No. 9,219,836; U.S. Pat. No. 9,224,024; U.S. Pat. No. 9,224,027; U.S. Pat. No. 9,230,140; U.S. Pat. No. 9,235,553; U.S. Pat. No. 9,239,950; U.S. Pat. No. 9,245,492; U.S. Pat. No. 9,248,640; U.S. Pat. No. 9,250,652; U.S. Pat. No. 9,250,712; U.S. Pat. No. 9,251,411; U.S. Pat. No. 9,258,033; U.S. Pat. No. 9,262,633; U.S. Pat. No. 9,262,660; U.S. Pat. No. 9,262,662; U.S. Pat. No. 9,269,036; U.S. Pat. No. 9,270,782; U.S. Pat. No. 9,274,812; U.S. Pat. No. 9,275,388; U.S. Pat. No. 9,277,668; U.S. Pat. No. 9,280,693; U.S. Pat. No. 9,286,496; U.S. Pat. No. 9,298,964; U.S. Pat. No. 9,301,427; U.S. Pat. No. 9,313,377; U.S. Pat. No. 9,317,037; U.S. Pat. No. 9,319,548; U.S. Pat. No. 9,342,723; U.S. Pat. No. 9,361,882; U.S. Pat. No. 9,365,381; U.S. Pat. No. 9,373,018; U.S. Pat. No. 9,375,945; U.S. Pat. No. 9,378,403; U.S. Pat. No. 9,383,848; U.S. Pat. No. 9,384,374; U.S. Pat. No. 9,390,304; U.S. Pat. No. 9,390,596; U.S. Pat. No. 9,411,386; U.S. Pat. No. 9,412,242; U.S. Pat. No. 9,418,269; U.S. Pat. No. 9,418,270; U.S. Pat. No. 9,465,967; U.S. Pat. No. 9,423,318; U.S. Pat. No. 9,424,454; U.S. Pat. No. 9,436,860; U.S. Pat. No. 9,443,123; U.S. Pat. No. 9,443,222; U.S. Pat. No. 9,454,689; U.S. Pat. No. 9,464,885; U.S. Pat. No. 9,465,967; U.S. Pat. No. 9,478,983; U.S. Pat. No. 9,481,186; U.S. Pat. No. 9,487,113; U.S. Pat. No. 9,488,986; U.S. Pat. No. 9,489,782; U.S. Pat. No. 9,490,540; U.S. Pat. No. 9,491,729; U.S. Pat. No. 9,497,092; U.S. Pat. No. 9,507,974; U.S. Pat. No. 9,519,814; U.S. Pat. No. 9,521,331; U.S. Pat. No. 9,530,038; U.S. Pat. No. 9,572,901; U.S. Pat. No. 9,558,386; U.S. Pat. No. 9,606,581; U.S. Pat. No. 9,646,189; U.S. Pat. No. 9,646,191; U.S. Pat. No. 9,652,648; U.S. Pat. No. 9,652,653; U.S. Pat. No. 9,656,487; U.S. Pat. No. 9,659,198; U.S. Pat. No. 9,680,282; U.S. Pat. No. 9,697,401; U.S. Pat. No. 9,701,140; U.S. Design Pat. No. D702,237; U.S. Design Pat. No. D716,285; U.S. Design Pat. No. D723,560; U.S. Design Pat. No. D730,357; U.S. Design Pat. No. D730,901; U.S. Design Pat. No. D730,902; U.S. Design Pat. No. D734,339; U.S. Design Pat. No. D737,321; U.S. Design Pat. No. D754,205; U.S. Design Pat. No. D754,206; U.S. Design Pat. No. D757,009; U.S. Design Pat. No. D760,719; U.S. Design Pat. No. D762,604; U.S. Design Pat. No. D766,244; U.S. Design Pat. No. D777,166; U.S. Design Pat. No. D771,631; U.S. Design Pat. No. D783,601; U.S. Design Pat. No. D785,617; U.S. Design Pat. No. D785,636; U.S. Design Pat. No. D790,505; U.S. Design Pat. No. D790,546; International Publication No. 2013/163789; U.S. Patent Application Publication No. 2008/0185432; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0265880; U.S. Patent Application Publication No. 2011/0202554; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0194692; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2013/0175341; U.S. Patent Application Publication No. 2013/0175343; U.S. Patent Application Publication No. 2013/0257744; U.S. Patent Application Publication No. 2013/0257759; U.S. Patent Application Publication No. 2013/0270346; U.S. Patent Application Publication No. 2013/0292475; U.S. Patent Application Publication No. 2013/0292477; U.S. Patent Application Publication No. 2013/0293539; U.S. Patent Application Publication No. 2013/0293540; U.S. Patent Application Publication No. 2013/0306728; U.S. Patent Application Publication No. 2013/0306731; U.S. Patent Application Publication No. 2013/0307964; U.S. Patent Application Publication No. 2013/0308625; U.S. Patent Application Publication No. 2013/0313324; U.S. Patent Application Publication No. 2013/0332996; U.S. Patent Application Publication No. 2014/0001267; U.S. Patent Application Publication No. 2014/0025584; U.S. Patent Application Publication No. 2014/0034734; U.S. Patent Application Publication No. 2014/0036848; U.S. Patent Application Publication No. 2014/0039693; U.S. Patent Application Publication No. 2014/0049120; U.S. Patent Application Publication No. 2014/0049635; U.S. Patent Application Publication No. 2014/0061306; U.S. Patent Application Publication No. 2014/0063289; U.S. Patent Application Publication No. 2014/0066136; U.S. Patent Application Publication No. 2014/0067692; U.S. Patent Application Publication No. 2014/0070005; U.S. Patent Application Publication No. 2014/0071840; U.S. Patent Application Publication No. 2014/0074746; U.S. Patent Application Publication No. 2014/0076974; U.S. Patent Application Publication No. 2014/0097249; U.S. Patent Application Publication No. 2014/0098792; U.S. Patent Application Publication No. 2014/0100813; U.S. Patent Application Publication No. 2014/0103115; U.S. Patent Application Publication No. 2014/0104413; U.S. Patent Application Publication No. 2014/0104414; U.S. Patent Application Publication No. 2014/0104416; U.S. Patent Application Publication No. 2014/0106725; U.S. Patent Application Publication No. 2014/0108010; U.S. Patent Application Publication No. 2014/0108402; U.S. Patent Application Publication No. 2014/0110485; U.S. Patent Application Publication No. 2014/0125853; U.S. Patent Application Publication No. 2014/0125999; U.S. Patent Application Publication No. 2014/0129378; U.S. Patent Application Publication No. 2014/0131443; U.S. Patent Application Publication No. 2014/0133379; U.S. Patent Application Publication No. 2014/0136208; U.S. Patent Application Publication No. 2014/0140585; U.S. Patent Application Publication No. 2014/0152882; U.S. Patent Application Publication No. 2014/0158770; U.S. Patent Application Publication No. 2014/0159869; U.S. Patent Application Publication No. 2014/0166759; U.S. Patent Application Publication No. 2014/0168787; U.S. Patent Application Publication No. 2014/0175165; U.S. Patent Application Publication No. 2014/0191684; U.S. Patent Application Publication No. 2014/0191913; U.S. Patent Application Publication No. 2014/0197304; U.S. Patent Application Publication No. 2014/0214631; U.S. Patent Application Publication No. 2014/0217166; U.S. Patent Application Publication No. 2014/0231500; U.S. Patent Application Publication No. 2014/0247315; U.S. Patent Application Publication No. 2014/0263493; U.S. Patent Application Publication No. 2014/0263645; U.S. Patent Application Publication No. 2014/0270196; U.S. Patent Application Publication No. 2014/0270229; U.S. Patent Application Publication No. 2014/0278387; U.S. Patent Application Publication No. 2014/0288933; U.S. Patent Application Publication No. 2014/0297058; U.S. Patent Application Publication No. 2014/0299665; U.S. Patent Application Publication No. 2014/0332590; U.S. Patent Application Publication No. 2014/0351317; U.S. Patent Application Publication No. 2014/0362184; U.S. Patent Application Publication No. 2014/0363015; U.S. Patent Application Publication No. 2014/0369511; U.S. Patent Application Publication No. 2014/0374483; U.S. Patent Application Publication No. 2014/0374485; U.S. Patent Application Publication No. 2015/0001301; U.S. Patent Application Publication No. 2015/0001304; U.S. Patent Application Publication No. 2015/0009338; U.S. Patent Application Publication No. 2015/0014416; U.S. Patent Application Publication No. 2015/0021397; U.S. Patent Application Publication No. 2015/0028104; U.S. Patent Application Publication No. 2015/0029002; U.S. Patent Application Publication No. 2015/0032709; U.S. Patent Application Publication No. 2015/0039309; U.S. Patent Application Publication No. 2015/0039878; U.S. Patent Application Publication No. 2015/0040378; U.S. Patent Application Publication No. 2015/0049347; U.S. Patent Application Publication No. 2015/0051992; U.S. Patent Application Publication No. 2015/0053769; U.S. Patent Application Publication No. 2015/0062366; U.S. Patent Application Publication No. 2015/0063215; U.S. Patent Application Publication No. 2015/0088522; U.S. Patent Application Publication No. 2015/0096872; U.S. Patent Application Publication No. 2015/0100196; U.S. Patent Application Publication No. 2015/0102109; U.S. Patent Application Publication No. 2015/0115035; U.S. Patent Application Publication No. 2015/0127791; U.S. Patent Application Publication No. 2015/0128116; U.S. Patent Application Publication No. 2015/0133047; U.S. Patent Application Publication No. 2015/0134470; U.S. Patent Application Publication No. 2015/0136851; U.S. Patent Application Publication No. 2015/0142492; U.S. Patent Application Publication No. 2015/0144692; U.S. Patent Application Publication No. 2015/0144698; U.S. Patent Application Publication No. 2015/0149946; U.S. Patent Application Publication No. 2015/0161429; U.S. Patent Application Publication No. 2015/0178523; U.S. Patent Application Publication No. 2015/0178537; U.S. Patent Application Publication No. 2015/0178685; U.S. Patent Application Publication No. 2015/0181109; U.S. Patent Application Publication No. 2015/0199957; U.S. Patent Application Publication No. 2015/0210199; U.S. Patent Application Publication No. 2015/0212565; U.S. Patent Application Publication No. 2015/0213647; U.S. Patent Application Publication No. 2015/0220753; U.S. Patent Application Publication No. 2015/0220901; U.S. Patent Application Publication No. 2015/0227189; U.S. Patent Application Publication No. 2015/0236984; U.S. Patent Application Publication No. 2015/0239348; U.S. Patent Application Publication No. 2015/0242658; U.S. Patent Application Publication No. 2015/0248572; U.S. Patent Application Publication No. 2015/0254485; U.S. Patent Application Publication No. 2015/0261643; U.S. Patent Application Publication No. 2015/0264624; U.S. Patent Application Publication No. 2015/0268971; U.S. Patent Application Publication No. 2015/0269402; U.S. Patent Application Publication No. 2015/0288689; U.S. Patent Application Publication No. 2015/0288896; U.S. Patent Application Publication No. 2015/0310243; U.S. Patent Application Publication No. 2015/0310244; U.S. Patent Application Publication No. 2015/0310389; U.S. Patent Application Publication No. 2015/0312780; U.S. Patent Application Publication No. 2015/0327012; U.S. Patent Application Publication No. 2016/0014251; U.S. Patent Application Publication No. 2016/0025697; U.S. Patent Application Publication No. 2016/0026838; U.S. Patent Application Publication No. 2016/0026839; U.S. Patent Application Publication No. 2016/0040982; U.S. Patent Application Publication No. 2016/0042241; U.S. Patent Application Publication No. 2016/0057230; U.S. Patent Application Publication No. 2016/0062473; U.S. Patent Application Publication No. 2016/0070944; U.S. Patent Application Publication No. 2016/0092805; U.S. Patent Application Publication No. 2016/0101936; U.S. Patent Application Publication No. 2016/0104019; U.S. Patent Application Publication No. 2016/0104274; U.S. Patent Application Publication No. 2016/0109219; U.S. Patent Application Publication No. 2016/0109220; U.S. Patent Application Publication No. 2016/0109224; U.S. Patent Application Publication No. 2016/0112631; U.S. Patent Application Publication No. 2016/0112643; U.S. Patent Application Publication No. 2016/0117627; U.S. Patent Application Publication No. 2016/0124516; U.S. Patent Application Publication No. 2016/0125217; U.S. Patent Application Publication No. 2016/0125342; U.S. Patent Application Publication No. 2016/0125873; U.S. Patent Application Publication No. 2016/0133253; U.S. Patent Application Publication No. 2016/0171597; U.S. Patent Application Publication No. 2016/0171666; U.S. Patent Application Publication No. 2016/0171720; U.S. Patent Application Publication No. 2016/0171775; U.S. Patent Application Publication No. 2016/0171777; U.S. Patent Application Publication No. 2016/0174674; U.S. Patent Application Publication No. 2016/0178479; U.S. Patent Application Publication No. 2016/0178685; U.S. Patent Application Publication No. 2016/0178707; U.S. Patent Application Publication No. 2016/0179132; U.S. Patent Application Publication No. 2016/0179143; U.S. Patent Application Publication No. 2016/0179368; U.S. Patent Application Publication No. 2016/0179378; U.S. Patent Application Publication No. 2016/0180130; U.S. Patent Application Publication No. 2016/0180133; U.S. Patent Application Publication No. 2016/0180136; U.S. Patent Application Publication No. 2016/0180594; U.S. Patent Application Publication No. 2016/0180663; U.S. Patent Application Publication No. 2016/0180678; U.S. Patent Application Publication No. 2016/0180713; U.S. Patent Application Publication No. 2016/0185136; U.S. Patent Application Publication No. 2016/0185291; U.S. Patent Application Publication No. 2016/0186926; U.S. Patent Application Publication No. 2016/0188861; U.S. Patent Application Publication No. 2016/0188939; U.S. Patent Application Publication No. 2016/0188940; U.S. Patent Application Publication No. 2016/0188941; U.S. Patent Application Publication No. 2016/0188942; U.S. Patent Application Publication No. 2016/0188943; U.S. Patent Application Publication No. 2016/0188944; U.S. Patent Application Publication No. 2016/0189076; U.S. Patent Application Publication No. 2016/0189087; U.S. Patent Application Publication No. 2016/0189088; U.S. Patent Application Publication No. 2016/0189092; U.S. Patent Application Publication No. 2016/0189284; U.S. Patent Application Publication No. 2016/0189288; U.S. Patent Application Publication No. 2016/0189366; U.S. Patent Application Publication No. 2016/0189443; U.S. Patent Application Publication No. 2016/0189447; U.S. Patent Application Publication No. 2016/0189489; U.S. Patent Application Publication No. 2016/0192051; U.S. Patent Application Publication No. 2016/0202951; U.S. Patent Application Publication No. 2016/0202958; U.S. Patent Application Publication No. 2016/0202959; U.S. Patent Application Publication No. 2016/0203021; U.S. Patent Application Publication No. 2016/0203429; U.S. Patent Application Publication No. 2016/0203797; U.S. Patent Application Publication No. 2016/0203820; U.S. Patent Application Publication No. 2016/0204623; U.S. Patent Application Publication No. 2016/0204636; U.S. Patent Application Publication No. 2016/0204638; U.S. Patent Application Publication No. 2016/0227912; U.S. Patent Application Publication No. 2016/0232891; U.S. Patent Application Publication No. 2016/0292477; U.S. Patent Application Publication No. 2016/0294779; U.S. Patent Application Publication No. 2016/0306769; U.S. Patent Application Publication No. 2016/0314276; U.S. Patent Application Publication No. 2016/0314294; U.S. Patent Application Publication No. 2016/0316190; U.S. Patent Application Publication No. 2016/0323310; U.S. Patent Application Publication No. 2016/0325677; U.S. Patent Application Publication No. 2016/0327614; U.S. Patent Application Publication No. 2016/0327930; U.S. Patent Application Publication No. 2016/0328762; U.S. Patent Application Publication No. 2016/0330218; U.S. Patent Application Publication No. 2016/0343163; U.S. Patent Application Publication No. 2016/0343176; U.S. Patent Application Publication No. 2016/0364914; U.S. Patent Application Publication No. 2016/0370220; U.S. Patent Application Publication No. 2016/0372282; U.S. Patent Application Publication No. 2016/0373847; U.S. Patent Application Publication No. 2016/0377414; U.S. Patent Application Publication No. 2016/0377417; U.S. Patent Application Publication No. 2017/0010141; U.S. Patent Application Publication No. 2017/0010328; U.S. Patent Application Publication No. 2017/0010780; U.S. Patent Application Publication No. 2017/0016714; U.S. Patent Application Publication No. 2017/0018094; U.S. Patent Application Publication No. 2017/0046603; U.S. Patent Application Publication No. 2017/0047864; U.S. Patent Application Publication No. 2017/0053146; U.S. Patent Application Publication No. 2017/0053147; U.S. Patent Application Publication No. 2017/0053647; U.S. Patent Application Publication No. 2017/0055606; U.S. Patent Application Publication No. 2017/0060316; U.S. Patent Application Publication No. 2017/0061961; U.S. Patent Application Publication No. 2017/0064634; U.S. Patent Application Publication No. 2017/0083730; U.S. Patent Application Publication No. 2017/0091502; U.S. Patent Application Publication No. 2017/0091706; U.S. Patent Application Publication No. 2017/0091741; U.S. Patent Application Publication No. 2017/0091904; U.S. Patent Application Publication No. 2017/0092908; U.S. Patent Application Publication No. 2017/0094238; U.S. Patent Application Publication No. 2017/0098947; U.S. Patent Application Publication No. 2017/0100949; U.S. Patent Application Publication No. 2017/0108838; U.S. Patent Application Publication No. 2017/0108895; U.S. Patent Application Publication No. 2017/0118355; U.S. Patent Application Publication No. 2017/0123598; U.S. Patent Application Publication No. 2017/0124369; U.S. Patent Application Publication No. 2017/0124396; U.S. Patent Application Publication No. 2017/0124687; U.S. Patent Application Publication No. 2017/0126873; U.S. Patent Application Publication No. 2017/0126904; U.S. Patent Application Publication No. 2017/0139012; U.S. Patent Application Publication No. 2017/0140329; U.S. Patent Application Publication No. 2017/0140731; U.S. Patent Application Publication No. 2017/0147847; U.S. Patent Application Publication No. 2017/0150124; U.S. Patent Application Publication No. 2017/0169198; U.S. Patent Application Publication No. 2017/0171035; U.S. Patent Application Publication No. 2017/0171703; U.S. Patent Application Publication No. 2017/0171803; U.S. Patent Application Publication No. 2017/0180359; U.S. Patent Application Publication No. 2017/0180577; U.S. Patent Application Publication No. 2017/0181299; U.S. Patent Application Publication No. 2017/0190192; U.S. Patent Application Publication No. 2017/0193432; U.S. Patent Application Publication No. 2017/0193461; U.S. Patent Application Publication No. 2017/0193727; U.S. Patent Application Publication No. 2017/0199266; U.S. Patent Application Publication No. 2017/0200108; and U.S. Patent Application Publication No. 2017/0200275.


The following represent Exemplary Embodiments:


A1. A method of characterizing one or more inventory items to a planogram, the method comprising:


receiving one or more images using an imaging device, the one or more images comprising a view of a scene including one or more inventory items and an indicia for at least one of the one or more inventory items, each indicia comprising an identification code configured to identify a respective one of the one or more inventory items;


receiving one or more utterances from a user using a voice recognition system, the one or more utterances comprising a spoken description of at least one of the one or more inventory items;


identifying at least one of the one or more inventory items in the scene and an identification code corresponding to each of the at least one identified inventory items, wherein the identifying is based at least in part on the one or more images and at least in part on the one or more utterances;


identifying a plurality of attributes corresponding to the at least one of the one or more inventory items, the plurality of attributes identified based at least in part on the one or more images and at least in part on the one or more utterances; and


characterizing the one or more inventory items to a planogram based at least in part on the respective identification code and the respective plurality of attributes.


A2. The method of A1, wherein the one or more images comprising the view of the scene comprises one or more of:


a location corresponding to at least one of the one or more inventory items in the scene;


a quantity corresponding to at least one of the one or more inventory items in the scene;


a facing arrangement corresponding to at least one of the one or more inventory items in the scene;


a floor layout corresponding to at least one of the one or more inventory items in the scene;


a fixture attribute corresponding to at least one of the one or more inventory items in the scene; and


an identification code corresponding to at least one of the one or more inventory items in the scene.


A3. The method of A1, wherein the one or more utterances comprises one or more of:


a location corresponding to at least one of the one or more inventory items in the scene;


a quantity corresponding to at least one of the one or more inventory items in the scene;


a facing arrangement corresponding to at least one of the one or more inventory items in the scene;


a floor layout corresponding to at least one of the one or more inventory items in the scene;


a fixture attribute corresponding to at least one of the one or more inventory items in the scene; and


an identification code corresponding to at least one of the one or more inventory items.


A4. The method of A1, wherein the plurality of attributes comprises one or more of:


a location corresponding to a respective inventory item;


a quantity corresponding to a respective inventory item;


a facing arrangement corresponding to a respective inventory item;


a floor layout corresponding to a respective inventory item; and


a fixture attribute corresponding to a respective inventory item.


A5. The method of A1, wherein the planogram comprises:


a visual representation of the scene, the visual representation based at least in part on the one or more images and at least in part on the one or more utterances;


a description of the at least one of the one or more inventory items in the scene, the description comprising a visual representation of the inventory item and/or a textual description of the inventory item, wherein the description is based at least in part on the one or more utterances; and


a description of the plurality of attributes corresponding to the at least one of the one or more inventory items, the description comprising a visual representation of at least some of the attributes and a textural description of at least some of attributes.


A6. The method of A1, wherein the description of the at least one of the one or more inventory items in the scene comprises an identification code corresponding to at least one of the one or more inventory items in the scene.


A7. The method of A1, wherein the description of the plurality of attributes corresponding to the at least one of the one or more inventory items comprises a description of one or more of:


a location corresponding to at least one of the one or more inventory items in the scene, the description of the location comprising a visual representation of the location and/or a textual description of the location;


a quantity corresponding to at least one of the one or more inventory items in the scene, the description of the quantity comprising a visual representation of the quantity and/or a textual description of the quantity;


a facing arrangement corresponding to at least one of the one or more inventory items in the scene, the description of the facing arrangement comprising a visual representation of the facing arrangement and/or a textual description of the facing arrangement;


a floor layout corresponding to at least one of the one or more inventory items in the scene, the description of the layout comprising a visual representation of the layout and/or a textual description of the layout; and


a fixture attribute corresponding to at least one of the one or more inventory items in the scene, the description of the fixture attributes comprising a visual representation of the fixture attribute and/or a textual description of the fixture attribute.


A8. The method of A1, comprising identifying a first inventory item based at least in part on the one or more images and at least in part on the one or more utterances.


A9. The method of A1, comprising identifying a first inventory item based at least in part on the one or more images and identifying a second inventory item based at least in part on the one or more utterances.


A10. The method of A1, comprising identifying a first inventory item based at least in part on the one or more images and identifying an identification code corresponding to the first inventory item based at least in part on the one or more utterances.


A11. The method of A1, comprising identifying a first inventory item based at least in part on the one or more utterances and identifying an identification code corresponding to the first inventory item based at least in part on the one or more images.


A12. The method of A1, comprising identifying a first inventory item based at least in part on the one or more images and at least in part on the one or more utterances, and identifying an identification code corresponding to the first inventory item based at least in part on the one or more images and at least in part on the one or more utterances.


A13. The method of A1, wherein the identification code comprises a shelf identification code.


A14. The method of A1, wherein the identification code comprises an SKU code.


A15. The method of A1, wherein the one or more utterances comprise a spoken name for an inventory item.


A16. The method of A1, wherein the one or more utterances comprise a spoken identification code corresponding to an inventory item.


A17. The method of A1, wherein the one or more utterances comprise a spoken description of one or more of the plurality of attributes.


A18. The method of A17, wherein the spoken description of the one or more of the plurality of attributes comprises one or more of:


a spoken description of a location corresponding to a respective inventory item;


a spoken description of a quantity corresponding to a respective inventory item;


a spoken description of a facing arrangement corresponding to a respective inventory item;


a spoken description of a floor layout corresponding to a respective inventory item; and


a spoken description of a fixture attribute corresponding to a respective inventory item.


B19. A method of characterizing one or more inventory items to a planogram, the method comprising:


receiving one or more images using an imaging device, the one or more images comprising a view of a scene including one or more inventory items and one or more of:

    • an indicia for at least one of the one or more inventory items, each indicia comprising an identification code configured to identify a respective one of the one or more inventory items;
    • a location corresponding to at least one of the one or more inventory items in the scene;
    • a quantity corresponding to at least one of the one or more inventory items in the scene;
    • a facing arrangement corresponding to at least one of the one or more inventory items in the scene;
    • a floor layout corresponding to at least one of the one or more inventory items in the scene; and
    • a fixture attribute corresponding to at least one of the one or more inventory items in the scene;


receiving one or more utterances from a user using a voice recognition system, the one or more utterances comprising a spoken description of one or more of:

    • at least one of the one or more inventory items;
    • a location corresponding to at least one of the one or more inventory items in the scene;
    • a quantity corresponding to at least one of the one or more inventory items in the scene;
    • a facing arrangement corresponding to at least one of the one or more inventory items in the scene;
    • a floor layout corresponding to at least one of the one or more inventory items in the scene;
    • a fixture attribute corresponding to at least one of the one or more inventory items in the scene; and
    • an identification code corresponding to at least one of the one or more inventory items;


identifying at least one of the one or more inventory items in the scene based at least in part on the one or more images and at least in part on the one or more utterances;


identifying a plurality of attributes corresponding to the at least one of the one or more inventory items, the plurality of attributes identified based at least in part on the one or more images and at least in part on the one or more utterances, wherein the plurality of attributes comprises one or more of:

    • a location corresponding to a respective inventory item;
    • a quantity corresponding to a respective inventory item;
    • a facing arrangement corresponding to a respective inventory item;
    • a floor layout corresponding to a respective inventory item; and
    • a fixture attribute corresponding to a respective inventory item; and


characterizing the one or more inventory items to a planogram based at least in part on the respective identification code and the respective plurality of attributes, the planogram comprising:

    • a visual representation of the scene, the visual representation based at least in part on the one or more images and at least in part on the one or more utterances;
    • a description of the at least one of the one or more inventory items in the scene and of the corresponding identification codes, the description comprising a visual representation of the inventory item and/or a textual description of the inventory item, wherein the description is based at least in part on the one or more utterances; and
    • a description of placement of the at least one of the one or more inventory items in the scene, the placement being in respect of one or more of the following attributes respectively corresponding to the at least one of the one or more inventory items:
      • a location corresponding to at least one of the one or more inventory items in the scene, the description of the location comprising a visual representation of the location and/or a textual description of the location;
      • a quantity corresponding to at least one of the one or more inventory items in the scene, the description of the quantity comprising a visual representation of the quantity and/or a textual description of the quantity;
      • a facing arrangement corresponding to at least one of the one or more inventory items in the scene, the description of the facing arrangement comprising a visual representation of the facing arrangement and/or a textual description of the facing arrangement;
      • a floor layout corresponding to at least one of the one or more inventory items in the scene, the description of the layout comprising a visual representation of the layout and/or a textual description of the layout; and
      • a fixture attribute corresponding to at least one of the one or more inventory items in the scene, the description of the fixture attributes comprising a visual representation of the fixture attribute and/or a textual description of the fixture attribute.


B20. The method of B19, further comprising identifying an identification code corresponding to at least one of the one or more inventory items based at least in part on the one or more images and at least in part on the one or more utterances.


B21. The method of B19, further comprising initiating the method at least in part by providing a prompt configured to instruct the user to characterize the one or more inventory items.


B22. The method of B21, wherein the prompt comprises an audible prompt, the audible prompt comprising an identification of at least one inventory item.


B23. The method of B19, further comprising providing a prompt configured to instruct the user to characterize the one or more inventory items upon identifying at least one inventory item in the scene based at least in part on the one or more images.


B24. The method of B19, further comprising capturing the one or more images using the imaging device upon receiving the one or more utterances from the user using the voice recognition system.


B25. The method of B19, further comprising receiving coordinates corresponding to one or more of:


the location for at least one of the one or more inventory items;


the facing arrangement for at least one of the one or more inventory items; and


the floor layout for at least one of the one or more inventory items.


B26. The method of B25, wherein the coordinates comprise geoposition coordinates.


B27. The method of B25, wherein the coordinates are generated using a Wi-Fi positioning system.


B28. The method of B25, wherein the coordinates are generated using the one or more images comprising the view of the scene.


B29. The method of B19, wherein identifying the one or more inventory items in the scene comprises:


identifying the one or more indicia in the one or more images comprising the view of the scene;


decoding the one or more indicia to obtain the corresponding one or more identification codes; and


searching a database to identify the respective one or more inventory items corresponding to the one or more identification codes.


B30. The method of B19, wherein identifying the quantity comprises identifying a current quantity and/or a stocking quantity based at least in part on the one or more images.


B31. The method of B19, wherein identifying the quantity comprises identifying a current quantity and/or a stocking quantity based at least in part on the one or more utterances.


B32. The method of B19, further comprising generating the planogram in a two-dimensional graphic corresponding to the view of the scene and/or in a three-dimensional graphic corresponding to the view of the scene.


B33. The method of B19, further comprising transposing the one or more images, providing a view of the scene from a different orientation, and wherein the planogram comprising the visual representation corresponds to the different orientation.


B34. The method of B19, wherein the planogram comprises instructions for positioning the one or more inventory items on the one or more fixture attributes.


B35. The method of B19, further comprising storing the planogram in a database.


B36. The method of B19, further comprising storing at least one of the following aspects of information pertaining to the one or more inventory items in one or more nascent database entries:


the visual representation of the scene;


the description of the at least one of the one or more inventory items in the scene and of the corresponding identification codes;


the description of the location corresponding to at least one of the one or more inventory items in the scene;


the description of the quantity corresponding to at least one of the one or more inventory items in the scene;


the description of the facing arrangement corresponding to at least one of the one or more inventory items in the scene;


the description of the floor layout corresponding to at least one of the one or more inventory items in the scene;


the description of the fixture attribute corresponding to at least one of the one or more inventory items in the scene; and


the description of the placement of the at least one of the one or more inventory items in the scene in respect of the one or more attributes.


B37. The method of B19, wherein at least one of the following aspects of information pertaining to the one or more inventory items were unavailable prior to receiving the one or more images and/or prior to receiving the one or more utterances:


the visual representation of the scene;


the description of the at least one of the one or more inventory items in the scene and of the corresponding identification codes;


the description of the location corresponding to at least one of the one or more inventory items in the scene;


the description of the quantity corresponding to at least one of the one or more inventory items in the scene;


the description of the facing arrangement corresponding to at least one of the one or more inventory items in the scene;


the description of the floor layout corresponding to at least one of the one or more inventory items in the scene;


the description of the fixture attribute corresponding to at least one of the one or more inventory items in the scene; and


the description of the placement of the at least one of the one or more inventory items in the scene in respect of the one or more attributes.


B38. The method of B19, wherein the description of the location and/or the description of the quantity corresponding to the at least one of the one or more inventory items in the scene comprises coordinates corresponding to one or more of:


the location corresponding to the at least one of the one or more inventory items;


the facing arrangement corresponding to the at least one of the one or more inventory items;


the floor layout corresponding to the at least one of the one or more inventory items; and


the fixture attribute corresponding to the at least one of the one or more inventory items.


B39. The method of B19, wherein the description of the location and/or the quantity corresponding to the at least one of the one or more inventory items comprises a current quantity and/or a stocking quantity.


B40. The method of B19, wherein the one or more utterances from the user are received contemporaneously with the one or more images from the imaging device.


B41. The method of B19, wherein the one or more utterances from the user are received separately in time from the one or more images from the imaging device.


B42. The method of B19, comprising receiving the one or more utterances from a first user operating the voice recognition system, and receiving the one or more images from a second user operating the imaging device.


B43. The method of B19, wherein the view of the scene comprises a retail environment selected from the group consisting of: a softline retailer; a grocery retailer; a food retailer; a convenience retailer; a hardline retailer; and a specialty retailer.


B44. The method of B19, wherein the view of the scene comprises a retail environment selected from the group consisting of: a department store; a clothing store; a clothing store; a footwear store; a toiletries store; a cosmetics store; a pharmacy; an office-supply store; a discount outlet; a grocery store; a supermarket; a hypermarket; a convenience store; a big-box store; a restaurant; a fruit stand; a bakery; a coffee shop; a farmer's market; a home-improvement store; a hardware store; a warehouse club; an electronics store; an automobile dealership; an appliance store; a furniture store; a sporting goods store; a lumber yard; a bookstore; an art gallery; a craft store; a music store; a musical instrument store; a boutique; a jewelry store; a gift shop; an arcade; a bazaar; a toy store; a category killer; a chain store; a concept store; a co-operative store; a destination store; a general store; a mall; a kiosk; a pop-up retail store; and a retail market.


B45. The method of B19, further comprising providing stocking instructions based at least in part on the planogram.


B46. The method of B19, wherein the planogram characterizes a customized store layout.


B47. The method of B46, wherein the customized store layout is selected based at least in part on localized customer desires and/or localized demand.


B48. The method of B19, wherein the planogram incorporates corporate-level business rules and/or best practices pertaining to product placement.


B49. The method of B19, further comprising:


obtaining sales data corresponding to at least one of the one or more inventory items, the sales data stored in a database;


calculating a performance metric corresponding to at least one of the one or more inventory items; and


characterizing the at least one of the one or more inventory items to the planogram based at least in part on the performance metric, the planogram comprising a modification to one or more of:

    • the location corresponding to the at least one of the one or more inventory items in the scene;
    • the quantity corresponding to the at least one of the one or more inventory items in the scene;
    • the facing arrangement corresponding to the at least one of the one or more inventory items in the scene;
    • the floor layout corresponding to the at least one of the one or more inventory items in the scene; and
    • the fixture attribute corresponding to the at least one of the one or more inventory items.


B50. The method of B49, wherein the metric comprises one or more of: sales value; gross margin; profit margin; inventory turn; customer conversion ratio; shelf space; and items per purchase.


B51. The method of B19, wherein the user captures the one or more images using the imaging device while working in a retail environment in which the one or more inventory items are sold, the working at least in part comprising a task customarily performed by a worker in the retail environment.


B52. The method of B19, wherein the imaging device comprises a camera and/or a scanner.


B53. The method of B19, further comprising receiving one or more manual inputs from a hand-operated input device, and wherein the planogram is based at least in part on the one or more manual inputs.


B54. The method of B19, further comprising providing a workflow to the user while the user is working in a retail environment in which the one or more inventory items are sold, the workflow configured to direct the user to:


capture the one or more images using the imaging device while working in the retail environment; and/or


provide the one or more utterances using the voice recognition system;


wherein the working at least in part comprising a task customarily performed by a worker in the retail environment.


B55. The method of B54, wherein the workflow comprises a voice directed workflow, the voice directed workflow provided by an audio headset.


B56. The method of B54, wherein the workflow comprises a visually directed workflow, the visually directed workflow provided on a screen of a mobile device.


B57. The method of B19, wherein the location corresponding to the at least one of the one or more inventory items in the scene is unavailable from any database resource prior to receiving the one or more images and/or prior to receiving the one or more utterances.


B58. The method of B54, wherein the workflow is prepared in advance based at least in part on known attributes of the retail environment.


B59. The method of B54, wherein the workflow is generated in real-time while the user is working in a retail environment in which the one or more inventory items are sold.


B60. The method of B19, further comprising using the one or more images and/or the one or more utterances to generate a workflow in real-time while the user is working in a retail environment in which the one or more inventory items are sold.


In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. A method of characterizing an inventory item to a planogram, the method comprising: receiving an image using an imaging device, the image comprising a view of a scene including an inventory item and an indicia for the inventory item, the indicia comprising an identification code configured to identify the inventory item;receiving an utterance from a user using a voice recognition system, the utterance comprising a spoken description of the inventory item;identifying the inventory item in the scene and the identification code corresponding to the identified inventory item, wherein the identifying is based on the image and on the utterance;identifying an attribute corresponding to the inventory item, the attribute identified based on the image and the utterance;calculating a performance metric corresponding to the inventory item, wherein the performance metric comprises at least one of a gross margin, a profit margin, an inventory turn, a customer conversion ratio, a shelf space and an item per purchase;characterizing the inventory item to generate a planogram based on the performance metric, the identification code and the attribute;wherein the identification code is identified from the image during planogram generation; andwherein the attribute of the inventory item identified from the image and the user utterance received are used for characterizing of the inventory item during the planogram generation.
  • 2. The method of claim 1, wherein the image comprising the view of the scene comprises of: a location corresponding to the inventory item in the scene;a quantity corresponding to the inventory item in the scene;a facing arrangement corresponding to the inventory item in the scene;a floor layout corresponding to the inventory item in the scene;a fixture attribute corresponding to the inventory item in the scene; andan identification code corresponding to the inventory item in the scene.
  • 3. The method of claim 1, wherein the utterance comprises at least one of: a location corresponding to the inventory item in the scene;a quantity corresponding to the inventory item in the scene;a facing arrangement corresponding to the inventory item in the scene;a floor layout corresponding to the inventory item in the scene;a fixture attribute corresponding to the inventory item in the scene; andan identification code corresponding to the inventory item.
  • 4. The method of claim 1, wherein the attribute comprises at least one of: a location corresponding to the inventory item;a quantity corresponding to the inventory item;a facing arrangement corresponding to the inventory item;a floor layout corresponding to the inventory item; anda fixture attribute corresponding to the inventory item.
  • 5. The method of claim 1, wherein the planogram comprises: a visual representation of the scene, the visual representation based is on the image and the utterance;a description of the inventory item in the scene, the description comprising a visual representation of the inventory item and/or a textual description of the inventory item, wherein the description is based on the utterance; anda description of the attribute corresponding to the inventory item, the description comprising a visual representation of the attribute and a textural description of the attribute.
  • 6. The method of claim 1, wherein a description of the inventory item in the scene comprises an identification code corresponding to inventory item in the scene.
  • 7. The method of claim 1, wherein a description of the attribute corresponding to the inventory item comprises the description of: a location corresponding to inventory item in the scene, the description of the location comprising a visual representation of the location and/or a textual description of the location;a quantity corresponding to the inventory item in the scene, the description of the quantity comprising a visual representation of the quantity and/or a textual description of the quantity;a facing arrangement corresponding to the inventory item in the scene, the description of the facing arrangement comprising a visual representation of the facing arrangement and/or a textual description of the facing arrangement;a floor layout corresponding to inventory item in the scene, the description of the floor layout comprising a visual representation of the floor layout and/or a textual description of the floor layout; anda fixture attribute corresponding to the inventory item, the description of the fixture attribute comprising a visual representation of the fixture attribute and/or a textual description of the fixture attribute.
  • 8. The method of claim 1, comprising identifying a first inventory item based on the image and the utterance.
  • 9. The method of claim 1, comprising identifying a first inventory item based on the image and identifying a second inventory item based the utterance.
  • 10. The method of claim 1, comprising identifying a first inventory item based on the image and identifying an identification code corresponding to the first inventory item based on the utterance.
  • 11. A method of characterizing one or more inventory items to a planogram, the method comprising: receiving an image using an imaging device, the image comprising a view of a scene including an inventory item and at least one of: an indicia for the inventory item, each indicia comprising an identification code configured to identify a respective inventory item;a location corresponding to the inventory item in the scene;a quantity corresponding to the inventory item in the scene;a facing arrangement corresponding to the inventory item in the scene;a floor layout corresponding to the inventory item in the scene; anda fixture attribute corresponding to the inventory item in the scene;receiving an utterance from a user using a voice recognition system, the utterance comprising a spoken description of: at least the inventory item;a location corresponding to the inventory item in the scene;a quantity corresponding to the inventory item in the scene;a facing arrangement corresponding to the inventory item in the scene;a floor layout corresponding to the inventory item in the scene;a fixture attribute corresponding to the inventory item in the scene; andan identification code corresponding to the inventory item;identifying the inventory item in the scene based on the image and utterance;identifying an attribute corresponding to the inventory item, the attribute identified based on the image and the utterance, wherein the attribute comprises at least one of: a location corresponding to a respective inventory item;a quantity corresponding to a respective inventory item;a facing arrangement corresponding to a respective inventory item;a floor layout corresponding to a respective inventory item; anda fixture attribute corresponding to a respective inventory item; andcalculating a performance metric corresponding to the inventory item, wherein the performance metric comprises at least one of a gross margin, a profit margin, an inventory turn, a customer conversion ratio, a shelf space and an item per purchase;characterizing the inventory item to generate a planogram based on the performance metric, the identification code and the respective attribute, wherein the identification code is identified from the image during planogram generation, and the generated planogram comprises of: a visual representation of the scene, the visual representation based on the image and on the utterance;a description of the inventory item in the scene and of the corresponding identification code, the description comprising a visual representation of the inventory item and/or a textual description of the inventory item, wherein the description is based on the utterance; anda description of placement of the inventory item in the scene, the placement being in respect of at least one of the following attribute respectively corresponding to the inventory item:a location corresponding to inventory item in the scene, the description of the location comprising a visual representation of the location and/or a textual description of the location;a quantity corresponding to inventory item in the scene, the description of the quantity comprising a visual representation of the quantity and/or a textual description of the quantity;a facing arrangement corresponding to inventory item in the scene, the description of the facing arrangement comprising a visual representation of the facing arrangement and/or a textual description of the facing arrangement;a floor layout corresponding to inventory item in the scene, the description of the floor layout comprising a visual representation of the floor layout and/or a textual description of the floor layout; anda fixture attribute corresponding to inventory item in the scene, the description of the fixture attribute comprising a visual representation of the fixture attribute and/or a textual description of the fixture attribute.
  • 12. The method of claim 11, further comprising identifying a first inventory item based on the image and identifying an identification code corresponding to the first inventory item based on the utterance.
  • 13. The method of claim 11, further comprising initiating the method by providing a prompt configured to instruct the user to characterize the inventory item.
  • 14. The method of claim 13, wherein the prompt comprises an audible prompt, the audible prompt comprising an identification of the inventory item.
  • 15. The method of claim 11, further comprising providing a prompt configured to instruct the user to characterize the inventory item upon identifying the inventory item in the scene based on the image.
  • 16. The method of claim 11, further comprising capturing the image using the imaging device upon receiving the utterance from the user using the voice recognition system.
  • 17. The method of claim 11, further comprising receiving coordinates corresponding to one or more of: the location for the inventory item;the facing arrangement for the inventory item; andthe floor layout for the inventory item.
  • 18. The method of claim 17, wherein the coordinates comprise geoposition coordinates.
  • 19. The method of claim 17, wherein the coordinates are generated using a Wi-Fi positioning system.
  • 20. The method of claim 17, wherein the coordinates are generated using the image comprising the view of the scene.
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims the benefit of U.S. Patent Application No. 62/457,242 for a Method and System for Inputting Products into an Inventory System filed Feb. 10, 2017, which is hereby incorporated by reference in its entirety.

US Referenced Citations (675)
Number Name Date Kind
6341269 Dulaney Jan 2002 B1
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
7750817 Teller Jul 2010 B2
7801778 Fox Sep 2010 B2
8032392 Brennan Oct 2011 B2
8214313 Puskorius Jul 2012 B1
8249955 Gross Aug 2012 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8934923 Golden Jan 2015 B1
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9061527 Tobin et al. Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9076459 Braho et al. Jul 2015 B2
9079423 Bouverie et al. Jul 2015 B2
9080856 Laffargue Jul 2015 B2
9082023 Feng et al. Jul 2015 B2
9084032 Rautiola et al. Jul 2015 B2
9087250 Coyle Jul 2015 B2
9092681 Havens et al. Jul 2015 B2
9092682 Wilz et al. Jul 2015 B2
9092683 Koziol et al. Jul 2015 B2
9093141 Liu Jul 2015 B2
D737321 Lee Aug 2015 S
9098763 Lu et al. Aug 2015 B2
9104929 Todeschini Aug 2015 B2
9104934 Li et al. Aug 2015 B2
9107484 Chaney Aug 2015 B2
9111159 Liu et al. Aug 2015 B2
9111166 Cunningham Aug 2015 B2
9135483 Liu et al. Sep 2015 B2
9137009 Gardiner Sep 2015 B1
9141839 Xian et al. Sep 2015 B2
9147096 Wang Sep 2015 B2
9148474 Skvoretz Sep 2015 B2
9158000 Sauerwein Oct 2015 B2
9158340 Reed et al. Oct 2015 B2
9158953 Gillet et al. Oct 2015 B2
9159059 Daddabbo et al. Oct 2015 B2
9165174 Huck Oct 2015 B2
9171543 Emerick et al. Oct 2015 B2
9183425 Wang Nov 2015 B2
9189669 Zhu et al. Nov 2015 B2
9195844 Todeschini et al. Nov 2015 B2
9202458 Braho et al. Dec 2015 B2
9208366 Liu Dec 2015 B2
9208367 Wang Dec 2015 B2
9219836 Bouverie et al. Dec 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224024 Bremer et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9235553 Fitch et al. Jan 2016 B2
9239950 Fletcher Jan 2016 B2
9245492 Ackley et al. Jan 2016 B2
9443123 Hejl Jan 2016 B2
9248640 Heng Feb 2016 B2
9250652 London et al. Feb 2016 B2
9250712 Todeschini Feb 2016 B1
9251411 Todeschini Feb 2016 B2
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9262660 Lu et al. Feb 2016 B2
9262662 Chen et al. Feb 2016 B2
9269036 Bremer Feb 2016 B2
9270782 Hala et al. Feb 2016 B2
9274812 Doren et al. Mar 2016 B2
9275388 Havens et al. Mar 2016 B2
9277668 Feng et al. Mar 2016 B2
9280693 Feng et al. Mar 2016 B2
9286496 Smith Mar 2016 B2
9297900 Jiang Mar 2016 B2
9298964 Li et al. Mar 2016 B2
9301427 Feng et al. Mar 2016 B2
D754205 Nguyen et al. Apr 2016 S
D754206 Nguyen et al. Apr 2016 S
9304376 Anderson Apr 2016 B2
9310609 Rueblinger et al. Apr 2016 B2
9313377 Todeschini et al. Apr 2016 B2
9317037 Byford et al. Apr 2016 B2
9319548 Showering et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342723 Liu et al. May 2016 B2
9342724 McCloskey May 2016 B2
9361882 Ressler et al. Jun 2016 B2
9365381 Colonel et al. Jun 2016 B2
9373018 Colavito et al. Jun 2016 B2
9375945 Bowles Jun 2016 B1
9378403 Wang et al. Jun 2016 B2
D760719 Zhou et al. Jul 2016 S
9360304 Chang et al. Jul 2016 B2
9383848 Daghigh Jul 2016 B2
9384374 Bianconi Jul 2016 B2
9390304 Chang et al. Jul 2016 B2
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
9411386 Sauerwein Aug 2016 B2
9412242 Van Horn et al. Aug 2016 B2
9418269 Havens et al. Aug 2016 B2
9418270 Van Volkinburg et al. Aug 2016 B2
9423318 Lui et al. Aug 2016 B2
9424598 Kraft Aug 2016 B1
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9454689 McCloskey et al. Sep 2016 B2
9464885 Lloyd et al. Oct 2016 B2
9465967 Xian et al. Oct 2016 B2
9478113 Xie et al. Oct 2016 B2
9478983 Kather et al. Oct 2016 B2
D771631 Fitch et al. Nov 2016 S
9481186 Bouverie et al. Nov 2016 B2
9487113 Schukalski Nov 2016 B2
9488986 Solanki Nov 2016 B1
9489782 Payne et al. Nov 2016 B2
9490540 Davies et al. Nov 2016 B1
9491729 Rautiola et al. Nov 2016 B2
9497092 Gomez et al. Nov 2016 B2
9507974 Todeschini Nov 2016 B1
9519814 Cudzilo Dec 2016 B2
9521331 Bessettes et al. Dec 2016 B2
9530038 Xian et al. Dec 2016 B2
D777166 Bidwell et al. Jan 2017 S
9558386 Yeakley Jan 2017 B2
9572901 Todeschini Feb 2017 B2
9606581 Howe et al. Mar 2017 B1
D783601 Schulte et al. Apr 2017 S
D785617 Bidwell et al. May 2017 S
D785636 Oberpriller et al. May 2017 S
9646189 Lu et al. May 2017 B2
9646191 Unemyr et al. May 2017 B2
9652648 Ackley et al. May 2017 B2
9652653 Todeschini et al. May 2017 B2
9656487 Ho et al. May 2017 B2
9659198 Giordano et al. May 2017 B2
D790505 Vargo et al. Jun 2017 S
D790546 Zhou et al. Jun 2017 S
9680282 Hanenburg Jun 2017 B2
9697401 Feng et al. Jul 2017 B2
9701140 Alaganchetty et al. Jul 2017 B1
9805402 Maurer Oct 2017 B1
10423923 Harsha Sep 2019 B2
20030158796 Balent Aug 2003 A1
20030212643 Steele Nov 2003 A1
20070063048 Havens et al. Mar 2007 A1
20080147475 Gruttadauria Jun 2008 A1
20090134221 Zhu et al. May 2009 A1
20090276317 Dixon Nov 2009 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120194692 Mers et al. Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20130018696 Meldrum Jan 2013 A1
20130037613 Soldate Feb 2013 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130076726 Ferrara Mar 2013 A1
20130173435 Cozad, Jr. Jul 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130332524 Fiala et al. Dec 2013 A1
20130332996 Fiala et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140003727 Lortz et al. Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140191684 Valois Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140304123 Schwartz Oct 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150046299 Yan Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150049902 Moraleda Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150088703 Yan Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150178523 Gelay et al. Jun 2015 A1
20150178537 El et al. Jun 2015 A1
20150178685 Krumel et al. Jun 2015 A1
20150181109 Gillet et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150212565 Murawski et al. Jul 2015 A1
20150213647 Laffargue et al. Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150220901 Gomez et al. Aug 2015 A1
20150227189 Davis et al. Aug 2015 A1
20150236984 Sevier Aug 2015 A1
20150239348 Chamberlin Aug 2015 A1
20150242658 Nahill et al. Aug 2015 A1
20150248572 Soule et al. Sep 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150261643 Caballero et al. Sep 2015 A1
20150264624 Wang et al. Sep 2015 A1
20150268971 Barten Sep 2015 A1
20150269402 Barber et al. Sep 2015 A1
20150288689 Todeschini et al. Oct 2015 A1
20150288896 Wang Oct 2015 A1
20150310243 Ackley Oct 2015 A1
20150310244 Xian et al. Oct 2015 A1
20150310389 Crimm et al. Oct 2015 A1
20150312780 Wang et al. Oct 2015 A1
20150327012 Bian et al. Nov 2015 A1
20160014251 Hejl Jan 2016 A1
20160025697 Alt et al. Jan 2016 A1
20160026838 Gillet et al. Jan 2016 A1
20160026839 Qu et al. Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160062473 Bouchat et al. Mar 2016 A1
20160073886 Connor Mar 2016 A1
20160092805 Geisler et al. Mar 2016 A1
20160101936 Chamberlin Apr 2016 A1
20160102975 McCloskey et al. Apr 2016 A1
20160104019 Todeschini et al. Apr 2016 A1
20160104274 Jovanovski et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160117627 Raj et al. Apr 2016 A1
20160119540 Wu Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171597 Todeschini Jun 2016 A1
20160171666 McCloskey Jun 2016 A1
20160171720 Todeschini Jun 2016 A1
20160171775 Todeschini et al. Jun 2016 A1
20160171777 Todeschini et al. Jun 2016 A1
20160174674 Oberpriller et al. Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160178685 Young et al. Jun 2016 A1
20160178707 Young et al. Jun 2016 A1
20160179132 Harr et al. Jun 2016 A1
20160179143 Bidwell et al. Jun 2016 A1
20160179368 Roeder Jun 2016 A1
20160179378 Kent et al. Jun 2016 A1
20160180130 Bremer Jun 2016 A1
20160180133 Oberpriller et al. Jun 2016 A1
20160180136 Meier et al. Jun 2016 A1
20160180594 Todeschini Jun 2016 A1
20160180663 McMahan et al. Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160180713 Bernhardt et al. Jun 2016 A1
20160185136 Ng et al. Jun 2016 A1
20160185291 Chamberlin Jun 2016 A1
20160186926 Oberpriller et al. Jun 2016 A1
20160188861 Todeschini Jun 2016 A1
20160188939 Sailors et al. Jun 2016 A1
20160188940 Lu et al. Jun 2016 A1
20160188941 Todeschini et al. Jun 2016 A1
20160188942 Good et al. Jun 2016 A1
20160188943 Linwood Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189076 Mellott et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160189088 Pecorari et al. Jun 2016 A1
20160189092 George Jun 2016 A1
20160189284 Mellott Jun 2016 A1
20160189288 Todeschini Jun 2016 A1
20160189366 Chamberlin et al. Jun 2016 A1
20160189443 Smith Jun 2016 A1
20160189447 Valenzuela Jun 2016 A1
20160189489 Au et al. Jun 2016 A1
20160191684 DiPiazza et al. Jun 2016 A1
20160192051 DiPiazza et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160202951 Pike et al. Jul 2016 A1
20160202958 Zabel et al. Jul 2016 A1
20160202959 Doubleday et al. Jul 2016 A1
20160203021 Pike et al. Jul 2016 A1
20160203429 Mellott et al. Jul 2016 A1
20160203797 Pike et al. Jul 2016 A1
20160203820 Zabel et al. Jul 2016 A1
20160204623 Haggert et al. Jul 2016 A1
20160204636 Allen et al. Jul 2016 A1
20160204638 Miraglia et al. Jul 2016 A1
20160316190 McCloskey et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
20160323310 Todeschini et al. Nov 2016 A1
20160325677 Fitch et al. Nov 2016 A1
20160327614 Young et al. Nov 2016 A1
20160327930 Charpentier et al. Nov 2016 A1
20160328762 Pape Nov 2016 A1
20160330218 Hussey et al. Nov 2016 A1
20160343163 Venkatesha et al. Nov 2016 A1
20160343176 Ackley Nov 2016 A1
20160350708 Jones Dec 2016 A1
20160364914 Todeschini Dec 2016 A1
20160370220 Ackley et al. Dec 2016 A1
20160372282 Bandringa Dec 2016 A1
20160373847 Vargo et al. Dec 2016 A1
20160377414 Thuries et al. Dec 2016 A1
20160377417 Jovanovski et al. Dec 2016 A1
20170010141 Ackley Jan 2017 A1
20170010328 Mullen et al. Jan 2017 A1
20170010780 Waldron et al. Jan 2017 A1
20170011333 Greiner Jan 2017 A1
20170016714 Laffargue et al. Jan 2017 A1
20170018094 Todeschini Jan 2017 A1
20170046603 Lee et al. Feb 2017 A1
20170047864 Stang et al. Feb 2017 A1
20170053146 Liu et al. Feb 2017 A1
20170053147 Geramine et al. Feb 2017 A1
20170053647 Nichols et al. Feb 2017 A1
20170055606 Xu et al. Mar 2017 A1
20170060316 Larson Mar 2017 A1
20170061961 Nichols et al. Mar 2017 A1
20170064634 Van Horn et al. Mar 2017 A1
20170083730 Feng et al. Mar 2017 A1
20170091502 Furlong et al. Mar 2017 A1
20170091706 Lloyd et al. Mar 2017 A1
20170091741 Todeschini Mar 2017 A1
20170091904 Ventress Mar 2017 A1
20170092908 Chaney Mar 2017 A1
20170094238 Germaine et al. Mar 2017 A1
20170098947 Wolski Apr 2017 A1
20170100949 Celinder et al. Apr 2017 A1
20170108838 Todeschini et al. Apr 2017 A1
20170108895 Chamberlin et al. Apr 2017 A1
20170118355 Wong et al. Apr 2017 A1
20170123598 Phan et al. May 2017 A1
20170124369 Rueblinger et al. May 2017 A1
20170124396 Todeschini et al. May 2017 A1
20170124687 McCloskey et al. May 2017 A1
20170126873 McGary et al. May 2017 A1
20170126904 d'Armancourt et al. May 2017 A1
20170139012 Smith May 2017 A1
20170140329 Bernhardt et al. May 2017 A1
20170140731 Smith May 2017 A1
20170147847 Berggren et al. May 2017 A1
20170150124 Thuries May 2017 A1
20170169198 Nichols Jun 2017 A1
20170171035 Lu et al. Jun 2017 A1
20170171703 Maheswaranathan Jun 2017 A1
20170171803 Maheswaranathan Jun 2017 A1
20170180359 Wolski et al. Jun 2017 A1
20170180577 Nguon et al. Jun 2017 A1
20170181299 Shi et al. Jun 2017 A1
20170190192 Delano et al. Jul 2017 A1
20170193432 Bernhardt Jul 2017 A1
20170193461 Jonas et al. Jul 2017 A1
20170193727 Van Horn et al. Jul 2017 A1
20170199266 Rice et al. Jul 2017 A1
20170200108 Au et al. Jul 2017 A1
20170200275 McCloskey et al. Jul 2017 A1
20170206547 Vise Jul 2017 A1
Foreign Referenced Citations (2)
Number Date Country
2013163789 Nov 2013 WO
2015195415 Dec 2015 WO
Non-Patent Literature Citations (3)
Entry
Extended Search Report in related European Application No. 18156172.1 dated Mar. 22, 2018, pp. 1-7.
Office Action for European Application No. 18156172.1 dated Jul. 29, 2019, 6 pages.
Summons to Attend Oral Proceedings for European Application No. 18156172.1, dated Feb. 6, 2020, 7 pages.
Related Publications (1)
Number Date Country
20180232688 A1 Aug 2018 US
Provisional Applications (1)
Number Date Country
62457242 Feb 2017 US