Product identification using electroencephalography

Information

  • Patent Grant
  • 10373143
  • Patent Number
    10,373,143
  • Date Filed
    Thursday, September 24, 2015
    9 years ago
  • Date Issued
    Tuesday, August 6, 2019
    5 years ago
Abstract
An EEG POS system has an EEG device that detects electrical signals representing brain waves. A database of brain wave profiles represents a plurality of items to be identified. A live signal analyzer compares electrical signals from the EEG device with stored brain wave profiles in the database to identify entries in the database representing items that match the electrical signals from the EEG device, where items whose stored brain wave profiles match the electrical signals are considered identified items. A POS terminal is coupled to the live signal analyzer in order to log and tally items for a transaction.
Description
FIELD OF THE INVENTION

The present invention relates to product identification, as for example in point of sale terminals using Electroencephalography (EEG).


BACKGROUND

Generally speaking, barcode scanning, in particular two dimensional barcode scanning, requires a great deal of image processing, the image to be in perfect focus and adequate lighting conditions. Accurate scanning can be hindered when these conditions are not met or when there is excessive motion. In addition, the barcode should ideally be properly positioned within the field of view of an imager, which may be unintuitive to aim. If any of these preconditions are not met, the barcode often cannot be deciphered. Also, sometimes the barcode itself is printed poorly and can be damaged during the life of the product. This can lead to hard or impossible to read codes. Some items, such as produce, often do not even have barcodes and have to be keyed in manually by a cashier. This can be time consuming and can be impeded by human error. It is also quite simple for a thief to swap the barcode of an expensive item with the barcode of a much cheaper product. These issues can cause major problems and be very expensive for businesses.


Therefore, a need exists for a system that enhances bar code reading or replaces the bar code reading with another system.


SUMMARY

Accordingly, in one aspect, the present invention embraces use of EEG data to aid in identification of items in order to process a Point-Of-Sale transaction.


In an example embodiment, an EEG POS system has an EEG device that detects electrical signals representing brain waves. A database of brain wave profiles represents a plurality of items to be identified. A live signal analyzer compares electrical signals from the EEG device with stored brain wave profiles in the database to identify entries in the database representing items that match the electrical signals from the EEG device, where items whose stored brain wave profiles match the electrical signals are considered identified items. A POS terminal is coupled to the live signal analyzer in order to log and tally items for a transaction.


In another example embodiment, an electroencephalograph (EEG) point of sale system having an EEG device that is configured to detect a plurality of electrical signals representing brain waves. A database of brain wave profiles represent a plurality of items to be identified. A live signal analyzer compares electrical signals from the EEG device with stored brain wave profiles in the database to identify entries in the database representing items that match the electrical signals from the EEG device, where items whose stored brain wave profiles match the electrical signals are considered identified items.


In certain example implementations, the system also has a point of sale terminal coupled to the live signal analyzer that logs and tallies items for a transaction, where the live signal analyzer provides item identification and price data to the point of sale terminal for identified items. In certain example implementations, the live signal analyzer comprises a programmed processor coupled to the EEG device, the point of sale terminal, and the database. In certain example implementations, the EEG device is configured as headgear that is to be worn by a user. In certain example implementations, a brain response profiler generates a brain response profile for an item from the EEG device and generates a database entry for the item.


In certain example implementations, the brain response profiler is implemented using a programmed processor coupled to the EEG device and the database. In certain example implementations, the live signal analyzer identifies entries in the database representing items that match the electrical signals from the EEG device by cross correlating the plurality of electrical signals from the EEG device with stored database entries representing a plurality of items. In certain example implementations, the electrical signals from the EEG device are converted to frequency domain signals and where the brain response profile is a frequency domain profile.


In yet another example embodiment, 9. A method, involves receiving electroencephalograph (EEG) data generated when a user is exposed to an item that is to be identified; at a programmed processor, comparing the EEG data to a plurality of brain wave profiles stored in a database, the stored brain wave profiles corresponding to identifiable items; ascertaining that a match exists between the EEG data and a stored brain wave profile for a particular identifiable item; retrieving information from the database associated with the particular identifiable item; and passing the information retrieved from the database to a point of sale terminal.


In certain example implementations, the information retrieved from the database comprises item identification and price data. In certain example implementations, the comparing comprises calculating a cross correlation between the EEG data and a plurality of the stored brain wave profiles. In certain example implementations, the method further involves processing the EEG data using a fast Fourier transform. In certain example implementations, the EEG data and the brain wave profiles are represented in the frequency domain.


In another aspect, the present invention involves a training method, including receiving EEG training data from an EEG device that is generated when a user is exposed to a training item; generating an item brain wave profile for the training item that characterizes the EEG training data along with data identifying the training item; and storing the item brain wave profile in a database.


In certain example implementations, the method further involves receiving electroencephalograph (EEG) data generated when a user is exposed to an item that is to be identified; at a programmed processor, comparing the EEG data to a plurality of item brain wave profiles stored in the database; ascertaining that a match exists between the EEG data and a stored item brain wave profile for a particular identifiable item; retrieving information from the database associated with the particular identifiable item; and passing the information retrieved from the database to a point of sale terminal.


In certain example implementations, the information retrieved from the database includes item identification and price data. In certain example implementations, the comparing involves calculating a cross correlation between the EEG data and a plurality of the stored brain wave profiles. In certain example implementations, processing the EEG data involves using a fast Fourier transform. In certain example implementations, the EEG data and the brain wave profiles are represented in the frequency domain.


The foregoing illustrative summary, as well as other exemplary objectives and/or advantages of the invention, and the manner in which the same are accomplished, are further explained within the following detailed description and its accompanying drawings.





BRIEF DESCRIPTION OF THE DRAWINGS


FIG. 1 depicts a logical block diagram of a system consistent with certain example embodiments consistent with the present invention.



FIG. 2 is an example of a flow chart depicting a training process as used with certain example embodiments consistent with the present invention.



FIG. 3 is an example of a flow chart depicting an operational process as used in certain example embodiments consistent with the present invention.



FIG. 4 is an example system block diagram for an illustrative system consistent with the present teachings.



FIG. 5 is an example of a flow chart of an overall process including training and operation of an example system consistent with the present invention.





DETAILED DESCRIPTION

The present invention embraces methods and apparatus using electroencephalograph data to scan items to be checked out at a point of sale terminal, for example, at a retail establishment.


As previously noted barcode scanning, in particular two dimensional barcode scanning, requires a great deal of image processing, minimal motion, perfect focus, proper placement in a scanner's field, clear bar code printing and adequate lighting conditions. Accurate scanning can be hindered when these conditions are not met. If any of these preconditions are not met, the barcode often cannot be deciphered.


As the form factor of our products continue to evolve, so too will the way in which we interact with them. Today there are several ways of interfacing with smart devices other than traditional hardware buttons. Touch screen gesturing, voice recognition, inertial sensors motion detection, 3D sensor gesture recognition and other methods have become commonplace for interfacing with a computer. More recently, many advances have been made in improvements in the brain-computer interface (BCI). All BCI devices on the market today use electroencephalography (EEG) as their core technology. This involves the placement of an array of electrodes on the head, which measure voltage fluctuations resulting from ionic current flows within the neurons of the brain. These electromagnetic signals are recorded, processed and the source of the activity is isolated.


Embodiments consistent with the present invention can address the above problems by removing most of the preconditions necessary to successfully identify a product. It removes the need for any image processing, any mechanical autofocus routine, can operate in extremely low light conditions, and is motion tolerant. It also removes the need for a barcode altogether, thus eliminating all the issues regarding print quality, code damage, and theft. It also facilitates the identification of items like produce, which typically do not contain barcodes. This is done by replacing or supplementing the current method of product identification using a barcode scanner with the human brain in cooperation with EEG technology.


Recent advances in Electroencephalography (EEG) have taken the ability to read electronic signals produced by the brain out of the lab and into to more mainstream applications. Relatively inexpensive EEG devices have been brought to market that do not require shaving of the subject's head or gels of any kind. Such devices can be easily worn by the user and are quite unobtrusive and even stylish.


In accord with the present discussion, an EEG device is used to identify a product that is being viewed without the use of barcode technology and unreliable image processing (or as a supplement thereto). Certain embodiments utilize the extremely efficient object recognition algorithms ingrained within the human brain to determine what object is currently being looked at. Such embodiments also capitalize on the focusing and light sensitivity powers of the human eye to make sure the object is always in focus and perfectly exposed. Real-time readings from the EEG may be compared to known readings that have been previously cataloged for the current user. In other words, certain embodiments of this invention aim to heavily leverage what hundreds of thousands of years of evolution have given humans.


In accord with certain embodiments, an Electroencephalography (EEG) device, e.g. such as devices similar to the Emotiv EPOC product (Commercially available from Emotiv, Inc., 490 Post St. Suite 824, San Francisco, Calif. 94102 USA) is used to characterize brain waves for purposes of identifying products at checkout. In one example, a training process is conducted in which a retail store clerk is shown all or a portion of the products available for sale in a store while wearing the EEG device. The employee's brain response to the sight of each object can be recorded and cataloged for that particular employee. After training is complete there will exist a catalog of unique brain responses (for each employee) paired with an identifier for each product within the system.


In the example of a grocery store, when products are placed on a checkout conveyor belt, the store clerk wearing the EEG device simply needs to look at each item individually while bagging them. The EEG device produces an EEG representation of the clerk's brain responses. This EEG representation is then compared to the pre-cataloged list of responses acquired during employee training to identify the products that are currently being looked at. When matches are obtained between the current EEG signal and the pre-cataloged list of responses are found, visual and/or audible feedback can be produced for the clerk in order to confirm to the clerk that the product has been identified. The system can record the associated item and its price for checkout.


This technology can be used standalone or combined with existing bar code reader technology, or can be paired with eye direction detection, so that one barcode in a field can be selected and trigger the scan of the desired code.


An exemplary embodiment is depicted from a logical block diagram perspective as system 10 of FIG. 1 in which an EEG device 14 produces signals that represent brain waves of a user when viewing an object. While training this device to the brain wave signals of a particular user, a brain response profiler 18 receives signals from each of a plurality (e.g., sixteen) of electrodes affixed to the user's scalp. This placement of the electrodes can be accomplished using headgear equipped with EEG electrodes as the EEG device 14.


The brain wave signals represented by voltages picked up at each of the electrodes is processed by the brain response profiler to generate a profile for the user's brain waves when the user is presented with a visual (and possibly tactile) exposure to an item that is to be profiled. Such a profile is generated for each of a plurality of items representing inventory in a retail establishment. The profile of the brain waves is associated with an identifier of the item and a price to be charged for that item to complete a profile record for each of the items to be processed. The profile can then be stored to a database to produce stored brain response profiles 22.


When the brain response profiles 22 are completed for each item, the training process is complete. The brain wave signals during normal operation of a user (e.g., a retail clerk) from EEG device 14 are then passed to a live signal analyzer 26. The live signal analyzer 26 receives the EEG signals as the user views items that are to be checked out one at a time. The live signal analyzer 26 generates a profile in a manner similar to the brain response profiler 18 and conducts a comparison of the live brain response profile with the stored brain response profiles stored at 22 in order to identify a close match. This can be accomplished using any number of techniques including cross-correlation of the profiles to seek the highest correlation. When a match is achieved, the item is identified by the live signal analyzer and the identity of the item and price is transferred to a point of sale (POS) application for tallying and logging for use in completing the retail transaction.


Referring now to FIG. 2, a training process is described in connection with example block diagram 50 starting at 54 where the user being trained views items while wearing an EEG device that is communicatively coupled to a host computer system. As an item is being viewed, the brain wave signals associated with the item is sent from each of the EEG electrodes to the host system for brain response profiling and storage of the brain response profile at 58. If the last item to be trained has not been reached at 62, the user then proceeds to the next item at 54 and the process is repeated. When the last item has been reached at 62, the profile of each product and the brain response profile associated therewith can be stored (e.g., to cloud storage) for later retrieval during the operational process.


Referring now to FIG. 3, an operational process for use of the EEG technology for item identification in a retail environment by a user such as a store clerk is depicted in example flow chart 100 starting at 102. At 102, a user occupies the checkout area wearing an EEG device that is wired or wirelessly coupled to a host processor. A transaction is initiated and the user is presented with one or more items for checkout in a normal manner at 106. One by one, the user views each item at 110 to produce a brain wave response that is represented by the signals from each of the electrodes of the EEG device and sent to a host system's live signal analyzer at 118.


The host system's live signal analyzer compares the live EEG signals with stored brain response profiles for the user at 122 in order to identify a matching item in the brain response/product profiles. If no match is achieved at 126, a failure routine is entered at 130 to allow for other data entry techniques. Moreover, excessive failures to identify an item may be indicative of an improperly installed EEG device or a need for further training to better characterize the items that are to be identified.


When a match is achieved at 126, the item is considered “scanned” and identified at 134 and it can be logged to the current transaction along with an identifier of the item and a price to be charged. Feedback can be generated at 138 upon completion of a scan to let the user know that the item has been successfully scanned and the next item (if any) can be viewed. If the item is not the last item at 142, the next item is retrieved at 146 and control passes to 110 to repeat the process. When the last item is reached at 142, the transaction can enter a final stage in which payment is processed at 150 and the transaction can be deemed completed.


Turning now to FIG. 4, an example system 200 is depicted. A user 204 is connected to an EEG device 14 that is suitably interfaced to a host processor 208 either with a wired connection such as a universal serial bus (USB) connection or via a wireless connection. The connection of EEG device 14 is represented functionally in this illustration as a connection to bus 212 of the system 200.


The host processor 208 may be locally or remotely situated or cloud based without limitation. Moreover, processor 208 can be made up of one processor or a plurality of processors. The processor 208 is coupled to memory 216 that includes routines or modules that correspond to the functions of brain response profiler, live signal analyzer, and Point-Of-Sale applications. Processor is further communicatively coupled to EEG database 220 that contains entries corresponding to brain wave profiles and other data for a plurality of identifiable items. Processor 208 is further communicatively coupled to a Point-Of-Sale terminal 224 that is used to carry out a financial transaction with a customer.


During the training process discussed above, the EEG device 14 measures brain wave signals from the user as the user views and/or is otherwise exposed to an item that could be purchased. The brain response profiler operation is carried out by processor 208 to create a profile of each item and the profile is then stored in the database 220.


During live operation, the user 204 is exposed to items that are being purchased and the associated live EEG data is produced by EEG device 14. This live EEG data is analyzed by the processor 208 using the live signal analyzer process to conduct a comparison between the live EEG data and the EEG database entries to identify the item. Once the item is identified, information such as an item name or description can be retrieved from the database along with price for the item. This information is then transferred to the POS application for use at the POS terminal to complete a transaction by adding the item to a list of items being purchased and adding the price to a transaction tally.


An example of the operation of system while carrying out a live transaction is depicted in one example by the flow chart 250 of FIG. 5 start starting at 252 where the user is exposed to a product to be identified. Electroencephalograph (EEG) data is generated when the user is exposed to an item that is to be identified at 256. The raw data from the EEG can be used or the EEG data can be converted to a characterization of the EEG response at 260. The processor then compares the EEG data to a plurality of brain wave profiles stored in a database at 264, where the stored brain wave profiles corresponding to identifiable items in the database. At 268, the process ascertains that a match exists between the EEG data and a stored brain wave profile for a particular identifiable item. Information is retrieved from the database entry associated with the particular identifiable item at 272 (e.g., an identification of the item and a cost). The retrieved information can then be transferred at 276 from the database to the point of sale terminal so that the price and item identifier can be entered into the POS transaction record. The system is then ready to process the next product at 280.


EEG technology is used to measure brain activity which is generally classified by frequency bands: Delta (δ, below 4 Hz), Theta (θ, 4-7 Hz), Alpha (α, 8-12 Hz), Beta (β, 13-30 Hz) and Gamma (γ, above 30 Hz). The output of an EEG device is a collection of signals picked up by sensors placed about a user's head, which picks up brain activity, in the form of measured voltage fluctuations resulting from ionic current within the neurons of the brain, in the above frequency bands. These signals can represent intensity and frequency of brain waves as received by each of the sensors. In accord with the present teachings, a collection of such signals received by a plurality of sensors can be used as a “signature” that identifies a user's brain activity when visually stimulated by viewing a particular object.


An evoked potential is the electrical response of the brain to a stimulus. In the present case, the EEG device measures electrical potentials at the electrodes that are evoked in response to visual stimulation of the brain when the user is exposed to an item that is to be identified. In general, N sensors (e.g., N=16) in an EEG device will produce N output signals which may, in certain implementations, be represented as a sequence of K samples of the electrical potential present at each of the electrodes. So, for N electrodes, one form of EEG output can be represented as a matrix as shown:


















Sample 1
V1(1)
V2(1)
V3(1)
V4(1) . . . VN(1)


Sample 2
V1(2)
V2(2)
V3(2)
V4(2) . . . VN(2)


. . .






Sample K
V1(K)
V2(K)
V3(K)
V4(K) . . . VN(K)









In one embodiment consistent with the present teachings, this matrix of sample values (or a normalized version thereof) can be used directly as a brain wave profile for storage in the database along with other information such as the following simple record for an example item:


















Item Name
Price
Per
Profile









Watermelon
$6.95
Each
[watermelon profile matrix]










It will be appreciated that when items are sold by weight, after recognizing the item, the actual cost that is tallied for checkout will factor in the weight of the product. It is further noted that the above example profile is somewhat minimalist since the database can also store other data such as inventory related data, manufacturer or supplier, rebate information and other data.


During operation, the live EEG data as read by the EEG device can be arranged in a matrix in the same manner as that used in the profile and then compared to the profile matrices stored in the database using any suitable comparison technique to identify an item in the database that can be considered a match for the item represented by the Live EEG data.


Many variations are possible. For example, the profile data for a particular item (as well as the live EEG data representation) can be processed to either simplify calculations or enhance accuracy. In one example, data averaging techniques can be used. In other examples, the time domain sample data can be processed by a fast Fourier transform (FFT) to convert the data to a sequence of samples representing the brain waves in the frequency domain. In yet other examples, the frequency domain data can be added together and possibly normalized in order to construct a composite set of data for the set of N electrodes. This may reduce the volume of data and allow for classification of the data to facilitate enhancement of speed of carrying out the comparison operations.


Those skilled in the art will appreciate that other techniques for manipulation and interpretation of EEG signals can be utilized including those which involve averaging the EEG activity time-locked to the presentation of a stimulus and other techniques known in the field of signal processing and EEG interpretation for cognitive science, cognitive psychology, and psychophysiology. Algorithms to determine what is considered a match and what type of tolerance is acceptable can be determined experimentally.


* * *

To supplement the present disclosure, this application incorporates entirely by reference the following commonly assigned patents, patent application publications, and patent applications:

  • U.S. Pat. Nos. 6,832,725; 7,128,266; 7,159,783; 7,413,127; 7,726,575; 8,294,969; 8,317,105; 8,322,622; 8,366,005; 8,371,507; 8,376,233; 8,381,979; 8,390,909; 8,408,464; 8,408,468; 8,408,469; 8,424,768; 8,448,863; 8,457,013; 8,459,557; 8,469,272; 8,474,712; 8,479,992; 8,490,877; 8,517,271; 8,523,076; 8,528,818; 8,544,737; 8,548,242; 8,548,420; 8,550,335; 8,550,354; 8,550,357; 8,556,174; 8,556,176; 8,556,177; 8,559,767; 8,599,957; 8,561,895; 8,561,903; 8,561,905; 8,565,107; 8,571,307; 8,579,200; 8,583,924; 8,584,945; 8,587,595; 8,587,697; 8,588,869; 8,590,789; 8,596,539; 8,596,542; 8,596,543; 8,599,271; 8,599,957; 8,600,158; 8,600,167; 8,602,309; 8,608,053; 8,608,071; 8,611,309; 8,615,487; 8,616,454; 8,621,123; 8,622,303; 8,628,013; 8,628,015; 8,628,016; 8,629,926; 8,630,491; 8,635,309; 8,636,200; 8,636,212; 8,636,215; 8,636,224; 8,638,806; 8,640,958; 8,640,960; 8,643,717; 8,646,692; 8,646,694; 8,657,200; 8,659,397; 8,668,149; 8,678,285; 8,678,286; 8,682,077; 8,687,282; 8,692,927; 8,695,880; 8,698,949; 8,717,494; 8,717,494; 8,720,783; 8,723,804; 8,723,904; 8,727,223; D702,237; 8,740,082; 8,740,085; 8,746,563; 8,750,445; 8,752,766; 8,756,059; 8,757,495; 8,760,563; 8,763,909; 8,777,108; 8,777,109; 8,779,898; 8,781,520; 8,783,573; 8,789,757; 8,789,758; 8,789,759; 8,794,520; 8,794,522; 8,794,525; 8,794,526; 8,798,367; 8,807,431; 8,807,432; 8,820,630; 8,822,848; 8,824,692; 8,824,696; 8,842,849; 8,844,822; 8,844,823; 8,849,019; 8,851,383; 8,854,633; 8,866,963; 8,868,421; 8,868,519; 8,868,802; 8,868,803; 8,870,074; 8,879,639; 8,880,426; 8,881,983; 8,881,987; 8,903,172; 8,908,995; 8,910,870; 8,910,875; 8,914,290; 8,914,788; 8,915,439; 8,915,444; 8,916,789; 8,918,250; 8,918,564; 8,925,818; 8,939,374; 8,942,480; 8,944,313; 8,944,327; 8,944,332; 8,950,678; 8,967,468; 8,971,346; 8,976,030; 8,976,368; 8,978,981; 8,978,983; 8,978,984; 8,985,456; 8,985,457; 8,985,459; 8,985,461; 8,988,578; 8,988,590; 8,991,704; 8,996,194; 8,996,384; 9,002,641; 9,007,368; 9,010,641; 9,015,513; 9,016,576; 9,022,288; 9,030,964; 9,033,240; 9,033,242; 9,036,054; 9,037,344; 9,038,911; 9,038,915; 9,047,098; 9,047,359; 9,047,420; 9,047,525; 9,047,531; 9,053,055; 9,053,378; 9,053,380; 9,058,526; 9,064,165; 9,064,167; 9,064,168; 9,064,254; 9,066,032; 9,070,032;
  • U.S. Design Pat. Nos. D716,285; D723,560; D730,357; D730,901; D730,902; D733,112; D734,339;
  • International Publication No. 2013/163789; International Publication No. 2013/173985; International Publication No. 2014/019130; International Publication No. 2014/110495;
  • U.S. Patent Application Publication No. 2008/0185432; U.S. Patent Application Publication No. 2009/0134221; U.S. Patent Application Publication No. 2010/0177080; U.S. Patent Application Publication No. 2010/0177076; U.S. Patent Application Publication No. 2010/0177707; U.S. Patent Application Publication No. 2010/0177749; U.S. Patent Application Publication No. 2010/0265880; U.S. Patent Application Publication No. 2011/0202554; U.S. Patent Application Publication No. 2012/0111946; U.S. Patent Application Publication No. 2012/0168511; U.S. Patent Application Publication No. 2012/0168512; U.S. Patent Application Publication No. 2012/0193423; U.S. Patent Application Publication No. 2012/0203647; U.S. Patent Application Publication No. 2012/0223141; U.S. Patent Application Publication No. 2012/0228382; U.S. Patent Application Publication No. 2012/0248188; U.S. Patent Application Publication No. 2013/0043312; U.S. Patent Application Publication No. 2013/0082104; U.S. Patent Application Publication No. 2013/0175341; U.S. Patent Application Publication No. 2013/0175343; U.S. Patent Application Publication No. 2013/0257744; U.S. Patent Application Publication No. 2013/0257759; U.S. Patent Application Publication No. 2013/0270346; U.S. Patent Application Publication No. 2013/0287258; U.S. Patent Application Publication No. 2013/0292475; U.S. Patent Application Publication No. 2013/0292477; U.S. Patent Application Publication No. 2013/0293539; U.S. Patent Application Publication No. 2013/0293540; U.S. Patent Application Publication No. 2013/0306728; U.S. Patent Application Publication No. 2013/0306731; U.S. Patent Application Publication No. 2013/0307964; U.S. Patent Application Publication No. 2013/0308625; U.S. Patent Application Publication No. 2013/0313324; U.S. Patent Application Publication No. 2013/0313325; U.S. Patent Application Publication No. 2013/0342717; U.S. Patent Application Publication No. 2014/0001267; U.S. Patent Application Publication No. 2014/0008439; U.S. Patent Application Publication No. 2014/0025584; U.S. Patent Application Publication No. 2014/0034734; U.S. Patent Application Publication No. 2014/0036848; U.S. Patent Application Publication No. 2014/0039693; U.S. Patent Application Publication No. 2014/0042814; U.S. Patent Application Publication No. 2014/0049120; U.S. Patent Application Publication No. 2014/0049635; U.S. Patent Application Publication No. 2014/0061306; U.S. Patent Application Publication No. 2014/0063289; U.S. Patent Application Publication No. 2014/0066136; U.S. Patent Application Publication No. 2014/0067692; U.S. Patent Application Publication No. 2014/0070005; U.S. Patent Application Publication No. 2014/0071840; U.S. Patent Application Publication No. 2014/0074746; U.S. Patent Application Publication No. 2014/0076974; U.S. Patent Application Publication No. 2014/0078341; U.S. Patent Application Publication No. 2014/0078345; U.S. Patent Application Publication No. 2014/0097249; U.S. Patent Application Publication No. 2014/0098792; U.S. Patent Application Publication No. 2014/0100813; U.S. Patent Application Publication No. 2014/0103115; U.S. Patent Application Publication No. 2014/0104413; U.S. Patent Application Publication No. 2014/0104414; U.S. Patent Application Publication No. 2014/0104416; U.S. Patent Application Publication No. 2014/0104451; U.S. Patent Application Publication No. 2014/0106594; U.S. Patent Application Publication No. 2014/0106725; U.S. Patent Application Publication No. 2014/0108010; U.S. Patent Application Publication No. 2014/0108402; U.S. Patent Application Publication No. 2014/0110485; U.S. Patent Application Publication No. 2014/0114530; U.S. Patent Application Publication No. 2014/0124577; U.S. Patent Application Publication No. 2014/0124579; U.S. Patent Application Publication No. 2014/0125842; U.S. Patent Application Publication No. 2014/0125853; U.S. Patent Application Publication No. 2014/0125999; U.S. Patent Application Publication No. 2014/0129378; U.S. Patent Application Publication No. 2014/0131438; U.S. Patent Application Publication No. 2014/0131441; U.S. Patent Application Publication No. 2014/0131443; U.S. Patent Application Publication No. 2014/0131444; U.S. Patent Application Publication No. 2014/0131445; U.S. Patent Application Publication No. 2014/0131448; U.S. Patent Application Publication No. 2014/0133379; U.S. Patent Application Publication No. 2014/0136208; U.S. Patent Application Publication No. 2014/0140585; U.S. Patent Application Publication No. 2014/0151453; U.S. Patent Application Publication No. 2014/0152882; U.S. Patent Application Publication No. 2014/0158770; U.S. Patent Application Publication No. 2014/0159869; U.S. Patent Application Publication No. 2014/0166755; U.S. Patent Application Publication No. 2014/0166759; U.S. Patent Application Publication No. 2014/0168787; U.S. Patent Application Publication No. 2014/0175165; U.S. Patent Application Publication No. 2014/0175172; U.S. Patent Application Publication No. 2014/0191644; U.S. Patent Application Publication No. 2014/0191913; U.S. Patent Application Publication No. 2014/0197238; U.S. Patent Application Publication No. 2014/0197239; U.S. Patent Application Publication No. 2014/0197304; U.S. Patent Application Publication No. 2014/0214631; U.S. Patent Application Publication No. 2014/0217166; U.S. Patent Application Publication No. 2014/0217180; U.S. Patent Application Publication No. 2014/0231500; U.S. Patent Application Publication No. 2014/0232930; U.S. Patent Application Publication No. 2014/0247315; U.S. Patent Application Publication No. 2014/0263493; U.S. Patent Application Publication No. 2014/0263645; U.S. Patent Application Publication No. 2014/0267609; U.S. Patent Application Publication No. 2014/0270196; U.S. Patent Application Publication No. 2014/0270229; U.S. Patent Application Publication No. 2014/0278387; U.S. Patent Application Publication No. 2014/0278391; U.S. Patent Application Publication No. 2014/0282210; U.S. Patent Application Publication No. 2014/0284384; U.S. Patent Application Publication No. 2014/0288933; U.S. Patent Application Publication No. 2014/0297058; U.S. Patent Application Publication No. 2014/0299665; U.S. Patent Application Publication No. 2014/0312121; U.S. Patent Application Publication No. 2014/0319220; U.S. Patent Application Publication No. 2014/0319221; U.S. Patent Application Publication No. 2014/0326787; U.S. Patent Application Publication No. 2014/0332590; U.S. Patent Application Publication No. 2014/0344943; U.S. Patent Application Publication No. 2014/0346233; U.S. Patent Application Publication No. 2014/0351317; U.S. Patent Application Publication No. 2014/0353373; U.S. Patent Application Publication No. 2014/0361073; U.S. Patent Application Publication No. 2014/0361082; U.S. Patent Application Publication No. 2014/0362184; U.S. Patent Application Publication No. 2014/0363015; U.S. Patent Application Publication No. 2014/0369511; U.S. Patent Application Publication No. 2014/0374483; U.S. Patent Application Publication No. 2014/0374485; U.S. Patent Application Publication No. 2015/0001301; U.S. Patent Application Publication No. 2015/0001304; U.S. Patent Application Publication No. 2015/0003673; U.S. Patent Application Publication No. 2015/0009338; U.S. Patent Application Publication No. 2015/0009610; U.S. Patent Application Publication No. 2015/0014416; U.S. Patent Application Publication No. 2015/0021397; U.S. Patent Application Publication No. 2015/0028102; U.S. Patent Application Publication No. 2015/0028103; U.S. Patent Application Publication No. 2015/0028104; U.S. Patent Application Publication No. 2015/0029002; U.S. Patent Application Publication No. 2015/0032709; U.S. Patent Application Publication No. 2015/0039309; U.S. Patent Application Publication No. 2015/0039878; U.S. Patent Application Publication No. 2015/0040378; U.S. Patent Application Publication No. 2015/0048168; U.S. Patent Application Publication No. 2015/0049347; U.S. Patent Application Publication No. 2015/0051992; U.S. Patent Application Publication No. 2015/0053766; U.S. Patent Application Publication No. 2015/0053768; U.S. Patent Application Publication No. 2015/0053769; U.S. Patent Application Publication No. 2015/0060544; U.S. Patent Application Publication No. 2015/0062366; U.S. Patent Application Publication No. 2015/0063215; U.S. Patent Application Publication No. 2015/0063676; U.S. Patent Application Publication No. 2015/0069130; U.S. Patent Application Publication No. 2015/0071819; U.S. Patent Application Publication No. 2015/0083800; U.S. Patent Application Publication No. 2015/0086114; U.S. Patent Application Publication No. 2015/0088522; U.S. Patent Application Publication No. 2015/0096872; U.S. Patent Application Publication No. 2015/0099557; U.S. Patent Application Publication No. 2015/0100196; U.S. Patent Application Publication No. 2015/0102109; U.S. Patent Application Publication No. 2015/0115035; U.S. Patent Application Publication No. 2015/0127791; U.S. Patent Application Publication No. 2015/0128116; U.S. Patent Application Publication No. 2015/0129659; U.S. Patent Application Publication No. 2015/0133047; U.S. Patent Application Publication No. 2015/0134470; U.S. Patent Application Publication No. 2015/0136851; U.S. Patent Application Publication No. 2015/0136854; U.S. Patent Application Publication No. 2015/0142492; U.S. Patent Application Publication No. 2015/0144692; U.S. Patent Application Publication No. 2015/0144698; U.S. Patent Application Publication No. 2015/0144701; U.S. Patent Application Publication No. 2015/0149946; U.S. Patent Application Publication No. 2015/0161429; U.S. Patent Application Publication No. 2015/0169925; U.S. Patent Application Publication No. 2015/0169929; U.S. Patent Application Publication No. 2015/0178523; U.S. Patent Application Publication No. 2015/0178534; U.S. Patent Application Publication No. 2015/0178535; U.S. Patent Application Publication No. 2015/0178536; U.S. Patent Application Publication No. 2015/0178537; U.S. Patent Application Publication No. 2015/0181093; U.S. Patent Application Publication No. 2015/0181109;
  • U.S. patent application Ser. No. 13/367,978 for a Laser Scanning Module Employing an Elastomeric U-Hinge Based Laser Scanning Assembly, filed Feb. 7, 2012 (Feng et al.);
  • U.S. patent application Ser. No. 29/458,405 for an Electronic Device, filed Jun. 19, 2013 (Fitch et al.);
  • U.S. patent application Ser. No. 29/459,620 for an Electronic Device Enclosure, filed Jul. 2, 2013 (London et al.);
  • U.S. patent application Ser. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/150,393 for Indicia-reader Having Unitary Construction Scanner, filed Jan. 8, 2014 (Colavito et al.);
  • U.S. patent application Ser. No. 14/200,405 for Indicia Reader for Size-Limited Applications filed Mar. 7, 2014 (Feng et al.);
  • U.S. patent application Ser. No. 14/231,898 for Hand-Mounted Indicia-Reading Device with Finger Motion Triggering filed Apr. 1, 2014 (Van Horn et al.);
  • U.S. patent application Ser. No. 29/486,759 for an Imaging Terminal, filed Apr. 2, 2014 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/257,364 for Docking System and Method Using Near Field Communication filed Apr. 21, 2014 (Showering);
  • U.S. patent application Ser. No. 14/264,173 for Autofocus Lens System for Indicia Readers filed Apr. 29, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/277,337 for MULTIPURPOSE OPTICAL READER, filed May 14, 2014 (Jovanovski et al.);
  • U.S. patent application Ser. No. 14/283,282 for TERMINAL HAVING ILLUMINATION AND FOCUS CONTROL filed May 21, 2014 (Liu et al.);
  • U.S. patent application Ser. No. 14/327,827 for a MOBILE-PHONE ADAPTER FOR ELECTRONIC TRANSACTIONS, filed Jul. 10, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/334,934 for a SYSTEM AND METHOD FOR INDICIA VERIFICATION, filed Jul. 18, 2014 (Hejl);
  • U.S. patent application Ser. No. 14/339,708 for LASER SCANNING CODE SYMBOL READING SYSTEM, filed Jul. 24, 2014 (Xian et al.);
  • U.S. patent application Ser. No. 14/340,627 for an AXIALLY REINFORCED FLEXIBLE SCAN ELEMENT, filed Jul. 25, 2014 (Rueblinger et al.);
  • U.S. patent application Ser. No. 14/446,391 for MULTIFUNCTION POINT OF SALE APPARATUS WITH OPTICAL SIGNATURE CAPTURE filed Jul. 30, 2014 (Good et al.);
  • U.S. patent application Ser. No. 14/452,697 for INTERACTIVE INDICIA READER, filed Aug. 6, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/453,019 for DIMENSIONING SYSTEM WITH GUIDED ALIGNMENT, filed Aug. 6, 2014 (Li et al.);
  • U.S. patent application Ser. No. 14/462,801 for MOBILE COMPUTING DEVICE WITH DATA COGNITION SOFTWARE, filed on Aug. 19, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/483,056 for VARIABLE DEPTH OF FIELD BARCODE SCANNER filed Sep. 10, 2014 (McCloskey et al.);
  • U.S. patent application Ser. No. 14/513,808 for IDENTIFYING INVENTORY ITEMS IN A STORAGE FACILITY filed Oct. 14, 2014 (Singel et al.);
  • U.S. patent application Ser. No. 14/519,195 for HANDHELD DIMENSIONING SYSTEM WITH FEEDBACK filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,179 for DIMENSIONING SYSTEM WITH MULTIPATH INTERFERENCE MITIGATION filed Oct. 21, 2014 (Thuries et al.);
  • U.S. patent application Ser. No. 14/519,211 for SYSTEM AND METHOD FOR DIMENSIONING filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/519,233 for HANDHELD DIMENSIONER WITH DATA-QUALITY INDICATION filed Oct. 21, 2014 (Laffargue et al.);
  • U.S. patent application Ser. No. 14/519,249 for HANDHELD DIMENSIONING SYSTEM WITH MEASUREMENT-CONFORMANCE FEEDBACK filed Oct. 21, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/527,191 for METHOD AND SYSTEM FOR RECOGNIZING SPEECH USING WILDCARDS IN AN EXPECTED RESPONSE filed Oct. 29, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/529,563 for ADAPTABLE INTERFACE FOR A MOBILE COMPUTING DEVICE filed Oct. 31, 2014 (Schoon et al.);
  • U.S. patent application Ser. No. 14/529,857 for BARCODE READER WITH SECURITY FEATURES filed Oct. 31, 2014 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/398,542 for PORTABLE ELECTRONIC DEVICES HAVING A SEPARATE LOCATION TRIGGER UNIT FOR USE IN CONTROLLING AN APPLICATION UNIT filed Nov. 3, 2014 (Bian et al.);
  • U.S. patent application Ser. No. 14/531,154 for DIRECTING AN INSPECTOR THROUGH AN INSPECTION filed Nov. 3, 2014 (Miller et al.);
  • U.S. patent application Ser. No. 14/533,319 for BARCODE SCANNING SYSTEM USING WEARABLE DEVICE WITH EMBEDDED CAMERA filed Nov. 5, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/535,764 for CONCATENATED EXPECTED RESPONSES FOR SPEECH RECOGNITION filed Nov. 7, 2014 (Braho et al.);
  • U.S. patent application Ser. No. 14/568,305 for AUTO-CONTRAST VIEWFINDER FOR AN INDICIA READER filed Dec. 12, 2014 (Todeschini);
  • U.S. patent application Ser. No. 14/573,022 for DYNAMIC DIAGNOSTIC INDICATOR GENERATION filed Dec. 17, 2014 (Goldsmith);
  • U.S. patent application Ser. No. 14/578,627 for SAFETY SYSTEM AND METHOD filed Dec. 22, 2014 (Ackley et al.);
  • U.S. patent application Ser. No. 14/580,262 for MEDIA GATE FOR THERMAL TRANSFER PRINTERS filed Dec. 23, 2014 (Bowles);
  • U.S. patent application Ser. No. 14/590,024 for SHELVING AND PACKAGE LOCATING SYSTEMS FOR DELIVERY VEHICLES filed Jan. 6, 2015 (Payne);
  • U.S. patent application Ser. No. 14/596,757 for SYSTEM AND METHOD FOR DETECTING BARCODE PRINTING ERRORS filed Jan. 14, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/416,147 for OPTICAL READING APPARATUS HAVING VARIABLE SETTINGS filed Jan. 21, 2015 (Chen et al.);
  • U.S. patent application Ser. No. 14/614,706 for DEVICE FOR SUPPORTING AN ELECTRONIC TOOL ON A USER'S HAND filed Feb. 5, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/614,796 for CARGO APPORTIONMENT TECHNIQUES filed Feb. 5, 2015 (Morton et al.);
  • U.S. patent application Ser. No. 29/516,892 for TABLE COMPUTER filed Feb. 6, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/619,093 for METHODS FOR TRAINING A SPEECH RECOGNITION SYSTEM filed Feb. 11, 2015 (Pecorari);
  • U.S. patent application Ser. No. 14/628,708 for DEVICE, SYSTEM, AND METHOD FOR DETERMINING THE STATUS OF CHECKOUT LANES filed Feb. 23, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/630,841 for TERMINAL INCLUDING IMAGING ASSEMBLY filed Feb. 25, 2015 (Gomez et al.);
  • U.S. patent application Ser. No. 14/635,346 for SYSTEM AND METHOD FOR RELIABLE STORE-AND-FORWARD DATA HANDLING BY ENCODED INFORMATION READING TERMINALS filed Mar. 2, 2015 (Sevier);
  • U.S. patent application Ser. No. 29/519,017 for SCANNER filed Mar. 2, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/405,278 for DESIGN PATTERN FOR SECURE STORE filed Mar. 9, 2015 (Zhu et al.);
  • U.S. patent application Ser. No. 14/660,970 for DECODABLE INDICIA READING TERMINAL WITH COMBINED ILLUMINATION filed Mar. 18, 2015 (Kearney et al.);
  • U.S. patent application Ser. No. 14/661,013 for REPROGRAMMING SYSTEM AND METHOD FOR DEVICES INCLUDING PROGRAMMING SYMBOL filed Mar. 18, 2015 (Soule et al.);
  • U.S. patent application Ser. No. 14/662,922 for MULTIFUNCTION POINT OF SALE SYSTEM filed Mar. 19, 2015 (Van Horn et al.);
  • U.S. patent application Ser. No. 14/663,638 for VEHICLE MOUNT COMPUTER WITH CONFIGURABLE IGNITION SWITCH BEHAVIOR filed Mar. 20, 2015 (Davis et al.);
  • U.S. patent application Ser. No. 14/664,063 for METHOD AND APPLICATION FOR SCANNING A BARCODE WITH A SMART DEVICE WHILE CONTINUOUSLY RUNNING AND DISPLAYING AN APPLICATION ON THE SMART DEVICE DISPLAY filed Mar. 20, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/669,280 for TRANSFORMING COMPONENTS OF A WEB PAGE TO VOICE PROMPTS filed Mar. 26, 2015 (Funyak et al.);
  • U.S. patent application Ser. No. 14/674,329 for AIMER FOR BARCODE SCANNING filed Mar. 31, 2015 (Bidwell);
  • U.S. patent application Ser. No. 14/676,109 for INDICIA READER filed Apr. 1, 2015 (Huck);
  • U.S. patent application Ser. No. 14/676,327 for DEVICE MANAGEMENT PROXY FOR SECURE DEVICES filed Apr. 1, 2015 (Yeakley et al.);
  • U.S. patent application Ser. No. 14/676,898 for NAVIGATION SYSTEM CONFIGURED TO INTEGRATE MOTION SENSING DEVICE INPUTS filed Apr. 2, 2015 (Showering);
  • U.S. patent application Ser. No. 14/679,275 for DIMENSIONING SYSTEM CALIBRATION SYSTEMS AND METHODS filed Apr. 6, 2015 (Laffargue et al.);
  • U.S. patent application Ser. No. 29/523,098 for HANDLE FOR A TABLET COMPUTER filed Apr. 7, 2015 (Bidwell et al.);
  • U.S. patent application Ser. No. 14/682,615 for SYSTEM AND METHOD FOR POWER MANAGEMENT OF MOBILE DEVICES filed Apr. 9, 2015 (Murawski et al.);
  • U.S. patent application Ser. No. 14/686,822 for MULTIPLE PLATFORM SUPPORT SYSTEM AND METHOD filed Apr. 15, 2015 (Qu et al.);
  • U.S. patent application Ser. No. 14/687,289 for SYSTEM FOR COMMUNICATION VIA A PERIPHERAL HUB filed Apr. 15, 2015 (Kohtz et al.);
  • U.S. patent application Ser. No. 29/524,186 for SCANNER filed Apr. 17, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/695,364 for MEDICATION MANAGEMENT SYSTEM filed Apr. 24, 2015 (Sewell et al.);
  • U.S. patent application Ser. No. 14/695,923 for SECURE UNATTENDED NETWORK AUTHENTICATION filed Apr. 24, 2015 (Kubler et al.);
  • U.S. patent application Ser. No. 29/525,068 for TABLET COMPUTER WITH REMOVABLE SCANNING DEVICE filed Apr. 27, 2015 (Schulte et al.);
  • U.S. patent application Ser. No. 14/699,436 for SYMBOL READING SYSTEM HAVING PREDICTIVE DIAGNOSTICS filed Apr. 29, 2015 (Nahill et al.);
  • U.S. patent application Ser. No. 14/702,110 for SYSTEM AND METHOD FOR REGULATING BARCODE DATA INJECTION INTO A RUNNING APPLICATION ON A SMART DEVICE filed May 1, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/702,979 for TRACKING BATTERY CONDITIONS filed May 4, 2015 (Young et al.);
  • U.S. patent application Ser. No. 14/704,050 for INTERMEDIATE LINEAR POSITIONING filed May 5, 2015 (Charpentier et al.);
  • U.S. patent application Ser. No. 14/705,012 for HANDS-FREE HUMAN MACHINE INTERFACE RESPONSIVE TO A DRIVER OF A VEHICLE filed May 6, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/705,407 for METHOD AND SYSTEM TO PROTECT SOFTWARE-BASED NETWORK-CONNECTED DEVICES FROM ADVANCED PERSISTENT THREAT filed May 6, 2015 (Hussey et al.);
  • U.S. patent application Ser. No. 14/707,037 for SYSTEM AND METHOD FOR DISPLAY OF INFORMATION USING A VEHICLE-MOUNT COMPUTER filed May 8, 2015 (Chamberlin);
  • U.S. patent application Ser. No. 14/707,123 for APPLICATION INDEPENDENT DEX/UCS INTERFACE filed May 8, 2015 (Pape);
  • U.S. patent application Ser. No. 14/707,492 for METHOD AND APPARATUS FOR READING OPTICAL INDICIA USING A PLURALITY OF DATA SOURCES filed May 8, 2015 (Smith et al.);
  • U.S. patent application Ser. No. 14/710,666 for PRE-PAID USAGE SYSTEM FOR ENCODED INFORMATION READING TERMINALS filed May 13, 2015 (Smith);
  • U.S. patent application Ser. No. 29/526,918 for CHARGING BASE filed May 14, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/715,672 for AUGMENTED REALITY ENABLED HAZARD DISPLAY filed May 19, 2015 (Venkatesha et al.);
  • U.S. patent application Ser. No. 14/715,916 for EVALUATING IMAGE VALUES filed May 19, 2015 (Ackley);
  • U.S. patent application Ser. No. 14/722,608 for INTERACTIVE USER INTERFACE FOR CAPTURING A DOCUMENT IN AN IMAGE SIGNAL filed May 27, 2015 (Showering et al.);
  • U.S. patent application Ser. No. 29/528,165 for IN-COUNTER BARCODE SCANNER filed May 27, 2015 (Oberpriller et al.);
  • U.S. patent application Ser. No. 14/724,134 for ELECTRONIC DEVICE WITH WIRELESS PATH SELECTION CAPABILITY filed May 28, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 14/724,849 for METHOD OF PROGRAMMING THE DEFAULT CABLE INTERFACE SOFTWARE IN AN INDICIA READING DEVICE filed May 29, 2015 (Barten);
  • U.S. patent application Ser. No. 14/724,908 for IMAGING APPARATUS HAVING IMAGING ASSEMBLY filed May 29, 2015 (Barber et al.);
  • U.S. patent application Ser. No. 14/725,352 for APPARATUS AND METHODS FOR MONITORING ONE OR MORE PORTABLE DATA TERMINALS (Caballero et al.);
  • U.S. patent application Ser. No. 29/528,590 for ELECTRONIC DEVICE filed May 29, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 29/528,890 for MOBILE COMPUTER HOUSING filed Jun. 2, 2015 (Fitch et al.);
  • U.S. patent application Ser. No. 14/728,397 for DEVICE MANAGEMENT USING VIRTUAL INTERFACES CROSS-REFERENCE TO RELATED APPLICATIONS filed Jun. 2, 2015 (Caballero);
  • U.S. patent application Ser. No. 14/732,870 for DATA COLLECTION MODULE AND SYSTEM filed Jun. 8, 2015 (Powilleit);
  • U.S. patent application Ser. No. 29/529,441 for INDICIA READING DEVICE filed Jun. 8, 2015 (Zhou et al.);
  • U.S. patent application Ser. No. 14/735,717 for INDICIA-READING SYSTEMS HAVING AN INTERFACE WITH A USER'S NERVOUS SYSTEM filed Jun. 10, 2015 (Todeschini);
  • U.S. patent application Ser. No. 14/738,038 for METHOD OF AND SYSTEM FOR DETECTING OBJECT WEIGHING INTERFERENCES filed Jun. 12, 2015 (Amundsen et al.);
  • U.S. patent application Ser. No. 14/740,320 for TACTILE SWITCH FOR A MOBILE ELECTRONIC DEVICE filed Jun. 16, 2015 (Bandringa);
  • U.S. patent application Ser. No. 14/740,373 for CALIBRATING A VOLUME DIMENSIONER filed Jun. 16, 2015 (Ackley et al.);
  • U.S. patent application Ser. No. 14/742,818 for INDICIA READING SYSTEM EMPLOYING DIGITAL GAIN CONTROL filed Jun. 18, 2015 (Xian et al.);
  • U.S. patent application Ser. No. 14/743,257 for WIRELESS MESH POINT PORTABLE DATA TERMINAL filed Jun. 18, 2015 (Wang et al.);
  • U.S. patent application Ser. No. 29/530,600 for CYCLONE filed Jun. 18, 2015 (Vargo et al);
  • U.S. patent application Ser. No. 14/744,633 for IMAGING APPARATUS COMPRISING IMAGE SENSOR ARRAY HAVING SHARED GLOBAL SHUTTER CIRCUITRY filed Jun. 19, 2015 (Wang);
  • U.S. patent application Ser. No. 14/744,836 for CLOUD-BASED SYSTEM FOR READING OF DECODABLE INDICIA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/745,006 for SELECTIVE OUTPUT OF DECODED MESSAGE DATA filed Jun. 19, 2015 (Todeschini et al.);
  • U.S. patent application Ser. No. 14/747,197 for OPTICAL PATTERN PROJECTOR filed Jun. 23, 2015 (Thuries et al.);
  • U.S. patent application Ser. No. 14/747,490 for DUAL-PROJECTOR THREE-DIMENSIONAL SCANNER filed Jun. 23, 2015 (Jovanovski et al.); and
  • U.S. patent application Ser. No. 14/748,446 for CORDLESS INDICIA READER WITH A MULTIFUNCTION COIL FOR WIRELESS CHARGING AND EAS DEACTIVATION, filed Jun. 24, 2015 (Xie et al.).


* * *

In the specification and/or figures, typical embodiments of the invention have been disclosed. The present invention is not limited to such exemplary embodiments. The use of the term “and/or” includes any and all combinations of one or more of the associated listed items. The figures are schematic representations and so are not necessarily drawn to scale. Unless otherwise noted, specific terms have been used in a generic and descriptive sense and not for purposes of limitation.

Claims
  • 1. An electroencephalograph (EEG) point of sale system, comprising: an EEG device that is configured to detect a plurality of electrical signals representing brain waves;a database of brain wave profiles representing a plurality of items to be identified, wherein the brain wave profiles comprise electrical signals corresponding to the brain waves recorded during a training of the EEG point of sale system; anda live signal analyzer that compares electrical signals from the EEG device with stored brain wave profiles in the database to identify entries in the database representing items that match the electrical signals from the EEG device, where items whose stored brain wave profiles match the electrical signals are considered identified items.
  • 2. The system according to claim 1, further comprising a point of sale terminal coupled to the live signal analyzer that logs and tallies items for a transaction; and where the live signal analyzer provides item identification and price data to the point of sale terminal for identified items.
  • 3. The system according to claim 2, where the live signal analyzer comprises a programmed processor coupled to the EEG device, the point of sale terminal, and the database.
  • 4. The system according to claim 1, where the EEG device is configured as headgear that is to be worn by a user.
  • 5. The system according to claim 1, further comprising a brain response profiler that generates a brain response profile for an item from the EEG device and generates a database entry for the item.
  • 6. The system according to claim 5, where the brain response profiler comprises a programmed processor coupled to the EEG device and the database.
  • 7. The system according to claim 1, where the live signal analyzer identifies entries in the database representing items that match the electrical signals from the EEG device by cross correlating the plurality of electrical signals from the EEG device with stored database entries representing a plurality of items.
  • 8. The system according to claim 1, where the electrical signals from the EEG device are converted to frequency domain signals and where the brain response profile is a frequency domain profile.
  • 9. A method, comprising: receiving, at a programmed processor, electroencephalograph (EEG) data generated by an EEG device when a user is exposed to an item that is to be identified;comparing, via the programmed processor, the EEG data to a plurality of brain wave profiles stored in a database, the stored brain wave profiles corresponding to identifiable items, wherein the brain wave profiles comprise electrical signals corresponding to the brain waves recorded during a training using the EEG device;ascertaining, based upon the results of the comparison, that a match exists between the EEG data and a stored brain wave profile for a particular identifiable item;retrieving, via the programmed processor, information from the database associated with the particular identifiable item; andpassing the information retrieved from the database to a point of sale terminal.
  • 10. The method according to claim 9, where the information retrieved from the database comprises item identification and price data.
  • 11. The method according to claim 9, where the comparing comprises calculating a cross correlation between the EEG data and a plurality of the stored brain wave profiles.
  • 12. The method according to claim 9, further comprising processing the EEG data using a fast Fourier transform.
  • 13. The method according to claim 9, where the EEG data and the brain wave profiles are represented in the frequency domain.
US Referenced Citations (488)
Number Name Date Kind
6832725 Gardiner et al. Dec 2004 B2
7128266 Zhu et al. Oct 2006 B2
7159783 Walczyk et al. Jan 2007 B2
7413127 Ehrhart et al. Aug 2008 B2
7726575 Wang et al. Jun 2010 B2
8294969 Plesko Oct 2012 B2
8317105 Kotlarsky et al. Nov 2012 B2
8322622 Liu et al. Dec 2012 B2
8366005 Kotlarsky et al. Feb 2013 B2
8371507 Haggerty et al. Feb 2013 B2
8376233 Van Horn et al. Feb 2013 B2
8381979 Franz Feb 2013 B2
8390909 Plesko Mar 2013 B2
8408464 Zhu et al. Apr 2013 B2
8408468 Horn et al. Apr 2013 B2
8408469 Good Apr 2013 B2
8424768 Rueblinger et al. Apr 2013 B2
8448863 Xian et al. May 2013 B2
8457013 Essinger et al. Jun 2013 B2
8459557 Havens et al. Jun 2013 B2
8469272 Kearney Jun 2013 B2
8474712 Kearney et al. Jul 2013 B2
8479992 Kotlarsky et al. Jul 2013 B2
8490877 Kearney Jul 2013 B2
8517271 Kotlarsky et al. Aug 2013 B2
8523076 Good Sep 2013 B2
8528818 Ehrhart et al. Sep 2013 B2
8544737 Gomez et al. Oct 2013 B2
8548420 Grunow et al. Oct 2013 B2
8550335 Samek et al. Oct 2013 B2
8550354 Gannon et al. Oct 2013 B2
8550357 Kearney Oct 2013 B2
8556174 Kosecki et al. Oct 2013 B2
8556176 Van Horn et al. Oct 2013 B2
8556177 Hussey et al. Oct 2013 B2
8559767 Barber et al. Oct 2013 B2
8561895 Gomez et al. Oct 2013 B2
8561903 Sauerwein Oct 2013 B2
8561905 Edmonds et al. Oct 2013 B2
8565107 Pease et al. Oct 2013 B2
8571307 Li et al. Oct 2013 B2
8579200 Samek et al. Nov 2013 B2
8583924 Caballero et al. Nov 2013 B2
8584945 Wang et al. Nov 2013 B2
8587595 Wang Nov 2013 B2
8587697 Hussey et al. Nov 2013 B2
8588869 Sauerwein et al. Nov 2013 B2
8590789 Nahill et al. Nov 2013 B2
8596539 Havens et al. Dec 2013 B2
8596542 Havens et al. Dec 2013 B2
8596543 Havens et al. Dec 2013 B2
8599271 Havens et al. Dec 2013 B2
8599957 Peake et al. Dec 2013 B2
8600158 Li et al. Dec 2013 B2
8600167 Showering Dec 2013 B2
8602309 Longacre et al. Dec 2013 B2
8608053 Meier et al. Dec 2013 B2
8608071 Liu et al. Dec 2013 B2
8611309 Wang et al. Dec 2013 B2
8615487 Gomez et al. Dec 2013 B2
8621123 Caballero Dec 2013 B2
8622303 Meier et al. Jan 2014 B2
8628013 Ding Jan 2014 B2
8628015 Wang et al. Jan 2014 B2
8628016 Winegar Jan 2014 B2
8629926 Wang Jan 2014 B2
8630491 Longacre et al. Jan 2014 B2
8635309 Berthiaume et al. Jan 2014 B2
8636200 Kearney Jan 2014 B2
8636212 Nahill et al. Jan 2014 B2
8636215 Ding et al. Jan 2014 B2
8636224 Wang Jan 2014 B2
8638806 Wang et al. Jan 2014 B2
8640958 Lu et al. Feb 2014 B2
8640960 Wang et al. Feb 2014 B2
8643717 Li et al. Feb 2014 B2
8646692 Meier et al. Feb 2014 B2
8646694 Wang et al. Feb 2014 B2
8657200 Ren et al. Feb 2014 B2
8659397 Vargo et al. Feb 2014 B2
8668149 Good Mar 2014 B2
8678285 Kearney Mar 2014 B2
8678286 Smith et al. Mar 2014 B2
8682077 Longacre Mar 2014 B1
D702237 Oberpriller et al. Apr 2014 S
8687282 Feng et al. Apr 2014 B2
8692927 Pease et al. Apr 2014 B2
8695880 Bremer et al. Apr 2014 B2
8698949 Grunow et al. Apr 2014 B2
8702000 Barber et al. Apr 2014 B2
8717494 Gannon May 2014 B2
8720783 Biss et al. May 2014 B2
8723804 Fletcher et al. May 2014 B2
8723904 Marty et al. May 2014 B2
8727223 Wang May 2014 B2
8740082 Wilz Jun 2014 B2
8740085 Furlong et al. Jun 2014 B2
8746563 Hennick et al. Jun 2014 B2
8750445 Peake et al. Jun 2014 B2
8752766 Xian et al. Jun 2014 B2
8756059 Braho et al. Jun 2014 B2
8757495 Qu et al. Jun 2014 B2
8760563 Koziol et al. Jun 2014 B2
8763909 Reed et al. Jul 2014 B2
8777108 Coyle Jul 2014 B2
8777109 Oberpriller et al. Jul 2014 B2
8779898 Havens et al. Jul 2014 B2
8781520 Payne et al. Jul 2014 B2
8783573 Havens et al. Jul 2014 B2
8789757 Barten Jul 2014 B2
8789758 Hawley et al. Jul 2014 B2
8789759 Xian et al. Jul 2014 B2
8794520 Wang et al. Aug 2014 B2
8794522 Ehrhart Aug 2014 B2
8794525 Amundsen et al. Aug 2014 B2
8794526 Wang et al. Aug 2014 B2
8798367 Ellis Aug 2014 B2
8807431 Wang et al. Aug 2014 B2
8807432 Van Horn et al. Aug 2014 B2
8820630 Qu et al. Sep 2014 B2
8822848 Meagher Sep 2014 B2
8824692 Sheerin et al. Sep 2014 B2
8824696 Braho Sep 2014 B2
8842849 Wahl et al. Sep 2014 B2
8844822 Kotlarsky et al. Sep 2014 B2
8844823 Fritz et al. Sep 2014 B2
8849019 Li et al. Sep 2014 B2
D716285 Chaney et al. Oct 2014 S
8851383 Yeakley et al. Oct 2014 B2
8854633 Laffargue Oct 2014 B2
8866963 Grunow et al. Oct 2014 B2
8868421 Braho et al. Oct 2014 B2
8868519 Maloy et al. Oct 2014 B2
8868802 Barten Oct 2014 B2
8868803 Caballero Oct 2014 B2
8870074 Gannon Oct 2014 B1
8879639 Sauerwein Nov 2014 B2
8880426 Smith Nov 2014 B2
8881983 Havens et al. Nov 2014 B2
8881987 Wang Nov 2014 B2
8903172 Smith Dec 2014 B2
8908995 Benos et al. Dec 2014 B2
8910870 Li et al. Dec 2014 B2
8910875 Ren et al. Dec 2014 B2
8914290 Hendrickson et al. Dec 2014 B2
8914788 Pettinelli et al. Dec 2014 B2
8915439 Feng et al. Dec 2014 B2
8915444 Havens et al. Dec 2014 B2
8916789 Woodburn Dec 2014 B2
8918250 Hollifield Dec 2014 B2
8918564 Caballero Dec 2014 B2
8925818 Kosecki et al. Jan 2015 B2
8939374 Jovanovski et al. Jan 2015 B2
8942480 Ellis Jan 2015 B2
8944313 Williams et al. Feb 2015 B2
8944327 Meier et al. Feb 2015 B2
8944332 Harding et al. Feb 2015 B2
8950678 Germaine et al. Feb 2015 B2
D723560 Zhou et al. Mar 2015 S
8967468 Gomez et al. Mar 2015 B2
8971346 Sevier Mar 2015 B2
8976030 Cunningham et al. Mar 2015 B2
8976368 Akel et al. Mar 2015 B2
8978981 Guan Mar 2015 B2
8978983 Bremer et al. Mar 2015 B2
8978984 Hennick et al. Mar 2015 B2
8985456 Zhu et al. Mar 2015 B2
8985457 Soule et al. Mar 2015 B2
8985459 Kearney et al. Mar 2015 B2
8985461 Gelay et al. Mar 2015 B2
8988578 Showering Mar 2015 B2
8988590 Gillet et al. Mar 2015 B2
8991704 Hopper et al. Mar 2015 B2
8996194 Davis et al. Mar 2015 B2
8996384 Funyak et al. Mar 2015 B2
8998091 Edmonds et al. Apr 2015 B2
9002641 Showering Apr 2015 B2
9007368 Laffargue et al. Apr 2015 B2
9010641 Qu et al. Apr 2015 B2
9015513 Murawski et al. Apr 2015 B2
9016576 Brady et al. Apr 2015 B2
D730357 Fitch et al. May 2015 S
9022288 Nahill et al. May 2015 B2
9030964 Essinger et al. May 2015 B2
9033240 Smith et al. May 2015 B2
9033242 Gillet et al. May 2015 B2
9036054 Koziol et al. May 2015 B2
9037344 Chamberlin May 2015 B2
9038911 Xian et al. May 2015 B2
9038915 Smith May 2015 B2
D730901 Oberpriller et al. Jun 2015 S
D730902 Fitch et al. Jun 2015 S
D733112 Chaney et al. Jun 2015 S
9047098 Barten Jun 2015 B2
9047359 Caballero et al. Jun 2015 B2
9047420 Caballero Jun 2015 B2
9047525 Barber Jun 2015 B2
9047531 Showering et al. Jun 2015 B2
9049640 Wang et al. Jun 2015 B2
9053055 Caballero Jun 2015 B2
9053378 Hou et al. Jun 2015 B1
9053380 Xian et al. Jun 2015 B2
9057641 Amundsen et al. Jun 2015 B2
9058526 Powilleit Jun 2015 B2
9064165 Havens et al. Jun 2015 B2
9064167 Xian et al. Jun 2015 B2
9064168 Todeschini et al. Jun 2015 B2
9064254 Todeschini et al. Jun 2015 B2
9066032 Wang Jun 2015 B2
9070032 Corcoran Jun 2015 B2
D734339 Zhou et al. Jul 2015 S
D734751 Oberpriller et al. Jul 2015 S
9082023 Feng et al. Jul 2015 B2
9224022 Ackley et al. Dec 2015 B2
9224027 Van Horn et al. Dec 2015 B2
D747321 London et al. Jan 2016 S
9230140 Ackley Jan 2016 B1
9443123 Hejl Jan 2016 B2
9250712 Todeschini Feb 2016 B1
9258033 Showering Feb 2016 B2
9262633 Todeschini et al. Feb 2016 B1
9310609 Rueblinger et al. Apr 2016 B2
D757009 Oberpriller et al. May 2016 S
9342724 McCloskey May 2016 B2
9375945 Bowles Jun 2016 B1
D760719 Zhou et al. Jul 2016 S
9390596 Todeschini Jul 2016 B1
D762604 Fitch et al. Aug 2016 S
D762647 Fitch et al. Aug 2016 S
9412242 Van Horn et al. Aug 2016 B2
D766244 Zhou et al. Sep 2016 S
9443222 Singel et al. Sep 2016 B2
9478113 Xie et al. Oct 2016 B2
9507974 Todeschini Nov 2016 B1
20060258408 Tuomela et al. Nov 2006 A1
20070010756 Viertio-Oja Jan 2007 A1
20070063048 Havens et al. Mar 2007 A1
20070123350 Soderlund May 2007 A1
20070124027 Betziza et al. May 2007 A1
20070168461 Moore Jul 2007 A1
20080228365 White et al. Sep 2008 A1
20090040054 Wang et al. Feb 2009 A1
20090134221 Zhu et al. May 2009 A1
20090227965 Wijesiriwardana Sep 2009 A1
20100094502 Ito Apr 2010 A1
20100145218 Adachi et al. Jun 2010 A1
20100177076 Essinger et al. Jul 2010 A1
20100177080 Essinger et al. Jul 2010 A1
20100177707 Essinger et al. Jul 2010 A1
20100177749 Essinger et al. Jul 2010 A1
20100324936 Vishnubhatla Dec 2010 A1
20110169999 Grunow et al. Jul 2011 A1
20110187640 Jacobsen et al. Aug 2011 A1
20110202554 Powilleit et al. Aug 2011 A1
20110213511 Visconti et al. Sep 2011 A1
20110247027 Davis Oct 2011 A1
20120046531 Hua Feb 2012 A1
20120108995 Pradeep et al. May 2012 A1
20120111946 Golant May 2012 A1
20120168512 Kotlarsky et al. Jul 2012 A1
20120172744 Kato Jul 2012 A1
20120193423 Samek Aug 2012 A1
20120203647 Smith Aug 2012 A1
20120223141 Good et al. Sep 2012 A1
20130043312 Van Horn Feb 2013 A1
20130075168 Amundsen et al. Mar 2013 A1
20130130799 Van Hulle et al. May 2013 A1
20130175341 Kearney et al. Jul 2013 A1
20130175343 Good Jul 2013 A1
20130204153 Buzhardt Aug 2013 A1
20130226408 Fung et al. Aug 2013 A1
20130239187 Leddy Sep 2013 A1
20130257744 Daghigh et al. Oct 2013 A1
20130257759 Daghigh Oct 2013 A1
20130270346 Xian et al. Oct 2013 A1
20130287258 Kearney Oct 2013 A1
20130292475 Kotlarsky et al. Nov 2013 A1
20130292477 Hennick et al. Nov 2013 A1
20130293539 Hunt et al. Nov 2013 A1
20130293540 Laffargue et al. Nov 2013 A1
20130296731 Kidmose et al. Nov 2013 A1
20130306728 Thuries et al. Nov 2013 A1
20130306731 Pedraro Nov 2013 A1
20130307964 Bremer et al. Nov 2013 A1
20130308625 Park et al. Nov 2013 A1
20130313324 Koziol et al. Nov 2013 A1
20130313325 Wilz et al. Nov 2013 A1
20130342717 Havens et al. Dec 2013 A1
20140001267 Giordano et al. Jan 2014 A1
20140002828 Laffargue et al. Jan 2014 A1
20140008439 Wang Jan 2014 A1
20140025584 Liu et al. Jan 2014 A1
20140100813 Showering Jan 2014 A1
20140034734 Sauerwein Feb 2014 A1
20140036848 Pease et al. Feb 2014 A1
20140039693 Havens et al. Feb 2014 A1
20140042814 Kather et al. Feb 2014 A1
20140049120 Kohtz et al. Feb 2014 A1
20140049635 Laffargue et al. Feb 2014 A1
20140061306 Wu et al. Mar 2014 A1
20140063289 Hussey et al. Mar 2014 A1
20140066136 Sauerwein et al. Mar 2014 A1
20140067692 Ye et al. Mar 2014 A1
20140070005 Nahill et al. Mar 2014 A1
20140071840 Venancio Mar 2014 A1
20140074746 Wang Mar 2014 A1
20140076974 Havens et al. Mar 2014 A1
20140078341 Havens et al. Mar 2014 A1
20140078342 Li et al. Mar 2014 A1
20140078345 Showering Mar 2014 A1
20140098792 Wang et al. Apr 2014 A1
20140100774 Showering Apr 2014 A1
20140103115 Meier et al. Apr 2014 A1
20140104413 McCloskey et al. Apr 2014 A1
20140104414 McCloskey et al. Apr 2014 A1
20140104416 Giordano et al. Apr 2014 A1
20140104451 Todeschini et al. Apr 2014 A1
20140106594 Skvoretz Apr 2014 A1
20140106725 Sauerwein Apr 2014 A1
20140108010 Maltseff et al. Apr 2014 A1
20140108402 Gomez et al. Apr 2014 A1
20140108682 Caballero Apr 2014 A1
20140110485 Toa et al. Apr 2014 A1
20140114530 Fitch et al. Apr 2014 A1
20140124577 Wang et al. May 2014 A1
20140124579 Ding May 2014 A1
20140125842 Winegar May 2014 A1
20140125853 Wang May 2014 A1
20140125999 Longacre et al. May 2014 A1
20140129378 Richardson May 2014 A1
20140131438 Kearney May 2014 A1
20140131441 Nahill et al. May 2014 A1
20140131443 Smith May 2014 A1
20140131444 Wang May 2014 A1
20140131445 Ding et al. May 2014 A1
20140131448 Xian et al. May 2014 A1
20140133379 Wang et al. May 2014 A1
20140136208 Maltseff et al. May 2014 A1
20140140585 Wang May 2014 A1
20140151453 Meier et al. Jun 2014 A1
20140152882 Samek et al. Jun 2014 A1
20140158770 Sevier et al. Jun 2014 A1
20140159869 Zumsteg et al. Jun 2014 A1
20140166755 Liu et al. Jun 2014 A1
20140166757 Smith Jun 2014 A1
20140166759 Liu et al. Jun 2014 A1
20140168787 Wang et al. Jun 2014 A1
20140175165 Havens et al. Jun 2014 A1
20140175172 Jovanovski et al. Jun 2014 A1
20140191644 Chaney Jul 2014 A1
20140191913 Ge et al. Jul 2014 A1
20140197238 Lui et al. Jul 2014 A1
20140197239 Havens et al. Jul 2014 A1
20140197304 Feng et al. Jul 2014 A1
20140203087 Smith et al. Jul 2014 A1
20140204268 Grunow et al. Jul 2014 A1
20140206323 Scorcioni Jul 2014 A1
20140214631 Hansen Jul 2014 A1
20140217166 Berthiaume et al. Aug 2014 A1
20140217180 Liu Aug 2014 A1
20140231500 Ehrhart et al. Aug 2014 A1
20140232930 Anderson Aug 2014 A1
20140247315 Marty et al. Sep 2014 A1
20140263493 Amurgis et al. Sep 2014 A1
20140263645 Smith et al. Sep 2014 A1
20140270196 Braho et al. Sep 2014 A1
20140270229 Braho Sep 2014 A1
20140278387 DiGregorio Sep 2014 A1
20140282210 Bianconi Sep 2014 A1
20140284384 Lu et al. Sep 2014 A1
20140285404 Takano et al. Sep 2014 A1
20140288933 Braho et al. Sep 2014 A1
20140297058 Barker et al. Oct 2014 A1
20140299665 Barber et al. Oct 2014 A1
20140312121 Lu et al. Oct 2014 A1
20140319220 Coyle Oct 2014 A1
20140319221 Oberpriller et al. Oct 2014 A1
20140326787 Barten Nov 2014 A1
20140332590 Wang et al. Nov 2014 A1
20140334083 Bailey Nov 2014 A1
20140344943 Todeschini et al. Nov 2014 A1
20140346233 Liu et al. Nov 2014 A1
20140351317 Smith et al. Nov 2014 A1
20140353373 Van Horn et al. Dec 2014 A1
20140361073 Qu et al. Dec 2014 A1
20140361082 Xian et al. Dec 2014 A1
20140362184 Jovanovski et al. Dec 2014 A1
20140363015 Braho Dec 2014 A1
20140369511 Sheerin et al. Dec 2014 A1
20140374483 Lu Dec 2014 A1
20140374485 Xian et al. Dec 2014 A1
20150001301 Ouyang Jan 2015 A1
20150001304 Todeschini Jan 2015 A1
20150003673 Fletcher Jan 2015 A1
20150009338 Laffargue et al. Jan 2015 A1
20150009610 London et al. Jan 2015 A1
20150012426 Purves Jan 2015 A1
20150014416 Kotlarsky et al. Jan 2015 A1
20150021397 Rueblinger et al. Jan 2015 A1
20150028102 Ren et al. Jan 2015 A1
20150028103 Jiang Jan 2015 A1
20150028104 Ma et al. Jan 2015 A1
20150029002 Yeakley et al. Jan 2015 A1
20150032709 Maloy et al. Jan 2015 A1
20150039309 Braho et al. Feb 2015 A1
20150040378 Saber et al. Feb 2015 A1
20150048168 Fritz et al. Feb 2015 A1
20150049347 Laffargue et al. Feb 2015 A1
20150051992 Smith Feb 2015 A1
20150053766 Havens et al. Feb 2015 A1
20150053768 Wang et al. Feb 2015 A1
20150053769 Thuries et al. Feb 2015 A1
20150062366 Liu et al. Mar 2015 A1
20150063215 Wang Mar 2015 A1
20150063676 Lloyd et al. Mar 2015 A1
20150069130 Gannon Mar 2015 A1
20150071819 Todeschini Mar 2015 A1
20150073907 Purves Mar 2015 A1
20150083800 Li et al. Mar 2015 A1
20150086114 Todeschini Mar 2015 A1
20150088522 Hendrickson et al. Mar 2015 A1
20150096872 Woodburn Apr 2015 A1
20150099557 Pettinelli et al. Apr 2015 A1
20150100196 Hollifield Apr 2015 A1
20150102109 Huck Apr 2015 A1
20150115035 Meier et al. Apr 2015 A1
20150127791 Kosecki et al. May 2015 A1
20150128116 Chen et al. May 2015 A1
20150129659 Feng et al. May 2015 A1
20150133047 Smith et al. May 2015 A1
20150134470 Hejl et al. May 2015 A1
20150136851 Harding et al. May 2015 A1
20150136854 Lu et al. May 2015 A1
20150141529 Hargrove May 2015 A1
20150142492 Kumar May 2015 A1
20150144692 Hejl May 2015 A1
20150144698 Teng et al. May 2015 A1
20150144701 Xian et al. May 2015 A1
20150149946 Benos et al. May 2015 A1
20150161429 Xian Jun 2015 A1
20150169925 Chang et al. Jun 2015 A1
20150169929 Williams et al. Jun 2015 A1
20150186703 Chen et al. Jul 2015 A1
20150193644 Kearney et al. Jul 2015 A1
20150193645 Colavito et al. Jul 2015 A1
20150199957 Funyak et al. Jul 2015 A1
20150204671 Showering Jul 2015 A1
20150210199 Payne Jul 2015 A1
20150220753 Zhu et al. Aug 2015 A1
20150248651 Akutagawa Sep 2015 A1
20150254485 Feng et al. Sep 2015 A1
20150257673 Lawrence et al. Sep 2015 A1
20150272465 Ishii Oct 2015 A1
20150282760 Badower et al. Oct 2015 A1
20150313496 Connor Nov 2015 A1
20150313497 Chang et al. Nov 2015 A1
20150313539 Connor Nov 2015 A1
20150327012 Bian et al. Nov 2015 A1
20150374255 Vasapollo Dec 2015 A1
20160004820 Moore Jan 2016 A1
20160014251 Hejl Jan 2016 A1
20160040982 Li et al. Feb 2016 A1
20160042241 Todeschini Feb 2016 A1
20160057230 Todeschini et al. Feb 2016 A1
20160103487 Crawford et al. Apr 2016 A1
20160109219 Ackley et al. Apr 2016 A1
20160109220 Laffargue Apr 2016 A1
20160109224 Thuries et al. Apr 2016 A1
20160112631 Ackley et al. Apr 2016 A1
20160112643 Laffargue et al. Apr 2016 A1
20160124516 Schoon et al. May 2016 A1
20160125217 Todeschini May 2016 A1
20160125342 Miller et al. May 2016 A1
20160132707 Lindbo et al. May 2016 A1
20160133253 Braho et al. May 2016 A1
20160171720 Todeschini Jun 2016 A1
20160178479 Goldsmith Jun 2016 A1
20160180678 Ackley et al. Jun 2016 A1
20160188944 Wilz et al. Jun 2016 A1
20160189087 Morton et al. Jun 2016 A1
20160125873 Braho et al. Jul 2016 A1
20160227912 Oberpriller et al. Aug 2016 A1
20160232891 Pecorari Aug 2016 A1
20160292477 Bidwell Oct 2016 A1
20160294779 Yeakley et al. Oct 2016 A1
20160306769 Kohtz et al. Oct 2016 A1
20160314276 Sewell et al. Oct 2016 A1
20160314294 Kubler et al. Oct 2016 A1
Foreign Referenced Citations (4)
Number Date Country
2013163789 Nov 2013 WO
2013173985 Nov 2013 WO
2014019130 Feb 2014 WO
2014110495 Jul 2014 WO
Non-Patent Literature Citations (30)
Entry
Combined Search and Examination Report in counterpart GB Application No. 1615457.7 dated Feb. 21, 2017, pp. 1-7.
Kapoor et al., “Combining brain computer interfaces with vision for object categorization”, Computer Vision and Pattern Recognition, IEEE Conference on 2008, accessible online at http://ieeexplore.ieee.org/document/4587618/, pp. 1-8 [Cited in GB Search Report].
Behroozi et al., “EEG phase patterns reflect the representation of semantic categories of objects”, B. Med. Biol. Eng. Comput. (2016) 54:205, Sep. 23, 2015, accessible online at http://rd.springer.com/article/10.1007%2Fs11517-015-1391-7, pp. 1-28 [Cited in GB Search Report].
U.S. Appl. No. 13/367,978, filed Feb. 7, 2012, (Feng et al.); now abandoned.
U.S. Appl. No. 14/277,337 for Multipurpose Optical Reader, filed May 14, 2014 (Jovanovski et al.); 59 pages; now abandoned.
U.S. Appl. No. 14/446,391 for Multifunction Point of Sale Apparatus With Optical Signature Capture filed Jul. 30, 2014 (Good et al.); 37 pages; now abandoned.
U.S. Appl. No. 29/516,892 for Table Computer filed Feb. 6, 2015 (Bidwell et al.); 13 pages.
U.S. Appl. No. 29/523,098 for Handle for a Tablet Computer filed Apr. 7, 2015 (Bidwell et al.); 17 pages.
U.S. Appl. No. 29/528,890 for Mobile Computer Housing filed Jun. 2, 2015 (Fitch et al.); 61 pages.
U.S. Appl. No. 29/526,918 for Charging Base filed May 14, 2015 (Fitch et al.); 10 pages.
U.S. Appl. No. 14/715,916 for Evaluating Image Values filed May 19, 2015 (Ackley); 60 pages.
U.S. Appl. No. 29/525,068 for Tablet Computer With Removable Scanning Device filed Apr. 27, 2015 (Schulte et al.); 19 pages.
U.S. Appl. No. 29/468,118 for an Electronic Device Case, filed Sep. 26, 2013 (Oberpriller et al.); 44 pages.
U.S. Appl. No. 29/530,600 for Cyclone filed Jun. 18, 2015 (Vargo et al); 16 pages.
U.S. Appl. No. 14/707,123 for Application Independent DEX/UCS Interface filed May 8, 2015 (Pape); 47 pages.
U.S. Appl. No. No. 14/283,282 for Terminal Having Illumination and Focus Control filed May 21, 2014 (Liu et al.); 31 pages; now abandoned.
U.S. Appl. No. 14/705,407 for Method and System to Protect Software-Based Network-Connected Devices From Advanced Persistent Threat filed May 6, 2015 (Hussey et al.); 42 pages.
U.S. Appl. No. 14/704,050 for Intermediate Linear Positioning filed May 5, 2015 (Charpentier et al.); 60 pages.
U.S. Appl. No. 14/705,012 for Hands-Free Human Machine Interface Responsive to a Driver of a Vehicle filed May 6, 2015 (Fitch et al.); 44 pages.
U.S. Appl. No. 14/715,672 for Augumented Reality Enabled Hazard Display filed May 19, 2015 (Venkatesha et al.); 35 pages.
U.S. Appl. No. 14/735,717 for Indicia-Reading Systems Having an Interface With a User's Nervous System filed Jun. 10, 2015 (Todeschini); 39 pages.
U.S. Appl. No. 14/702,110 for System and Method for Regulating Barcode Data Injection Into a Running Application on a Smart Device filed May 1, 2015 (Todeschini et al.); 38 pages.
U.S. Appl. No. 14/747,197 for Optical Pattern Projector filed Jun. 23, 2015 (Thuries et al.); 33 pages.
U.S. Appl. No. 14/702,979 for Tracking Battery Conditions filed May 4, 2015 (Young et al.); 70 pages.
U.S. Appl. No. 29/529,441 for Indicia Reading Device filed Jun. 8, 2015 (Zhou et al.); 14 pages.
U.S. Appl. No. 14/747,490 for Dual-Projector Three-Dimensional Scanner filed Jun. 23, 2015 (Jovanovski et al.); 40 pages.
U.S. Appl. No. 14/740,320 for Tactile Switch for a Mobile Electronic Device filed Jun. 16, 2015 (Barndringa); 38 pages.
U.S. Appl. No. 14/740,373 for Calibrating a Volume Dimensioner filed Jun. 16, 2015 (Ackley et al.); 63 pages.
Wikipedia, “Evoked potential” downloaded from: https://en.wikipedia.org/wiki/Evoked_potential, Sep. 17, 2015, pp. 1-9.
Combined Search and Examination Report in related GB Application No. 1721791.0, dated Feb. 19, 2018, pp. 1-9 [All references previously cited.].
Related Publications (1)
Number Date Country
20170091741 A1 Mar 2017 US