This application is related in some aspects to the commonly owned and co-pending application entitled “Portable Device-Based Shopping Checkout,” filed May 31, 2007, and U.S. patent application Ser. No. 11/756,382, the entire contents of which are herein incorporated by reference.
The present invention generally relates to a smart scanning system. Specifically, the present invention relates to an integrated system in which a barcode scanner and an image capture device (e.g., image capture device) are commonly contained/housed/positioned.
Marketplace security has become a rising concern over recent years. Security and anti-theft concerns have only increased with the pervasiveness of scanners at checkout stations. It has become increasingly frequent for perpetrators to switch and/or alter barcodes so that an item can be obtained for a cheaper price. In addition, many retailers also utilize image capture devices to catch shop lifters. Unfortunately, current placement of image capture devices is either awkward, or of little use during the checkout process. Any current placement of image capture devices near checkout stations inevitably places them in the way of customer's arms, heads or bodies, creating opportunities for collisions, with damage to the person and/or the image capture devices. It also creates problems for the store, with regard to cleaning, theft, camera occlusions, etc.
In view of the foregoing, there exits a need for a solution that solves at least one of the above-referenced deficiencies in the related art.
In general, the present invention provides a smart scanning system comprising an integrated scanning and image capture system in which one or more image capture device(s) (e.g., still camera, video camera, etc.) and a barcode scanner are positioned within a common enclosure that is a component of a checkout station. The barcode of item is scanned and an image of the item is recorded. It is then determined whether the identity of the item as determined based on the barcode is consistent with its appearance as determined from the image. If not, a discrepancy is registered. It is then determined whether the discrepancy is due to fraud (e.g., theft) or device error. In the case of the latter, the system can be updated to prevent a repeat of the error.
A first aspect of the present invention provides a smart scanning method, comprising: receiving a scan of a barcode of an item via an integrated scanning and image capture system; determining an identity of the item based on the barcode; capturing of an image of the item via the integrated scanning and image capture system; and determining whether the identity is consistent with an appearance of the item as determined from the image.
A second aspect of the present invention provides a smart scanning system, comprising: a module for receiving a scan of a barcode of an item, the barcode being scanned via an integrated scanning and image capture system; a module for determining an identity of the item based on the barcode; a module for receiving of an image of the item, the image being captured via the integrated scanning and image capture system; a module for determining an appearance of the item based on the image; and a module for determining whether the identity is consistent with the appearance.
A third aspect of the present invention provides an integrated scanning and image capture system, comprising: a barcode scanner for scanning a barcode of an item; an image capture device for capturing an image of the item, the barcode scanner and the image capture device both being positioned within a common enclosure.
A fourth aspect of the present invention provides a program item stored on a computer readable medium for smart scanning, the computer readable medium comprising program code for causing a computer system to: receive a scan of a barcode of an item via an integrated scanning and image capture system; determine an identity of the item based on the barcode; capture of an image of the item via the integrated scanning and image capture system; and determine whether the identity is consistent with an appearance of the item as determined from the image.
A fifth aspect of the present invention provides a method for deploying a smart scanning system, comprising: providing a computer infrastructure being operable to: receive a scan of a barcode of an item via an integrated scanning and image capture system; determine an identity of the item based on the barcode; capture of an image of the item via the integrated scanning and image capture system; and determine whether the identity is consistent with an appearance of the item as determined from the image.
A sixth aspect of the present invention provides computer software embodied in a propagated signal for smart scanning, the computer software comprising instructions for causing a computer system to: receive a scan of a barcode of an item via an integrated scanning and image capture system; determine an identity of the item based on the barcode; capture of an image of the item via the integrated scanning and image capture system; and determine whether the identity is consistent with an appearance of the item as determined from the image.
A seventh aspect of the present invention provides a data processing system for smart scanning, comprising: a memory medium having instructions; a bus coupled to the memory medium; and a processor coupled to the bus that when executing the instructions causes the data processing system to: receive a scan of a barcode of an item via an integrated scanning and image capture system; determine an identity of the item based on the barcode; capture of an image of the item via the integrated scanning and image capture system; and determine whether the identity is consistent with an appearance of the item as determined from the image.
An eighth aspect of the present invention provides computer-implemented business method for smart scanning, comprising: receiving a scan of a barcode of an item via an integrated scanning and image capture system; determining an identity of the item based on the barcode; capturing of an image of the item via the integrated scanning and image capture system; and determining whether the identity is consistent with an appearance of the item as determined from the image.
Any of these aspects can include one or more of the following additional aspects: register a discrepancy if the identity is inconsistent with the appearance; determine whether the discrepancy is either an error or fraud; update the integrated scanning and image capture system to prevent the error from being repeated; provide a notification of the discrepancy, the notification being at least one of a visual notification and an audible notification.
These and other features of this invention will be more readily understood from the following detailed description of the various aspects of the invention taken in conjunction with the accompanying drawings in which:
The drawings are not necessarily to scale. The drawings are merely schematic representations, not intended to portray specific parameters of the invention. The drawings are intended to depict only typical embodiments of the invention, and therefore should not be considered as limiting the scope of the invention. In the drawings, like numbering represents like elements.
For convenience, the Detailed Description of the Invention has the following Sections:
I. General Description
II. Computerized Implementation
I. General Description
As indicated above, the present invention provides a smart scanning system comprising an integrated scanning and image capture system in which one or more image capture device(s) (e.g., camera) and a barcode scanner are positioned within a common enclosure that is a component of a checkout station. The barcode of item is scanned and an image of the item is recorded. The identity of the item as determined based on the barcode is compared to an appearance of the item as determined based on its image. If the two are inconsistent, a discrepancy is registered. It is then determined whether the discrepancy is due to fraud (e.g., theft) or device error. In the case of the latter, the system can be updated to prevent a repeat of the error.
The integrated device of the present invention should not result in an increase of volume of the enclosure of previous systems that houses only barcode scanners. The enclosure of the present invention protects the image capture device(s) from collision, theft, dust and dirt, water and similar fluids, etc. The image capture device(s) are generally located and oriented so that (1) they fit inside the small, already-somewhat-full space, (2) they are pointed in such a way that they can usefully see the desired field of view, and (3) they do not interfere with the optical paths required by the pre-existing barcode scanner light source, mirrors, and reflected light intensity sensor. In addition, they do not require increasing the dimensions of glass used, so that no additional risk of glass breakage is incurred.
Referring now to
Image capture device(s) 14 is placed behind shield 17 (glass, plastic, etc.) as shown. Where multiple image capture devices 14 are used, one is placed is placed so that one looks out along a path more or less normal to the vertical glass surface of the barcode scanner, and the other looks more or less straight up from below the horizontal surface of the scanner. The latter image capture device is also placed off to one side, so that it is actually under the (e.g., steel) rim surrounding the existing shield 17 in that surface. As such, hole (e.g., 1″ in diameter in an illustrative embodiment) is cut in the steel rim, and covered with a separate, transparent plate. Regardless (as stated above), smart scanning system 10 also includes barcode scanning and image processing software to perform the functions described herein. This software is stored on a memory medium that may or may not be positioned within enclosure.
Referring now to
Before, after or simultaneous to the scan of the barcode), an image 20 of item 16 will be captured by image capture device 14 (shown in
Thereafter, smart scanning system 10 will compare the identity of item 16 as determined based on the scan of barcode 18 to the appearance as determined based on the image to determine if the two are consistent with one another. If not, smart scanning system 10 will register a discrepancy and providing a notification of the discrepancy (at least one of a visual notification and an audible notification). Where there is a discrepancy, smart scanning system 10 will then determine to determine whether the discrepancy is either error or fraud. For example, was the barcode tampered with or changed, or as the discrepancy due to device error. Where device error caused the discrepancy, smart scanning system 10 can be updated (e.g., in response to an operator's input) to reflect the true identity of the item and its association with the image just captured.
II. Computerized Implementation
II. Computerized Implementation
Referring now to
As shown, smart scanning system 10 includes a processing unit 106, a memory 108, a bus 110, and device interfaces 112. Further, smart scanning system 10 is shown having barcode scanner 12 and image capture device 14 and storage system 116 that communicate with bus via device interfaces (although barcode scanner 12 and/or image capture device 14 alternatively could directly communicate with bus 110). In general, processing unit 106 executes computer program code, such as image processing program 118 and barcode scanning program 122, which are stored in memory 108 and/or storage system 116. While executing computer program code, processing unit 106 can read and/or write data to/from memory 108, storage system 116, and/or device interfaces 112. Bus 110 provides a communication link between each of the components in smart scanning system 10. Although not shown, smart scanning system 10 could also include I/O interfaces that communicate with: one or more external devices such as a cash register, a keyboard, a pointing device, a display, etc.); one or more devices that enable a user to interact with smart scanning system 10; and/or any devices (e.g., network card, modem, etc.) that enable smart scanning system 10 to communicate with one or more other computing devices.
Computer infrastructure 102 is only illustrative of various types of computer infrastructures for implementing the invention. For example, in one embodiment, computer infrastructure 102 comprises two or more computing devices (e.g., a server cluster) that communicate over a network to perform the various process of the invention. Moreover, smart scanning system 10 is only representative of various possible computer systems that can include numerous combinations of hardware. To this extent, in other embodiments, smart scanning system 10 can comprise any specific purpose computing article of manufacture comprising hardware and/or computer program code for performing specific functions, any computing article of manufacture that comprises a combination of specific purpose and general purpose hardware/software, or the like. In each case, the program code and hardware can be created using standard programming and engineering techniques, respectively. Moreover, processing unit 106 may comprise a single processing unit, or be distributed across one or more processing units in one or more locations, e.g., on a client and server. Similarly, memory 108 and/or storage system 116 can comprise any combination of various types of data storage and/or transmission media that reside at one or more physical locations. Further, device interfaces 112 can comprise any module for exchanging information with one or more external device 114. Still further, it is understood that one or more additional components (e.g., system software, math co-processing unit, etc.) not shown in
Storage system 116 can be any type of system capable of providing storage for information under the present invention. To this extent, storage system 116 could include one or more storage devices, such as a magnetic disk drive or an optical disk drive. In another embodiment, storage system 116 includes data distributed across, for example, a local area network (LAN), wide area network (WAN) or a storage area network (SAN) (not shown). In addition, although not shown, additional components, such as cache memory, communication systems, system software, etc., may be incorporated into smart scanning system 10.
Shown in memory 108 of smart scanning system 10 is image processing program 118, which a set (at least one) of modules 120. The modules generally provide the functions of the present invention as described herein. Specifically (among other things), set of modules 120 is configured to: receive a scan of a barcode of item 16 via barcode scanner 12; determine an identity of the item based on the barcode; capture of an image of item 16 via image capture device 14; process the image (e.g., segment item 16 from the background, extract visual feature(s) of item 16); determine an appearance the item 16 based on the image; and determining whether the identity is consistent with the appearance; register a discrepancy if the identity is inconsistent with the appearance; determine whether the discrepancy is either an error or fraud; update the integrated scanning and image capture system to prevent the error from being repeated; provide a notification of the discrepancy, the notification being at least one of a visual notification and an audible notification.
While shown and described herein as a smart scanning system, method, and program item, it is understood that the invention further provides various alternative embodiments. For example, in one embodiment, the invention provides a computer-readable/useable medium that includes computer program code to enable a computer infrastructure to provide smart scanning. To this extent, the computer-readable/useable medium includes program code that implements each of the various process of the invention. It is understood that the terms computer-readable medium or computer useable medium comprises one or more of any type of physical embodiment of the program code. In particular, the computer-readable/useable medium can comprise program code embodied on one or more portable storage articles of manufacture (e.g., a compact disc, a magnetic disk, a tape, etc.), on one or more data storage portions of a computing device, such as memory 108 (
In another embodiment, the invention provides a business method that performs the process of the invention on a subscription, advertising, and/or fee basis. That is, a service provider, such as a Solution Integrator, could offer to provide smart scanning. In this case, the service provider can create, maintain, and support, etc., a computer infrastructure, such as computer infrastructure 102 (
In still another embodiment, the invention provides a computer-implemented method for smart scanning. In this case, a computer infrastructure, such as computer infrastructure 102 (
As used herein, it is understood that the terms “program code” and “computer program code” are synonymous and mean any expression, in any language, code or notation, of a set of instructions intended to cause a computing device having an information processing capability to perform a particular function either directly or after either or both of the following: (a) conversion to another language, code or notation; and/or (b) reproduction in a different material form. To this extent, program code can be embodied as one or more of: an application/software program, component software/a library of functions, an operating system, a basic device system/driver for a particular computing and/or device, and the like.
A data processing system suitable for storing and/or executing program code can be provided hereunder and can include at least one processor communicatively coupled, directly or indirectly, to memory element(s) through a system bus. The memory elements can include, but are not limited to, local memory employed during actual execution of the program code, bulk storage, and cache memories that provide temporary storage of at least some program code in order to reduce the number of times code must be retrieved from bulk storage during execution. Input/output or device devices (including, but not limited to, keyboards, displays, pointing devices, etc.) can be coupled to the system either directly or through intervening device controllers.
Network adapters also may be coupled to the system to enable the data processing system to become coupled to other data processing systems, remote printers, storage devices, and/or the like, through any combination of intervening private or public networks. Illustrative network adapters include, but are not limited to, modems, cable modems and Ethernet cards.
The foregoing description of various aspects of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously, many modifications and variations are possible. Such modifications and variations that may be apparent to a person skilled in the art are intended to be included within the scope of the invention as defined by the accompanying claims.
Number | Name | Date | Kind |
---|---|---|---|
4323772 | Serge | Apr 1982 | A |
5331455 | Chang | Jul 1994 | A |
5426282 | Humble | Jun 1995 | A |
5467403 | Fishbine et al. | Nov 1995 | A |
5477379 | Chang | Dec 1995 | A |
5497314 | Novak | Mar 1996 | A |
5513264 | Wang et al. | Apr 1996 | A |
5546475 | Bolle et al. | Aug 1996 | A |
5583686 | Chen | Dec 1996 | A |
5609223 | Iizaka et al. | Mar 1997 | A |
5631976 | Bolle et al. | May 1997 | A |
5635697 | Shellhammer et al. | Jun 1997 | A |
5649970 | Loeb et al. | Jul 1997 | A |
5659167 | Wang et al. | Aug 1997 | A |
5763864 | O'Hagan et al. | Jun 1998 | A |
5811774 | Ju et al. | Sep 1998 | A |
5815200 | Ju et al. | Sep 1998 | A |
5883968 | Welch et al. | Mar 1999 | A |
5918211 | Sloane | Jun 1999 | A |
6005959 | Mohan et al. | Dec 1999 | A |
6032128 | Morrison et al. | Feb 2000 | A |
6064469 | Brownstein | May 2000 | A |
6122409 | Boggs et al. | Sep 2000 | A |
6287299 | Sasnett et al. | Sep 2001 | B1 |
6310964 | Mohan et al. | Oct 2001 | B1 |
6366696 | Hertz et al. | Apr 2002 | B1 |
6382357 | Morrison et al. | May 2002 | B1 |
6434530 | Sloane et al. | Aug 2002 | B1 |
6457644 | Collins, Jr. et al. | Oct 2002 | B1 |
6504481 | Teller | Jan 2003 | B2 |
6556276 | Staeheli et al. | Apr 2003 | B2 |
6592033 | Jennings et al. | Jul 2003 | B2 |
6606171 | Renk et al. | Aug 2003 | B1 |
6726094 | Rantze et al. | Apr 2004 | B1 |
6853400 | Matama | Feb 2005 | B1 |
6991158 | Munte | Jan 2006 | B2 |
7044370 | Bellis et al. | May 2006 | B2 |
7118026 | Harris et al. | Oct 2006 | B2 |
7124058 | Namaky et al. | Oct 2006 | B2 |
7143065 | Enright | Nov 2006 | B1 |
7159770 | Onozu | Jan 2007 | B2 |
7168618 | Schwartz | Jan 2007 | B2 |
7196624 | Teller | Mar 2007 | B2 |
7202780 | Teller | Apr 2007 | B2 |
7219838 | Brewster et al. | May 2007 | B2 |
7334729 | Brewington | Feb 2008 | B2 |
7337960 | Ostrowski et al. | Mar 2008 | B2 |
7337962 | Do et al. | Mar 2008 | B2 |
7398923 | Do et al. | Jul 2008 | B2 |
7422147 | Rosenbaum | Sep 2008 | B2 |
7442147 | Matsuzaki et al. | Oct 2008 | B2 |
7909248 | Goncalves | Mar 2011 | B1 |
20010045463 | Madding et al. | Nov 2001 | A1 |
20020004404 | Squibbs | Jan 2002 | A1 |
20020013837 | Battat et al. | Jan 2002 | A1 |
20020070861 | Teller | Jun 2002 | A1 |
20020110374 | Staeheli et al. | Aug 2002 | A1 |
20020121547 | Wieth et al. | Sep 2002 | A1 |
20020161658 | Sussman | Oct 2002 | A1 |
20020194074 | Jacobs | Dec 2002 | A1 |
20030015585 | Wike et al. | Jan 2003 | A1 |
20030024982 | Bellis, Jr. et al. | Feb 2003 | A1 |
20030071725 | Teller | Apr 2003 | A1 |
20030167242 | Hamilton | Sep 2003 | A1 |
20030222147 | Havens et al. | Dec 2003 | A1 |
20040125396 | Burke | Jul 2004 | A1 |
20040252025 | Silverbrook et al. | Dec 2004 | A1 |
20040262391 | Harris et al. | Dec 2004 | A1 |
20050096855 | Teller | May 2005 | A1 |
20050145963 | Saitoh | Jul 2005 | A1 |
20050173527 | Conzola | Aug 2005 | A1 |
20050189411 | Ostrowski et al. | Sep 2005 | A1 |
20050189412 | Hudnut et al. | Sep 2005 | A1 |
20050200490 | Teller | Sep 2005 | A1 |
20050211771 | Onozu | Sep 2005 | A1 |
20050237213 | Teller | Oct 2005 | A1 |
20050240478 | Lubow et al. | Oct 2005 | A1 |
20060022051 | Patel et al. | Feb 2006 | A1 |
20060032915 | Schwartz | Feb 2006 | A1 |
20060047835 | Greaux | Mar 2006 | A1 |
20060161390 | Namaky et al. | Jul 2006 | A1 |
20060179164 | Katibian et al. | Aug 2006 | A1 |
20060180664 | Barrett et al. | Aug 2006 | A1 |
20060288133 | Katibian et al. | Dec 2006 | A1 |
20060289637 | Brice et al. | Dec 2006 | A1 |
20060290980 | Terada | Dec 2006 | A1 |
20070094080 | Wiken | Apr 2007 | A1 |
20070107016 | Angel et al. | May 2007 | A1 |
20070107017 | Angel et al. | May 2007 | A1 |
20070107021 | Angel et al. | May 2007 | A1 |
20070158417 | Brewington | Jul 2007 | A1 |
20070279244 | Haughawout et al. | Dec 2007 | A1 |
20070288310 | Boos et al. | Dec 2007 | A1 |
20080027796 | Chaves | Jan 2008 | A1 |
20080059281 | Tower et al. | Mar 2008 | A1 |
20080141755 | Edwards | Jun 2008 | A1 |
20080142598 | Kwan | Jun 2008 | A1 |
20080149710 | Silverbrook et al. | Jun 2008 | A1 |
20080149725 | Rosenbaum | Jun 2008 | A1 |
20080154727 | Carlson | Jun 2008 | A1 |
20080226129 | Kundu et al. | Sep 2008 | A1 |
20090119168 | Otto et al. | May 2009 | A1 |
20090268941 | French et al. | Oct 2009 | A1 |
20100042236 | Chow | Feb 2010 | A1 |
20110060634 | Grossman et al. | Mar 2011 | A1 |
20110215147 | Goncalves | Sep 2011 | A1 |
20120030003 | Herwig | Feb 2012 | A1 |
20120101881 | Taylor et al. | Apr 2012 | A1 |
20130001295 | Goncalves | Jan 2013 | A1 |
Number | Date | Country |
---|---|---|
0225506 | Mar 2002 | WO |
Entry |
---|
Connell II et al., U.S. Appl. No. 12/052,051, Office Action Communication, Feb. 23, 2009, 13 pages. |
Connell II et al., U.S. Appl. No. 12/052,051, Office Action Communication, Jan. 6, 2010, 24 pages. |
Rankins, U.S. Appl. No. 12/037,270, Notice of Allowance & Fees Due, 18 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action dated Sep. 30, 2013, TTEC-0002, 14 pages. |
Stanford, U.S. Appl. No. 11/782,177, Office Action dated Oct. 29, 2013, IBME-0443, 20 pages. |
U.S. Appl. No. 12/037,266, Notice of Allowance & Fees Due dated Jan. 24, 2014, 10 pages. |
Basit, U.S. Appl. No. 12/037,270, Office Action Communication, Jun. 15, 2011, 22 pages. |
Mehta, “Delta Adds Fee to Tickets Not Bought Online; Airlines: The $2 charge applies to all round-trip domestic flights not booked through its Web site. Travel agens criticize the move.” Los Angeles Times, Jan. 14, 1999, ProQuest LLC, 3 pages. |
Johnson, U.S. Appl. No. 12/052,051, Notice of Allowance & Fees Due, Jul. 14, 2011, 20 pages. |
Bolle et al., “Veggie Vision: A Produce Recognition System”, pp. 1-4, WACV 1996. |
Connell II et al., U.S. Appl. No. 12/052,046, filed Mar. 20, 2008, Office Communication dated Apr. 2, 2010, 13 pages. |
Connell II, U.S. Appl. No. 11/782,173, Examiner's Answer, Apr. 30, 2010, 11 pages. |
Connell II, U.S. Appl. No. 12/052,051, Office Action Communication, Jun. 11, 2010, 32 pages. |
Connell II et al., U.S. Appl. No. 11/782,173, Office Action Communication, May 27, 2009, 12 pages. |
Connell II et al., U.S. Appl. No. 12/052,051, Office Action Communication, Sep. 8, 2009, 19 pages. |
Connell II et al., U.S. Appl. No. 12/052,051, Office Action Communication, Jun. 26, 2009, 19 pages. |
Connell II et al., U.S. Appl. No. 11/782,177, Office Action Communication, Nov. 24, 2009, 16 pages. |
Connell II et al., U.S. Appl. No. 11/782,173, Office Action Communication, Nov. 27, 2009, 12 pages. |
Sahai, “Towards Distributed and Dynamic Network Management”, IEEE, 1998, 10 pages. |
Reesen, “Virtual World Technologies to Manage a Grid”, IBM Corporation, 2008, 19 pages. |
Hai, U.S. Appl. No. 12/037,270, Office Action Communication, Aug. 6, 2010, 19 pages. |
IBM, RD 411124A, “Web Based Ordering System for Non-Standardised Goods e.g. Fruit, Vegetables Provides View of Item for Selection and Prints Bar Code on Item for Order Processing”, Copyright 2009 Derwent Information Ltd., pp. 1-2. |
U.S. Appl. No. 11/782,173, Office Action, May 27, 2009, in pp. 1-12. |
U.S. Appl. No. 11/782,173, Amendment, Aug. 27, 2009, in pp. 1-13. |
Kim, U.S. Appl. No. 11/756,382, Office Action Communication, Sep. 29, 2010, 26 pages. |
Crosland, U.S. Appl. No. 12/052,046, Notice of Allowance & Fees Due, Oct. 7, 2010, 14 pages. |
Stanford, U.S. Appl. No. 12/112,318, Office Action Communication, Nov. 8, 2010, 20 pages. |
Stanford, U.S. Appl. No. 11/782,177, Office Action Communication, Jun. 19, 2012, 16 pages. |
Stanford, U.S. Appl. No. 12/112,318, Office Action Communication, May 3, 2011, 16 pages. |
Kim, U.S. Appl. No. 11/756,382, Notice of Allowance & Fees Due, Mar. 24, 2011, 11 pages. |
Stanford, U.S. Appl. No. 11/782,177, Office Action Communication, Feb. 28, 2012, 31 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Mar. 9, 2012, 13 pages. |
Seth, U.S. Appl. No. 12/111,652, Notice of Allowance & Fees Due, Mar. 14, 2012, 7 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Jul. 22, 2011, 18 pages. |
Vo, U.S. Appl. No. 11/782,173, Decision on Appeal, Apr. 12, 2013, 7 pages. |
Vo, U.S. Appl. No. 11/782,173, Notice of Allowance & Fees Due, May 24, 2013, 38 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Feb. 6, 2013, 14 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Jul. 17, 2012, 14 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Jun. 14, 2013, 14 pages. |
Stanford, U.S. Appl. No. 11/782,177, Office Action Communication, Jun. 14, 2013, 25 pages. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Nov. 17, 2011, 12 pages. |
Basit, U.S. Appl. No. 12/037,270, Office Action Communication, Nov. 17, 2011, 21 pages. |
Seth, U.S. Appl. No. 12/111,652, Office Action Communication, Dec. 1, 2011, 43 pages. |
Vo, U.S. Appl. No. 11/782,173, Examiner's Answer, Apr. 30, 2010, 11 pages. |
Johnson, U.S. Appl. No. 12/052,051, Office Action Communication, Dec. 23, 2010, 31 pages. |
“Web based ordering system for non-standarised goods e.g. fruit, vegetables provides view of item for selection and prints bar code on item for order processing”, IBM Corporation, Jul. 10, 1998, 2 pages, abstract. |
Vo, U.S. Appl. No. 12/037,266, Office Action Communication, Jan. 27, 2011, 31 pages. |
Basit, U.S. Appl. No. 12/037,270, Office Action Communication, Jan. 21, 2011, 13 pages. |
Connell II, et al., U.S. Appl. No. 11/782,177, Final Office Action, Apr. 23, 2010, 18 pages. |
Connell II, et al., U.S. Appl. No. 12/052,046, Office Action, Apr. 2, 2010, 13 pages. |
U.S. Appl. No. 11/782,177, Examiner's Answer, Dated Jun. 16, 2014, 11 pages. |
Number | Date | Country | |
---|---|---|---|
20080296382 A1 | Dec 2008 | US |