In general, the present invention relates to barcodes. Specifically, the present invention relates to time-varying barcodes in an active display for information exchange.
A barcode is a static, optical machine-readable representation (image) of data. The bars are read by variances in reflected light. Barcode readers are relatively inexpensive and more accurate than key entry. However, issues arise when the barcode image is low resolution or has become obstructed or damaged in some way, causing a failed read or misread of the barcode information. Barcodes are also susceptible to limitations of the printer and reader. For example, barcodes printed on dark backgrounds like corrugated cardboard may be difficult to read. Heretofore, several unsuccessful attempts have been made to address these shortcomings.
U.S. Patent Application 20110000958 discloses a method and system for communicating encoded information through “animated” barcodes wherein a single bar code area on an electronics' display or television is scanned multiple times while the bar code area changes from one bar code image to another.
U.S. Patent Application 20100020970 discloses a system and method for creating a camera imaging data channel by encoding a sequence of bar codes from a display screen and captured by a camera, then decoded by software on a cell phone or similar device.
U.S. Patent Application 20060054695 discloses a dynamic bar code display apparatus that includes a storage medium and means for displaying at least two or more bar codes continuously.
U.S. Pat. No. 7,360,706 and U.S. Pat. No. 7,273,180 disclose a hand-supportable digital imaged-based bar code symbol reading device.
U.S. Pat. No. 5,591,952 discloses a bar code reader that utilizes a CCD imager device to capture the image and the memory data from the imager device is analyzed to recognize and decode any symbols included within the image.
U.S. Pat. No. 5,278,397 discloses a multi-resolution bar code reader in which the bar code reader's optics and sensing elements are organized to send two channels of data derived from a bar code scan.
U.S. Pat. No. 5,073,954 discloses the bar code location and recognition processing system in which a bar code is optically scanned and a digital video processor converts the scan to binary data and determines the location and pattern of the bar code in the scan image.
U.S. Patent Application 20080277475 discloses a digital image capture and processing system that combines video and snapshot image captures into a single bar code data capture cycle.
U.S. Patent Application 20070199993 and U.S. Patent Application 20070187509 disclose a hand-supportable digital bar code reader that has multiple modes of image processing capabilities that include reading both 1D and 2D bar code symbols.
None of these references, however, teach the use of an error-identifying or two-way communication feed-back loop in a dual electronic device apparatus that uses image display and image capturing devices to communicate between the devices via bar codes in at least one direction. Furthermore, none of these references teach the use of bar codes that have varying sections of bar code image pattern resolution within a single bar code pattern.
In general, embodiments of the present invention provide barcode sequences in an active display for information exchange. Specifically, embodiments of the present invention provide a system and method for communicating information between electronic devices via a barcode image sequence. In a typical embodiment, a barcode image sequence is displayed on the display screen of a first electronic device. A second electronic device reads and decodes the barcode image sequence. The second electronic device displays an acknowledgement on the display screen of the second electronic device. The acknowledgement is read by the first electronic device.
A first aspect of the present invention provides a data communication system for communicating information between electronic devices via a barcode image sequence, said system comprising: an electronic device, comprising: a camera configured to read a barcode image sequence from another electronic device; a barcode decoding component configured to decode the barcode image sequence; and a display component configured to display an acknowledgement on a screen of the electronic device, wherein the electronic device determines a camera location of the other electronic device relative to the screen of the electronic device by performing an alignment detection algorithm.
A second aspect of the present invention provides a method for communicating information between electronic devices via a barcode image sequence, comprising: reading a barcode image sequence displayed on a screen using a camera of an electronic device; decoding the barcode image sequence; determining a camera location of another electronic device relative to a display screen of the electronic device; and displaying an acknowledgement on the display screen of the electronic device based on the determined camera location.
A third aspect of the present invention provides a method for providing a data communication system for communicating information between electronic devices via a barcode image sequence, said system comprising: providing an electronic device, comprising: a camera configured to read a barcode image sequence from another electronic device; a barcode decoding component configured to decode the barcode image sequence; and a display component configured to display an acknowledgement on a screen of the electronic device, wherein the electronic device determines a camera location of the other electronic device relative to the screen of the electronic device by performing an alignment detection algorithm.
These and other features of this invention will be more readily understood from the following detailed description of the various aspects of the invention taken in conjunction with the accompanying drawings in which:
The drawings are not necessarily to scale. The drawings are merely schematic representations, not intended to portray specific parameters of the invention. The drawings are intended to depict only typical embodiments of the invention, and therefore should not be considered as limiting the scope of the invention. In the drawings, like numbering represents like elements.
Illustrative embodiments will now be described more fully herein with reference to the accompanying drawings, in which exemplary embodiments are shown. This disclosure may, however, be embodied in many different forms and should not be construed as limited to the exemplary embodiments set forth herein. Rather, these exemplary embodiments are provided so that this disclosure will be thorough and complete and will fully convey the scope of this disclosure to those skilled in the art. In the description, details of well-known features and techniques may be omitted to avoid unnecessarily obscuring the presented embodiments.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of this disclosure. As used herein, the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, the use of the terms “a”, “an”, etc., do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. It will be further understood that the terms “comprises” and/or “comprising”, or rectify “includes” and/or “including”, when used in this specification, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components, and/or groups thereof.
Hereinafter, a camera includes any electronic device or component configured to capture and provide signals and/or data representative of video images. Video images include, but are not limited to, barcode images.
The barcode can be traced back to 1948, but it did not have a large impact until the 1970's when it became a tool in alleviating manual inventories. Grocery store owners began to see it as a way to save time and money in tracking product levels within the store. Throughout the 1970's, barcode scanning systems became more affordable and practical with the continued price reduction and miniaturization of barcode readers.
However, the barcode reader usually needs a fairly good picture of the symbol to decode these barcodes. Barcodes can store a large amount of data, but grow in size and complexity relative to the amount of data stored. The bigger and more complex the barcode, the better the picture required to decode it.
As indicated above, embodiments of the present invention provide a system and method for communicating information between electronic devices via a barcode image sequence. In a typical embodiment, a barcode image sequence is displayed on the display screen of a first electronic device. A second electronic device reads and decodes the barcode image sequence. The second electronic device displays an acknowledgement on the display screen of the second electronic device. The acknowledgement is read by the first electronic device.
Referring now to
Each data packet is coded as a barcode image with additional error-correcting code. For example, T1 displays an image on its display screen. The camera on T2 reads the image. Any errors that are introduced are detected and corrected based on the remaining codes. The display screen of T2 displays a check sum which is read by the camera of T1. T1 verifies the transmission results. If error increases, the image resolution may be decreased or the cameras need to be realigned.
A similar camera resolution checkup sequence is also ran to determine the resolution capability of the camera on T1. T2 displays a sequence of images on its display screen. T2 displays the images beginning from the simplest image to the most complex image in increasing complexity. The camera on T1 reads each image as they are displayed and responds through its display screen with a spatial and temporal code. T2 determines the resolution capability of the camera on T1 based on the codes read from the display screen of T1.
T2 repeats the process to determine the camera location of T1 on the screen of T2. T2 displays an image or multiple images on its screen. The camera of T1 reads part of the display screen of T2 and generates a whole screen response on the display screen of T1. T2 reads the response from the display of T1 to determine the camera location of T1 on the screen of T2.
The foregoing description of various aspects of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed and, obviously, many modifications and variations are possible. Such modifications and variations that may be apparent to a person skilled in the art are intended to be included within the scope of the invention as defined by the accompanying claims.
This application is a continuation of commonly-owned co-pending application Ser. No. 13/661,443 filed Oct. 26, 2012. This earlier filed application is itself a continuation application of commonly-owned application Ser. No. 13/556,831, filed Jul. 24, 2012 (now U.S. Pat. No. 8,408,462, issued Apr. 2, 2013). That earlier filed application is itself a continuation application of commonly-owned application Ser. No. 13/106,514, filed on May 12, 2011 (now U.S. Pat. No. 8,256,673, issued Sep. 4, 2012).
Number | Name | Date | Kind |
---|---|---|---|
5073954 | Van Tyne et al. | Dec 1991 | A |
5278397 | Barkan et al. | Jan 1994 | A |
5591952 | Krichever et al. | Jan 1997 | A |
7089420 | Durst et al. | Aug 2006 | B1 |
7162035 | Durst et al. | Jan 2007 | B1 |
7273180 | Zhu et al. | Sep 2007 | B2 |
7360706 | Zhu et al. | Apr 2008 | B2 |
7578436 | Kiliccote | Aug 2009 | B1 |
7946493 | Havens et al. | May 2011 | B2 |
8231054 | Kim | Jul 2012 | B1 |
8256673 | Kim | Sep 2012 | B1 |
8408462 | Kim | Apr 2013 | B2 |
8418922 | Kim | Apr 2013 | B1 |
20020067865 | Stutzman | Jun 2002 | A1 |
20020099942 | Gohl | Jul 2002 | A1 |
20050005102 | Meggitt et al. | Jan 2005 | A1 |
20050038756 | Nagel | Feb 2005 | A1 |
20050199699 | Sato et al. | Sep 2005 | A1 |
20050246536 | Roberts | Nov 2005 | A1 |
20060002610 | Suomela et al. | Jan 2006 | A1 |
20060052058 | Lai et al. | Mar 2006 | A1 |
20060054695 | Owada | Mar 2006 | A1 |
20060071077 | Suomela et al. | Apr 2006 | A1 |
20060101280 | Sakai | May 2006 | A1 |
20070019616 | Rantapuska et al. | Jan 2007 | A1 |
20070021065 | Sengupta et al. | Jan 2007 | A1 |
20070109262 | Oshima | May 2007 | A1 |
20070187509 | Kotlarsky et al. | Aug 2007 | A1 |
20070199993 | Kotlarsky et al. | Aug 2007 | A1 |
20070211148 | Lev | Sep 2007 | A1 |
20070242883 | Kruppa | Oct 2007 | A1 |
20080099561 | Douma | May 2008 | A1 |
20080203167 | Soule et al. | Aug 2008 | A1 |
20080230615 | Read et al. | Sep 2008 | A1 |
20080244714 | Kulakowski et al. | Oct 2008 | A1 |
20080277475 | Kotlarsky et al. | Nov 2008 | A1 |
20090176505 | Van Deventer | Jul 2009 | A1 |
20090308927 | Longacre et al. | Dec 2009 | A1 |
20100020970 | Liu et al. | Jan 2010 | A1 |
20100030695 | Chen et al. | Feb 2010 | A1 |
20100112279 | McIntosh | May 2010 | A1 |
20100125497 | Arguello | May 2010 | A1 |
20100210287 | De Vries | Aug 2010 | A1 |
20110000958 | Herzig | Jan 2011 | A1 |
20110070829 | Griffin et al. | Mar 2011 | A1 |
20110081860 | Brown et al. | Apr 2011 | A1 |
20120045059 | Fujinami | Feb 2012 | A1 |
20120077433 | Walker et al. | Mar 2012 | A1 |
20120141660 | Fiedler | Jun 2012 | A1 |
20120198531 | Ort et al. | Aug 2012 | A1 |
20120264401 | Hwang | Oct 2012 | A1 |
20120292392 | Kim | Nov 2012 | A1 |
20120298752 | Kim | Nov 2012 | A1 |
20130031261 | Suggs | Jan 2013 | A1 |
20130031623 | Sanders | Jan 2013 | A1 |
20130133086 | Liberman | May 2013 | A1 |
20130221083 | Doss et al. | Aug 2013 | A1 |
20130240621 | Everett | Sep 2013 | A1 |
20140004793 | Bandyopadhyay et al. | Jan 2014 | A1 |
20140113550 | Li | Apr 2014 | A1 |
20140117074 | Kim | May 2014 | A1 |
20140330993 | Raz | Nov 2014 | A1 |
20140334665 | Quinn et al. | Nov 2014 | A1 |
20150054917 | Coon | Feb 2015 | A1 |
20150138608 | Turner et al. | May 2015 | A1 |
20160267369 | Picard et al. | Sep 2016 | A1 |
Entry |
---|
Sandiford, U.S. Appl. No. 13/412,792, Office Action, dated Feb. 25, 2014, 43 pages. |
Sandiford, U.S. Appl. No. 13/412,792, Final Office Action, dated Sep. 5, 2014, 28 pages. |
Sandiford, U.S. Appl. No. 13/412,792, Office Action, dated Dec. 3, 2014, 38 pages. |
Sandiford, U.S. Appl. No. 13/412,792, Notice of Allowance, dated May 15, 2015, 10 pages. |
Trail, U.S. Appl. No. 13/106,514, Notice of Allowance, dated Feb. 3, 2012, 8 pages. |
Trail, U.S. Appl. No. 13/556,831, Office Action, dated Aug. 29, 2012, 14 pages. |
Trail, U.S. Appl. No. 13/556,831, Notice of Allowance, dated Jan. 9, 2013, 14 pages. |
Tardif, U.S. Appl. No. 13/661,443, Office Action, dated Sep. 12, 2016, 13 pages. |
Trail, U.S. Appl. No. 13/113,205, Office Action, dated Oct. 7, 2011, 12 pgs. |
Trail, U.S. Appl. No. 13/113,205, Notice of Allowance, dated Mar. 30, 2012, 8 pgs. |
Trail, U.S. Appl. No. 13/556,737, Office Action, dated Aug. 29, 2012, 13 pgs. |
Trail, U.S. Appl. No. 13/556,737, Notice of Allowance, dated Jun. 20, 2013, 19 pgs. |
Trail, U.S. Appl. No. 13/626,119, Office Action, dated Nov. 7, 2012, 8 pgs. |
Trail, U.S. Appl. No. 13/626,119, Notice of Allowance, dated Mar. 11, 2013, 15 pgs. |
Sandiford, U.S. Appl. No. 14/798,787, Notice of Allowance, dated Dec. 6, 2016, 13 pgs. |
Trail, U.S. Appl. No. 13/106,514, Office Action dated Oct. 7, 2011, 12 pages. |
Trail, U.S. Appl. No. 13/106,514, Notice of Allowance dated Feb. 3, 2012, 8 pages. |
Trail, U.S. Appl. No. 13/556,831, Office Action dated Aug. 29, 2012, 14 pages. |
Trail, U.S. Appl. No. 13/556,831, Notice of Allowance dated Jan. 9, 2013, 14 pages. |
Tardif, U.S. Appl. No. 13/661,443, Office Action dated Oct. 7, 2013, 12 pages. |
Tardif, U.S. Appl. No. 13/661,443, Office Action dated Mar. 13, 2014, 15 pages. |
Tardif, U.S. Appl. No. 13/661,443, Final Office Action dated Oct. 22, 2014, 8 pages. |
Tardif, U.S. Appl. No. 13/661,443, Office Action dated Apr. 10, 2015, 8 pages. |
Devan A. Sandiford, USPTO Office Action, U.S. Appl. No. 14/798,787, Notification Date Sep. 29, 2016, 23 pages. |
David P. Tardif, USPTO Office Action, U.S. Appl. No. 13/661,443, Notification Date Sep. 12, 2016, 13 pages. |
Number | Date | Country | |
---|---|---|---|
20160292476 A1 | Oct 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13661443 | Oct 2012 | US |
Child | 15174167 | US | |
Parent | 13556831 | Jul 2012 | US |
Child | 13661443 | US | |
Parent | 13106514 | May 2011 | US |
Child | 13556831 | US |