This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2010-030541, filed on Feb. 15, 2010; the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a technique of reading a code symbol such as a barcode attached to an article, using a camera such as a charge coupled device (CCD) camera.
A technique of detecting and slicing off a barcode from image data including characters, patterns and the like is already known. Thus, using this technique, a code symbol reading apparatus is recently developed which reads a code symbol such as a barcode or two-dimensional data code attached to an article.
For example, there is a code symbol reading apparatus having a camera, an image display unit, and a decoder. The camera at least picks up an image of a code symbol and outputs the image data of the code symbol. The image display unit displays the image data outputted from the camera, in real time as a dynamic image. The decoder decodes the code symbol based on the image data outputted from the camera.
The code symbol reading apparatus can allow an operator to recognize the reading state of the code symbol. Therefore, the operator can adjust the direction and position of the code symbol so that the code symbol can be securely read.
However, an operator who is unfamiliar with the operation cannot determine the direction in which the code symbol should be moved even when viewing the dynamic image, and often takes long for the adjustment.
In general, according to one embodiment, a code symbol reading apparatus includes a decoder, a candidate area detection unit, a direction determination unit, and a direction notification unit. The decoder decodes a code symbol attached to an article based on an image picked up by a camera, the image include the article. The candidate area detection unit detects an image area to be a candidate of the code symbol from the image of the article picked up by the camera. The direction determination unit determines such a direction that a decoding rate of the code symbol becomes higher based on the image area detected by the candidate area detection unit when the decoder cannot decode the code symbol. The direction notification unit notifies of the direction determined by the direction determination unit.
Hereinafter, an embodiment of the code symbol reading apparatus will be described using the drawings. In this embodiment, the code symbol reading apparatus is applied to a barcode reading apparatus 8 incorporated in a self-scanning type checkout terminal 1.
A weight measuring unit to measure the weight of an article placed on the receiving surface 2a is provided on the receiving surface 2a of the unregistered article placing table 2. A weight measuring unit to measure the weight of an article placed on the receiving surface 3a is also provided on the receiving surface 3a of the registered article placing table 3. The weight measured by these weight measuring units is used for weight check in order to prevent failure to register an article or false registration.
A display device 6 is attached on top of the terminal body 4. The display device 6 is a cathode ray tube (CRT) display, liquid crystal display, organic electro-luminescence (organic EL) display or the like. A touch panel 6b is arranged on a screen 6a of the display device 6.
An electronic settlement terminal 7 is attached to a lateral side of the terminal body 4. The electronic settlement terminal 7 carries out wireless communication with an electronic money medium and performs electronic settlement of the price in a commercial transaction.
A barcode reading apparatus 8 and a receipt printer 9 are installed inside the terminal body 4. Also, a barcode reading window 10 and a receipt issue port 11 are formed on the front side of the terminal body 4. The barcode reading apparatus 8 reads a barcode symbol attached to an article held over a glass surface of the barcode reading window 10. The receipt printer 9 prints a receipt on which the content of a commercial transaction is recorded, and issues the receipt via the receipt issue port 11.
The controller 21 servers as a control center of the barcode reading apparatus 8 and mainly include a central processing unit (CPU). The program storage unit 22 stores a control program to operate the controller 21.
The camera 23 includes a CCD image pickup element as an area image sensor, a driving circuit for the CCD image pickup element, and an image pickup lens to form an image in an image pickup area on the CCD image pickup element. The area of an image formed in the area of the CCD image pickup element through the image pickup lens from the barcode reading window 10 is the image pickup area. The camera 23 outputs the image in the image pickup area to the controller 21 on a frame basis. In this embodiment, a frame-based image is called a frame image.
The image memory 24 sequentially unfolds and stores the frame image outputted from the camera 23. The decoder 25 decodes barcode data, based on image data in an area sliced as a barcode candidate from the frame image unfolded in the image memory 24.
The interface 26 outputs the barcode data decoded by the decoder 25 to a main CPU of the checkout terminal 1. The buzzer 27 outputs a predetermined reading completion sound in response to the output of the barcode data via the interface 26.
The message table 28 stores data of a guide message to the operator. The image display area 29 displays the frame image picked up by the camera 23, in real time.
The controller 21 realizes the functions of an image display unit 31, a candidate area detection unit 32, a direction determination unit 33 and a direction notification unit 34 according to the control program stored in the program storage unit 22. These functions will be described with reference to the flowcharts of
As the control program is started, the controller 21 starts the processing shown in
The image display area 29 is formed in a part of an article registration waiting screen 40 displayed on the display device 6 of the checkout terminal 1.
Here, the image display unit 31 which displays the image picked up by the camera 23 on the display device 6 is realized by each processing of ACT 1, ACT 2 and ACT 3. The image displayed by the image display unit 31 is a mirror image acquired by reversing the image picked up by the camera 23 in the left-right direction.
Next, the controller 21 analyzes the frame image stored in the image memory 24 and detects an image area that is assumed to include a barcode symbol, that is, a so-called barcode candidate image area (ACT 4). This processing uses, for example, the technique disclosed in JP-A-2005-266907 laid public in Japan.
The controller 21 determines whether a barcode candidate image area is successfully detected or not (ACT 5). When a barcode candidate image area is not successfully detected (NO in ACT 5), the controller 21 returns to the processing of ACT 1. That is, the controller 21 takes in the next frame image from the camera 23 and executes each processing of ACTs 2, 3 and 4 again.
When a barcode candidate image area is successfully detected (YES in ACT 5), the controller 21 identifies and displays the barcode candidate image area from the picked-up image displayed in the image display area 29. Specifically, the controller 21 encloses the barcode candidate image area with a frame (ACT 6).
When plural areas are simultaneously detected as barcode candidate image areas, the controller 21 decides priority of these areas as barcode candidates. The priority is decided based on a determination condition such as the proportion of the size of the barcode candidate image area to the flat area of the article, or the direction of the longitudinal side of the barcode candidate image area to the contour shape of the article. The controller 21 selects, identifies and displays the barcode candidate image area with the highest priority. That is, the controller 21 encloses the selected barcode candidate image area with a frame. Alternatively, the controller 21 encloses all the detected barcode candidate image areas with frames and changes the color of only the frame of the barcode candidate image area with the highest priority. The identification and display method is not limited to enclosing of the area with a frame.
Here, the candidate area detection unit 32 which detects an image area to be a candidate of a barcode from the image picked up by the camera 23 is realized by each processing of ACT 4, ACT 5 and ACT 6. The processing of ACT 6 can be omitted from the candidate area detection unit 32.
After identifying and displaying the barcode candidate image area, the controller 21 decodes barcode data using the decoder 25 (ACT 7). When plural areas are simultaneously detected as barcode candidate image areas, the controller 21 decodes barcode data in order from the area with the highest priority.
Generally, when a barcode symbol is held over a central part of the barcode reading window 10 at a short distance from the glass surface of the barcode reading window 10, an image of the barcode with a large size is situated at the center of an image picked up by the camera 23. Therefore, the decoder 25 can accurately decode the barcode data from the picked-up image. However, when the barcode symbol is held away from the glass surface of the barcode reading window 10, the image of the barcode is small relative to the picked-up image. When the barcode symbol is held out of the central part of the barcode reading window 10, the image of the barcode is situated at an edge of the picked-up image. Therefore, the decoder 25 may not be able to accurately decode the barcode data from the picked-up image.
The controller 21 determines whether the barcode data is successfully decoded by the decoder 25 (ACT 8). When the barcode data is successfully decoded (YES in ACT 8), the controller 21 fills the inside of the frame of the barcode candidate image area where the barcode data is successfully decoded, with a predetermined color (ACT 9). The controller 21 also outputs the barcode data decoded by the decoder 25 to the main CPU via the interface 26 (ACT 10). The main CPU performs registration of article information based on the barcode data inputted from the barcode reading apparatus 8.
Meanwhile, when the barcode data is not successfully decoded by the decoder 25 (NO in ACT 8), the controller 21 executes selection of a guide message (ACT 11). That is, the controller 21 determines such a direction that the decoding rate of the barcode becomes higher based on the current barcode candidate image area.
As described above, when the image of the barcode with a large size is situated at the center of the image picked up by the camera 23, the decoding rate of the barcode data is high. However, even if the image of the barcode is situated at the center of the picked-up image, the decoding rate of the barcode data is low if the image of the barcode is small. Moreover, when the image of the barcode is situated at an edge of the picked-up image, the decoding rate of the barcode data is low.
Thus, controller 21 determines such a direction that the decoding rate of the barcode becomes higher, based on the size of the barcode candidate image area relative to the picked-up image and the position of the barcode candidate image area in the picked-up image.
As such a direction that the decoding rate of the barcode becomes higher is determined, the controller 21 selects a guide message to guide the barcode candidate image area in that direction, from the message table 28. The controller 21 displays the selected guide message in the image display area 29 (ACT 12).
Here, the direction determination unit 33 which determines such a direction that the decoding rate of barcode data becomes higher based on the barcode candidate image area is realized by the processing of ACT 11. The direction notification unit 34 which notifies of the direction determined by the direction determination unit 33 is realized by the processing of ACT 12.
When the barcode data is outputted to the main CPU in the processing of ACT 10 or the guide message is displayed in the image display area 29 in the processing of ACT 12, the controller 21 returns to the processing of ACT 1. The controller 21 then takes in the next frame image from the camera 23 and executes the processing of ACT 2 and the subsequent processing again.
Generally, the decoding rate of barcode data by the decoder 25 is high when a large barcode image is within the central area P. However, when the barcode image is small, the decoding rate of barcode data is lowered even if the barcode image is within the central area P. In the peripheral areas A to H, the decoding rate of barcode data is low because of the reduction in the quantity of light cast from the light source, the reduction in the quantity of lighting in the optical system including a lens, or the distortion of the image or the like. Thus, the barcode reading apparatus 8 has the message table 28 having the data content shown in
That is, the message table 28 stores, as data of message number “1”, a guide message “Move the article closer to the glass surface” to guide the user to move the barcode closer to the glass surface of the barcode reading window 10 in order to raise the decoding rate of the barcode candidate image. The message table 28 also stores, as data of message numbers “2” to “9”, guide messages “Move the article to XX (direction)” to guide the user to move each of the barcode candidate image areas situated in the peripheral areas A to H into the central area P of the picked-up image, together with information of the peripheral areas A to H.
In the processing of ACT 21, the controller 21 detects which area of the divided areas A to H and P of the picked-up image the barcode candidate image area exists in. When the barcode candidate image area covers plural divided areas, the controller 21 detects the area with the largest area.
When the barcode candidate image area exists within the central area P (YES in ACT 22), the controller 21 selects the guide message data of message number “1” from the message table 28 (ACT 23).
Meanwhile, when the barcode candidate image area exists within peripheral areas A to H (NO in ACT 22), the controller 21 selects from the message table 28 the guide message data of message numbers “2” to “9” corresponding to the peripheral areas A to H where the barcode candidate image area exists.
That is, when the barcode candidate image area exists in the peripheral area A (YES in ACT 24), the controller 21 selects the guide message data of message number “2” (ACT 25). When the barcode candidate image area exists in the peripheral area B (YES in ACT 26), the controller 21 selects the guide message data of message number “3” (ACT 27). When the barcode candidate image area exists in the peripheral area C (YES in ACT 28), the controller 21 selects the guide message data of message number “4” (ACT 29). When the barcode candidate image area exists in the peripheral area D (YES in ACT 30), the controller 21 selects the guide message data of message number “5” (ACT 31). When the barcode candidate image area exists in the peripheral area E (YES in ACT 32), the controller 21 selects the guide message data of message number “6” (ACT 33). When the barcode candidate image area exists in the peripheral area F (YES in ACT 34), the controller 21 selects the guide message data of message number “7” (ACT 35). When the barcode candidate image area exists in the peripheral area G (YES in ACT 36), the controller 21 selects the guide message data of message number “8” (ACT 37). When the barcode candidate image area exists in the peripheral area H (NO in ACT 22 to ACT 36), the controller 21 selects the guide message data of message number “9” (ACT 38).
The one guide message thus selected is displayed in the image display area 29 by the processing of ACT 12. Then, the guide message selection ends.
The self-scanning type checkout terminal 1 having the barcode reading apparatus 8 of such configuration is installed, for example, at a checkout counter of a supermarket. A customer who carries out accounting of purchased articles using the checkout terminal 1 first places articles with unregistered article information on the receiving surface 2a of the unregistered article placing table 2. Next, the customer takes out the articles one by one from the receiving surface 2a and holds the barcode symbol attached to the articles over the barcode reading window 10.
When the barcode data is consequently read by the barcode reading apparatus 8, a reading completion sound is outputted from the buzzer 27. Also, the registration information of the articles is displayed in the details display section 41 of the article registration waiting screen 40 displayed on the display device 6. Then, the customer puts the articles which the customer holds in hand into a shopping bag spread on the receiving surface 3a of the registered article placing table 3.
Meanwhile, when the barcode data is not read even if the barcode symbol of the article is held over the barcode reading window 10, a mirror image of the picked-up image and a predetermined guide message are displayed in the image display area 29 of the article registration waiting screen 40.
In the example of
Thus, the operator moving the article according to the guide message can recognize that the barcode data is read, as the barcode candidate image area 52 is filled with a predetermined color.
In this manner, simply by moving an article held over the barcode reading window 10 according to a guide message by the operator, the data of the barcode symbol attached to the article is securely read. Therefore, even when the operator is a customer who is unfamiliar with the operation of the self-scanning type checkout terminal 1, the operator can adjust the direction and position of the barcode symbol in a short time so that the barcode data can be securely read.
Moreover, a guide message is displayed on the display device 6 together with a picked-up image in the form of a mirror image in the image display area 29 provided substantially immediately above the barcode reading window 10. Therefore, the operation is easy since it suffices for the operator to move the article toward the center of the image display area 29 while viewing the picked-up image displayed in the image display area 29. Thus, the time required for reading the barcode data can be reduced and processing efficiency can be improved. Moreover, stress on the operator can be reduced, too.
The invention is not limited to the embodiment. In practical implementations, components can be embodied in modified manners without departing from the scope of the invention.
In the embodiment, the direction notification unit 34 displays a guide message and thereby notifies of such the direction that the decoding rate of the code symbol becomes higher based on an image area detected by the candidate area detection unit. However, the direction notification unit 34 is not limited to this example. For example, the direction notification unit 34 may notify of a guide message via an audio guide using a voice synthesizer. In this case, the image display unit 31 which displays an image picked up by the camera 23 on the display device 6 is not necessarily required.
In the embodiment, the invention is applied to the barcode reading apparatus 8 in the self-scanning type checkout terminal 1. However, the application target of the invention is not limited to this example. The invention can also be applied to a reading apparatus for other code symbols than barcodes, for example, two-dimensional data codes.
Moreover, in the embodiment, it is assumed that the control program to realize the functions of the invention is recorded in advance in the program storage unit 22 within the apparatus. However, without being limited to this example, a similar program may be downloaded to the apparatus from a network. Alternatively, a similar program recorded in a recording medium may be installed in the apparatus. The recording medium may be of any form as long as the recording medium can store a program and can be read by the apparatus, like CD-ROM. The functions that can be acquired by installation or download of the program may also be realized in cooperation with the operating system (OS) within the apparatus or the like.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2010-030541 | Feb 2010 | JP | national |