This application is based upon and claims the benefit of priority from the prior Japanese Patent Application No. 2017-150754, filed on Aug. 3, 2017, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a dimension measurement apparatus.
At the time of accepting a transportation object in a home delivery business or the like, a work is performed in which a weight of the transportation object is measured by a scale, dimensions of a length, a width and a height of the transportation object are respectively measured by a tape measure or the like, and a transportation charge is determined based on the combination of the measured weight and dimensions. For the reason, a home delivery agent has to separately perform the dimension measurement and the weight measurement to the transportation object, and thereby there was a problem that the working efficiency is not good.
In contrast, an apparatus which photographs a transportation object using a distance image sensor (camera) to acquire a distance image, and measures dimensions of the transportation object based on the acquired distance image is thought of. It is possible to reduce a work to manually measure dimensions and a weight of the transportation object by using the apparatus like this.
On the other hand, it is necessary to complete the measurement quickly and accurately at the time of measuring dimensions of a transportation object at a transportation object acceptance site. However, in order to enable the quick and accurate measurement, an arithmetic unit with high processing performance is necessitated for the dimension measurement based on the distance image, and thereby an installation cost might be increased. Accordingly, it is desired to reduce a processing load for the dimension measurement based on the distance image so that the measurement can be completed quickly and accurately without increasing the installation cost.
According to one embodiment, a dimension measurement apparatus has a camera and a processing device. The camera photographs an object to be measured to generate a distance image of the object to be measured. The processing device has a memory and a controller so that the processing device generates dimension data indicating a length, a width, and a height of the object to be measured based on the distance image generated by the camera. The memory stores a control program for generating the dimension data. The controller acquires the distance image of the object to be measured. The controller divides the acquired distance image into an X coordinate image, a Y coordinate image, and a Z coordinate image in a three-dimensional space. The controller removes a coordinate point not corresponding to one reference surface of the object to be measured, in the respective divided X coordinate image, Y coordinate image, and Z coordinate image, to detect an X coordinate image, a Y coordinate image, and a Z coordinate image of the reference surface. Further, the controller generates the dimension data indicating the length, the width, and the height of the object to be measured, based on the respective detected coordinate images of the reference surface.
Hereinafter, the present embodiment will be described with reference to the drawings. In the drawings, the same symbols indicate the same or the similar portions.
The dimension measurement apparatus 10 is installed and used at an acceptance place of a transportation object OB of a home delivery agent, for example. The dimension measurement apparatus 10 measures dimensions of a length, a width, and a height (a depth, a width, and a height), and a weight of the transportation object OB, in order to determine a transportation charge of the transportation object OB.
As shown in
The distance image generated by the camera 22 is expressed by point group data (XYZ coordinate image) including positions (XYZ coordinates) of respective pixels in an XYZ coordinate space. The XYZ coordinate space is defined by an orthogonal coordinate system using, as a reference, an origin which is set to any position within a space to be photographed by the camera 22, for example. In
The camera 22 may be a stereo camera to output a distance image based on parallax of the imaged images by two cameras, for example, or may be a distance image camera (sensor) of a TOF (Time Of Flight) system to measure a distance from a time required for a projected laser to reciprocate to a photographic subject. In addition, the camera 22 may be a camera to generate a distance image of another system.
In addition, the weight measurement device 24 is provided in the measurement area 18. The weight measurement device 24 measures a weight of the transportation object OB placed in the measurement area 18.
The processing device 20 inputs (acquires) the distance image (point group data) generated by the camera 22. The processing device 20 executes a dimension processing to generate dimension data indicating dimensions of the length, the width, and the height of the transportation object OB. In addition, the processing device 20 inputs the weight data measured by the weight measurement device 24. The processing device 20 executes a processing to calculate a transportation charge of the transportation object OB, using the inputted weight data and the generated dimension data.
The controller 20A is a CPU (Central Processing Unit), for example. Hereinafter, the controller 20A may be called the CPU 20A. The CPU 20A executes a control program to control the whole of the dimension measurement apparatus 10. The control program includes a dimension measurement program and so on. The CPU 20A executes the dimension measurement program to realize function modules shown in a block diagram of
The distance image acquisition module 30 acquires the point group data (may be called the XYZ coordinate image) composed of positions (XYZ coordinates) of respective pixels in the XYZ coordinate space of the distance image of the transportation object OB photographed by the camera 22. The image processing module 40 generates an upper surface coordinate image corresponding to a reference surface of the transportation object OB, based on the distance image (XYZ coordinate image). In the present embodiment, the upper surface of the transportation object OB is made to be the reference surface. The image processing module 40 divides the distance image into an X coordinate image, a Y coordinate image, and a Z coordinate image, and removes coordinate points not corresponding to the upper surface (reference surface) of the transportation object OB in the respective coordinate images, to detect respective upper surface coordinate images corresponding to the upper surface (refer to
The memory 20B stores various data associated with the execution of various processings, in addition to the respective control programs to be executed by the CPU 20A. The storage device 20C is a nonvolatile storage medium (a hard disk or the like), and stores various program and data.
The input device 20D inputs an instruction for controlling an operation of the dimension measurement apparatus 10. The input device 20D includes a touch panel, a keyboard, a button and so on, for example. The input device 20D detects an input of an instruction to the touch panel, the keyboard, the button and so on, and outputs (notifies) the instruction to the CPU 20A. For example, the input device 20D is installed (not shown) in the vicinity of the measurement table 12 shown in
The display 20E displays an operation state and a processing result of the dimension measurement apparatus 10 under the control of the CPU 20A. The display 20E is installed (not shown) in the vicinity of the measurement table 12, for example, and presents the operation state and the processing result to a home delivery agent (receptionist) working at the measurement table 12, or a customer. The printer 20F prints a charge and so on determined based on the measured dimensions and weight of the transportation object OB.
The input/output interface 20G is an interface to which the camera 22 and the weight measurement device 24 are to be connected. Another external device may be connected to the input/output interface 20G.
The X coordinate image processing module 40x removes X coordinate points in the range not corresponding to the upper surface of the transportation object OB from the X coordinate image to generate an upper surface X coordinate image. The X coordinate image processing module 40x executes respective processings of an X coordinate image generation, an existence range limitation, smoothing, a Z range limitation, closing, an x range limitation, for example, to generate the upper surface X coordinate image.
The Y coordinate image processing module 40y removes Y coordinate points in the range not corresponding to the upper surface of the transportation object OB from the Y coordinate image to generate an upper surface Y coordinate image. The Y coordinate image processing module 40y executes respective processings of a Y coordinate image generation, the existence range limitation, the smoothing, the Z range limitation, the closing, a y range limitation, for example, to generate the upper surface Y coordinate image.
The Z coordinate image processing module 40z removes Z coordinate points in the range not corresponding to the upper surface of the transportation object OB from the Z coordinate image to generate an upper surface Z coordinate image. The Z coordinate image processing module 40z executes respective processings of a Z coordinate image generation, the existence range limitation, the smoothing, the Z range limitation, the closing, a narrow region exclusion, for example, to generate the upper surface Z coordinate image.
Each of the X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z limits the object to be processed to a range to be measured in the distance image generated by the camera 22 (corresponds to a range of the measurement area 18, for example), by the processing of the existence range limitation. Since each of the X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z makes, not the three-dimensional coordinate data, but only the coordinate data of the corresponding coordinate system, to be the object to be processed, the processing procedure is simplified and thereby the processing efficiency can be improved.
In addition, each of the X coordinate image processing module 40x, the Y coordinate image processing module 40y and the Z coordinate image processing module 40z limits only coordinate values of pixels having the Z coordinate values in the range capable of corresponding to the transportation object OB, as the object to be processed, by the relevant processing of the Z range limitation. For example, in the dimension measurement apparatus 10, a coordinate value of a pixel having a Z coordinate value exceeding an upper limit of the transportation object OB whose dimension is to be measured is removed as being outside the object to be processed.
The X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z respectively perform the processings of the Z range limitation to the X coordinate image, the Y coordinate image, and the Z coordinate image corresponding to the XYZ coordinate image shown in
In addition, each of the X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z removes a trash portion in the image, such as an isolated point and a thin line in the relevant coordinate image by the closing processing.
For example, the X coordinate image processing module 40x executes the closing processing to the X coordinate image shown in
The X coordinate image processing module 40x limits the object to be processed to the data of the X coordinate in a range corresponding to the upper surface of the transportation object OB, by the processing of the x range limitation. In addition, the Y coordinate image processing module 40y limits the object to be processed to the data of the Y coordinate in a range corresponding to the upper surface of the transportation object OB, by the processing of the y range limitation. The Z coordinate image processing module 40z, when the Z coordinate data group (point group data) indicating a narrow region (narrow region) that is made not to correspond to a predetermined upper surface of the transportation object OB is present, removes the relevant coordinate data by the processing of the narrow region exclusion.
The image processing module 40 in the present embodiment, regarding the pixel position the coordinate point of which has been removed as being not corresponding to the upper surface in any one module of the X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z, makes the other coordinate image processing modules operate so as not to make the pixel position an object of the processing to remove the coordinate point. For example, the pixel position which has been removed by the processing based on the Z coordinate image by the Z coordinate image processing module 40z is made outside the object to be processed in the X coordinate image processing module 40x and the Y coordinate image processing module 40y, as the relevant pixel position has to be removed also in the X coordinate image and the Y coordinate image. When a coordinate value of one pixel is expressed by the XYZ coordinate, if the pixel is not discriminated as being the object to be removed in each of the X coordinate, the Y coordinate and the Z coordinate, the pixel cannot be discriminated as being the object to be removed. However, at a time point when the pixel is discriminated as being the object to be removed in any of the X coordinate, the Y coordinate and the Z coordinate, the image processing module 40 can discriminate the relevant pixel as being the object to be removed. By this means, it is possible to reduce the whole processing load in the image processing module 40.
In addition, at what timings the processings in the X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z are respectively executed is not particularly limited. For example, the processings in the X coordinate image processing module 40x, the Y coordinate image processing module 40y, and the Z coordinate image processing module 40z may be executed in series, and after the processing in the Z coordinate image processing module 40z has been executed, the processing of the X coordinate image processing module 40x and the processing of the Y coordinate image processing module 40y may be executed in parallel. In addition, when any image processing module finishes a processing in a certain stage, the processings using the result may be executed in parallel in the next image processing modules in a pipeline manner.
The 3D modeling module 51 generates an upper surface 3D model indicating the upper surface of the transportation object OB, based on the upper surface X coordinate image, the upper surface Y coordinate image, and the upper surface Z coordinate image to be obtained by the processing of the image processing module 40. The minimum inclusion rectangular solid determination module 52 discriminates a rectangular solid having the upper surface which the upper surface 3D model shows, that is, a rectangular solid indicating the upper surface of the transportation object OB photographed by the camera 22, based on the upper surface 3D model.
The histogram distribution generation module 56 generates a histogram distribution indicating the number of pixels for each Z coordinate value, from the upper surface Z coordinate image obtained by the processing of the image processing module 40.
The distance determination module 57 determines a distance from the camera 22 to a position made to be the height of the upper surface of the transportation object OB, based on the histogram distribution of the upper surface Z coordinate image generated by the histogram distribution generation module 56, and determines a dimension of the height of the transportation object OB from the distance, and outputs height data. The distance determination module 57 determines a distance from the camera 22 to the position made to be the height of the upper surface of the transportation object OB, based on a maximum value (MaxPZ) of a peak zone of the histogram distribution of the upper surface Z coordinate image, a minimum value (MinPZ) of the peak zone of the histogram distribution of the upper surface Z coordinate image, and previously stored distance data indicating a distance from the camera 22 to the bottom surface of the transportation object OB (the upper surface of the measurement area 18).
The upper surface of the transportation object OB is not a plane due to an expansion, a dent, an appendage or the like as described above, but as shown in
Next, an operation of the dimension measurement apparatus 10 in the present embodiment will be described. For example, in order to determine a transportation charge of the transportation object OB, a dimension measurement and a weight measurement of the transportation object OB are performed using the dimension measurement apparatus 10. The transportation object OB is placed inside the measurement area 18 which is provided on the upper surface of the measurement table 12. Here, a dimension measurement start is instructed by an operation to the input device 20D, the processing device 20 instructs the camera 22 to photograph the transportation object OB, and instructs the weight measurement device 24 to execute the weight measurement.
The processing device 20 inputs the distance image (dot group data) generated by the camera 22, and in the image processing module 40, divides the distance image into the X coordinate image, the Y coordinate image, and the Z coordinate image, removes coordinate points not corresponding to the upper surface (reference surface) of the transportation object OB in the respective divided coordinate images, and thereby executes the processing to detect the respective upper surface coordinate images corresponding to the upper surface. The dimension data generation module 50 outputs the width data (lateral dimension data), the depth data (longitudinal dimension data), and the height data, as described above, by the processing based on the result of image processing by the image processing module 40.
The processing device 20 calculates a transportation charge, based on a sum of the dimensions of the length, the width, and the height of the transportation object OB which the data outputted by the dimension data generation module 50 indicates, the weight of the transportation object OB which the weight data to be inputted from the weight measurement device 24 indicates, a delivery destination (transportation distance) which is separately inputted through the input device 20D and so on, and a transportation mode (service contents). The calculated transportation charge is printed on a prescribed position of an invoice by the printer 20F, for example.
Since in the dimension measurement apparatus 10 in the present embodiment, the dimension measurement and the weight measurement are concurrently performed to the transportation object OB placed in the measurement area 18 in this manner, a working load for determining the transportation charge can be reduced. In the dimension measurement, the shape of the transportation object OB is specified and the dimension of the transportation object OB is measured, by only the distance image obtained by photographing the upper surface of the transportation object OB, and accordingly, the dimension can be measured simply and with high accuracy, without photographing the transportation object OB for a plural number of times while changing a position and an angle.
In addition, in the processing in the processing device 20, the distance image of the upper surface is divided into the X coordinate image, the Y coordinate image, and the Z coordinate image, and the processings are individually performed to the respective divided coordinate images, and thereby the processing load of the dimension measurement based on the distance image can be reduced. Accordingly, since an arithmetic unit with a high processing performance is not necessitated, increase in cost of the dimension measurement apparatus 10 can be avoided.
In addition, in the above-described description, the camera 22 is provided at a position immediately above the measurement area 18, but since at least the upper surface (reference surface) of the transportation object OB can only be photographed by the camera 22, the upper surface of the transportation object OB may be photographed obliquely from above, for example.
Further, in the above-described description, the camera 22 is installed above the measurement area 18, and thereby the distance image using the upper surface of the transportation object OB (the object to be measured) as the reference surface is acquired, but the transportation object OB is photographed from the lateral direction or from the downward direction of the transportation object OB, and thereby the distance image using the side surface or the bottom surface as the reference surface may be acquired. In this case, the camera 22 is installed at a position where the side surface or the bottom surface of the transportation object OB becomes photographable, and the transportation object OB is photographed from the lateral direction or the downward direction. In addition, when the transportation object OB is photographed from the lateral direction, the transportation object OB is placed in the measurement area 18, while a side surface opposite to a surface of the transportation object OB to be photographed is matched to a reference position (a wall formed vertically on the measurement area 18, for example). In addition, data indicating a distance from the camera 22 to the reference position (data corresponding to the above-described distance data BZ from the camera 22 to the bottom surface) is to be previously stored in the same manner as the above-described case in which the measurement area 18 is installed on the measurement table 12. Similarly, when the transportation object OB is photographed from the downward direction, data indicating a distance from the camera 22 provided below the measurement area 18 to the upper surface of the transportation object OB is to be previously stored (in this case, the placing surface of the measurement area 18 is to be formed of a transparent member).
In addition, in the above-described description, the dimension data of the transportation object OB is generated based on the distance image of one reference surface which has been acquired by photographing the transportation object OB from one direction, but the dimension data may be generated based on the distance images of a plurality of the reference surfaces photographed from a plurality of directions (the distance images obtained by photographing the upper surface and the side surface of the transportation object OB, for example). For example, an average value of the dimension data generated based on the respective distance images may be made to be final dimension data, or any effective dimension data may be selected. By this means, it becomes possible to generate the dimension data with higher accuracy.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2017-150754 | Aug 2017 | JP | national |