The technical field relates to a measuring apparatus and a measuring method, and specifically relates to a measuring apparatus and a measuring method for measuring a volume of boxes.
General speaking, shipping companies calculated their transportation fees based on the volume and the weight of the transported items, and warehousing companies calculated their storage fees based on the volume and the weight of the stored items. For the sake of calculation, it is critical for these companies to effectively measure the volume and/or the weight of the processed items.
Part of the shipping companies has built a measuring system inside their factories. The measuring system at least includes a conveyor for conveying the items, and also includes a measure apparatus and a weight scale arranged in a measuring region. While the items is conveyed to the measuring region, the measuring system measures the size, such as length, width, height of the items through the measuring apparatus, and also measures the weight of the items through the weight scale.
However, this kind of measuring system usually has a huge volume and is hardly to be re-located, which is inconvenient to the staffs of the companies.
Moreover, part of the logistics companies prefers not to consider the weight of the items in order to speed up their processing efficiency. In doing so, they directly produce and sell multiple types of box each has a pre-determined size (such as a big size, a middle size, a small size, etc.). When processing the items, the logistics companies directly charge the transportation fee and the process fee according to the size of the boxes carrying the items. As a result, the measuring procedure can be omitted. However, the size of such boxes is limiting, it is hard to satisfy all the user's demand.
Accordingly, a novel volume measuring approach is needed to the market for assisting the companies in measuring the items' volume quickly and conveniently.
The disclosure is directed to a volume measuring apparatus and a volume measuring method for a box, which generates a depth graph containing a target box based on the images respectively obtained by two cameras, and then computes a volume related data of the target box directly according to the depth graph.
In one of the exemplary embodiments, the volume measuring apparatus of the present invention includes a first camera, a second camera, a structure light emitting unit, and a processing unit. When the processing unit is triggered, it controls the structure light emitting unit to emit invisible structure light onto a target box, and controls the first camera and the second camera to respectively capture a left image and right image both containing an image of the target box, and then generates a depth graph based on the left image and the right image. The processing unit scans the depth graph according to multiple scanning lines for retrieving a middle line, a bottom line, a left-side line, and a right-side line of the target box in the depth graph, and scans within a range of the middle line, the bottom line, the left-side line, and the right-side line through the multiple scanning lines for obtaining multiple widths, heights, and lengths of the target box. Therefore, the processing unit may compute a volume related data of the target box based on the multiple widths, heights, and lengths.
The present invention captures the images of a target box through two separated cameras arranged on a portable volume measuring apparatus, and directly computes the volume related data of the target box based on the images. In comparison with related arts, the volume measuring apparatus of the present invention may assist the staffs in the warehousing companies and shipping companies in measuring the volume of different kinds of boxes quickly and conveniently.
In cooperation with the attached drawings, the technical contents and detailed description of the present invention are described hereinafter according to multiple embodiments, being not used to limit its executing scope. Any equivalent variation and modification made according to appended claims is all covered by the claims claimed by the present invention.
The present invention discloses a volume measuring apparatus for measuring the volume of a box, which may be used to precisely measure a volume related data, such as width information, height information, length information, etc., of a rectangular box. By using the volume measuring apparatus of the present invention, it is convenient for the users to record and to calculate (or compute) the volume of the box, as well as to calculate the processing cost such as packaging fee, transportation fee, etc. of the box.
Please refer to
As disclosed in
As shown in
One of the technical features of the present invention is that, when the left image and the right image are obtained, the processing unit 10 performs a calculation (or computation) based on the left image and the right image (and the reference information) through a depth transforming algorithm for generating a corresponding depth graph. The depth graph at least contains depth information of the target box and depth information of the background around the target box (as shown in
While performing the first scanning procedure, the processing unit 10 uses the multiple vertical scanning lines and the multiple horizontal scanning lines to statistically gather the depth information of each point in the depth graph, and distinguishes the target box from the background according to the depth differences of the adjacent points, and distinguishes the boundaries and faces of the target box according to the depth differences of the adjacent points. Therefore, the processing unit 10 may obtain the middle line, the bottom line, the left-side line, and the right-side line of the target box. After obtaining the middle line, the bottom line, the left-side line, and the right-side line, the processing unit 10 may determine the specific position of the target box in the depth graph.
The processing unit 10 may perform a second scanning procedure to the target box in the depth graph, so as to obtain the volume related data such as a width, a height, a length, etc. of the target box. In particular, while performing the second scanning procedure, the processing unit 10 defines (or uses) the middle line, the bottom line, the left-side line, and the right-side line as a scanning range, and directly scans the target box in the depth graph within the scanning range through the multiple scanning lines (including the vertical scanning lines and the horizontal scanning lines), so as to obtain multiple width information, height information, and length information (as shown in
The processing unit 10 may compute (or calculate) the volume related data, such as an actual width, an actual height, and an actual length, of the target box respectively according to the multiple width information, the multiple height information, and the multiple length information. For example, the processing unit 10 may calculate an average of the multiple width information for obtaining the actual width of the target box, calculate an average of the multiple height information for obtaining the actual height of the target box, and calculate an average of the multiple records of length information for obtaining the actual length of the target box. The above description is only one of the exemplary embodiments of the present invention, not limited thereto.
It is should be noted that the measuring apparatus 1 of the present invention may further include a structure light emitting unit 14 electrically connected with the processing unit 10. In the embodiment, the structure light emitting unit 14, the first camera 11, and the second camera 12 are arranged on same side (or surface) of the shell 2 (such as the first surface of the shell 2). The structure light emitting unit 14 is a light emitter which may be regarded as a light source and may provide light to an external environment of the measuring apparatus 1. In one of the exemplary embodiments, when the structure light emitting unit 14 is triggered (for example, trigged by a triggering unit 13, or activated by the processing unit 10), the structure light emitting unit 14 emits invisible structure light to the external environment, and one or multiple sets of reference pattern may be formed by the emitted structure light. With respect to the one or multiple sets of reference pattern, the processing unit 10 may generate the depth graph quickly and accurately.
Please refer to
In this embodiment, the processing unit 10 performs a computation (or calculation) based on the images of the target box 3 and the reference patterns 141 in the left image 41 and right image 42 through the depth transforming algorithm, so as to generate a depth graph 5 corresponding to the target box 3.
In particular, the one or multiple sets of reference pattern 141 are constituted by multiple identifiable elements such as points, shapes, pictures, texts, symbols, etc. While the processing unit 10 is generating the depth graph 5, the processing unit 10 searches for same element in the left image 41 and the right image 42 respectively, and calculates corresponding depth information of each element according to the position differences of each element in the left image 41 and in the right image 42. The processing unit 10 may generate the depth graph 5 according to the corresponding depth information of each element. As a result, the depth graph 5 in this embodiment may be more accurate than the depth graph generated without reference pattern 141.
As shown in
In more detail, when the triggering unit 13 receives an external operation from the user (for example being pressed by the user), the processing unit 10 is triggered to control the structure light emitting unit 14 to emit the invisible structure light for forming the reference pattern 141. The processing unit 10 is also triggered by the triggering unit 13 to control the first camera 11 and the second camera 12 to respectively capture the left image 41 and the right image 42. Hence, the processing unit 10 may perform the aforementioned computation (or calculation) based on the left image 41 and the right image 42 for obtaining the volume related data of the target box contained in the left image 41 and the right image 42.
As shown in
Please refer to
More specific, a manufacturer of the measuring apparatus 1 may preset parameters of the measuring apparatus 1 while manufacturing, such that the parameters like focal length, FoV, and resolution of the first camera 11 and the second camera 12 may be related to the size and shape of the guiding object 151. When the user moves while holding the measuring apparatus 1, the user may align a horizontal line of the guiding object 151 emitted by the guiding unit 15 with the middle line of the target box 3, and the user may adjust the distance between the measuring apparatus 1 and the target box 3 for the horizontal line of the guiding object 151 to be equal to or greater than the width of the target box 3, so as to have the target box 3 to be located in an effective image capturing range of the first camera 11 and the second camera 12. In this situation, the first camera 11 and the second camera 12 may respectively capture an effective left image and an effective right image.
In other embodiment, the processing unit 10 may use the relation of the guiding object 151 and the target box 3 as a triggering basis. In particular, the processing unit 10 in this embodiment may control the guiding unit 15 to emit the guiding object 151, and continually determines whether the horizontal line of the guiding object 151 is aligned with the middle line of the target box 3, and whether the horizontal line of the guiding object 151 is equal to or greater than the width of the target box 3. The processing unit 10, in this embodiment, may automatically control the structure light emitting unit 14 to emit the reference pattern 141 and control the first camera 11 and the second camera 12 to capture the left image and the right image when the horizontal line of the guiding object 151 is aligned with the middle line of the target box 3 and is equal to or greater than the width of the target box 3. The above description is only one of the exemplary embodiments of the present invention, not limited thereto.
As shown in
As shown in
As shown in
As shown in
As shown in
It should be noted that, in addition to the above procedures, the measuring method of the present invention may further include a data verifying procedure optionally executed by the processing unit 10 for verifying the volume related data (for example, the processing unit 10 may calculate the volume related data of the target box 3 for multiple times, and compares the similarity of the multiple results of the volume related data in order to verify the accuracy of the volume related data), a data displaying procedure for displaying the calculated volume related data (for example, displaying the volume related data through the displaying unit 16 or the external electronic device), a data printing procedure for printing the volume related data (for example, printing the volume related data through the printing unit 18), etc., but not limited thereto.
In particular, the measuring method of the present invention relates to a technical solution which determines the volume related data of the target box 3 based on the depth information of the captured images. In the embodiment, the aforementioned image obtaining procedure is to generate the depth graph 5 as a calculation basis of the processing unit 10, the aforementioned position pre-determining procedure is to determine whether a rectangular object is located in the depth graph 5 and retrieve multiple boundary lines of the rectangular object when the rectangular object is located, and the measuring procedure is to calculate the volume related data including the width, the height, and the length of the rectangular object based on the multiple boundary lines of the rectangular object.
Please refer to
In one embodiment, the processing unit 10 may control the structure light emitting unit 14 to emit invisible structure light to the external environment after being triggered, and the reference pattern 141 is formed on the target box 3 through the invisible structure light. The processing unit 10 controls the first camera 11 and the second camera 12 to respectively capture the left image 41 and the right image 42. In other words, both the left image 41 and the right image 42 in this embodiment contain the image of the entire target box 3 and the image of the reference pattern 141 around the target box 3.
The processing unit 10 performs a calculation to the left image 41 and the right image 42 through a depth transforming algorithm for generating a depth graph 5 (step S12) correspondingly. As shown in
In the position pre-determining procedure (step S30), the processing unit 10 generates multiple scanning lines through executing an algorithm (such as the depth transforming algorithm). The multiple scanning lines may be, for example, the multiple vertical scanning lines 61 as shown in
It should be noted that the bottom line 32 may be a lower-bottom line of the target box 3 or an upper-bottom line of the target box 3. In other embodiment, the processing unit 10 may scan the target box 3 to retrieve the lower-bottom line and the upper-bottom line of the target box 3 simultaneously, not limited thereto.
After the step S31, the processing unit 10 determines whether the middle line, the bottom line, the left-side line, and the right-side line of the target box 3 are obtained (step S32), and the processing unit 10 determines that a position pre-determination for determining the position of the target box 3 fails if any one of the middle line, the bottom line, the left-side line, and the right-side line is not obtained (step S33). If the position pre-determination fails, it indicates that the target box 3 is not located in the depth graph 5, or the position of the target box 3 is inappropriate for measurement (for example, the target box 3 is put closely to wall and has similar depth with the wall), so the processing unit 10 stops performing the following measuring procedure. On the other hand, the processing unit 10 determines that the position pre-determination succeeds when the middle line, the bottom line (one of the upper-bottom line and the lower-bottom line), the left-side line, and the right-side line of the target box 3 are obtained, and the processing unit 10 proceeds to perform the measuring procedure to measure the volume related data of the target box 3.
In the measuring procedure (step S50), the processing unit 10 scans the target box 3 in the depth graph 5 within a range of the middle line, the bottom line, the left-side line, and the right-side line through the multiple scanning lines to obtain multiple width information, multiple height information, and multiple length information of the target box 3 (step S51). The processing unit 10 calculates the actual volume related data of the target box 3 according to the multiple width information, the multiple height information and the multiple length information (step S52). In this embodiment, the volume related data at least includes the width, the height, and the length of the target box 3.
After the step S52, the processing unit 10 outputs the calculated width, height, and length of the target box 3 (step S70). For example, the processing unit 10 displays the data through the displaying unit 16, or prints the data through the printing unit 18, or transmits the data to an external electronic device for displaying, not limited thereto.
It should be noted that, when the width, the height, and the length of the target box 3 are calculated, the processing unit 10 may further inquire the look up table 191 stored in the storage unit 19 according to the volume related data, so as to obtain a corresponding fee, such as a packaging fee, a transportation fee, or a mailing fee, of the target box 3, and displays the corresponding fee through the displaying unit 16 or the external electronic device, or prints the corresponding fee through the printing unit 18. Therefore, the user may easily use the measuring apparatus 1 and measuring method provided by the present invention.
As disclosed in
When the triggering unit 13 is triggered, the processing unit 10 controls the structure light emitting unit 14 to emit the reference pattern 141 to the external environment by the invisible structure light (step S23). The reference pattern 141 is projected onto the target box 3. The processing unit 10 controls the first camera 11 to capture the left image 41 (step S24), and controls the second camera 12 to capture the right image 42 (step S25). In the embodiment, the left image 41 contains both the image of the entire target box 3 and the image of the reference pattern 141, and the right image 42 contains both the image of the entire target box 3 and the image of the reference pattern 141. The image of the reference pattern 141 covers the image of the target box 3 (as shown in
After the step S25, the processing unit 10 may generate a depth graph 5 correspondingly through performing a calculation to the images of the target box 3 and the reference pattern 141 in the left image 41 and the right image 42 (step S26). The depth information of the target box 3 and the depth information of the background around the target box 3 are recorded in the depth graph 5.
In one of the exemplary embodiments, the user may operate the measuring apparatus 1 and trigger the triggering unit 13 when a horizontal line of the guiding object 151 is aligned with the middle line of the target box 3 and the length of the horizontal line is equal to or greater than the width of the target box 3. When the triggering unit 13 is triggered, the processing unit 10 controls the structure light emitting unit 14 to emit the reference pattern 141, and controls the first camera 11 and the second camera 12 to respectively capture the left image 41 and the right image 42.
In other embodiment, the processing unit 10 may continually check (or determine) the distance and the relative positions between the guiding object 151 and the target box 3, and automatically controls the structure light emitting unit 14, the first camera 11, and the second camera 12 when the horizontal line of the guiding object 151 is aligned with the middle line of the target box 3 and the length of the horizontal line is equal to or greater than the width of the target box 3. The above descriptions are only few embodiments of the present invention, not limited thereto.
Please refer to
Refer to
It is worth saying that, as shown in
In particular, the boundary lines of the target box 3 have specific characteristics, so the processing unit 10 may determine multiple points which match with the characteristics as the characteristic points 310, 320 as mentioned above. For example, the processing unit 10 may determine the shallowest points comparing to the adjacent points on each vertical scanning line 61 as the characteristic points 310, determine the deepest points comparing to the adjacent points on each vertical scanning line 61 as the characteristic points 320, or determine the points, on each vertical scanning lines 61, having greatest depth difference with the adjacent points as the characteristic points of the upper-bottom line. The above description is only one of the exemplary embodiments, but not limited to the above embodiment.
As shown in
Similar to the disclosure of
After the step S41, the step S42, and the step S43, the processing unit 10 determines whether any of the middle line 31, the bottom line 32, the left-side line 33, and the right-side line 34 of the target box 3 may not be successfully obtained (step S44), determines that the position pre-determining procedure fails if any one of the boundary lines of the target box 3 is not obtained (step S45), and determines that the position pre-determining procedure succeeds when every boundary line of the target box 3 are obtained (step S46). If the position pre-determining procedure succeeds, it means that the position of the target box 3 in the depth graph 5 is determined.
If the position pre-determining procedure fails, the processing unit 10 in this embodiment stops determining and the following measuring procedure. On the other hand, if the position pre-determining procedure succeeds, the processing unit 10 proceeds to perform the measuring procedure for calculating the volume related data of the target box 3 according to the retrieved middle line 31, the bottom line 32, the left-side line 33, and the right-side line 34 of the target box 3.
Please refer to
As shown in
In one of the exemplary embodiments, the processing unit 10 generates and applies multiple horizontal scanning line 62 to scan the target box 3 within a range of the left-side line 33 and the right-side line 34, so as to obtain multiple width information of the target box 3 (step S61).
As shown in
Similarly, the processing unit 10 may generate and apply multiple vertical scanning line 61 to scan the target box 3 within a range of the middle line 31 and the bottom line 32, so as to obtain multiple height information and multiple length information of the target box 3 (step S62).
As shown in
As shown in
It should be noted that, each characteristic point upon the upper-bottom line of the target box 3 has huge depth difference comparing to the adjacent point in the background, hence, even the processing unit 10 does not retrieve the position of the upper-bottom line of the target box 3 during the above position pre-determining procedure, the algorithm may still determine the position of each characteristic point upon the upper-bottom line and calculate the multiple length information in the step S62.
In the present invention, the processing unit 10 may calculate the width information, height information, and length information of the target box 3 in any order depending on the demand, a strict executing order is not imposed to the step S61 and the step S62.
When the multiple of width information, height information, and length information are obtained, the processing unit 10 randomly groups the multiple width information for generating multiple width groups, randomly groups the multiple height information for generating multiple height groups, and randomly groups the multiple length information for generating multiple length groups (step S63). The processing unit 10 determines whether the width groups (i.e., multiple widths) match with each other, whether the height groups (i.e., multiple heights) match with each other, and whether the length groups (i.e., multiple lengths) match with each other (step S64).
For an instance, the processing unit 10 may select a first, a fourth, a seventh and a tenth width information of the multiple width information as a first width group, select a second, a fifth, an eighth and an eleventh width information of the multiple width information as a second width group, select a third, a sixth, a ninth and a twelfth width information of the multiple width information as a third width group, and select the fourth, the seventh, the tenth and a thirteenth width information of the multiple width information as a fourth width group. The processing unit 10 calculates a sum or an average of width of the first width group, calculates a sum or an average of width of the second width group, calculates a sum or an average of width of the third width group, and calculates a sum or an average of width of the fourth width group. After that, the processing unit 10 compares these sums or averages with each other, and determines that the width groups match with each other if the differences of these sums or averages are smaller than a pre-determined tolerance value.
If the processing unit 10 determines that any of the multiple width groups, the multiple height groups, and the multiple length groups are not matched, the processing unit 10 determines that the measuring procedure fails (step S65). On the contrary, if the processing unit 10 determines that the multiple width groups are all matched with each other, the multiple height groups are all matched with each other, and the multiple length groups are all matched with each other, the processing unit 10 determines that the measuring procedure succeeds (step S66).
If the measuring procedure succeeds, the processing unit 10 respectively calculates an average of the multiple width information, calculates an average of the multiple height information, and calculates an average of the multiple length information, and the processing unit 10 uses these averages respectively as the volume related data of the target box 3 (step S67). In particular, the processing unit 10 uses the average of the multiple width information as a width of the target box 3, uses the average of the multiple height information as a height of the target box 3, and uses the average of the multiple length information as a length of the target box 3.
By way of the measuring apparatus 1 and the measuring method of the present invention, the staffs of the warehousing companies and transportation companies may measure the volume of any kind of boxes anytime and anywhere, which is convenient for the users.
As the skilled person will appreciate, various changes and modifications can be made to the described embodiment. It is intended to include all such variations, modifications and equivalents which fall within the scope of the present invention, as defined in the accompanying claims.
Number | Date | Country | Kind |
---|---|---|---|
108144168 | Dec 2019 | TW | national |
Number | Name | Date | Kind |
---|---|---|---|
7463783 | Dugan | Dec 2008 | B1 |
20050211108 | Carabin | Sep 2005 | A1 |
20170091957 | Driegen et al. | Mar 2017 | A1 |
20180321383 | Heidemann et al. | Nov 2018 | A1 |
20190213389 | Peruch | Jul 2019 | A1 |
Entry |
---|
Search Report dated Apr. 26, 2021 of the corresponding European patent application No. 20207455.5. |
Number | Date | Country | |
---|---|---|---|
20210166413 A1 | Jun 2021 | US |