The present disclosure relates to a measurement device and a measurement method which measure dimensions of an object.
Patent Literature (PTL) 1 discloses a dimension measurement device that measures dimensions of a cargo placed on a platform or a cargo attached with a platform. This dimension measurement device transmits a measurement wave, receives the measurement wave that is reflected, and generates a distance image. The dimension measurement device individually generates a cargo distance image that includes the cargo and the platform and a background distance image that does not include the cargo or the platform. Based on a difference between the cargo distance image and the background distance image, the dimension measurement device generates a distance image showing a shape of the cargo or the cargo attached with the platform. This makes it possible for the dimension measurement device to appropriately measure the dimensions of the cargo or the cargo attached with the platform.
PTL 1 is WO 2016/199366.
The present disclosure provides a measurement device and a measurement method which accurately measure an object including a platform and a load placed on the platform.
A measurement device according to the present disclosure is a measurement device that measures a size of an outer shape of an object, the object including a platform present on a floor surface and a load placed on the platform. The measurement device includes: an acquisition unit that acquires depth information indicating distances from a reference position to the floor surface and the object; a storage that stores standard dimension information indicating a standard size of the platform; a controller that measures width, depth, and height dimensions of the object by identifying the platform based on the depth information and the standard dimension information, and generates measurement information indicating the measured width, depth, and height dimensions; and an output unit that outputs the measurement information.
These general and specific aspects may be achieved by a system, a method, and a computer program, and any combination of these.
A measurement method according to the present disclosure is a measurement method for measuring a size of an outer shape of an object, the object including a platform present on a floor surface and a load placed on the platform. The measurement method includes: a step of acquiring depth information indicating distances from a reference position to the floor surface and the object; a step of acquiring standard dimension information indicating a standard size of the platform; a step of measuring width, depth, and height dimensions of the object by identifying the platform based on the depth information and the standard dimension information, and generating measurement information indicating the width, depth, and height dimensions measured in the measuring; and a step of outputting the measurement information generated in the generating.
The measurement device and the measurement method in the present disclosure measure the width, depth, and height dimensions of the object by identifying the platform based on the depth information indicating the distance from the reference position to the object including the platform present on the floor surface and the load placed on the platform and based on the standard dimension information indicating the standard size of the platform. Thus, the outer shape of the object including the platform and the load placed on the platform can be accurately measured.
Exemplary embodiments will be described below in detail with appropriate reference to the drawings. However, detailed descriptions more than necessary may be omitted. For example, a detailed description of a matter which is already well-known, or a repetitive description for a substantially identical configuration may be omitted. Such omissions are made in order to avoid unnecessary redundancy of the following description and to facilitate the understanding of those skilled in the art. The inventors provide the accompanying drawings and the following description to help those skilled in the art sufficiently understand the present disclosure, and therefore have no intention to put any limitation by those drawings and description on subject matters described in claims.
A pallet is used as a platform for placing a load thereon in logistics and the like. A region occupied by the load placed on the pallet in a warehouse, a truck, or the like has dimensions in which a bottom area is a pallet size and a total height is the sum of heights of the pallet and the load placed on the pallet. Hence, it is necessary to measure not only the dimensions of the load placed on the pallet but also the entirety of the pallet and the load placed on the pallet. In order to photograph, by a depth camera, the entirety of the pallet and the load placed on the pallet, it is necessary to photograph the entirety of the pallet and the load from a distance such that the entirety of the pallet and the load can be photographed. However, photographing from a distant distance increases noise in depth information obtained from the depth camera, and decreases accuracy of the depth information. In particular, some pallets for use in logistics and the like are slatted, and the pallets are provided with insertion holes for forklifts. In addition, the load on such a pallet comes in various colors and materials. For example, in the case of using the infrared active stereo method, it is difficult to detect a depth in a gap, an uneven portion, a black material, and the like, and missing of data is likely to occur in the depth information. Hence, it has been impossible to obtain accurate depth information, and it has been impossible to accurately measure the dimensions of the pallet and the load.
A measurement device of the present disclosure accurately measures a width and depth of the pallet and a height from a floor surface to a highest point of the load. Specifically, the measurement device of the present disclosure uses standard dimension information, which indicates a standard size of the pallet, together with the depth information obtained by photographing the pallet and the load. The size of the pallet for use in logistics and the like is standardized to several types in each country, region, or the like. Hence, by using the standard dimension information together with the depth information, the type of pallet can be identified accurately, and it is possible to accurately measure the width and depth of the pallet and the height from the floor surface to the highest point of the load. Thus, even when the accuracy of the depth information is not good, it is possible to perform the accurate measurement. Moreover, even when the load placed on the pallet is smaller than a bottom area of the pallet, it is possible to accurately measure the region occupied by the load placed on the pallet. The measurement device of the present disclosure will be described below in detail.
A first exemplary embodiment will be described below with reference to the drawings.
A configuration of a measurement device of the present exemplary embodiment will be described with reference to
Touch screen 110 includes display unit 111 and operation unit 112. Display unit 111 is configured with, for example, a liquid crystal display or an organic electroluminescence (EL) display. Operation unit 112 is a user interface that receives a variety of operations by a user. In the present exemplary embodiment, operation unit 112 is a touch panel provided on the surface of display unit 111. Operation unit 112 detects a touch operation by a user's finger or a pointing device such as a pen. Operation unit 112 includes, for example, an electrode film. For example, controller 130 measures a change in voltage or a change in electrostatic capacity, which is caused by the fact that the finger or the pointing device comes into contact with operation unit 112, and can thereby identify a contact position of the finger or the pointing device. Note that operation unit 112 may be configured with a keyboard, buttons, switches, or any combination of these as well as the touch panel.
Depth camera 120 generates depth information indicating a distance from a reference position to a subject. Specifically, depth camera 120 measures the distance to the subject, and generates a depth image in which the measured distance is indicated by a depth value for each pixel. Depth camera 120 is, for example, an infrared active stereo camera. In the present exemplary embodiment, the subject includes a floor surface, a pallet put on the floor surface, and a load placed on the pallet. Depth camera 120 is configured by implementing various known techniques such as an active stereo system and a time of flight (TOF) system. For example, measurement device 100 may include two depth cameras 120, in which case the distance may be calculated based on a parallax of two images. Measurement device 100 may include one depth camera 120, in which case the distance may be calculated from a time taken for emitted infrared rays to hit an object and for the reflected light to return. Depth camera 120 corresponds to an acquisition unit that acquires depth information indicating distances from the reference position to the floor surface and the object.
Controller 130 is configurable with a semiconductor element or the like. Controller 130 can be configured with, for example, a microcomputer, a central processing unit (CPU), a micro processing unit (MPU), a graphics processing unit (GPU), a digital signal processor (DSP), a field programmable gate array (FPGA), or an application specific integrated circuit (ASIC). Functions of controller 130 may be implemented only by hardware or may be implemented by a combination of hardware and software. Controller 130 reads out data and programs stored in storage 140 to perform various arithmetic processing, and thus implements predetermined functions.
Storage 140 is a storage medium that stores a program and data necessary to achieve functions of measurement device 100. Storage 140 can be configured with, for example, a hard disk (HDD), a solid state drive (SSD), a random access memory (RAM), a dynamic RAM (DRAM), a ferroelectric memory, a flash memory, a magnetic disk, or any combination of these.
Communication unit 150 includes a circuit that communicates with an external device in accordance with a predetermined communication standard. The predetermined communication standard is, for example, a local area network (LAN), Wi-Fi (registered trademark), Bluetooth (registered trademark), a universal serial bus (USB), and HDMI (registered trademark).
Measurement device 100 calculates width W200, depth D200, and height H200 of object 200 with reference to the depth image. Width W200 and depth D200 of object 200 are a width and depth of pallet 210. Height H200 of object 200 is a height of a highest point of load 220 from the floor surface.
Depth information 141 indicating depth image 141p generated by depth camera 120 is stored in storage 140. Coordinate convertor 131 converts the two-dimensional coordinates and depth value of depth information 141 into three-dimensional coordinates with depth camera 120 taken as an origin, and generates three-dimensional coordinate information 142.
Floor surface estimation unit 132 estimates a region of the floor surface based on depth information 141 and three-dimensional coordinate information 142, and generates a plane equation of the floor surface. Hereinafter, the plane equation of the floor surface will be also referred to as a “floor surface equation”.
Standard dimension information 143 indicating a standard size of pallet 210 is stored in storage 140 in advance.
Pallet estimation unit 133 estimates the width, depth and position of pallet 210 based on the floor surface equation, depth information 141, three-dimensional coordinate information 142, and standard dimension information 143.
Highest point estimation unit 134 estimates the height from the floor surface to the highest point of the load based on the floor surface equation, the width, depth, and a position of pallet 210, and three-dimensional coordinate information 142.
Controller 130 generates measurement information 144 including the width and depth of pallet 210, which are estimated by pallet estimation unit 133, and the height of the load estimated by highest point estimation unit 134.
An operation of measurement device 100 of the present exemplary embodiment will be described with reference to
Coordinate convertor 131 converts the two-dimensional coordinates and depth value of depth information 141 into three-dimensional coordinates with depth camera 120 taken as an origin, and generates three-dimensional coordinate information 142 (Step S2). Thus, the pixel containing information on the depth value is converted into a point in a three-dimensional coordinate system.
Floor surface estimation unit 132 estimates the floor surface in depth image 141p (Step S3). Specifically, floor surface estimation unit 132 generates the plane equation of the floor surface based on three-dimensional coordinate information 142.
Pallet estimation unit 133 estimates the width and depth of pallet 210 reflected in depth image 141p and the position of pallet 210 (Step S4).
Highest point estimation unit 134 estimates the height of the highest point of load 220 reflected in depth image 141p (Step S5).
Controller 130 generates and outputs measurement information 144 including the width and depth of pallet 210, which are estimated in Steps S4 and S5, and the height of the highest point of load 220 (Step S6).
The generation of the floor surface equation will be described with reference to
Floor surface estimation unit 132 estimates that lower side proximity region 31 of depth image 141p is a region of the floor surface, and selects at least three points from the pixels in lower side proximity region 31 (Step S301). Lower side proximity region 31 is, for example, a region of “20×20” pixels near the lower side of depth image 141p. A size of lower side proximity region 31 may be changed in response to resolution of depth camera 120.
Floor surface estimation unit 132 calculates a normal vector (a, b, c) based on three-dimensional coordinates of three selected points (Step S302). For example, floor surface estimation unit 132 generates two vectors each of which connects two of the three points to each other, and calculates the normal vector from a cross product of the two vectors. At this time, floor surface estimation unit 132 may calculate a plurality of normal vectors from three or more different points in lower side proximity region 31, or may calculate a normal vector for each of a plurality of lower side proximity regions 31. In this case, floor surface estimation unit 132 may determine the normal vector of the floor surface by averaging such a plurality of the calculated normal vectors. This improves accuracy of the normal vector.
Floor surface estimation unit 132 calculates constant d of the plane equation of the floor surface with a height of the floor surface taken as zero based on three-dimensional coordinates of any point in lower side proximity region 31, for example, of the points selected in Step S301 and based on the normal vector calculated in Step S302 (Step S303). Floor surface estimation unit 132 may determine constant d by one point, or may determine constant d by averaging constants d calculated from a plurality of points in lower side proximity region 31. By the averaging, accuracy of constant d is improved.
Estimation of the pallet will be described with reference to
Pallet estimation unit 133 reads out standard dimension information 143 from storage 140, and thereby acquires standard dimension information 143 (Step S401).
Pallet estimation unit 133 detects points near the height of the pallet, which are indicated by standard dimension information 143, from among the points indicated by three-dimensional coordinate information 142, and identifies nearest point A closest to depth camera 120 among the detected points (Step S402). The proximity of the height of the pallet corresponds to a range from “standard height×α (for example, α=0.8)” to the standard height. Specifically, the proximity of the height of the pallet includes the vicinity of a height of girder plate 211 and deck board 212 which are illustrated in
[Equation 1]
√{square root over (a2+b2+c2)}=1 (1)
[Equation 2]
h
0
=|ax
0+by0+cz0+d| (2)
Based on depth information 141, pallet estimation unit 133 identifies, as nearest point A, a point with a smallest depth value from among a plurality of points where height h0 from the floor surface, which is calculated by Equation (2), is in the proximity of the height indicated by standard dimension information 143.
Pallet estimation unit 133 searches for, on a straight line, points which continue with one another from nearest point A in the proximity of the pallet height indicated by standard dimension information 143, that is, points with the same height as that of nearest point A, and identifies both ends of the searched points as left end point B and right end point C of pallet 210 (Step S403).
Pallet estimation unit 133 compares a distance between A and B and a distance between A and C with the width and the depth which are indicated by standard dimension information 143, and identifies a type of pallet 210 (Step S404). For example, pallet estimation unit 133 individually calculates the distance between A and B and the distance between A and C based on three-dimensional coordinate information 142. If the distance between A and B is within a range of “80 cm±α” (for example, α=9 cm) and the distance between A and C is within a range of “120 cm±β” (for example, β=9 cm), then pallet estimation unit 133 determines that the type of pallet 210 is “pallet I”. Based on a result of this determination, pallet estimation unit 133 estimates “AB=80 cm” and “AC=120 cm”. Thus, dimensions of width W200 and depth D200 of object 200 as illustrated in
When there are three types of pallets illustrated in
Pallet estimation unit 133 specifies a position of point D based on the identified type of pallet type, and estimates a region of pallet 210 (Step S405). Specifically, a parallelogram including nearest point A, left end point B, and right end point C is estimated as the region of pallet 210. Thus, the position of pallet 210 in three-dimensional coordinate information 142 is estimated.
The height estimation of the load will be described with reference to
Highest point estimation unit 134 calculates a height of a point from the floor surface, the point being present in three-dimensional space 400 with the estimated region of pallet 210 taken as a bottom plane (Step S501). Three-dimensional space 400 is a space that takes, as side planes, plane P1 including side AB and a normal of the floor surface, plane P2 including side CD and a normal of the floor surface, plane P3 including side AC and a normal of the floor surface, and plane P4 including side BD and a normal of the floor surface. For example, highest point estimation unit 134 calculates plane equations of planes P1, P2, P3, and P4. Highest point estimation unit 134 considers, as points present on pallet 210, points having coordinates between plane P1 and plane P2 and between plane P3 and plane P4. Note that a bottom plane region of three-dimensional space 400 may be made larger than the estimated region of pallet 210. This makes it possible to eliminate an influence of an error of depth information 141. The height of each point from the floor surface can be calculated by above-mentioned Equation (2).
Highest point estimation unit 134 determines a highest point, of which height is the highest among the calculated heights, as a height of the highest point of the load from the floor surface (Step S502). Thus, the height of the point in top plane T220 of load 220 is estimated as height H200 of object 200. Note that continuity of such heights calculated in three-dimensional space 400 may be verified, and points without continuity may be excluded from the determination of the highest point. Thus, the influence of the noise of depth information 141 can be removed.
Measurement device 100 of the present exemplary embodiment measures a size of an outer shape of object 200 including pallet 210 present on the floor surface and load 220 placed on pallet 210. Measurement device 100 includes: an acquisition unit that acquires depth information 141 indicating the distances from the reference position to the floor surface and object 200; storage 140 that stores standard dimension information 143 indicating the standard size of pallet 210; controller 130 that measures the width, depth and height dimensions of object 200 by identifying pallet 210 based on depth information 141 and standard dimension information 143, and generates measurement information 144 indicating the measured width, depth, and height dimensions of object 200; and the output unit that outputs measurement information 144.
Since measurement device 100 uses both depth information 141 and standard dimension information 143, measurement device 100 can accurately measure object 200.
The acquisition unit includes depth camera 120 that photographs the floor surface and object 200 as depth information 141, and generates depth image 141p indicating the distances to the floor surface and object 200 as a depth value for each pixel.
The output unit is, for example, display unit 111 that outputs measurement information 144 to the screen. The output unit may be controller 130 that outputs measurement information 144 to storage 140. The output unit may be communication unit 150 that outputs measurement information 144 to the outside.
Controller 130 generates three-dimensional coordinate information 142 obtained by converting each pixel of depth image 141p into a point in the three-dimensional coordinate system, and calculates the plane equation of the floor surface based on three-dimensional coordinate information 142.
This makes it possible to calculate the height of the point with the floor surface taken as a reference.
Controller 130 estimates that lower side proximity region 31 of a predetermined size in depth image 141p is a region of the floor surface, and calculates the plane equation of the floor surface from the points in lower side proximity region 31. Controller 130 performs photography such that the floor is reflected at a lower portion of the screen, calculates the plane equation from the lower side proximity region, and can thereby calculate the floor surface equation.
Standard dimension information 143 includes the standard height of pallet 210. Controller 130 calculates the height of each point based on the three-dimensional coordinates of each point and the plane equation of the floor surface, and estimates the contour of pallet 210 based on the point where the calculated height is in the proximity of the standard height. Specifically, controller 130 detects nearest point A, searches for a point at the same height as that of nearest point A on the straight line, and detects left end point B and right end point C. Since controller 130 estimates side AB and side AC, which are the contour, based on the standard height, controller 130 can accurately estimate the contour.
Standard dimension information 143 includes the standard width and standard depth of pallet 210. Controller 130 calculates the width and depth of pallet 210 from the estimated contour of pallet 210 based on three-dimensional coordinate information 142, compares the calculated width and depth with the standard width and the standard depth, identifies the type of pallet 210, and estimates the width and depth of pallet 210. Controller 130 can accurately estimate the width and depth of object 200 by comparing the same with the standard width and the standard depth.
Controller 130 estimates, as the height of load 220, the highest point where the height of the point calculated by three-dimensional coordinate information 142 and the plane equation of the floor surface is the highest in the three-dimensional space with estimated pallet 210 taken as the bottom plane. The position of pallet 210 is accurately estimated, whereby the three-dimensional space in which load 220 is present can be accurately estimated. Hence, the highest point can be accurately estimated.
A measurement method of the present exemplary embodiment is a method by which controller 130 of the computer measures the size of the outer shape of object 200 including pallet 210 present on the floor surface and load 220 placed on pallet 210. The measurement method includes: Step S1 of acquiring, from the acquisition unit, depth information 141 indicating the distances from the reference position to the floor surface and object 200; Step S401 of acquiring, from storage 140, standard dimension information 143 indicating the standard size of pallet 210; Steps S4 to S6 of identifying pallet 210 based on depth information 141 and standard dimension information 143, measuring the width, depth, and height dimensions of object 200, and generating measurement information 144 indicating the measured width, depth, and height dimensions; and Step S6 of outputting measurement information 144 to the output unit. Since the measurement method uses both depth information 141 and standard dimension information 143, the measurement method can accurately measure object 200.
In the present exemplary embodiment, in Step S301 of
In identifying nearest point A in Step S402 of
In identifying left end point B and right end point C in Step S403 of
All types of pallets to be detected may be assumed, and with regard to spots where there are no insertion holes 215 as illustrated in
The points on left side AB and right side AC, which are being searched for, may be defined as point B′ and point C′, respectively, and may be added to extension lines of segment AB′ and segment AC′, and it may be evaluated whether or not there is a point on a straight line that is orthogonal to segment AC′ and passes through nearest point A and a straight line that is orthogonal to segment AB′ and passes through nearest point A. When such a point is detected, it may be considered that there is a point at the position with the height of the pallet, and the search may be continued. Thus, the false recognition can be reduced even when the accuracy of the depth information is low or there is a data loss.
A second exemplary embodiment is different from the first exemplary embodiment in the estimation method of the pallet. In the first exemplary embodiment, the points of which heights calculated based on three-dimensional coordinate information 142 are close to the standard height are detected, whereby nearest point A, left end point B and right end point C are detected. In the present exemplary embodiment, nearest point A, left end point B, and right end point C are detected based on a plane in which orientations of normal vectors are the same.
Such estimation of the pallet in the second exemplary embodiment will be described with reference to
Pallet estimation unit 133 calculates normal vectors corresponding to the respective pixels in depth image 141p based on three-dimensional coordinate information 142, and detects a plane in which orientations of the calculated normal vectors are the same (Step S411).
Pallet estimation unit 133 reads out standard dimension information 143 from storage 140, and thereby acquires standard dimension information 143 (Step S412).
Pallet estimation unit 133 extracts a region in the detected plane, where height h0 of the point from the floor surface is close to the pallet height indicated by standard dimension information 143 (Step S413). The extracted region corresponds to two straight lines. The extracted linear region is estimated as the contour of pallet 210.
Pallet estimation unit 133 identifies nearest point A, left end point B, and right end point C from the two extracted linear regions (Step S414). For example, pallet estimation unit 133 identifies an end of a left line as left end point B and an end of a right line as right end point C. Pallet estimation unit 133 identifies an intersection of the two lines as nearest point A.
Processing (Step S415 and Step S416) after identifying nearest point A, left end point B, and right end point C is the same as that of the first exemplary embodiment (Step S404 and Step S405 illustrated in
As described above, standard dimension information 143 includes the standard height of pallet 210. Controller 130 calculates the normal vectors of the points corresponding to the respective pixels in depth image 141p based on three-dimensional coordinate information 142. Controller 130 detects the plane in which the orientations of the calculated normal vectors are in the same direction. Moreover, controller 130 calculates the height of each point based on the three-dimensional coordinates of each point and the plane equation of the floor surface. Controller 130 estimates, as the contour of pallet 210, a portion in which the height calculated in the detected plane is close to the standard height. Thus, the contour of pallet 210 can be accurately estimated.
In the first and second exemplary embodiments, the measurements were taken based on depth information 141 obtained from depth camera 120. In a third exemplary embodiment, the measurement will be taken using, in addition to depth information 141, color information obtained from a visible light camera.
Controller 130 acquires the color information from visible light camera 160 (Step S23). The color information includes an RGB value for each pixel identified by two-dimensional coordinates. The two-dimensional coordinates of the color information and the two-dimensional coordinates of the depth information are associated with each other according to positions of depth camera 120 and visible light camera 160. For example, when depth camera 120 and visible light camera 160 are achieved by one camera, the two-dimensional coordinates of the color information and the two-dimensional coordinates of the depth information are the same. That is, each pixel of the depth image and each pixel of the color image have the same coordinate value.
For example, controller 130 performs image processing for the color image based on the color information, and thereby detects contour 230 of object 200 as illustrated in
Controller 130 refers to contour 230 detected based on the color information in addition to depth information 141 and three-dimensional coordinate information 142, and performs estimation of the floor surface (Step S25), estimation of the pallet (Step S26), and height estimation of the load (Step S27). A description will be given below of details of the estimation of the floor surface (Step S25), the estimation of the pallet (Step S26), and the height estimation of the load (Step S27).
Controller 130 selects at least three points from the pixels in the estimated floor surface region (Step S2502). Calculation of a normal vector (Step S2503) and calculation of constant d (Step S2504) after the points are selected are the same as those in the first exemplary embodiment (Steps S302, S303).
The region estimation of pallet 210 (Step S2602), which is based on comparison between the calculated dimensions and standard dimension information 143, is the same as that of the first exemplary embodiment. For example, the region estimation of pallet 210 (Step S2602) corresponds to the identification of the type of pallet 210 (Step S404) and the identification of point D (Step S405), which are based on standard dimension information 143.
As described above, measurement device 103 of the present exemplary embodiment further includes visible light camera 160 that photographs object 200 and generates color information indicating a color image. Controller 130 estimates the region of the floor surface based on the color information, and calculates the plane equation of the floor surface from the points in the estimated region of the floor surface. Controller 130 extracts the contour of pallet 210 by performing image processing for the color image. Controller 130 detects the contour of load 220 by performing the image processing for the color information, and estimates, as the height of load 220, the highest point where the height of the point calculated by three-dimensional coordinate information 142 and the plane equation of the floor surface is the highest in the inside of the detected contour. The measurement accuracy is improved using the depth information and the color information in combination with each other.
A measurement device of a fourth exemplary embodiment will be described with reference to
As described above, measurement device 104 further includes acceleration sensor 170 that detects the gravitational acceleration. Controller 130 calculates the plane equation of the floor surface based on the orientation of the normal vector of the floor surface, which is estimated based on the gravitational acceleration, and three-dimensional coordinate information 142. Thus, even if the orientation of depth camera 120 is tilted in a direction horizontal or vertical to the ground, the floor surface equation can be accurately generated.
A measurement device of a fifth exemplary embodiment will be described with reference to
Controller 130 sets a provisional virtual normal (Step S351). Controller 130 calculates local normal vectors based on three-dimensional coordinate information 142 (Step S352). Controller 130 estimates, as a horizontal plane, points having normals within a certain angle range with respect to the virtual normal (Step S353). Controller 130 calculates, as a normal vector (a′, b′, c′), an average of the normal vectors in the estimated horizontal plane (Step S354). Controller 130 calculates relative heights of the points from three-dimensional coordinate information 142 and the normal vector (a′, b′, c′), and estimates, as the floor surface, a region in which the relative heights are lower than a predetermined value (Step S355). Controller 130 calculates an average of normal vectors in the estimated floor surface as a normal vector (a, b, c) of the floor surface (Step S356). Controller 130 recalculates such heights of the points from three-dimensional coordinate information 142 and the normal vector (a, b, c) of the floor surface, and calculates constant d while taking, as the floor surface, a region with a height lower than a predetermined value (Step S357).
As described above, controller 130 calculates the normal vectors of the points corresponding to the pixels in depth image 141p based on three-dimensional coordinate information 142, and estimates, as the horizontal plane, points having the normal vectors in which the calculated normal vectors are within a certain angle range with respect to a predetermined virtual normal vector. Controller 130 estimates the region of the floor surface based on normal vectors in the estimated horizontal plane and the three-dimensional coordinate information, and calculates the plane equation of the floor surface from the points in the estimated region of the floor surface.
According to the present exemplary embodiment, the normal vectors are calculated from entire depth image 141p without depending on lower side proximity region 31, and accordingly, the normal vectors can be accurately calculated even when the noise around the lower portion of depth image 141p is strong. The measurement device of the present exemplary embodiment is particularly effective when it is possible to estimate an approximate holding angle at the time of photography.
As above, the first to fifth exemplary embodiments have been described as exemplifications of the technique disclosed in the present application. However, the technique in the present disclosure is not limited to the exemplary embodiments and is applicable to exemplary embodiments appropriately subjected to changes, replacements, additions, omissions, and the like. Moreover, a new exemplary embodiment can be made by combining the respective constituent elements described in the above first to fifth exemplary embodiments.
In the above exemplary embodiment, depth camera 120 is built in the measurement device, but depth camera 120 does not have to be built in the measurement device. The measurement device may acquire, via communication unit 150, depth information 141 generated by depth camera 120. In this case, communication unit 150 corresponds to the acquisition unit that acquires depth information 141. Likewise, visible light camera 160 does not have to be built into the measurement device. Acceleration sensor 170 does not have to be built into the measurement device. The measurement device may acquire color information and/or gravitational acceleration information together with depth information 141 via communication unit 150.
It is possible to achieve the measurement device of the present disclosure by cooperation with hardware resources such as a processor, a memory, and a program.
As above, the exemplary embodiments have been described as exemplifications of the technique in the present disclosure. For that purpose, the accompanying drawings and detailed descriptions have been provided. Hence, the constituent elements described in the accompanying drawings and the detailed description may include not only the constituent elements essential for solving the problem but also constituent elements that are not essential for solving the problem in order to illustrate the technique. Therefore, it should not be immediately recognized that such inessential constituent elements are essential by the fact that the inessential constituents are described in the accompanying drawings and the detailed description.
Moreover, since the above exemplary embodiments illustrate the technique in the present disclosure, various modifications, substitutions, additions, omissions and the like can be performed within the scope of claims and equivalent scope of claims.
The present disclosure is applicable to a measurement device and a measurement method which measure the width and depth of the pallet and the height of the load in a state in which the load is placed on the pallet.
Number | Date | Country | Kind |
---|---|---|---|
2018-185332 | Sep 2018 | JP | national |
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2019/036811 | Sep 2019 | US |
Child | 17211921 | US |