This application claims priority of Taiwanese Invention Patent Application No. 107140428, filed on Nov. 14, 2018.
The disclosure relates to a method and a system for inspecting a shoe, and more particularly to a method and a system for inspecting an upper of a shoe.
Referring to
Conventionally, the step of pulling the upper 12 around the shoe last 13 is performed manually or with the assistance of a machine (not shown), and the quality of molding the upper 12 in this step is crucial to the comfort of wearing the shoe. Therefore, a well experienced craftsman is demanded to check the quality of molding the shoe upper 12. However, it takes a significant amount of time to build up experience of a craftsman, and checking the quality is time consuming and labor intensive. Moreover, individual differences in experience among craftsmen makes quality control difficult.
Therefore, an object of the disclosure is to provide a method and a system for inspecting an upper of a shoe that can alleviate at least one of the drawbacks of the prior art.
According to one aspect of the disclosure, the upper is pulled around a last by a shoe lasting machine where the upper and the last cooperatively serve as a work piece. The system includes an optical sub-system and a processor. The optical sub-system is configured to capture at least one image related to the work piece so as to output image data representing the image thus captured. The processor is configured to receive the image data, to establish a work piece model that is a three-dimensional model of the work piece based on the image data, to obtain a cross section of the work piece model along an imaginary cutting plane passing through an imaginary section line, to obtain an entry of section data related to the cross section of the work piece model, and to compare the entry of section data and an entry of predetermined standard data so as to generate a result of inspection indicating whether the upper is normal.
According to another aspect of the disclosure, the method is to be implemented by the processor that is previously described. The method includes steps of:
Other features and advantages of the disclosure will become apparent in the following detailed description of the embodiments with reference to the accompanying drawings, of which:
Before the disclosure is described in greater detail, it should be noted that where considered appropriate, reference numerals or terminal portions of reference numerals have been repeated among the figures to indicate corresponding or analogous elements, which may optionally have similar characteristics.
Referring to
Referring to
Specifically speaking, referring to
The processor 5 is configured to compare the two images 91 of the work piece 3 to obtain plural pairs of corresponding image points (p′) in the two images 91 (only one pair is shown). For each of the pairs of corresponding image points (p′), the processor 5 is configured to determine a disparity of the pair of corresponding image points (p′), and to calculate a set of spatial coordinates of a common point (p) in a three-dimensional space based on the disparity and according to triangulation (such as the computer vision triangulation technique), where the pair of corresponding image points (p′) are projections of the common point (p) in the two images 91. In this embodiment, the set of spatial coordinates of the common point (p) is represented as P(x,y,z) in a Cartesian coordinate system, but implementation of representation of the set of spatial coordinates of the common point (p) is not limited to the disclosure herein and may vary in other embodiments. The processor 5 is configured to obtain a point cloud associated with the work piece 3, where the point cloud is formed by multiple common points (p) each correlating to a respective one of the pairs of corresponding image points (p′) in the three-dimensional space, and to establish the work piece model 3′ based on the set of spatial coordinates of the common points (p) in the point cloud.
It is worth to note that implementations of calculating the set of spatial coordinates of the common point (p) and establishing the work piece model 3′ have been well known to one skilled in the relevant art, e.g., technologies of computer stereo vision, so detailed explanation of the same is omitted herein for the sake of brevity.
Referring to
Referring to
In step 601, the processor 5 controls each of the two image capturing devices 41 that are spaced apart from each other to capture an image 91 of the work piece 3, and to output image data (O) representing the image 91 of the work piece 3 to the processor 5.
In step 602, the processor 5 obtains the two images 91 of the work piece 3 from the image data (O) received from the two image capturing devices 41.
In step 603, the processor 5 compares the two images 91 of the work piece 3 to obtain plural pairs of corresponding image points (p′) in the two images 91. It is noted that only one pair of corresponding image points (p′) is illustrated in
In step 604, for each of the pairs of corresponding image points (p′), the processor 5 determines a disparity of the pair of corresponding image points (p′), and calculates a set of spatial coordinates of a common point (p) corresponding to the pair of corresponding image points (p′) in three-dimensional space based on the disparity and according to computer vision triangulation techniques, where the pair of corresponding image points (p′) are projections of the common point (p) in the two images 91.
In step 605, the processor 5 obtains a point cloud associated with the work piece 3, where the point cloud is formed by the common points (p) of the respective pairs of corresponding image points (p′) in the three-dimensional space.
In step 606, the processor 5 establishes the work piece model 3′ based on the set of spatial coordinates of the common points (p) in the point cloud. Then, a flow of procedure proceeds to the inspection procedure 7.
Referring to
In step 701, after the system performs optical scanning on the work piece 3 so as to establish the work piece model 3′ that is the three-dimensional model of the work piece 3 in the model establishment procedure 6, the processor 5 obtains the cross section 30 of the work piece model 3′ along the imaginary cutting plane passing through the imaginary section line (L) on an X-Z plane as shown in
In step 702, the processor 5 obtains an entry of section data (D) related to the cross section 30 of the work piece model 3′.
In step 703, the processor 5 compares the entry of section data (D) and an entry of predetermined standard data (T) so as to determine whether the entry of section data (D) matches the entry of predetermined standard data (T). When it is determined that the entry of section data (D) matches the entry of predetermined standard data (T), the flow of procedure of this method proceeds to step 704. Otherwise, the flow of procedure proceeds to step 705.
In step 704, the processor 5 generates the result of inspection indicating that the upper 32 is normal. Based on the result of inspection, the shoe lasting machine 2 is permitted to proceed to procedures of glue spreading and sole attaching.
In step 705, the processor 5 generates the result of inspection indicating that the upper 32 is abnormal. Furthermore, the processor 5 generates the warning signal (W) to control the warning device 22 to output sound, light, and/or a text/graphics message. Alternatively, the warning signal (W) may be a digital signal to be transmitted to an electronic device for notification of the result of inspection.
Referring to
Referring to
Referring to
Referring to
It is worth to note that implementation of the comparison between the entry of section data (D) and the entry of predetermined standard data (T) is not limited to comparison in a single aspect, i.e., only one of the geometric center (C), the moment of inertia (I), the slope (M) and the contour 301 is taken into account. In some embodiments, the comparison may be implemented to be performed in multiple aspects, i.e., two or more of the geometric center (C), the moment of inertia (I), the slope (M) and the contour 301 may be taken into account for inspection of a single upper 32, or alternatively, multiple cross sections respectively on multiple X-Z planes along a Y axis may be extracted from the single work piece model 3′ followed by their respective comparison processes, such that the result of inspection may be more accurate.
Referring to
Referring to
The light source 43 is configured to emit a structured light beam 431 of a predefined pattern toward the upper 32 to form a projected pattern (P) on the work piece 3. The projected pattern (P) appears to be a distortion of the predefined pattern from viewpoints other than that of the light source 43 due to the three-dimensionally varying surface of the work piece 3. The predefined pattern may consist of multiple stripes, or be a single light stripe, in which case the light source 43 is configured to scan the work piece 3 with the single-stripe structured light beam 431, but implementation of the predefined pattern of the structured light beam 431 is not limited to the disclosure herein and may vary in other embodiments. The light source 43 may include one of a laser projector and a digital light processing (DLP) projector, but implementation is not limited to the disclosure herein and may vary in other embodiments. In this embodiment, the light source 43 includes a DLP projector for emitting the structured light beam 431 with the predefined pattern consisting of multiple stripes and covering the whole work piece 3, and an angle of projection of the structured light beam 431 emitted by the DLP projector is adjustable without having to physically move the light source 43 around.
The image capturing device 44 is configured to obtain an image 92 of the projected pattern (P) formed on the work piece 3, and to output image data (O) representing the image 92 of the projected pattern (P) to the processor 5 so as to enable the processor (5) to obtain the image 92 of the projected pattern (P) from the image data (O).
The processor 5 is configured to calculate spatial coordinates of each element (e.g., each stripe) of the projected pattern (P) in the image 92 based on a degree of deformation of the projected pattern (P) in the image 92 (with respect to the non-distorted predefined pattern of the structured light beam 431) and a positional relationship among the light source 43, the image capturing device 44 and the work piece 3, and to establish the work piece model 3′ as shown in
It should be noted that since implementation of establishing the work piece model 3′ in the three-dimensional space based on technologies of structured light (e.g., structured light 3D scanning) has been well known to one skilled in the relevant art, detailed explanation of the same is omitted herein for the sake of brevity.
Next, a second embodiment of the method according to the disclosure is discussed. The second embodiment of the method is to be implemented by the processor 5 of the second embodiment of the system that is previously described. As shown in
Referring to
In step 611, the processor 5 controls the light source 43 to emit the structured light beam 431 onto the upper 32 to form a projected pattern (P) on the work piece 3.
In step S612, the processor 5 controls the image capturing device 44 to capture an image 92 of the projected pattern (P) formed on the work piece 3, and to output image data (O) representing the image 92 of the projected pattern (P) to the processor 5.
In step 613, the processor 5 obtains, from the image data (O), the image 92 of the projected pattern (P) formed on the work piece 3.
In step 614, the processor 5 calculates spatial coordinates of each element of the projected pattern (P) based on a degree of deformation of the projected pattern (P) in the image 92 and the positional relationship among the light source 43, the image capturing device 44 and the work piece 3.
In step 615, the processor 5 establishes the work piece model 3′ based on the spatial coordinates of the elements of the projected pattern (P). Subsequently, the inspection procedure 7 as shown in
To sum up, the system and the method for inspecting an upper of a shoe according to the disclosure utilize an optical sub-system to capture an image related to a work piece that includes the upper and a last, and utilize a processor to establish a work piece model based on the image, to obtain a cross section of the work piece model, and to generate a result of inspection based on a comparison between an entry of section data related to the cross section and an entry of predetermined standard data. Therefore, adjustment can be made to a position of the upper relative to the last in time when it is indicated by the result of inspection that the upper is abnormal, reducing failure rate of the step of pulling the upper around the last and improving quality of shoe production. Accordingly, materials of the upper may be efficiently used. Moreover, inspection of the upper performed by the system and the method according to the disclosure is automatic and is more efficient than the conventional manual approach, and may be able to generate the result of inspection that is more consistent and precise than that of the conventional manual approach. Consequently, labor cost is reduced and quality control may be enhanced.
In the description above, for the purposes of explanation, numerous specific details have been set forth in order to output a thorough understanding of the embodiments. It will be apparent, however, to one skilled in the art, that one or more other embodiments may be practiced without some of these specific details. It should also be appreciated that reference throughout this specification to “one embodiment,” “an embodiment,” an embodiment with an indication of an ordinal number and so forth means that a particular feature, structure, or characteristic may be included in the practice of the disclosure. It should be further appreciated that in the description, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of various inventive aspects, and that one or more features or specific details from one embodiment may be practiced together with one or more features or specific details from another embodiment, where appropriate, in the practice of the disclosure.
While the disclosure has been described in connection with what are considered the exemplary embodiments, it is understood that this disclosure is not limited to the disclosed embodiments but is intended to cover various arrangements included within the spirit and scope of the broadest interpretation so as to encompass all such modifications and equivalent arrangements.
Number | Date | Country | Kind |
---|---|---|---|
107140428 | Nov 2018 | TW | national |