The present invention relates to an inspection program, an information processing apparatus, and an inspection method.
As an inspection technique for inspecting various members manufactured in a manufacturing place and a structure or the like assembled with various members, for example, a technique for imaging an inspection object and superimposing and displaying three-dimensional CAD data (three-dimensional image) as the inspection object, using the augmented reality technique (AR technique) has been known. According to the inspection technique, it is possible to inspect errors such as a dimension and an angle by measuring a deviation amount between the inspection object and the three-dimensional CAD data or the like.
Patent Document 1: Japanese Laid-open Patent Publication No. 2017-091078, Patent Document 2: Japanese Laid-open Patent Publication No. 2020-003995.
According to an aspect of the embodiments, a non-transitory computer-readable storage medium storing an inspection program that causes at least one computer to execute a process, the process includes acquiring a captured image that includes an inspection object to which a jig that provides a feature line that has a certain positional relationship with the inspection object is coupled; acquiring a three-dimensional image in a state where the jig is coupled to the inspection object; detecting a plurality of feature lines from the captured image; and displaying the three-dimensional image superimposed on the inspection object which is included in the captured image and to which the jig is coupled, by using the plurality of feature lines detected from the captured image and line segments of the three-dimensional image which correspond to the plurality of feature lines.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention.
Here, to superimpose and display the three-dimensional CAD data by the AR technique, it is needed to detect appropriate feature lines (for example, feature lines that are three-dimensionally distributed and have a sufficient length) from a captured image including the inspection object.
However, there may be a situation where the appropriate feature lines are not able to be detected and the three-dimensional CAD data is not able to be superimposed and displayed depending on a shape and an imaging environment of the inspection object.
In one aspect, an inspection program, an information processing apparatus, and an inspection method are provided that superimpose and display a three-dimensional image on an inspection object included in a captured image and inspect the inspection object.
An inspection program, an information processing apparatus, and an inspection method that superimpose and display a three-dimensional image on an inspection object included in a captured image and inspect the inspection object can be provided.
First, a relationship between superimposed display of an inspection object and three-dimensional CAD data performed by an information processing apparatus according to each embodiment below and deviation information indicating a deviation amount between the inspection object and the three-dimensional CAD data or the like will be briefly described with reference to
As illustrated in 12a of
Of these, 12b of
Specifically, 12b of
In the case of the superimposition method illustrated in 12b of
Therefore, an inspector can easily recognize that the inspection object 1210 and the three-dimensional image 1220 deviate from each other as the deviation information. However, it is not possible for the inspector to recognize a cause of the deviation, for example, which part of the inspection object 1210 has a defect that causes the deviation, as the deviation information.
In contrast, the information processing apparatus according to each embodiment below specifies a reference line (one of edge lines) or a reference point (a point on the edge line) defined by a manufacturer for the inspection object 1210 and performs superimposed display so that the reference line (or the reference point) is matched.
12
c of
In this way, in the case where superimposed display is performed so that the reference line is matched, the inspector compares the edge line of the inspection object 1210 other than the reference line and the line segment corresponding to the edge line other than the reference line among the line segments forming the three-dimensional image 1220 other than the reference line. As a result, the inspector can recognize the deviation amount, the cause of the deviation in addition to whether or not the deviation occurs, as the deviation information.
Hereinafter, each embodiment will be described with reference to the accompanying drawings. However, in this specification and the drawings regarding each embodiment below, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant description is omitted.
<Application Example of Information Processing Apparatus>
First, an application example of an information processing apparatus according to a first embodiment will be described.
As illustrated in
Subsequently, the manufacturer or the like manufactures various members included in the product based on the three-dimensional CAD data in a manufacturing process and assembles the manufactured various members and generate a structure in a member assembly process. Subsequently, the manufacturer welds the structures with each other in a welding process and executes finishing processing in a finishing process, and thereby, completes the product. Thereafter, the manufacturer or the like ships the completed product in a shipping process.
Here, an information processing apparatus (for example, a tablet terminal) 110 according to the first embodiment acquires captured image data obtained by imaging an inspection object (various members, structure, or the like) in each process that is captured image data 122 including an “inspection object to which an inspection jig is coupled”. Furthermore, the information processing apparatus 110 according to the first embodiment displays superimposed data 123 in which corresponding coupled three-dimensional CAD data 121 is superposed on the acquired captured image data 122 to an inspector 130.
The coupled three-dimensional CAD data here is three-dimensional CAD data (a three-dimensional image) of a state where the inspection jig is coupled to the inspection object and is generated based on three-dimensional CAD data of the inspection jig and three-dimensional CAD data of the inspection object.
As a result, the inspector 130 can inspect whether or not the inspection object in each process matches design content.
<Hardware Configuration of Information Processing Apparatus>
Next, a hardware configuration of the information processing apparatus 110 will be described.
Furthermore, the information processing apparatus 110 includes an auxiliary storage device 204, a UI device 205, an image capturing device 206, a communication device 207, and a drive device 208. Note that individual pieces of hardware of the information processing apparatus 110 are connected to each other via a bus 209.
The CPU 201 is a device that executes various programs (for example, an inspection program to be described below or the like) installed in the auxiliary storage device 204.
The ROM 202 is a nonvolatile memory. The ROM 202 functions as a main storage device that stores various programs, data, and the like needed for the CPU 201 to execute the various programs installed in the auxiliary storage device 204. Specifically, the ROM 202 functions as a main storage device that stores, for example, a boot program such as a basic input/output system (BIOS) or an extensible firmware interface (EFI).
The RAM 203 is a volatile memory such as a dynamic random access memory (DRAM) or a static random access memory (SRAM). The RAM 203 functions as a main storage device that provides a work area expanded when the various programs installed in the auxiliary storage device 204 are executed by the CPU 201.
The auxiliary storage device 204 is an auxiliary storage device that stores various programs and information used when the various programs are executed. For example, a three-dimensional CAD data storage unit (to be described below) that stores coupled three-dimensional CAD data 121 or the like is implemented by the auxiliary storage device 204.
The UI device 205 provides the inspector 130 with an inspection screen to display the coupled three-dimensional CAD data 121, the captured image data 122, and the superimposed data 123. Furthermore, the UI device 205 receives various instructions from the inspector 130 via the inspection screen.
The image capturing device 206 is an imaging device that images the inspection object to which the inspection jig is coupled in each process and generates the captured image data 122. The communication device 207 is a communication device that is connected to a network and performs communication.
The drive device 208 is a device to which a recording medium 210 is set. The recording medium 210 mentioned here includes a medium that optically, electrically, or magnetically records information, such as a compact disc read only memory (CD-ROM), a flexible disk, or a magneto-optical disk. Alternatively, the recording medium 210 may include a semiconductor memory or the like that electrically records information, such as a ROM or a flash memory.
Note that the various programs to be installed in the auxiliary storage device 204 are installed, for example, when the distributed recording medium 210 is set to the drive device 208, and the various programs recorded in the recording medium 210 are read by the drive device 208. Alternatively, the various programs to be installed in the auxiliary storage device 204 may be installed by being downloaded from a network via the communication device 207.
<Functional Configuration of Information Processing Apparatus>
Next, a functional configuration of the information processing apparatus 110 will be described.
The inspection unit 300 includes a coupled three-dimensional CAD data generation unit 301, a coupled three-dimensional CAD data acquisition unit 302, a line segment identification unit 303, a captured image data acquisition unit 311, a feature line detection unit 312, a superimposition unit 321, a specifying unit 322, and a display unit 323.
The coupled three-dimensional CAD data generation unit 301 reads three-dimensional CAD data of the inspection jig and three-dimensional CAD data of the inspection object stored in a three-dimensional CAD data storage unit 330 in advance. Furthermore, the coupled three-dimensional CAD data generation unit 301 generates three-dimensional CAD data of a state where the inspection jig is coupled to the inspection object based on the read three-dimensional CAD data and stores the generated data in the three-dimensional CAD data storage unit 330 as coupled three-dimensional CAD data.
The coupled three-dimensional CAD data acquisition unit 302 is an example of a second acquisition unit and reads the coupled three-dimensional CAD data 121 (coupled three-dimensional CAD data corresponding to the inspection object to which the inspection jig is coupled) specified by the inspector 130 from the three-dimensional CAD data storage unit 330.
The line segment identification unit 303 identifies each line segment in the read coupled three-dimensional CAD data 121. Moreover, the line segment identification unit 303 notifies the superimposition unit 321 of the read coupled three-dimensional CAD data (including information regarding each identified line segment).
The captured image data acquisition unit 311 is an example of a first acquisition unit and images the inspection object to which the inspection jig is coupled in each process by controlling the image capturing device 206 based on an instruction of the inspector 130 and acquires the captured image data 122.
The feature line detection unit 312 is an example of a detection unit and detects an edge line from the captured image data 122. Specifically, the feature line detection unit 312 detects an edge line of the inspection object to which the inspection jig is coupled, the inspection object being included in the captured image data. Furthermore, the feature line detection unit 312 notifies the superimposition unit 321 of the acquired captured image data 122 (including information regarding the detected edge line).
The superimposition unit 321 notifies the display unit 323 of the notified coupled three-dimensional CAD data 121 and captured image data 122 so as to display the coupled three-dimensional CAD data 121 and the captured image data 122 to the inspector 130 via the UI device 205.
Furthermore, in response to the display of the coupled three-dimensional CAD data 121 and the captured image data 122 via the UI device 205, the superimposition unit 321 receives notifications regarding
Furthermore, the superimposition unit 321 estimates the position and posture of the image capturing device 206 based on the specification of the reference line, the specification of the corresponding line segment, the specification of the edge line other than the reference line, and the specification of the corresponding line segment. Furthermore, the superimposition unit 321 scales, moves, or rotates the coupled three-dimensional CAD data based on the estimated position and posture of the image capturing device 206 and superimposes the coupled three-dimensional CAD data on the captured image data 122 to generate superimposed data 123. Moreover, the superimposition unit 321 notifies the display unit 323 of the generated superimposed data 123.
In a case where the inspector 130 specifies the reference line, the corresponding line segment, the edge line other than the reference line, and the corresponding line segment, the specifying unit 322 notifies the superimposition unit 321 of the specified lines and line segments in response to the specification.
The display unit 323 generates and displays an inspection screen that displays the coupled three-dimensional CAD data 121, the captured image data 122, and the superimposed data 123 that have been notified from the superimposition unit 321. Note that it is assumed that the inspection screen generated by the display unit 323 include operation buttons used to specify the reference line, the corresponding line segment, the edge line other than the reference line, and the corresponding line segment.
<Coupling Example in a Case Where Inspection Jig Is Coupled to Inspection Object and Edge Line Detection Example>
Next, a coupling example in a case where the inspection jig is coupled to the inspection object and a detection example of an edge line detected from the captured image data 122 including the inspection object to which the inspection jig is coupled, will be described.
As illustrated in
Furthermore, as illustrated in
However, the shape of the inspection jig 420 is not limited to the rectangular parallelepiped shape and may be a shape other than the rectangular parallelepiped shape as long as the inspection jig 420 can provide the edge lines described above.
Furthermore, as illustrated in
Moreover, as illustrated in
<Specific Example of Coupled Three-Dimensional CAD Data and Identification Example of Line Segment>
Next, a specific example of the coupled three-dimensional CAD data and an identification example of a line segment forming the coupled three-dimensional CAD data will be described.
In
Furthermore, in
As illustrated in
Furthermore, as illustrated in
<Specific Example of Inspection Screen>
Next, a specific example of an inspection screen in a case where the inspection object 430 is inspected by superimposing and displaying the coupled three-dimensional CAD data 121 on the inspection object 430, to which the inspection jig 420 is coupled, included in the captured image data 122 will be described.
Of these, the upper part of
are displayed on an inspection screen 600. Note that the coupled three-dimensional CAD data 121 is displayed to have a posture similar to a posture of the inspection object 430 included in the captured image data 122. At this time, a hidden line of the coupled three-dimensional CAD data 121 is deleted.
Furthermore, the lower part of
Furthermore, the upper part of
Moreover, the lower part of
The example in the lower part of
Note that a return button 605 on each of the inspection screens 600 in
<Flow of Inspection Processing>
Next, a flow of inspection processing executed by the inspection unit 300 of the information processing apparatus 110 according to the first embodiment will be described.
In step S701, the coupled three-dimensional CAD data generation unit 301 generates coupled three-dimensional CAD data of a state where the inspection jig is coupled to the inspection object and stores the generated data in the three-dimensional CAD data storage unit 330.
In step S702, the captured image data acquisition unit 311 images the inspection object to which the inspection jig is coupled and acquires captured image data. Furthermore, the display unit 323 displays the acquired captured image data on the inspection screen.
In step S703, a feature line detection unit 312 detects a feature line (an edge line) from the acquired captured image data.
In step S704, the coupled three-dimensional CAD data acquisition unit 302 reads coupled three-dimensional CAD data specified by the inspector 130 that is coupled three-dimensional CAD data corresponding to the inspection object included in the captured image data from the three-dimensional CAD data storage unit 330. Furthermore, the display unit 323 displays the read coupled three-dimensional CAD data on the inspection screen. Moreover, the line segment identification unit 303 identifies each line segment forming the read coupled three-dimensional CAD data.
In step S705, when the inspector 130 selects a feature line and a corresponding line segment, the specifying unit 322 receives the selected line and line segment. Specifically, first, the specifying unit 322 receives specification of a reference line in the captured image data. Next, the specifying unit 322 receives specification of a line segment corresponding to the reference line in the coupled three-dimensional CAD data. Next, the specifying unit 322 receives specification of an edge line other than the reference line in the captured image data. Next, the specifying unit 322 receives specification of a line segment corresponding to the edge line other than the reference line in the coupled three-dimensional CAD data.
In step S706, the superimposition unit 321 estimates a position and a posture of the image capturing device 206 when the inspection object to which the inspection jig is coupled is imaged (refer to Reference Documents 1 and 2).
In step S707, the superimposition unit 321 scales, moves, or rotates the coupled three-dimensional CAD data based on the estimated position and posture of the image capturing device 206. As a result, the superimposition unit 321 superimposes the coupled three-dimensional CAD data after the scaling, movement, or rotation on the inspection object to which the inspection jig is coupled, the inspection object being included in the captured image data, to generate superimposed data. Furthermore, the display unit 323 displays the generated superimposed data on the inspection screen.
In step S708, the superimposition unit 321 visualizes a deviation amount between the inspection object and the coupled three-dimensional CAD data in the superimposed data and presents the calculated error, and thereafter, terminates the inspection processing.
As is obvious from the above description, the information processing apparatus according to the first embodiment acquires the captured image data including the inspection object to which the jig is coupled, the jig being capable of providing the edge line having a predetermined positional relationship with the reference line of the inspection object. Furthermore, the information processing apparatus according to the first embodiment acquires the coupled three-dimensional CAD data of a state where the jig is coupled to the inspection object. Furthermore, the information processing apparatus according to the first embodiment detects a plurality of edge lines from the acquired captured image data. Moreover, the information processing apparatus according to the first embodiment superimposes and displays the coupled three-dimensional CAD data on the inspection object to which the jig is coupled, the inspection object being included in the captured image data, using the plurality of edge lines detected from the captured image data and the corresponding line segments in the coupled three-dimensional CAD data.
In this way, according to the information processing apparatus according to the first embodiment, it is possible to detect the plurality of edge lines that
As a result, according to the first embodiment, it is possible to superimpose and display the three-dimensional CAD data on the inspection object included in the captured image data and inspect the inspection object regardless of the imaging environment of the inspection object.
In the above-described first embodiment, a case where the inspection jig has a rectangular parallelepiped shape has been described. However, the shape of the inspection jig is not limited to a rectangular parallelepiped shape and may be a shape suitable for a shape of an inspection object. Hereinafter, in a second embodiment, jigs having various shapes suitable for the shape of the inspection object will be described.
<Example of Inspection Jig Suitable for Two-Hole Inspection Object>
First, an inspection jig suitable for an inspection object in which two machined holes are provided will be described.
As illustrated in
Of these, the cylindrical portion 802 is fixed to the base portion 803. On the other hand, the cylindrical portion 801 is attached to the base portion 803 to be movable in a direction of an arrow 811. As a result, a distance between the cylindrical portion 801 and the cylindrical portion 802 can be changed according to a distance between the two machined holes provided in the inspection object.
Furthermore, the cross portion 804 is formed on the plane of the base portion 803 by combining two rectangular parallelepipeds parallel to the x axis of the x axis, the y axis, and the z axis defined as illustrated in
Because machining accuracy of the positions of the machined holes provided in the inspection object is generally high, in the case of an inspection jig 800 in which the cylindrical portions 801 and 802 are respectively fitted into the two machined holes, it is possible to position the cross portion 804 with respect to the reference line with high accuracy.
Note that it is assumed that the plurality of rectangular parallelepipeds forming the cross portion 804 have a sufficient length so as not to interfere with the inspection object when the inspection jig 800 is coupled to the inspection object.
Furthermore, in the example in
<Example of Inspection Jig Suitable for One-Hole Inspection Object>
Next, an inspection jig suitable for an inspection object in which a single machined hole is provided will be described.
As illustrated in
Of these, the cylindrical portion 901 is fixed to the base portion 902. Furthermore, in a case where the cylindrical portion 901 is fitted into the single machined hole, the base portion 902 plays a role for avoiding a wobbling movement of the inspection jig 900 with respect to the inspection object.
The cylindrical portion 903 has the same axis as the cylindrical portion 901, and provides two boundary lines (an example of feature lines) when the cylindrical portion 901 is fitted into the single machined hole provided in the inspection object. Note that because the cylindrical portion 903 has a cylindrical shape, two boundary lines (an example of feature lines) are constantly detected at the same interval regardless of an imaging direction around the axis of the machined hole. Therefore, by calculating the center line of the two boundary lines, it is possible to use the center line as the feature line.
Note that, because the machining accuracy of the machined hole provided in the inspection object is generally high, in the case of the inspection jig 900 including the cylindrical portion 903 that has the same axis as the cylindrical portion 901 fitted into the single machined hole, it is possible to position the cylindrical portion 903 with respect to the reference line with high accuracy.
<Example of Inspection Jig Suitable for Non-Hole Inspection Object>
Next, an inspection jig suitable for an inspection object with no machined hole will be described.
As illustrated in
Of these, a magnet is attached on a rear surface of the base portion 1001 so that the base portion 1001 is surface-bonded to the plane of the inspection object.
Furthermore, the cross portion 1002 is formed on the plane of the base portion 1001 by combining two rectangular parallelepipeds parallel to the x axis of the x axis, the y axis, and the z axis defined as illustrated in
Note that it is assumed that the plurality of rectangular parallelepipeds forming the cross portion 1002 have a sufficient length so as not to interfere with the inspection object when the inspection jig 1000 is surface-bonded to the inspection object.
Furthermore, in the example in
Note that, in the case of the inspection jig 1000, it is difficult to position the cross portion 1002 with respect to the reference line of the inspection object with high accuracy. Therefore, in the case of the inspection jig 1000, for example, the inspection jig 1000 is used to inspect a deviation amount of an angle of the inspection object with respect to the coupled three-dimensional CAD data (for example, a deviation amount in a normal direction of the plane of the inspection object).
<Flow of Inspection Processing>
Next, a flow of inspection processing executed by an inspection unit 300 of an information processing apparatus 110 according to the second embodiment will be described.
In step S1101, a coupled three-dimensional CAD data generation unit 301 generates coupled three-dimensional CAD data of a state where the inspection jig is coupled to the inspection object and stores the generated data in a three-dimensional CAD data storage unit 330.
In step S1102, a captured image data acquisition unit 311 images the inspection object to which the inspection jig is coupled and acquires captured image data. Furthermore, a display unit 323 displays the acquired captured image data on an inspection screen.
In step S1103, a feature line detection unit 312 detects a feature line (an edge line or a boundary line) from the acquired captured image data.
In step S1104, in a case where an inspector 130 selects a type of the jig coupled to the inspection object, a coupled three-dimensional CAD data acquisition unit 302 determines which type has been selected in response to the selection. Note that the inspector 130 selects a type of a jig according to the number of machined holes of the inspection object.
In a case where it is determined in step S1104 that a jig coupled to a one-hole inspection object is selected, the procedure proceeds to step S1111.
In step S1111, the coupled three-dimensional CAD data acquisition unit 302 reads coupled three-dimensional CAD data corresponding to the inspection object to which the selected jig (an inspection jig suitable for a one-hole inspection object) is coupled from the three-dimensional CAD data storage unit 330. Furthermore, a display unit 323 displays the read coupled three-dimensional CAD data on an inspection screen. Moreover, a line segment identification unit 303 identifies a line segment forming the read coupled three-dimensional CAD data.
On the other hand, in a case where it is determined in step S1104 that a jig coupled to a two-hole inspection object is selected, the procedure proceeds to step S1121.
In step S1121, the coupled three-dimensional CAD data acquisition unit 302 reads coupled three-dimensional CAD data corresponding to the inspection object to which the selected jig (an inspection jig suitable for a two-hole inspection object) is coupled from the three-dimensional CAD data storage unit 330. Furthermore, a display unit 323 displays the read coupled three-dimensional CAD data on an inspection screen. Moreover, the line segment identification unit 303 identifies a line segment forming the read coupled three-dimensional CAD data.
In step S1122, when the inspector 130 selects a feature line and a corresponding line segment, the specifying unit 322 receives the selected line and line segment. Specifically, first, the specifying unit 322 receives specification of a reference line in the captured image data. Next, the specifying unit 322 receives specification of a line segment corresponding to the reference line in the coupled three-dimensional CAD data. Furthermore, the specifying unit 322 receives specification of an edge line or a boundary line other than the reference line in the captured image data. Moreover, the specifying unit 322 receives specification of a line segment corresponding to the edge line or the boundary line other than the reference line in the coupled three-dimensional CAD data.
In step S1123, a superimposition unit 321 estimates a position and a posture of an image capturing device 206 when the inspection object to which the inspection jig is coupled is imaged.
In step S1124, the superimposition unit 321 scales, moves, or rotates the coupled three-dimensional CAD data based on the estimated position and posture of the image capturing device 206. As a result, the superimposition unit 321 superimposes the coupled three-dimensional CAD data after the scaling, movement, or rotation on the inspection object (the inspection object coupled to the inspection jig suitable for a one-hole or two-hole inspection object) included in the captured image data to generate superimposed data. Furthermore, the display unit 323 displays the generated superimposed data on the inspection screen.
In step S1125, the superimposition unit 321 visualizes a deviation amount between the coupled three-dimensional CAD data and the inspection object in the superimposed data and presents the calculated error.
On the other hand, in a case where it is determined in step S1104 that the inspection jig coupled to the non-hole inspection object is selected, the procedure proceeds to step S1131.
In step S1131, the coupled three-dimensional CAD data acquisition unit 302 reads coupled three-dimensional CAD data corresponding to the inspection object coupled to the selected jig (an inspection jig suitable for a non-hole inspection object) from the three-dimensional CAD data storage unit 330. Furthermore, a display unit 323 displays the read coupled three-dimensional CAD data on an inspection screen. Moreover, the line segment identification unit 303 identifies a line segment forming the read coupled three-dimensional CAD data.
In step S1132, when the inspector 130 selects a feature line and a corresponding line segment, the specifying unit 322 receives the selected line and line segment. Specifically, first, the specifying unit 322 receives specification of a reference line in the captured image data. Furthermore, the specifying unit 322 receives specification of a line segment corresponding to the reference line in the coupled three-dimensional CAD data. Furthermore, the specifying unit 322 receives specification of an edge line other than the reference line in the captured image data. Moreover, the specifying unit 322 receives specification of a line segment corresponding to the edge line other than the reference line in the coupled three-dimensional CAD data.
In step S1133, the superimposition unit 321 estimates a position and a posture of the image capturing device 206 when the inspection object to which the inspection jig is coupled is imaged.
In step S1134, the superimposition unit 321 scales, moves, or rotates the coupled three-dimensional CAD data based on the estimated position and posture of the image capturing device 206. As a result, the superimposition unit 321 superimposes the coupled three-dimensional CAD data after the scaling, movement, or rotation on the inspection object (the inspection object coupled to the inspection jig suitable for a non-hole inspection object) included in the captured image data to generate superimposed data. Furthermore, the display unit 323 displays the generated superimposed data on the inspection screen.
In step S1135, the superimposition unit 321 calculates an angle of the edge line other than the reference line, of which the specification is received in step S1132, in the z axis direction (a normal direction of a plane in surface-contact with the inspection jig suitable for a non-hole inspection object).
In step S1136, the superimposition unit 321 visualizes a deviation amount between the coupled three-dimensional CAD data and the inspection object (a deviation amount of an angle in the z axis direction of the edge line) in the superimposed data and presents the calculated error of the angle.
As is obvious from the above description, in the second embodiment, the plurality of jigs having the shapes suitable for the shape of the inspection object is prepared and is properly used according to the shape of the inspection object. As a result, according to the information processing apparatus according to the second embodiment, the plurality of appropriate feature lines can be detected regardless of the shape of the inspection object.
As a result, according to the second embodiment, it is possible to superimpose and display the three-dimensional CAD data on the inspection object included in the captured image data and inspect the inspection object regardless of the shape of the inspection object.
In the above-described first embodiment, the description has been made while assuming that the inspector 130 presses the reference line specification button 602, the edge line specification button 603, or the like, and then specifies the reference line, the edge line other than the reference line, and the corresponding line segments. However, the method of specifying the reference line by the inspector 130 is not limited to this. For example, first, the plurality of edge lines and corresponding line segments may be specified, and a pair of the reference line and the corresponding line segment may be specified from among pairs of the plurality of specified edge lines and the corresponding line segments.
Furthermore, in the above-described first embodiment, the description has been made while assuming that the inspector 130 specifies the line segment corresponding to the edge line other than the reference line. However, the inspector 130 may specify the edge line other than the reference line of the inspection object included in the captured image data 122, and the specifying unit 322 may automatically associate the corresponding line segment with the edge line other than the reference line of the coupled three-dimensional CAD data 121. Alternatively, the edge line other than the reference line and the corresponding line segment may be automatically selected, for example, by the superimposition unit 321 respectively from the coupled three-dimensional CAD data 121 and the captured image data 122 and may be associated with each other.
Furthermore, in the above-described first and second embodiments, the description has been made while assuming that the inspector 130 specifies the reference line of the inspection object included in the captured image data 122 and the line segment corresponding to the reference line in the coupled three-dimensional CAD data 121 so as to associate the reference line with the line segment. However, the inspector 130 may specify the reference line of the inspection object included in the captured image data 122, and may not specify the line segment corresponding to the reference line of the coupled three-dimensional CAD data 121. That is, the specifying unit 322 may automatically perform association (pairing) on the line segment corresponding to the reference line in the coupled three-dimensional CAD data 121 based on the specified reference line of the inspection object. In this case, the specifying unit 322 functions as a pairing unit.
Furthermore, in the above-described first and second embodiments, the description has been made while assuming that the coupled three-dimensional CAD data 121 is scaled, moved, or rotated and superimposition is performed on the coupled three-dimensional CAD data 121 so that the reference line is matched at the time of superimposition and display. However, at the time of superimposition and display, the reference line does not need to be completely matched. For example, the coupled three-dimensional CAD data 121 may be scaled, moved, or rotated so that a degree of coincidence between the reference lines is equal to or more than a predetermined degree of coincidence, and superimposition is performed on the coupled three-dimensional CAD data 121.
Furthermore, in the above-described first and the second embodiments, the description has been made while assuming that the superimposed display is performed such that the reference line is matched. However, superimposed display may be performed such that a reference point is matched.
Furthermore, in the above-described first and second embodiments, the description has been made while assuming that the coupled three-dimensional CAD data is generated before the captured image data is acquired.
However, the coupled three-dimensional CAD data may be generated after the captured image data is acquired.
In the above-described first embodiment, the description has been made while assuming that, in superimposing and displaying the coupled three-dimensional CAD data on the inspection object to which the inspection jigs included in the captured image data is coupled, the inspector 130 specifies the reference line, the edge line other than the reference line, the corresponding line segment, and the like. In contrast, in a fourth embodiment, an operation load on an inspector 130 when performing superimposed display is decreased by reducing items specified by the inspector 130. Hereinafter, the fourth embodiment will be described focusing on differences from each of the above-described embodiments.
<Description of Inspection Jig>
First, an inspection jig used when an information processing apparatus according to the fourth embodiment executes inspection processing will be described.
As illustrated in
Furthermore, by arranging colors different from one another on the respective surfaces of the triangular dipyramid, a color region extraction unit, which will be described below, can extract surfaces in two or more different types of colors from the captured image data even in the case where the jig 1310 is imaged from an arbitrary position.
Note that, as in the above-described first embodiment, in the case where the shape of the jig is a rectangular parallelepiped, even if colors different from one another are arranged on the respective surfaces, the color region extraction unit to be described below is not able to extract surfaces in two or more different types of colors from the captured image data in a case of capturing an image from the front. In this case, it is not possible to distinguish top and bottom and right and left of the rectangular parallelepiped. In contrast, in the case of the jig 1310, even if an image is captured from an arbitrary position, the color region extraction unit to be described below can extract the surfaces in two or more different types of colors, and thus can distinguish the top and bottom and right and left.
Note that, in
An identifier for identifying each surface of the jig 1310 is stored in the “surface ID”. As described above, the jig 1310 is formed with a triangular dipyramid and has six surfaces, so “1” to “6” are stored in the “surface ID”.
The color arranged on each surface of the jig 1310 is stored in the “color”. The example of
The colors arranged on the surfaces adjacent to the surface identified by the corresponding “surface ID” are arranged and stored in counterclockwise order in the “adjacent colors”. In the case of a triangular dipyramid, the shape of each surface is a triangle and each surface is adjacent to three surfaces. Therefore, three types of colors are arranged and stored in the “adjacent colors”.
Furthermore, in
An identifier for identifying each side of the jig 1310 is stored in the “side ID”. Since the jig 1310 has nine sides, “1” to “9” are stored in the “side ID”.
The colors arranged on the two surfaces adjacent to the side identified by the corresponding “side ID” are stored in the “adjacent colors”.
<Functional Configuration of Information Processing Apparatus>
Next, a functional configuration of an information processing apparatus according to the fourth embodiment will be described.
The line segment identification unit 1401 identifies each line segment in read coupled three-dimensional CAD data. Note that the line segment identification unit 1401 identifies the line segment of each side of the jig 1310 in the coupled three-dimensional CAD data. At this time, the line segment identification unit 1401 acquires the colors arranged on the two surfaces adjacent to each identified line segment. Furthermore, the line segment identification unit 1401 notifies the corresponding pair determination unit 1421 of each identified line segment and the colors arranged on the two surfaces adjacent to the each identified line segment.
The color region extraction unit 1411 extracts a color region from captured image data 122. Specifically, the color region extraction unit 1411 extracts a region in which the color of each surface of the inspection jig 1310 is arranged, included in the captured image data. Furthermore, the color region extraction unit 1411 notifies the feature line detection unit 1412 of the extracted color region.
The feature line detection unit 1412 extracts a contour side of each color region notified from the color region extraction unit 1411, and identifies a boundary line of each color of the triangular dipyramid and adjacent colors adjacent to the boundary line of each color. Note that, in the present embodiment, the contour side refers to a side of a polygonal contour formed by each color region in the captured image data 122.
Furthermore, the feature line detection unit 1412 extracts edge lines from the captured image data 122 and associates each extracted edge line with the boundary line of each color, thereby specifying adjacent colors adjacent to each edge line. Moreover, the feature line detection unit 1412 notifies the corresponding pair determination unit 1421 of each edge line and the adjacent colors adjacent to each edge line.
The corresponding pair determination unit 1421 specifies a corresponding pair of the line segment and the edge line based on
The superimposition unit 1422 estimates the position and posture of the image capturing device 206 based on the corresponding pair specified by the corresponding pair determination unit 1421.
By estimating the position and posture of the image capturing device 206, the superimposition unit 321 scales, moves, or rotates the coupled three-dimensional CAD data 121 and automatically superimposes the coupled three-dimensional CAD data 121 on the captured image data 122 to generate superimposed data 123.
As described above, according to the information processing apparatus 110 according to the fourth embodiment, the inspector 130 does not need to specify
Note that, in the case of the information processing apparatus 110 according to the fourth embodiment, the reference line is not specified and thus superimposed display is performed such that the edge line of the jig 1310 matches the corresponding line segment, instead of performing superimposed display such that the reference line matches the corresponding line segment. Therefore, according to the fourth embodiment, the inspector 130 grasps a deviation amount and a cause of the deviation by comparing the edge line other than the edge line of the jig 1310 (that is, the edge line of an inspection object 1510) with the corresponding line segment. That is, in the fourth embodiment, the edge line of the jig 1310 serves as a reference line.
<Coupling Example in a Case where Inspection Jig is Coupled to Inspection Object and Edge Line Detection Example>
Next, a coupling example in a case where the inspection jig 1310 is coupled to the inspection object and a detection example of an edge line detected from the captured image data including the inspection object to which the inspection jig 1310 is coupled will be described.
As illustrated in
Furthermore, as illustrated in
Furthermore, as illustrated in
Note that the example of
<Specific Example of Coupled Three-Dimensional CAD Data and Identification Example of Line Segment>
Next, a specific example of the coupled three-dimensional CAD data and an identification example of a line segment forming the coupled three-dimensional CAD data will be described.
In
Furthermore, in
As illustrated in
Furthermore, as illustrated in
<Specific Example of Corresponding Pair Determination Processing>
Next, a specific example of corresponding pair determination processing by the color region extraction unit 1411, the feature line detection unit 1412, and the corresponding pair determination unit 1421 will be described.
As described above, when the captured image data 122 is notified by the captured image data acquisition unit 311, the color region extraction unit 1411 extracts the color region (the region where the color of each surface of the inspection jig 1310 is arranged) included in the captured image data 122. Note that the color region extraction unit 1411 converts, for example, pixel values (R value, G value, and B value) of each pixel of the captured image data 122 into HSV values, and compares the converted pixel values with table 1710. Therefore, the color region extraction unit 1411 identifies the color of each pixel of the captured image data 122, and extracts a cluster of pixels of each color as each color region.
In
Furthermore, as described above, the feature line detection unit 1412 extracts the contour sides from each color region (reference numeral 1720) notified by the color region extraction unit 1411. Reference numeral 1740 represents the contour sides extracted from each color region (reference numeral 1720), and reference numeral 1750 represents details of the extracted contour sides.
As represented by reference numeral 1750, three contour sides are extracted from each of red, light blue, and blue regions. Note that any method can be used to extract the contour sides but the feature line detection unit 1412 extracts the contour sides according to the following procedure, for example.
Next, the feature line detection unit 1412 specifies the boundary line of each color region and the adjacent colors adjacent to the boundary line of each color. Specifically, the feature line detection unit 1412 first calculates coordinates of a start point and coordinates of an end point in the captured image data 122 of each contour side included in reference numeral 1750. Next, the feature line detection unit 1412 determines contour sides that match each other among the contour sides included in reference numeral 1750. Specifically, the feature line detection unit 1412 determines that two contour sides match in a case of satisfying both of the conditions:
Then, in the case of determining that the two contour sides match, the feature line detection unit 1412 registers the contour sides as the boundary line of corresponding adjacent colors, and registers the coordinates of the start point and the coordinates of the end point of the boundary line in the captured image data 122.
Reference numeral 1810 in
Meanwhile, reference numeral 1820 in
The feature line detection unit 1412 selects the edge line closest to each boundary line indicated by reference numeral 1810 from the table indicated by reference numeral 1830 and matches the lines. Any matching method can be used by the feature line detection unit 1412. For example, a slope and an intercept of the boundary line and a slope and an intercept of the edge line are parameterized, respectively, and the edge line with the closest slope and intercept is selected.
Reference numeral 1840 in
Note that, as described in the first embodiment, in principle, four pairs of line segments and edge lines are needed for superimposed display. Therefore, the feature line detection unit 1412 and the corresponding pair determination unit 1421 also match the edge lines with other contour sides and specify the adjacent colors to specify the side IDs.
Specifically, the feature line detection unit 1412 extracts the contour sides whose adjacent colors are not specified from among the contour sides indicated by reference numeral 1740, and searches for the edge lines corresponding to the respective contour sides (the edge lines other than the matched edge lines, of the edge lines of reference numeral 1830).
In the case of
As described above, “edge line 1” and “edge line 2” have already been matched with the contour sides and the side IDs have been specified. For this reason, it is sufficient to specify the side IDs by matching the contour sides with the remaining two edge lines, but here, a case of matching all the five edge lines with the contour sides to specify the side IDs will be described. Note that, it is assumed that, in a case of matching only two edge lines of the five edge lines with the contour sides to specify the side IDs, for example, only the top two edge lines with the longest length are matched with the contour sides and the side IDs are specified.
Reference numeral 1850 in
Next, as illustrated in
Therefore, the corresponding pair determination unit 1421 refers to table 1320 (
That is, the adjacent colors of “boundary line 3 (red_line 2)” can be specified as red and black, and the adjacent colors of “boundary line 4 (red_line 3)” can be specified as yellow and red. Therefore, the corresponding pair determination unit 1421 can specify the side ID of “edge line 3” as “6” and the side ID of “edge line 4” as “2” by referring to table 1330 (see reference numeral 1910).
Similarly, “boundary line 5 (light blue_line 1)” is the contour side of the light blue surface, and according to table 1320, the colors of the surfaces adjacent to the light blue surface are red, black, and blue, counterclockwise. Here, according to reference numeral 1740 in
That is, the adjacent colors of “boundary line 5 (light blue_line 1)” can be specified as light blue and black. Therefore, the corresponding pair determination unit 1421 can specify the side ID of “edge line 5” as “7” by referring to table 1330 (see reference numeral 1910).
Similarly, “boundary line 6 (blue_line 3)” and “boundary line 7 (blue_line 1)” are the contour side of the blue surface, and according to table 1320, the colors of the surfaces adjacent to the blue surface are light blue, green, and yellow, counterclockwise. Here, according to reference numeral 1740 in
That is, the adjacent colors of “boundary line 6 (blue_line 3)” can be specified as blue and yellow, and the adjacent colors of “boundary line 7 (blue_line 1)” can be specified as blue and green. Therefore, the corresponding pair determination unit 1421 can specify the side ID of “edge line 6” as “1” and the side ID of “edge line 7” as “5” by referring to table 1330 (see reference numeral 1910).
<Specific Example of Inspection Screen>
Next, a specific example of an inspection screen in the information processing apparatus 110 according to the fourth embodiment will be described.
In
Moreover, the lower part of
As described above, while in the above-described first embodiment, the inspector 130 needs to specify the reference line, the edge lines other than the reference line, the corresponding line segments, and the like in displaying the superimposed data 123, the fourth embodiment only needs pressing of the superimposition button 604. That is, according to the fourth embodiment, it is possible to reduce the operation load on the inspector 130 when performing superimposed display.
<Flow of Inspection Processing>
Next, a flow of inspection processing executed by an inspection unit 1400 of the information processing apparatus 110 according to the fourth embodiment will be described.
Note that a difference from the first flowchart illustrated in
In step S2101, the color region extraction unit 1411 extracts the color region from the acquired captured image data, and extracts the contour side of each color region.
In step S2102, the feature line detection unit 1412 detects a feature line (edge line) from the acquired captured image data.
In step S2103, the feature line detection unit 1412 searches the detected feature lines (edge lines) for the feature line (edge line) corresponding to the extracted contour side. Furthermore, the corresponding pair determination unit 1421 specifies the colors of the surfaces adjacent to the contour side for which the feature line (edge line) has been searched.
In step S2104, the corresponding pair determination unit 1421 specifies each line segment of the coupled three-dimensional CAD data corresponding to each feature line (edge line) based on the adjacent colors.
As is clear from the above description, in the fourth embodiment, as information for identifying the line segment in the three-dimensional CAD data of the jig, color information of each of two surfaces adjacent to the line segment is associated. Furthermore, in the fourth embodiment, the colors of the two surfaces adjacent to the edge line detected from the captured image data are specified, and the coupled three-dimensional CAD data is superimposed and displayed based on the line segment associated with the color information matching the specified adjacent colors and the edge line.
As a result, according to the fourth embodiment, it is possible to obtain similar effects to the above-described first embodiment, and to reduce the operation load on the inspector 130 when performing superimposed display.
In the above-described fourth embodiment, the case where different colors (that is, six colors) are arranged on the respective surfaces of the jig 1310 has been described. Meanwhile, since one of the six surfaces of the jig 1310 is attached to the inspection object 1510, the color of that surface may be the same as the colors of the other five surfaces. That is, the surfaces of the jig 1310 may be colored with six colors or less.
Note that the present invention is not limited to the configurations described here, and may include combinations of the configurations or the like described in the above-described embodiments with other elements, and the like. These points can be changed without departing from the spirit of the present invention, and can be appropriately determined according to application modes of the points.
The present application claims the benefit of priority based on Japanese Patent Application No. 2020-128279, filed on Jul. 29, 2020, which is incorporated by reference herein in its entirety.
All examples and conditional language provided herein are intended for the pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although one or more embodiments of the present invention have been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2020-128279 | Jul 2020 | JP | national |
This application is a continuation application of International Application PCT/JP2021/015291 filed on Apr. 13, 2021 and designated the U.S., the entire contents of which are incorporated herein by reference. The International Application PCT/JP2021/015291 is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2020-128279, filed on Jul. 29, 2020, the entire contents of which are incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | PCT/JP2021/015291 | Apr 2021 | US |
Child | 18152652 | US |