The present application claims priority under 35 U.S.C. § 119 to Japanese Patent Application No. 2019-050516, filed on Mar. 18, 2019. The contents of which are incorporated herein by reference in their entirety.
The present invention relates to a measuring device and a shaping device.
Currently, shaping devices or what is called three-dimensional printer devices (3D printer devices) for shaping three-dimensional objects based on input data are known in the art. Also, 3D printer devices including three-dimensional shape measurement functions for acquiring the shape/contours of the molded object are known in the art. Three-dimensional shape measuring functions exert minimal effects on the shaping process and therefore non-contact type three-dimensional shape measuring functions are often utilized. Non-contact type three-dimensional shape measurement functions such as the optical cutting method and pattern irradiating method are known.
Japanese Unexamined Patent Application Publication No. 2017-15456 discloses a measurement system capable of simultaneously irradiating the object for measurement with a plurality of slit laser lights and measuring the three-dimensional coordinates with good efficiency and high accuracy.
However, when acquiring the shape of the object for shaping, the measurement system of Japanese Unexamined Patent Application Publication No. 2017-15456 requires shortening the measurement time to avoid effects on the shaping process.
According to an aspect of the present invention, a measurement device is configured to measure a three-dimensional shape of an object for measurement. The measurement device includes a projector, an imager, an identifier, and a calculator. The projector is configured to project a plurality of light lines onto the object for measurement. The imager is configured to capture an image of the object for measurement on which the plurality of light lines are projected. The identifier is configured to identify a projection condition of the light lines based on shaping information of the object for measurement. The calculator is configured to calculate a plurality of line shapes from the image captured by the imager, based on the projection condition.
The accompanying drawings are intended to depict exemplary embodiments of the present invention and should not be interpreted to limit the scope thereof. Identical or similar reference numerals designate identical or similar components throughout the various drawings.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the present invention.
As used herein, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
In describing preferred embodiments illustrated in the drawings, specific terminology may be employed for the sake of clarity. However, the disclosure of this patent specification is not intended to be limited to the specific terminology so selected, and it is to be understood that each specific element includes all technical equivalents that have the same function, operate in a similar manner, and achieve a similar result.
An embodiment of the present invention will be described in detail below with reference to the drawings.
An embodiment has an object to provide a shaping device and a measurement device capable of measuring the three-dimensional shape of the object for measurement with high speed and with high accuracy.
The embodiments of the 3D shaping system of the embodiments of the measuring device and shaping device are hereinafter described while referring to the accompanying drawings.
System Structure
First of all,
As illustrated in
The information processing terminal 150 may operate as a controller to control the processing to operate the shaping device 100, or the functions of this information processing terminal 150 may be embedded into the shaping device 100 for operation. Namely, the shaping device 100 and the information processing terminal 150 may be individually operated as physically separate devices, or integrally operated as the shaping device 100.
As illustrated in
When shaping of one layer of the shaping layer is complete, the shaping device 100 lowers the shaping plate 330 by just a one-layer height portion (lamination pitch) along the z axis direction. The shaping device 100 subsequently drives the shaper 210 the same as for the first layer to shape the second layer of the shaping layers. The shaping device 100 repeats these operations to form the layer laminations and shapes the 3D object for shaping.
The shaping device 100 moves the shaper 210 along the xy plane and as described by the example, moves the shaping plate 330 along the z axis. However, the above described structure does not limit the present embodiment and other structures may be utilized.
As illustrated in
An infrared sensor, a camera device, or a 3D measurement sensor (e.g., optical cutting profile sensor) may for example be utilized as the shape sensor 310. In other words, the shaping device 100 also functions as a scanning device. In the first embodiment, an example of the 3D measurement sensor (e.g., optical cutting profile sensor) utilized as the shape sensor 310 is described.
As subsequently described using
As illustrated in
Hardware Structure of the Information Processing Terminal
Function of the Information Processing Terminal
As illustrated in
The shaping controller 200 controls the shaping operation of the shaping device 100 based on the shape data. Specifically, the shaping controller 200 generates shaping commands based on the shape data and supplies the shaping commands to the shaper 210 of the shaping device 100. The shaping commands are commands stipulated by the shaper 210 and are generally commands that control the shaping process information for supplemental shaping.
The shape measurement controller 220 controls the shape sensor 310. Specifically, the shape measurement controller 220 first of all supplies irradiation commands to the projector 300 of the shape sensor 310 to irradiate the light lines onto the object for measurement. The projector 300 generates light rays in a linear state and projects them onto the object for measurement. As only a single example, in the case of the 3D shaping system 1 according to the first embodiment, a plurality of light lines are projected. By changing the irradiation angle of one light line, a projection state with the light lines may be projected, and the light lines may be generated by installing a plurality of light sources that project a single light line. The light lines may also be generated by utilizing a planar projector function.
The shape measurement controller 220 supplies imaging commands to the imager 301 of the shape sensor 310 to capture images of the object for measurement on which light lines are projected. The imager 301 captures images of the object for measurement on which the light lines are projected and supplies this image data to the shape data calculator 250. To capture the projected light lines as a single image, the shape measurement controller 220 generates imaging command for the imager 301 that are synchronized with projection commands to the projector 300.
The shape data calculator 250 calculates the shape data. The shape data calculator 250 is one example of an identifier and a calculator. The identifier identifies projection conditions for the light lines based on shaping information of the object for measurement. The calculator calculates a plurality of line shapes from the image captured by the imager based on the projection conditions. The shape data calculator 250 identifies patterns for the light lines in the image data by utilizing the shape data, and for example calculates 3D data (shape data) of the object for measurement by the optical cutting method.
Calculation Operation for Shape Data
The imager 310 captures images of the object for measurement 320 on which the light lines 340, 341 are projected and so can in this way generate image data 350 as illustrated for example in
In contrast, in the 3D shaping system 1 according to the first embodiment, images of the projected light lines 340, 341 can be acquired and the distance data of the light lines 340, 341 is calculated in one batch. Therefore, this calculation requires identifying irradiation conditions of the light lines from the projection patterns of the light lines contained in the image data 350. The shape data calculator 250 utilizes shape data to identify projection conditions of the light lines 340, 341 that are projected onto the object for measurement 320.
In other words, the object for measurement 320 is an object for shaping that is shaped by the shaping device 100 and therefore the shape data calculator 250 can acquire shape data acquired under ideal conditions in advance prior to measurement. The shape data calculator 250 also calculates the projected line pattern image (predictive pattern data) acquired when the light line for that ideal shape is projected. The shape data calculator 250 further calculates the predictive pattern data while adding the “assumed range of shape error” for error that is determined by the shaping performance of the shaping device 100.
Identifying Operation for Projection Conditions of Light Line
The operation for isolating and arranging the projection information of the light lines contained in the image data 350 and for identifying projection conditions of the light lines is described next.
By projecting the two light lines 340, 341 on the object for measurement 320, the predictive patterns 361, 362 for two projection conditions can be obtained as illustrated in
In other words, the light lines 340 projected onto the object for measurement 320 under the first projection condition observed in the area 361 illustrated in
The object for measurement 320 can in this way be shaped by the shaping device 100 based on shaping data so that projection conditions for the light lines 340, 341 projected onto the object for measurement 320 can be isolated using the shaping data. The shape data (Z direction, height data) of the lines can therefore be obtained in one batch based on one round of imaging data.
As clarified in the above description, the 3D shaping system 1 according to the first embodiment is capable of measuring two line shapes with one image data by using the light lines 340, 341 and without having to install new hardware for identifying projection conditions for the light lines. High-speed and low-cost shape measurement can therefore be achieved. The shape data calculator 250 can calculate distance data of the light lines 340, 341 in one batch and can therefore shorten the measurement time (speeding up).
The 3D shaping system according to a second embodiment is described next. In the case of the example for the 3D shaping system 1 according to the first embodiment, one projector 300 projects a plurality of light lines 340, 341. In contrast, in the example of the 3D shaping system according to the second embodiment, a plurality of projectors project one light line. The first embodiment described above and the second embodiment described below, differ only in this point. Therefore, only a description of the differences between both embodiments is given and redundant descriptions are omitted.
The shape data calculator 250 can identify line L2, line L4, and line L6 as projection results from projecting the light line 340 under a first projection condition by comparing the above described predictive pattern data, with the image data illustrated in
This type of 3D shaping system according to the second embodiment, the same as the first embodiment described above, is capable of isolating the projection conditions for the light lines 340 and 341 projected onto the object for measurement 320 by utilizing the shaping data. The shape data (Z direction, height data) of the lines can therefore be obtained in one batch based on one round of imaging data to obtain the same effect as the first embodiment.
The 3D shaping system according to a third embodiment is described next. In the case of the example for the 3D shaping system according to the second embodiment described above, a plurality of projectors each project one light line. In contrast, in the 3D shaping system according to the third embodiment, a plurality of projectors each project a plurality of light lines. The second embodiment described above and the third embodiment described below, differ only in this point. Therefore, only a description of the differences between both embodiments is given and redundant descriptions are omitted.
The shape data calculator 250 can identify line L4, line L8, and line L12 as projection results from projecting the light line 340 under a first projection condition by comparing the predictive pattern data, with the image data illustrated in
The shape data calculator 250 can in the same way identify line L1, line L5, and line L9 as projection results from projecting the light line 341 under a second projection condition, and identify line L2, line L6, and line 10 as projection results from projecting the light line 343 under a second projection condition.
This type of 3D shaping system according to the third embodiment, the same as the second embodiment described above, is capable of isolating the projection conditions for the light lines 340 through 343 projected onto the object for measurement 320 by utilizing the shaping data. The shape data (Z direction, height data) for the lines can therefore be obtained in one batch based on one round of image data to obtain the same effect as each of the above described embodiments.
Finally, each of the above described embodiments is provided as a single example and is not intended to limit the range of the present invention. Each of these novel embodiments can be implemented in other various forms and all manner of omissions, substitutions, and changes may be implemented without departing from the scope and spirit of the present invention. Various three-dimensional shaping methods are known in the art for example such as the fused filament fabrication (FFF) method, material jetting method, binder jetting method, selective laser sintering (SLS) method (or selective laser melting (SLM) method), stereolithography (laser method or digital light projector (DLP) method), however, the present invention applies to all of these 3D shaping methods and the above described effects can be obtained in all of these cases. The embodiments and the modifications of the embodiments are included in the range and intent of the invention, and are also included in the range of the claims of the invention or a range consistent with the invention.
An embodiment renders the advantageous effect that measuring the three-dimensional shape of the object for measurement can be measured with high speed and good accuracy.
The above-described embodiments are illustrative and do not limit the present invention. Thus, numerous additional modifications and variations are possible in light of the above teachings. For example, at least one element of different illustrative and exemplary embodiments herein may be combined with each other or substituted for each other within the scope of this disclosure and appended claims. Further, features of components of the embodiments, such as the number, the position, and the shape are not limited the embodiments and thus may be preferably set. It is therefore to be understood that within the scope of the appended claims, the disclosure of the present invention may be practiced otherwise than as specifically described herein.
The method steps, processes, or operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance or clearly identified through the context. It is also to be understood that additional or alternative steps may be employed.
Further, any of the above-described apparatus, devices or units can be implemented as a hardware apparatus, such as a special-purpose circuit or device, or as a hardware/software combination, such as a processor executing a software program.
Further, as described above, any one of the above-described and other methods of the present invention may be embodied in the form of a computer program stored in any kind of storage medium. Examples of storage mediums include, but are not limited to, flexible disk, hard disk, optical discs, magneto-optical discs, magnetic tapes, nonvolatile memory, semiconductor memory, read-only-memory (ROM), etc.
Alternatively, any one of the above-described and other methods of the present invention may be implemented by an application specific integrated circuit (ASIC), a digital signal processor (DSP) or a field programmable gate array (FPGA), prepared by interconnecting an appropriate network of conventional component circuits or by a combination thereof with one or more conventional general purpose microprocessors or signal processors programmed accordingly.
Each of the functions of the described embodiments may be implemented by one or more processing circuits or circuitry. Processing circuitry includes a programmed processor, as a processor includes circuitry. A processing circuit also includes devices such as an application specific integrated circuit (ASIC), digital signal processor (DSP), field programmable gate array (FPGA) and conventional circuit components arranged to perform the recited functions.
Number | Date | Country | Kind |
---|---|---|---|
JP2019-050516 | Mar 2019 | JP | national |
Number | Name | Date | Kind |
---|---|---|---|
8454879 | Kuzusako | Jun 2013 | B2 |
9233470 | Bradski | Jan 2016 | B1 |
20080273194 | De Sloovere et al. | Nov 2008 | A1 |
20090040533 | Takahashi et al. | Feb 2009 | A1 |
20100328682 | Kotake | Dec 2010 | A1 |
20120236317 | Nomura | Sep 2012 | A1 |
20140036034 | Boyer | Feb 2014 | A1 |
20150206325 | Furihata | Jul 2015 | A1 |
20150268035 | Furihata | Sep 2015 | A1 |
20160069669 | Furihata | Mar 2016 | A1 |
20160252346 | Bismuth | Sep 2016 | A1 |
20160253815 | Tokimitsu | Sep 2016 | A1 |
20160346882 | Yamazaki | Dec 2016 | A1 |
20160375640 | Cho et al. | Dec 2016 | A1 |
20170066192 | Cho et al. | Mar 2017 | A1 |
20170266727 | Nishino | Sep 2017 | A1 |
20170307366 | Tokimitsu | Oct 2017 | A1 |
20180058843 | Tabuchi | Mar 2018 | A1 |
20180180408 | Du | Jun 2018 | A1 |
20180186082 | Randhawa | Jul 2018 | A1 |
20190255776 | Cho et al. | Aug 2019 | A1 |
20190275742 | Yorozu | Sep 2019 | A1 |
20190278222 | Ishikawa | Sep 2019 | A1 |
20190286104 | Sugawara et al. | Sep 2019 | A1 |
20190299538 | Yorozu | Oct 2019 | A1 |
20190323829 | Suenaga | Oct 2019 | A1 |
20190325593 | Tokimitsu | Oct 2019 | A1 |
20190392607 | Sasaki | Dec 2019 | A1 |
20200001529 | Sugawara et al. | Jan 2020 | A1 |
20200003553 | Nishi | Jan 2020 | A1 |
20200105005 | Grassinger | Apr 2020 | A1 |
20200156322 | Yorozu | May 2020 | A1 |
20200230886 | Takano et al. | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
102007063041 | Jul 2009 | DE |
2004-020536 | Jan 2004 | JP |
5746529 | May 2015 | JP |
2017-013288 | Jan 2017 | JP |
2017-015456 | Jan 2017 | JP |
2020-114630 | Jul 2020 | JP |
2020-138535 | Sep 2020 | JP |
Entry |
---|
Graebling et al., “Optical high-precision three-dimensional vision-based quality control of manufactured parts by use of synthetic images and knowledge for image-data evaluation and interpretation,” 2002, Applied Optics, vol. 41, No. 14, pp. 2627-2643 (Year: 2002). |
Machine translation of DE102007063041A1, 2009, pp. 1-40. (Year: 2009). |
Extended Search Report for European Application No. 20161460.9 dated Aug. 18, 2020. |
Number | Date | Country | |
---|---|---|---|
20200300616 A1 | Sep 2020 | US |