The disclosed embodiments relate generally to additive and subtractive manufacturing and more particularly, but not exclusively, to part location of additively manufactured structures and method for post-processing the same.
Three-dimensional (3D) printing, also known as additive manufacturing, is a technique that deposits materials only where needed, thus resulting in significantly less material wastage than traditional manufacturing techniques, which typically form parts by reducing or removing material from a bulk material. While the first 3D printed articles were generally models, the industry is quickly advancing by creating 3D printed articles that may be functional parts in more complex systems, such as hinges, tools, structural elements.
In a typical additive manufacturing processes, a 3D object is created by forming layers of material under computer control. Computer-aided manufacturing (CAM) includes the use of software to control machine tools in 3D space. For some 3D objects, post-processing to further refine the object can include subtractive manufacturing techniques such as drilling, milling, or turning to modify the printed geometry of the 3D object. For example, a milling process using one or more rotary cutters can be used to remove material from the printed 3D object by feeding the cutter into the object at a certain direction.
In order to post-process the printed 3D object, it first must be moved from the 3D printer to a separate hardware device, such as a milling machine or a five-axis router. Alternatively, a cutting tool can be moved to the 3D object, such as by rotating a cutting/turning tool into position relative to the 3D object. Nevertheless, the computer-controlled system must always identify where the 3D object is relative to the tool to avoid even minor deviations in cuts and other processing. Particularly for larger, arbitrarily-shaped objects, it can be difficult for the computer-controlled system to locate the object that either has been moved, for example, on the milling machine or is disposed near a newly placed cutting edge of a turning tool. Furthermore, there is no good way to force a large 3D printed object against a corner or a predefined datum (e.g., such as placing an item on a specific corner of a photocopier) to know the object is in the correct position for any post-processing.
Accordingly, in some conventional systems, after fixing the printed 3D object onto the milling table, an operator provides the exact location of the object to the computer-controlled system using the cutting tools. In other words, the operator assigns a program zero (or starting point) for cutting tools on turning centers—a process known as “touching off” with the tools. However, this approach is not precise and can introduce human error, particularly where the 3D object does not have prominent geometry. If a vehicle-sized part is even slightly out of position, the entire part can be damaged or require significant rework.
In view of the foregoing, there is a need for improvements and/or alternative or additional solutions to improve conventional additive and/or subtractive manufacturing processes for locating structures for any post-processing of a 3D object.
It should be noted that the figures are not drawn to scale and that elements of similar structures or functions are generally represented by like reference numerals for illustrative purposes throughout the figures. It also should be noted that the figures are only intended to facilitate the description of the preferred embodiments. The figures do not illustrate every aspect of the described embodiments and do not limit the scope of the present disclosure.
Since currently-available methods and systems cannot dynamically locate printed parts in machine space (e.g., on a computer controlled mill and/or router), additive and/or subtractive manufacturing processes for scanning and locating structures for any post-processing of a 3D object can prove desirable and provide a basis for a wide range of applications, such as additive and subtractive manufacturing for vehicles and/or architectural structures.
Although
Additive manufacturing for making a 3D article on a large-scale (i.e., typically with at least one dimension greater than 5 feet) can be referred to as large-scale additive manufacturing. A system (or technique) for large-scale additive manufacturing can be referred to as large-scale additive manufacturing system (or technique). Exemplary large-scale additive manufacturing systems include, for example, the Big Area Additive Manufacturing (BAAM) 100 ALPHA available from Cincinnati Incorporated located in Harrison, Ohio, or the Large Scale Additive Manufacturing (LSAM) machine available from Thermwood Corporation located in Dale, Ind. An exemplary system 100 that uses extrusion deposition for large-scale additive manufacturing includes the BAAM 100 ALPHA.
Large-scale additive manufacturing has recently become an area of greater research, use, and technology advancement because of improvements in material properties and increased needs of customized large structures. For example, Local Motors located in Phoenix, Ariz. was the first to use large-scale additive manufacturing, or large-scale extrusion deposition, to print a vehicle.
Although the structures and methods as set forth in the present disclosure are applied to solve technical problems in large-scale additive and/or subtractive manufacturing, the structures and methods can be applied to any smaller-scale additive manufacturing, such as medium-scale and/or small-scale additive manufacturing, without limitation.
For example, turning to
The systems and methods disclosed herein advantageously scan and locate parts in machine space independent of shape geometry. For example, the printed object 201 may have some rounded corners, experience some drooping or deformation through the additive manufacturing process, and/or lack fine 3D geometry due to the material deposited, all of which can make the object 201 difficult to locate in machine space via reference points. Accordingly, in some embodiments, the systems and methods disclosed herein can scan and locate the printed object 201 without manual intervention.
Turning to
The scanner 300 can include any combination of short-range scanners and/or long-range scanners. Short-range scanners can also include portable handheld scanners, such as shown in
Turning to
If there are portions of the object 201 that are unscanned, at 4025, the scanner 300 is moved, at 4030, to a unique location around the object 201. In some embodiments, the scanner 300 is on a predetermined track that surrounds the object to calibrate the starting position of each scan. The scan continues, at 4020, until all portions of the object 201 have been captured, at 4025. In a preferred embodiment, each scan includes at least one reference point of the object 201, such as the tooling spheres discussed above. The reference points of each scan can be unique or common among scans. However, in yet another embodiment, not all scans need to include the reference points.
Once all portions of the object 201 have been scanned, from each scan, an alignment reference (or locating feature) is determined, at 4040. For example, in some embodiments, the alignment reference includes the reference point of each scan and/or a reference sphere. The alignment reference is a common reference system across one or more images that can be used to track location on an object for aligning the images. For example, adhesive reflective tabs and/or natural features of an object can be used to identify a specific location on an object across one or more images.
Additionally and/or alternatively, the locating feature can include any common reference point surrounding the object 201 that can be determined, at 4040. For example, a corner of a table where the object 201 is placed can be used. Advantageously, each scan includes fiducial markers and at least a portion of the scanned object in the same scan.
Once the scans are brought into a common reference system, the scans can be aligned and merged, at 4050, to create a complete 3D model of the object, such as shown in
Turning to
An operator begins the scanning routine, at 4010. Once the operator provides instructions for the scan of the object 201 to begin, each scanner 300 scans the object from their own position, at 4070. In a preferred embodiment, each position provides a unique angle of the 3D object 201. As all portions of the object 201 have been scanned from independent scanners, from each scan, a reference point is determined, at 4040. In some embodiments, because the scanner 300 can scan a large volume in a single pass, the reference point can be determined from a subset of the scans as at least a subset of the scans will include a reference point. In a preferred embodiment, each scan includes at least one reference point of the object 201, such as the tooling spheres discussed above. As an additional example, adhesive reflective tabs and/or natural features of an object can be used to identify a specific location on an object across the plurality of scans.
Once the scans are brought into a common reference system, the scans can be aligned and merged, at 4050, to create a complete 3D model of the object. In yet another embodiment, if not all scans include a reference point, a best fit alignment can be used to align individual scans of the object 201. A reference point then can be selected from one of the scans following the best fit alignment to ensure proper alignment to a fiducial marker. In a preferred embodiment, the scans are aligned and merged to a computer aided design (CAD) model (not shown) of the object 201, at 4060. Advantageously, the presence of scanned fiducial makers in the same file as the scanned object 201 enables a translation of physical machine space into a computer coordinate space. In this way, the system advantageously maps virtual coordinates, such as from the CAD file, to physical machine space locations, such as for use with CAM software. In some embodiments, because the alignment reference point can be determined for each scan, at 4040, the alignment of the scans (step 4050) is based on a best fit.
Turning to
The processor 510 can execute instructions for implementing the control system 500 and/or scanner 300. In an un-limiting example, the instructions include one or more additive and/or subtractive manufacturing software programs. The programs can operate to control the system 100 with multiple printing options, settings and techniques for implementing additive printing of large components.
The programs can include a computer-aided design (CAD) program to generate a 3D computer model of the object. Additionally and/or alternatively, the 3D computer model can be imported from another computer system (not shown). The 3D computer model can be solid, surface or mesh file format in an industry standard.
The programs can load the 3D computer model, create a print model and generate the machine code for controlling the system 100 to print, scan, locate, and/or post-process the object (e.g., via subtractive manufacturing). Exemplary programs can include LSAM Print 3D, available from Thermwood Corporation located in Dale, Ind. Additionally and/or alternatively, exemplary programs can include Unfolder Module Software, Bend Simulation Software, Laser Programming and/or Nesting Software available from Cincinnati Incorporated located in Harrison, Ohio.
As shown in
Additionally and/or alternatively, the control system 500 can include a communication module 530. The communication module 530 can include any conventional hardware and software that operates to exchange data and/or instruction between the control system 500 and another computer system (not shown) using any wired and/or wireless communication methods. For example, the control system 500 can receive computer-design data corresponding to the scans of the scanner 300 via the communication module 530. Exemplary communication methods include, for example, radio, Wireless Fidelity (Wi-Fi), cellular, satellite, broadcasting, or a combination thereof.
Additionally and/or alternatively, the control system 500 can include a display device 540. The display device 540 can include any device that operates to presenting programming instructions for operating the control system 500 and/or presenting data related to the print head 120. Additionally and/or alternatively, the control system 500 can include one or more input/output devices 550 (for example, buttons, a keyboard, keypad, trackball), as desired.
The processor 510, the memory 520, the communication module 530, the display device 540, and/or the input/output device 550 can be configured to communicate, for example, using hardware connectors and buses and/or in a wireless manner.
The disclosed embodiments are susceptible to various modifications and alternative forms, and specific examples thereof have been shown by way of example in the drawings and are herein described in detail. It should be understood, however, that the disclosed embodiments are not to be limited to the particular forms or methods disclosed, but to the contrary, the disclosed embodiments are to cover all modifications, equivalents, and alternatives.
This application claims priority to U.S. Provisional Application Ser. No. 62/632,560, filed on Feb. 20, 2018, the disclosure of the provisional application is hereby incorporated by reference in its entirety and for all purposes.
Number | Name | Date | Kind |
---|---|---|---|
5207371 | Prinz | May 1993 | A |
9600929 | Young | Mar 2017 | B1 |
20060173541 | Friel | Aug 2006 | A1 |
20080181486 | Spooner | Jul 2008 | A1 |
20090104585 | Diangelo et al. | Apr 2009 | A1 |
20090323121 | Valkenburg | Dec 2009 | A1 |
20100332196 | Fisker | Dec 2010 | A1 |
20120323345 | Jonas et al. | Dec 2012 | A1 |
20140035182 | Boyer | Feb 2014 | A1 |
20140125658 | Bell | May 2014 | A1 |
20140125767 | Bell et al. | May 2014 | A1 |
20150061170 | Engel et al. | Mar 2015 | A1 |
20160335771 | Luo | Nov 2016 | A1 |
20160368220 | Dimatteo | Dec 2016 | A1 |
20160374431 | Tow | Dec 2016 | A1 |
20170066193 | Kim | Mar 2017 | A1 |
20170068756 | Wilsher | Mar 2017 | A1 |
20170109888 | de Lima | Apr 2017 | A1 |
20170144242 | McQueen et al. | May 2017 | A1 |
20170287162 | Wasik | Oct 2017 | A1 |
20180130255 | Hazeghi | May 2018 | A1 |
Number | Date | Country |
---|---|---|
3 238 865 | Nov 2017 | EP |
2012-024920 | Feb 2012 | JP |
WO 2017106965 | Jun 2017 | WO |
Entry |
---|
Tyler Koslow, “Water Under World's First 3D Printed Pedestrian Bridge Completed in Madrid,” https://all3dp.com/3d-printed-pedestrian-bridge/, Dec. 20, 2016 (Year: 2016). |
WO, International Report & Written Opinion, Application No. PCT/US2019/018806, dated Jul. 31, 2019. |
Chinese First Office Action, Application No. 201980014512.4 dated Jun. 25, 2021. |
Korean Office Action, Application No. 10-2020-7026790 dated Jun. 24, 2021. |
Examination Report, Application No. 3091010, dated Oct. 15, 2021. |
Number | Date | Country | |
---|---|---|---|
20190255777 A1 | Aug 2019 | US |
Number | Date | Country | |
---|---|---|---|
62632560 | Feb 2018 | US |