![]() |
Proceedings of a Workshop—in Brief |
On May 29–30, 2024, the National Materials and Manufacturing Board and the Board on Mathematical Sciences and Analytics of the National Academies of Sciences, Engineering, and Medicine held a workshop, Methods for Enhancing Additive Manufacturing Qualification and Certification for Defense Applications, sponsored by the Department of Defense. The aim of the workshop was to explore current approaches to enhancing predictive accuracy and innovations in process, control, and inspection, as well as to discuss recent advances in statistics and analytics. This Proceedings of a Workshop—in Brief summarizes the presentations and discussions that occurred at that workshop.
The workshop’s first panel, moderated by workshop planning committee member Jian Cao, Northwestern University, had four speakers from industry, academia, and government who spoke about various challenges and gaps related to additive manufacturing (AM).
Douglas Hofmann, National Aeronautics and Space Administration’s (NASA’s) Jet Propulsion Laboratory (JPL), gave a brief overview of AM and its challenges at JPL. The laboratory has conducted work in metal AM since 2009. It initially focused on research, developing new processes to create materials for spacecraft hardware that would perform in extreme environments. JPL started qualifying AM for flight applications around 2018, and now there are several AM materials on various NASA craft, including the Mars rover Perseverance.
Hofmann discussed the difficulties and the expense—about $2 million for a particular alloy—of getting AM materials qualified for use on spacecraft. He described how AM in the national laboratories depends on commercial industry for raw materials, equipment, and service contracts. And he suggested that the current contraction in the commercial AM market could threaten the defense AM field through disruptions to its supply chain and the need to redo qualification for various tools and materials.
David Furrer, Pratt & Whitney, explained why the qualification and certification of AM components are difficult. First, AM component microstructures and properties are sensitive to processing parameters, parameter tolerances, and component geometries. Also, AM components can exhibit large gradients in microstructure, defect potential, and local properties. Thus, it is difficult to ensure that test specimens represent the entirety of complex parts, and, when the actual components are tested, it is difficult to know which locations on the component should be tested.
Traditionally, the qualification and certification of a part involve establishing requirements that include supporting both component design and structural analysis; establishing component design and materials definitions; creating a manufacturing process control plan; and designing and executing a test plan. In designing a component, statistical design minima based on test samples from components that account for the variation in properties of these samples should be considered. Given the nature of different process defects, it is difficult to know what minimum properties can be assumed for a particular component and material.
Furrer said he believes that the traditional approach is too static and requires a different, model-based approach. The design, process, structure, properties, and performance systems-based design allows one to consider how the factors interact in the production of a part to qualify and certify the product. This computational modeling can guide the processes to qualify the production tools, materials, processes, and components.
Markus Bambach, ETH Zurich, said the difference between open-loop and closed-loop control of materials processing can be thought of in terms of baking versus cooking. In baking, one knows the precise recipe and conditions where one mixes the ingredients, puts the mixture into the oven at a certain temperature, and then leaves it there for a certain time. Bambach equated baking to an open-loop process and shared that the influence of disturbances can be predicted. In cooking, by contrast, one cannot predict the influence of various disturbances and must monitor the mixture in the pot as it cooks, adjusting the temperature and stirring when necessary and perhaps adding salt or other ingredients.
Powder bed fusion and directed energy deposition are “layer-wise” cooking processes, Bambach said, but they are usually run like baking processes. One can see a part being made and interact with it as it is built up, but instead one can fix all parameters, start the machine, and hope the part comes out correctly, he continued. For open-loop processes, good models that are fast and accurate enough to help define target properties and parameters are needed. The AM processes need to be robust against disturbances—for example, by creating materials that are more robust to processing and by making the processes robust in execution. Quality control should be performed based on process data, which will be a major paradigm shift.
For closed-loop processes, a key challenge will be to develop an AM property controller and determine what to control to achieve the desired properties. Enabling feedback control is difficult and necessitates open machine architectures and hard real-time execution of control algorithms. The feedback must be robust, and selecting and placing the sensors correctly is crucial. Finally, qualifying closed-loop control hardware for aerospace applications, for example, will be challenging.
Vadim Shapiro, Intact Solutions, stated that as the AM market has grown in recent years, the advances in the software components of the market have lagged those of the hardware and materials components. As a result, software capabilities have fallen behind, and the models and assumptions do not reflect the physical process and do not scale. Thus, validation of AM is dominated by physical testing, and AM design, qualification, and validation are slow and costly.
Shapiro argued that improvements in software could decrease the costs and time required for AM design, qualification, and testing. In particular, he suggested using computational pre-qualification to minimize physical testing. He detailed the use of digital twins1 and the need to calibrate the physics-based digital twin models against the physical experiments to ensure that the digital twin accurately represents the component as it is being processed. Next, he offered two examples of using such digital twins, a thermal history simulation of a powder bed fusion process with melt pool prediction and a calculation of the material properties of a component from the process plan.
__________________
1 The National Academies of Sciences, Engineering, and Medicine’s report Foundational Research Gaps and Future Directions for Digital Twins defines a digital twin as “a set of virtual information constructs that mimics the structure, context, and behavior of a natural, engineered, or social system (or system-of-systems), is dynamically updated with data from its physical twin, has a predictive capability, and informs decisions that realize value. The bidirectional interaction between the virtual and the physical is central to the digital twin” (National Academies of Sciences, Engineering, and Medicine [NASEM], 2024, Foundational Research Gaps and Future Directions for Digital Twins, The National Academies Press, https://doi.org/10.17226/26894).
Among the challenges to using digital twins in this way is that part scale geometry affects the thermal history, material heterogeneity, and part performance. It will be difficult to ignore material heterogeneity in AM parts of any complexity, Shapiro said. Another challenge is that any simulation needs to start at the scale of the processes and extend to the scale of the part to take into account the effects of the part’s geometry on the outcome. Ultimately, software solutions will require the computational exploration of vast spaces of parameters for processes, materials, designs, and defects, which will be challenging.
Furrer said that software and speed are two critical areas for AM. “If you don’t have speed, no one will use it.” Shapiro noted that what sets AM apart from traditional manufacturing is that the process creates the material and the geometry of the part at the same time, while in traditional manufacturing one creates the material first and then the geometry. However, he said, in the design tools used for AM “we’re still trying to separate” the geometry from the material—in short, the fact that AM creates material and geometry is treated as a limitation rather than a degree of freedom that might be exploited.
Hofmann said that people ask how NASA can rely on complex AM parts when the materials testing is done on coupons. “The short answer is that we never fly as-built parts.” The microstructure of the parts is homogenized through hot isostatic pressing and heat treatment processing, and computed tomography (CT) scans are used to look for defects. Bambach noted he is learning how to control AM processes well enough to control the local microstructure and remove the need for such post-processing.
Adarsh Krishnamurthy, Iowa State University, offered background on foundation models and how they can be used in AM. Over the past 15 years, manufacturing has been transformed into what he termed “industry 4.0,” which is characterized by such digital technologies as big data, robotics, three-dimensional (3D) printing, cloud computing, and simulation. This has led to greater interoperability, the decentralization of information, increased flexibility, and real-time data flow. Now large language models (LLMs) and other forms of artificial intelligence (AI) are transforming industry even more.
A foundation model is a machine learning (ML) model that works with data of all types—text, images, speech, structured data, and 3D information. After it has been trained on a large amount of data, the foundation model can be used for multiple tasks, including answering questions in natural language, extracting information, recognizing and captioning images, and following instructions. The foundation models that have received the most attention are LLMs such as ChatGPT, vision models that can recognize and categorize images, and multimodal models that combine language and vision capabilities. Any of them can be trained on specific domains such as health care or finance. With language-guided generative design, Krishnamurthy said, it is possible to translate a description in words into an image and then into a design that can be used to create a real object.
More generally, LLMs can be integrated into each step of the manufacturing pipeline, which facilitates information exchange via natural language among these different steps. For example, it is possible to provide immediate responses to queries from assembly line workers about, say, how part A interfaces with part B. One challenge, however, is to find a way to guarantee that the LLMs were not giving incorrect responses, which has been a consistent issue for these models.
Because foundation models require large training data sets, Krishnamurthy said his team has developed a large multimodal data set (called Slice-100K) for AM. It contains data from varying modes such as computer-aided design models, G-code (a popular programming language for computer numerical control and 3D printing), renderings, and captions. One use of this data set was to train algorithms to provide captions for such things as renderings or 3D models for use as metadata.
When Krishnamurthy’s group trained LLMs on G-code, it was able to spot errors in the code and describe what shape would be produced by a particular set of instructions in G-code. He noted that proprietary LLMs generally performed better on these tasks than open-source LLMs.
In conclusion, he said that manufacturing-specific foundation models could transform the industry, with the potential to improve all stages of the manufacturing ecosystem. Training these foundation models will require significant manufacturing data, which will require building trust among manufacturers to enable secure data exchange as well as built-in cybersecurity. Finally, training such a manufacturing-specific foundation model would be expensive, so it will require some fundamental breakthroughs to reduce training costs.
In response to a question about how to know when to trust answers from LLMs, Krishnamurthy said that one approach is to have the model assign a confidence score to its answers and only accept answers above a predetermined cutoff. He also said that if the foundation models were given physical test data from an AM part, it should be possible to use them for certification.
Ryan Wicker, The University of Texas at El Paso, said that from an academic point of view, laser powder bed fusion of metals (LPBF-M) is an excellent system for basic and applied research. The available hardware has tremendous potential, and there are numerous intellectually challenging issues to address, such as non-contact accurate temperature measurement and the microsecond control of high-power lasers. Wicker’s laboratory has five commercial systems and studies equivalence in parts fabrication across platforms.
Much of the discussion about AM parts centers on structure, properties, and performance, Wicker said, with little attention paid to the details of the process. But fixing the process is something that people should focus on, because it is crucial to part reproducibility. All LPBF-M systems have similar subsystems: a laser source, a beam steering system, a powder and platform management system, a gas flow and filtering system, an environmental control chamber, and a machine control system. Furthermore, many systems get their parts from the same suppliers. So why is there such variability in the parts produced by the different systems? The variability occurs because each system has its own unique implementation, Wicker said.
Wicker pointed to powder and powder management (e.g., layer density) and gas flow and chamber environment (chamber chemistry and laser beam attenuation) as two sources of variation that can be controlled well. A more troublesome source is tool pathing. Both the software (which is specific to the manufacturer) and the hardware (for which there is generally a lack of transparency and control) contribute to part variability in ways that are difficult to control or account for. He argued that once this issue is addressed, one can solve the entire problem of part variability.
Wicker then discussed the individual paths—or welds—of a laser scan across a powder bed. Because these individual weld tracks are the building blocks of a part, their details matter. And if one looks at the microscopic details of how individual machines build up parts, one sees they are all different. It is impossible to get identical parts from different machines.
To solve this issue, he said there should be process transparency across the platforms to enable part reproduction. This will make it possible to reproduce parts. A universal build file should be adopted with no hidden black box information, and common user-enabled subsystem qualification should be required.
In response to a question, Wicker said that purchasing managers could require manufacturers to provide access to the necessary scanner information before they will purchase the scanner. Users should also make a business case as to why manufacturers should supply that information. If this is done, manufacturers may offer the necessary transparency. Ultimately, he said, the survival of this industry will depend on more openness.
The second panel, moderated by Cao, addressed the role of and opportunities for in situ sensing in AM.
Alaa Elwany, Texas A&M University, provided some context about in situ sensing, talking about its recent history and future directions. After showing a photo of a process monitoring system from his laboratory from a decade ago, Elwany said that today there are still few commercially available products for process monitoring in AM. One needs in situ monitoring, he continued, to
fulfill a goal of AM: having a cost- and time-efficient method to secure high-quality parts (i.e., defect-free and reproducible) that is enabled by a deep understanding of the process. This motivation should underlie efforts to identify and address gaps in the whats and hows. Unfortunately, he said, many people in the community end up doing things just because they can.
Elwany next discussed the advances in in situ monitoring that have occurred over the past decade. The reasons for in situ sensing have evolved—for instance, there is more emphasis on detecting defects. The capabilities of sensors have also advanced, but they are not yet industrially mature, according to an ASTM survey that evaluated the readiness of different types of sensors. For instance, neutron diffraction sensors were assigned a technology readiness level (TRL) of 3 at the time of the survey, meaning they are still considered in the “technological research” phase.2 The graphic that Elwany shared from the survey showed that accelerometers, by contrast, are ready to be used in product demonstration but not mature enough for part certification.
Looking to the future, Elwany said that various barriers to adoption, such as sensor validation and integration, need to be resolved. It is also important to decide what needs to be monitored, such as subsurface defects, and to develop in situ sensors for those tasks. In situ monitoring needs to be augmented with the “right” models, such as standardized and accessible models to validate and complement in situ monitoring data. Finally, the community needs to decide on common terminology, standard test artifacts, and common data repositories.
Shuchi Khurana, Addiguru, explained the economic case for in situ monitoring by saying that low-cost affordable sensors can stop failed prints. This saves time and materials and avoids the need to detect these failed prints with expensive CT scanning after they are finished. Unfortunately, not many manufacturers take advantage of the current in situ monitoring capabilities.
After listing the AM defects most important to detect (porosity, microstructure issues, deformation, rough surface finish), Khurana said that deploying a multimodal sensor suite is the key to detecting these various defects. The suite should include optical, infrared, near-infrared, and acoustic sensors as well as machine data. By combining machine vision and sensor data with materials science knowledge, physics-based simulations, process insights, and AI, it should be possible to identify issues in real time.
After describing a number of the in situ sensors available today, Khurana summarized by saying that affordable and practical sensors are available, multimodal sensors are needed for qualification and certification, and in situ monitoring could accelerate the adoption of AM.
Amanda Cruchley, Manufacturing Technology Centre (MTC), gave an overview of the Defence Materials Centre of Excellence (DMEx), which is a multipartner UK initiative to research, create, and prototype new defense materials that can survive in the harshest conditions.
The £42.5 million partnership is led by the Henry Royce Institute for advanced materials research and innovation, with 23 other partners from industry, academia, and research organizations. The High-Value Manufacturing Catapult supports the DMEx by ensuring that the implementation and scale up of advanced materials is considered at the lower TRLs.
One focus area of the DMEx is digital threads, with the goal of implementing mature digital threads3 across the AM value chain to enable predictability and traceability and increase the quality, efficiency, and industry uptake of AM parts. Another focus is in situ monitoring and qualification, and at MTC, a partnering organization of the DMEx, there is ongoing internal and external work to understand how to correlate indications from several in situ monitoring and inspection techniques and use that
__________________
2 M. Jamshid, M. Kottman, K. Snodderly, J. Williams, and M. Seifi, 2023, “Strategic Guide: Additive Manufacturing In-Situ Monitoring Technology Readiness: Findings and Path Forward for Applications in Qualification and Certification,” ASTM Additive Manufacturing Center of Excellence, https://doi.org/10.1520/amcoe-guide-in-situ-tech-readiness.
3 To see how the term “digital thread” has been used in other National Academies’ publications, consider the following sources (among others):
in the generation of a materials database. The DMEx also uses situ monitoring, inspection, and simulation tools to predict AM mechanical performance.
Cruchley closed by saying that digitalization in AM presents an opportunity to unleash the potential of AM, but the deluge of data presents a challenge to using it actively in control.
Adam Clare, University of British Columbia, said that defects appear in materials made with AM because people are trying to make perfect parts with imperfect processes. Thus, it is important to detect defects, preferably before a part is finished, and several sensing techniques are available for metal AM, including optical, ultrasound, and thermography.
After offering details on techniques for studying defects in AM metals, such as laser ultrasonics, Clare sketched out a vision for the future in which a high-fidelity digital twin of a metal AM part would be constructed using a physics-informed model combined with sensor-derived information about the part at different stages of the manufacturing process. The digital twin could be used, for instance, to detect and measure defects that arose during manufacturing. Then it would be possible to stop the build, fix the defect, and then continue, thus avoiding most defects altogether.
Clare also suggested that the information from the digital twin could predict the service life of manufactured parts. For instance, although all turbine blades of the future might meet specification, not all will be equal, and understanding the parts in detail will help manufacturers predict when they need to be replaced.
In the discussion, Khurana said that one important challenge in using in situ monitoring to qualify a part is to link the monitoring with the precise location, because for some sensors it is not easy to correlate a signature with a particular location in the part. Another step will be to link the sensors with part-level simulation models that account for different geometries, because current monitoring is more effective with simple geometries. Cruchley said that while in situ monitoring could be used for qualification of parts, using it to qualify processes will require a better understanding of the process physics.
Elwany said that the software for in situ monitoring has not evolved as much as the hardware in recent years. However, Khurana added, there is software work being done to make the sensors easier to use.
In response to a question about how to use sensors to monitor or qualify gradient structures in AM materials where the density varies, Clare said it is difficult to do and he did not know of any effective techniques for accomplishing it. Elwany addressed how universities are ensuring that the next-generation workforce receives relevant training in AM. Texas A&M University, he said, is establishing bridges with industry to design its curricula to meet what industry needs. This was not previously the norm, he added, but now it is the norm with many universities.
Paul Witherell, National Institute of Standards and Technology (NIST), moderated a panel on how verification, validation, and uncertainty quantification might support AM and part certification.
Ed Herderick, NSL Analytical, spoke about what is involved in the industrial testing of AM products. As context, he said, AM capabilities grow exponentially in the same way that computing power and memory do. For instance, laser powder bed fusion productivity grew from 1 cc per day in 1993 to 1,000 cc per hour in 2023. What will solidify these gains and lead to broad adoption? He pointed to four needs: (1) holistic combinations of chemical, physical, and mechanical testing and standards; (2) advanced lasers and optics; (3) rapid testing and qualification on new platforms for delta qualification (i.e., re-qualification on already-qualified equipment that has been modified or is being used differently); and (4) the use of AI and big data for process control. Delta qualification is particularly important, he said; without it the field could be locked into existing, inferior technologies because of the high cost of de novo testing.
To illustrate the importance of using different types of tests together, Herderick provided the following three
case studies on a holistic mix of testing: (1) where a particle-size distribution showed large particles that should have been sieved, (2) when fatigue properties were too low with one vendor but not another, and (3) when an oxygen measurement reading was high in powder feedstock testing.
Richard Huff, ASTM International, discussed how ASTM has advanced standards for AM. ASTM has approved 72 AM standards and has an additional 62 work items. After listing key standards supporting AM qualification and certification, Huff spoke about how the Additive Manufacturing Center of Excellence, a partnership among various government, industry, and academic organizations, is conducting strategic research and development (R&D) to advance standards across all aspects of AM and to create globally recognized certification programs. Undertaking such research is critical in developing standards, he said.
Huff closed by describing the Consortium for Materials Data and Standardization. This effort, he explained, provides a forum in which AM organizations can work together toward AM adoption by identifying standard requirements and best practices for materials data creation and management. The consortium was created to address several challenges: creating data sets is costly and can be prohibitive for some companies; directly transferring data and lessons learned between companies is difficult, leading to a duplication of efforts; and a lack of standardized approaches to data generation, pedigree, and management leads to significant waste.
Sankaran Mahadevan, Vanderbilt University, focused on the methods used in the verification, validation, and uncertainty qualification (VVUQ) of AM products for purposes of qualification and certification. He noted that there are few AM test specimens available for qualification and certification testing, so model-based predictions are used to supplement the test data. Information gathered from model VVUQ activities is fed into the model-assisted qualification and certification.
Mahadevan said that there are two main approaches. The first is uncertainty aggregation, a forward analysis, which refers to the aggregation of all VVUQ results to quantify the overall uncertainty in predictions of process quality and part performance. The second is uncertainty reduction, an inverse analysis, which identifies the dominant drivers of uncertainty to facilitate uncertainty reduction and improve process to meet qualification and certification requirements.
Mahadevan also elaborated on these two types of analysis. He described the types of uncertainty that one must deal with—and how they are dealt with—including process variability, data uncertainty, and uncertainty related to physics models of the processes. He also spoke about the use of physics models and ML models in predicting VVUQ.
Mark Douglass, Lincoln Electric, spoke about wire-arc directed energy deposition (DED), a form of AM that is distinct from the more common LPBF, typically used for much larger parts. This welding technique has been used for many decades, but only recently with the advent of advanced robotics, controls, and software could it be applied to AM.
Douglass discussed several codes and standards for wire-arc DED that are closely related to existing welding standards. He also spoke about the difference between a weld and an AM material created with wire-arc DED; in essence, a weld is metal material joining two metal parts together, while an AM material created with wire-arc DED consists of a metal shape laid down on a base, layer by layer.
Next, Douglass described what is involved in the qualification of parts made with wire-arc DED. He closed by predicting that updated codes and standards are likely to accelerate the adoption of metal AM parts made by wire-arc DED in coming years.
In response to a question about what single instrument he would choose for testing if he did not have access to multimodal testing, Herderick pointed to optical microscopy because of how much one can learn from it. As next steps, Mahadevan said that while methods have been demonstrated on small-scale parts and applications, the most important thing is to scale them up and demonstrate them on industrial parts or components of interest
to users such as the Department of Defense. That would be a major step toward adopting these methods and developing further standards, he said.
Jong Gyu Paik, Korea Research Institute for Defense Technology Planning and Advancement (KRIT), offered an overview of South Korea’s defense-related efforts in AM. He began with a brief overview of KRIT, which is a part of the Agency for Defense Development and whose mission is to research and develop innovative defense technologies and advance the defense industry. As context, he discussed the global AM market, which has grown by about 20 percent each year and is forecast to continue that growth through 2027. More than half of the market is metal AM.
South Korea is a “fast follower” rather than an innovator in AM, Paik said, and Korean companies are now setting their sights on catching up in the quickly developing sector. The government has also identified this as an area of focus, and the value of the AM market in South Korea is expected to increase rapidly in coming years. One reason to expect rapid growth in the AM industry in South Korea in coming years is a significant increase in public interest, driven by the media and early adopters. Furthermore, several universities have created AM departments.
In its efforts to develop the overall AM industry, the South Korean government has defined four major strategies: (1) fostering new demand for AM technologies inside South Korea; (2) enhancing the country’s technological competitiveness; (3) strengthening the country’s AM workforce and infrastructure; and (4) creating a reputation for South Korea as a country with a high-quality and well-regulated AM industry. The main AM parts produced in South Korea today are metal parts, with uses in the automotive and aerospace industries.
Paik discussed AM R&D activities in South Korea, including specific R&D projects, and he listed technologies where there are opportunities for collaboration with South Korean entities. One was real-time microstructure control during AM processing using a pulsed laser; an objective would be to reduce porosity and improve microstructure control of AM parts. A second was in-line monitoring of defects using electromagnetic properties detection with an eddy current; this would allow defects to be addressed immediately.
The next panel, moderated by Katherine Faber, California Institute of Technology, explored barriers to scaling up AM and how statistics and analytics might enable cost savings.
Greg Larsen, Oak Ridge National Laboratory (ORNL), said he would focus on increasing the size of AM parts in his presentation. Many of the parts he makes are larger than a human, requiring gantries or large robots for their construction. To make these large-scale AM composite parts, he said, his group uses either thermoplastics or thermoset plastics. Many of the parts the group makes are composites, with fiberglass or carbon fibers extruded within the plastic material as it is laid down in its pattern; the fibers can be continuous or discontinuous, with the discontinuous fibers adding more stiffness and the continuous ones providing greater strength and fatigue resistance. Using continuous fibers, though, constrains the path the tool takes as it is laying down the material and requires creative solutions—for example, any place that the fibers cross will have a thickness buildup.
Goals for this type of AM are to create complex shapes without the need of additional tools, with limited post-processing, and to be able to use continuous fibers for strength. Developers are working with two main approaches: automated fiber placement and fused deposition modeling. Fused deposition modeling is currently evolving to include continuous fibers. Automated fiber placement is a continuous fiber method that currently uses tools and post-processing, but this approach is shifting to minimize tool use and post-processing.
In closing, Larsen said there is a need for standards focused on composite AM parts; to date, most AM standards have focused on metals. He suggested using in situ monitoring to know what to focus on in post-production non-destructive examinations.
Nikki Jain, Boeing, began with a brief overview of how Boeing uses AM. The company has data on every step of
making its AM parts, from the design and manufacture to post-processing and testing. The data are collected in a data library, where they are used as training data for ML and AI, with lessons fed back into the process for continual learning. The digital build file for a part can be sent to any Boeing supplier or facility around the world and used to create a part.
Jain then listed the steps that Boeing sees as leading to scaled production. First, it is necessary to reduce costs and increase speed and accuracy. At this point, she said, AM makes sense only for the highest-value parts; once costs go down, it will be possible to expand AM to other parts. The next steps are achieving material and mechanical repeatability and also machine industrialization—supposedly identical machines sold by a single provider should actually be identical. Other steps include automating post-processing, introducing industry standards, expanding the workforce, developing an AM ecosystem, and creating a mature digital infrastructure.
Jain closed with a list of needs, including making AM systems more stable and repeatable, integrating design into other parts of the process, making simulations faster and higher fidelity, improving process control with such things as advanced sensors, and integrating post-processing into the remainder of the AM process.
Magdi Azer, REMADE Institute, discussed how a weld repair was made and qualified on the tip cap of a high-pressure turbine blade 30 years ago. It was a very hands-on process, and the benefit of that approach was that one could see the evolution and consistency of the repair during development. It was possible to query the process at a detailed level using optical microscopy.
Azer then offered some observations on the current state of AM that were informed by his experiences. First, he said, if integrated computational materials engineering (ICME) work is required to qualify all LPBF processes and parts, particularly for small and medium manufacturers who lack this expertise, “we will not achieve cost-efficient and at-scale AM any time soon.” Original equipment manufacturers (OEMs) that can afford to invest in the ICME approach will continue to do so, and OEMs that can afford to invest large sums of money will be able to qualify the production of key components. However, to proliferate LPBF will require finding ways to develop and qualify products and processes without ICME expertise. In short, he said, “we need to develop alternatives to ICME that accomplish the same thing.”
During his conclusion, he sketched out a possible alternative qualification scheme. It would include such elements as characterizing the powder lots produced, single-track experiments with different powder lots, using integrated process monitoring to characterize melt pool geometry and signatures versus process parameters, observing the process and part with optical emission spectroscopy, and the use of multi-input, multi-output control focused on achieving the desired microstructure.
Kyle Saleeby, Georgia Tech, spoke about advancing AM technologies through structured and cohesive information architectures so that R&D at facilities around the country can be connected in ways that will advance AM capabilities more quickly than would otherwise be possible. Currently, he said, scaling AM technologies faces several challenges: costs and available labor expertise; quality and certification; and a lack of uniformity among processes, sensors, systems, and models. There is also a need to scale the ability of manufacturers to access and use secure data. Saleeby called for “strategic digital connectivity for AM research and characterization.”
Saleeby detailed what would be required for this sort of connectivity. In terms of AM information architectures, for instance, there needs to be an effective strategy for scaling analytics methods alongside AM processes. Cybersecurity will be important. A vital step toward this widespread connectivity will be establishing strategic partnerships where parallel developments can take place. Georgia Tech already has such partnerships with Sandia National Laboratories, where the communications are one-way Sandia, and with NIST, where the communication goes both ways.
In closing, Saleeby recommended focusing efforts on improving digital connectivity and transferability; that standard practices be developed for in situ and ex situ
data collection, storage, and consumption; and that key strategic partnerships be instituted where information is shared.
The first question concerned workforce. Jain said that she wants Boeing’s engineers to be multiskilled, with capabilities throughout the development cycle. Saleeby added that technical colleges need to be part of workforce development.
On the topic of sharing data and information among institutions, Saleeby said that the defense industry, given its ability to handle classified information, could set up a network for sharing information among members of that community that would avoid some of the issues—such as intellectual property concerns—that might arise in a network for the broader AM community. For such networks, he added, it should be possible to use cybersecurity technologies and particular network architectures to decrease the risk of sharing information.
Asked whether there is an alternative to ICME in the materials qualification process, Azer said that parts are already being qualified without ICME. His concern is that too much emphasis is being placed on ICME in the AM field, which could slow the growth of production facilities beyond today’s OEMs.
Vincent Paquit, ORNL, spoke about how one deals with AM data in the context of certification and qualification. He described what can go wrong during robotic metal deposition, including inaccurate geometry, defects being present, robot collision, residual stress, distortion, uncontrolled microstructures, and environmental influence. A variety of in situ measurements and observations are available to monitor these issues, from melt pool monitoring, strain measurement, and shape measurement to electron beam and even neutron imaging. Many of these methods are expensive, however. The question is how can operations with relatively small budgets effectively monitor the process?
One answer is to use AI to perform in situ quality control of AM processes. In terms of detecting anomalies, Paquit said, in situ sensing that uses AI produces real-time results on par with traditional ex situ CT methods. However, this is not enough to predict performance of the part.
To predict performance, one can create a digital thread using sensor data and a physics-based simulation that reflects what is going on at each step of the process. By testing finished parts and comparing the results of the tests with the digital model, one can hone a model that will effectively predict the performance of AM parts using only in situ data. The goal, he said, is to bypass the testing operation. With this level of knowledge about the process, he continued, it will be easier to develop a qualification and certification strategy.
To illustrate, he provided some examples. The first was using automated micrograph analysis for the prediction of mechanical properties. Another was achieving uniform, defect-free microstructures through data analytics and AI. Paquit listed opportunities in this area, including creating standards for digital twins, working on the interpretability and explainability of AI and ML outputs, developing models and simulations for defect and microstructure prediction, and using LLMs and augmented reality/virtual reality for assisting operators.
Douglass spoke about what is possible with large-format AM metal parts. He began by discussing organizational growth and emphasized that companies must scale up their capacities—in essence, the number of AM machines that they operate—alongside product size.
One major advantage in making large, one-off parts with wire-arc AM or gas metal arc DED, he said, is that it reduces lead time by half or as much as 80–90 percent. This is particularly important for replacements parts, where customers will pay more for the part to get it quickly. Douglass commented that making large parts with AM generally requires running 24 hours per day, because the parts may take weeks to finish. While the process is automated, it is still important to have workers there 24 hours per day for maintenance activities.
Douglass then described a use case where his company replaced an old cast part. After the part had been scanned in three dimensions and as plans for constructing the replacement were being done, the maintenance team asked if changes could be made to the part to make
maintenance easier. It was possible, so the AM replacement part was an improvement over the existing one.
Another use case involved making a replacement part for an 8,000-pound extruder cylinder of high-strength, low-alloy steel. Where it would have taken 6 months to produce a forged part, the 3D-printed part was ready in 6 weeks.
A third example was of replacements for eight piping and fitting components used for a hydrogen furnace in a refinery. The company wanted to replace them during a planned 1-month maintenance shutdown, but it would have taken 3 months for new castings. Lincoln Electric produced, tested, and qualified the parts—which were about 3 feet in length and more than 500 pounds each—in just 30 days using wire they expedited from their local wire factory.
As these use cases show, it is often possible to make parts more quickly with AM than with castings, and, in general, with higher quality.
The final panel, moderated by Slade Gardener, Big Metal Additive, focused on alternative methods for qualification.
Matt Crill, Barnes Global Advisors, described a project called Delta Qualification. Its goal is to develop a simplified approach to addressing requalification activities within AM. The rapid pace of technological advances in the field necessitates this new approach, he said, and the project is looking to learn lessons from a variety of delta qualification approaches carried out by project teams working with LPBF technologies.
Ultimately, the ongoing project’s goal is to demonstrate the ability to update AM qualification rapidly and affordably in response to changes in key AM processes, post-processing, and feedstock variables. The project’s lessons will help generate draft language for standards organizations to consider for specifications and standards for delta qualification approaches. They may also be used in updates to the America Makes technology roadmaps for qualification activities.
Mark Shaw, Wichita State University, said that qualification should not be the primary focus for metal AM parts. Although there are some differences between AM metal parts and metal parts made using other methods, “metal is metal,” and people have been making qualified metal parts for 100 years. Many commercial airliners are flying today with flight-critical AM parts, and currently there is high-rate serial production for parts with as-printed surfaces that are used in high-temperature, high-fatigue environments. Instead, the major issue is cost and scale. Shaw asked how AM parts could be qualified at scale and at a reasonable cost.
Shaw offered three categories of qualification approaches, from most to least expensive: (1) point design, (2) prescriptive approaches, and (3) performance-based approaches. Point design works for low-volume parts and considers the details of the creation of the part, while performance-based approaches are less concerned with how a part was made and instead more with how it performs. The prescriptive approach is focused more on the process used to make parts.
After reviewing different types of qualification, he said that many qualification issues can be addressed by improvements in AM technology. These enhancements will increase the democratization of AM, allow the scaling of AM by a factor of 10 in 3 years, increase the ability to maintain AM process control, reduce AM costs, accelerate the creation of AM material data, and support a qualification standard.
Douglas Woodward, SpaceX, described how SpaceX approaches the qualification of AM parts. He discussed what he called “the algorithm” for dealing with additive parts. First, question the requirements, such as why a part needs to be inspected and whether new production methods might bypass the inspection method constraints. Second, consider deleting the part or process step; AM is well suited for deleting parts and merging them, and the same can be done with inspection by moving it into the production of the part. Third, optimize—make all parts data accessible, and break down barriers between departments. Fourth, build the inspection method to scale, as production rates will likely be higher than expected. Finally, automate by making the machine see what humans are seeing, having the machine self-correct when possible, develop predictive maintenance, and gate parts via automated flags.
Woodward then focused on SpaceX practices, such as using in-process monitoring for AM and the company’s progress toward in situ process inspection and control, with the goal being to have the system autonomously take corrective action when process issues occur. He spoke about the importance of collecting and storing in-depth data on processes and making the data available in one place. Process monitoring is especially important with high production rates, he said, because problems can be detected early before they cost too much time and money. Finally, it is important to link in-process data with as-built inspection data to understand why things go wrong.
Jeff Rossin, Relativity Space, described his company as a customer-centric rocket company that uses AM such as large-scale, wire-arc systems. Its first rocket to be flown, Terran 1, was about 85 percent additive parts by dry mass. At 7.5 feet in diameter and 110 feet tall, it was printed in nine sections. He described the printing process used to create the rocket. The wire-arc build of a cylinder is followed by machining, heat treatment, and final machining, with non-destructive examination at various points. Then he spoke about structural assurance and qualification, describing the various tests used on the parts as well as the testing of the assembly.
Finally, Rossin discussed agility in AM iteration and its implications for qualification and certification. Qualification is not “front-loaded” in the process; it is iterative with print development at full scale. Ultimately, qualification is done every time a part is printed.
The discussion session focused on gaps in qualification. Crill articulated the need to develop standards and specifications and to have increased confidence in the systems currently in use, including collecting more data. Shaw said that there is a problem in field service and keeping machines running, particularly with LPBF. Woodward said that more open architecture is needed in AM machines.
The final panel consisted of comments from the workshop sponsor. Jennifer Wolk, Office of Naval Research, identified several issues that the workshop had identified: openness and how to work within open architectures, in situ data and the lowest level of measurement that provides representative information, taking post-processing into account, and sharing information about best practices. Mark Benedict, Air Force Research Laboratory, identified two approaches to generating and using data, an empirical approach and a model-based philosophy. There seems to be quite a divide between the two camps, he said.
DISCLAIMER This Proceedings of a Workshop—in Brief was prepared by Robert Pool as a factual summary of what occurred at the workshop. The statements made are those of the rapporteur or individual workshop participants and do not necessarily represent the views of all workshop participants and should not be seen as a consensus of the workshop participants; the planning committee; or the National Academies of Sciences, Engineering, and Medicine.
WORKSHOP PLANNING COMMITTEE MEMBERS Thomas R. Kurfess (NAE) (Chair), Georgia Institute of Technology; Raymundo Arroyave, Texas A&M University; Jian Cao (NAE), Northwestern University; Brent Carey, MACH-20; Julie A. Christodoulou, Office of Naval Research; Katherine T. Faber, California Institute of Technology; Slade Gardner, Big Metal Additive; Jason R. Hattrick-Simpers, University of Toronto; Ralph G. Nuzzo (NAS), University of Illinois at Urbana-Champaign; Lourdes Salamanca-Riba, Energy Innovation Institute, University of Maryland; Paul Witherell, National Institute of Standards and Technology; and Rudy Wojtecki, Applied Materials.
STAFF Erik Svedberg, Scholar; Brittany Segundo, Program Officer; Amisha Jinandra, Senior Research Analyst; Sudhir Shenoy, Associate Program Officer; Heather Lozowski, Acting Deputy Director, Program Finance; Michelle Schwalbe, Director, Board on Mathematical Sciences and Analytics and National Materials and Manufacturing Board.
REVIEWERS To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshop—in Brief was reviewed by Ralph Nuzzo, University of Illinois at Urbana-Champaign; Amy Peterson, University of Massachusetts; and Rebecca Taylor, National Center for Manufacturing Sciences. Katiria Ortiz, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator.
SPONSOR This Proceedings of a Workshop—in Brief is based on work that was sponsored by the Army Research Office and was accomplished under Grant Number W911NF-23-1-0409. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Army Research Office or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation herein.
SUGGESTED CITATION National Academies of Sciences, Engineering, and Medicine. 2025. Methods for Enhancing Additive Manufacturing Qualification and Certification for Defense Applications: Proceedings of a Workshop—in Brief. Washington, DC: The National Academies Press. https://doi.org/10.17226/28595.
For additional information regarding the workshop, visit https://www.nationalacademies.org/our-work/methods-for-enhancing-additive-manufacturing-qualification-and-certification-for-defense-applications-a-workshop.
|
Division on Engineering and Physical Sciences Copyright 2025 by the National Academy of Sciences. All rights reserved. |
![]() |