Page 94 - IJAMD-1-2
P. 94
International Journal of AI for
Materials and Design
AMTransformer for process dynamics
handling non-linear relationships within the dynamical multiple spatial and temporal scales, even if they are not
dependencies, capturing complex patterns across various adjacent in space and time. This advantage of the proposed
levels of abstraction. The transformer can manage and AMTransformer contributed to its superior performance
interpret the non-linear dependencies that the linear over the ConvLSTM model, which can only refer to the
perspective of the AM state embedder’s Koopman operator dependency among states sequentially.
might miss. In AM processes, the characteristics of each The new modeling approach for AM proposed in
point, line, or layer are often interrelated in a non-linear this paper is impactful because AM part geometries,
and non-local manner. The transformer part of the hierarchies, materials, and functionalities can be extremely
proposed method effectively captures long-range, non- complex due to its unique design freedom. This capability
linear, and non-local dynamical dependencies through is crucial for understanding how control of a state may
self-attention mechanisms. This self-attention mechanism influence future states across multiple scales in complex
allows each AM state embedding to attend to all others patterns. The integration of linear and non-linear methods
in the concatenated inputs of AM state embedding in the AMTransformer enhances data-driven modeling in
vectors in latent representations, as shown in Figure 12. AM by providing a more comprehensive understanding
This capability enables an understanding of how changes of the dynamics involved in AM processes. In addition,
in one physical state in AM can affect future states at
the linear method can stabilize and expedite the learning
process, while the non-linear method can increase the
model’s adaptability and accuracy in AM scenarios where
complex interactions between physical entities occur.
Our approach involves formulating the dynamics of
AM into generalizable representations, which allows us
to model AM without being restricted to specific types of
AM processes or data. This generalizability of the proposed
method facilitates the expansion of the model’s scope and
enhances its applicability in various fields.
Figure 11. Comparison of predicted melt pool monitoring (MPM) 7. Conclusion
images (top row) with original MPM images (bottom row)
This paper presents a novel method, the
AMTransformer, and proposes a formal representation
of AM dynamics to provide a foundation for ML models
to capture these dynamics. In addition, we introduce
a Koopman theory-based transformer that enables
improved prediction of future AM states. The proposed
method offers a fundamental modeling approach to
comprehend complex spatiotemporal dependencies
among physical entities and their properties in AM. Our
study adapts Koopman theory and the transformer’s
attention mechanism to enhance the generation of latent
embeddings that capture key information about AM
states, their spatial-temporal dependencies, and their
evolution in AM processes.
In the future, we will focus on gaining a deeper
understanding and refining the design of AM-specific
attention mechanisms within the model. While case study
results indicate that structural changes affect attention
dynamics, further exploration is needed to comprehend
how the attention mechanism captures relevant dynamical
dependencies both spatially and temporally. This future
work will involve analyzing the alignment of attention
Figure 12. The AMTransformer’s attention mechanism learning
dynamical dependencies at multi scales: The blue dotted lines show how patterns with melt pool locations and tool paths to
the AMTransformer learns dependencies among melt pools interpret the dynamical dependencies the model captures.
Volume 1 Issue 2 (2024) 88 doi: 10.36922/ijamd.3919

