Page 87 - IJAMD-1-2
P. 87
International Journal of AI for
Materials and Design
AMTransformer for process dynamics
dynamical dependencies between state and rate properties the two matrices. This linear operator is adopted as , as
within a state across all state transitions in AM. The AM shown in Equation VII:
state embedder then applies the Koopman operator (VII)
repeatedly, capturing the dynamical dependencies in AM
state transitions within the observable space through the The decoder part, consisting of a multi-layer neural
evolution of the AM embedding vectors, as shown in network symmetric to the encoder part, reconstitutes the
Equation V: embedding vectors from the Koopman operator back into
d
e
(V) the original physical space, D: ℝ → ℝ , during training. The
data representing the AM state obtained from the decoder
Using the AM state embedding as inputs, the Koopman can be expressed as Equation VIII:
operator projects the AM state transitions onto the Z = D (ε ) (VIII)
embedding state transition in an infinite-dimensional i+1 i+1
observable space as a linear operator, as shown in Figure 4. The AM state embedder is trained using the loss
The Koopman operator offers a method to examine function shown in Equation IX. This loss function
non-linear dynamical systems by transforming them includes two crucial components. The first component
29
into a linear framework with infinite dimensions. The guarantees a uniform mapping to and from the embedded
relationship can be defined in Equation VI: representation, capturing the dependencies between the
state and rate properties at an AM state while minimizing
(VI) reconstruction loss. The second component, the Koopman
The Koopman operator is based on the Koopman dynamics loss, encourages embeddings to adhere to
theory and employs a dynamic mode decomposition linear dynamics, minimizing errors in capturing the
(DMD) approach. DMD is a method for representing dynamical dependencies during each AM state transition.
the Koopman operator using finite-dimensional
approximations based on available data. 30-35 In the AM
state embedder, DMD identifies essential measurement
functions that control the dynamics of an AM system, (IX)
along with a corresponding finite approximation of the
Koopman operator. Direct application of the Koopman In this equation, L AMSE is a total loss function of the
th
i
operator in its infinite-dimensional form is impractical, so AM state embedder, Z j represents the j observed data
point corresponding to an AM state in the i observation
th
DMD provides a feasible alternative. DMD can identify
coherent structures in high-dimensional data by analyzing data sequence, N is the number of observation data
multiple snapshots over time, aiding in the prediction of sequences, T is the length of an observation sequence (i.e.,
future states. From the collected two snapshot matrices, the number of AM state embeddings constituting a data
and l represent the reconstruction loss
sequence), l
which are sets of sequential embedding vectors, denoted recon KD
as S = [ε , ε ,⋯, ε ] and with one and the Koopman dynamics loss, respectively, and λ and
0
a
1
t
2
state-transition step difference, DMD determines the λ are coefficients.
1
dominant spectral decomposition, that is, eigenvectors Figure 5 illustrates the architecture of the AM state
and eigenvalues, of the best-fit linear operator linking embedder. During training, the embedder constantly
updates each part by identifying the evolutionary
characteristics of the AM states. The architecture of the
AM state embedder, grounded in the Koopman theory,
is designed to comprehend the intrinsic dynamics
of AM processes. This approach guarantees that the
resulting trained embeddings accurately encapsulate the
AM states, consisting of their state and rate properties
and transitions, thereby facilitating the capture of
their evolutionary trajectory and inherent dynamical
dependencies.
4.2.2. Transformer
Figure 4. A projection of additive manufacturing (AM) states and their
transitions into AM state embeddings and their transitions within an The transformer part enhances the elucidation of spatial
observable space and temporal dynamical dependencies, primarily based
Volume 1 Issue 2 (2024) 81 doi: 10.36922/ijamd.3919

