Page 86 - AJWEP-22-6
P. 86

Heidarnejad, et al.

                taking  into  account  computational  efficiency.  It  is   resilient  models,  and  offering  a  judicious  balance  in
                plausible to assess diverse transfer functions to identify   computational expense. 49
                the most appropriate  one.  An  ANN  may encompass     To simulate and model a problem using the MARS
                one  or  more  hidden  layers. After  finalizing  the  MLP   algorithm,  the  first  step  is  to  define  the  objective  of
                structure  (including  the  number  of  hidden  layers  and   the  modeling  process, whether  it  involves  predicting
                the number of neurons within each hidden layer), the   a continuous variable (regression) or classifying data.
                determination  of  weights  and  biases  ensues,  a  phase   Once  the  problem  is  clearly  understood,  a  relevant
                referred to as model training. 46                   dataset  is  collected  and  preprocessed  by  handling
                The process of the MLP model design entails a sequence   missing values, removing outliers if necessary, and
                of strategic stages:                                normalizing or scaling the features to ensure consistent
                (i)  Evaluation of hidden layer count: The initial phase   input data. After preparing  the dataset,  it is typically
                   involves  assessing  the  optimal  number  of  hidden   divided into training and testing sets to allow for
                   layers in the network architecture               subsequent performance evaluation. The MARS model
                (ii)  Determination of neuron quantity in each layer: The   is then initialized by selecting key parameters, such as
                   second step entails deciding on the suitable number   the  maximum  number  of  basis  functions,  the  degree
                   of neurons within each hidden layer              of  interactions  allowed  between  variables,  and  any
                (iii) Specification   of   transfer   functions:   The   penalties for model complexity to prevent overfitting.
                   identification of appropriate transfer functions for   The training process begins by allowing the model to
                   neuron activations is pursued                    construct a set of piecewise linear basis functions, which
                (iv) Selection  of training  algorithm:  The  selection   adaptively split the data at optimal points (called knots).
                   of  an  effective  training  algorithm  for  network   MARS adds these  functions in  a  forward stepwise
                   optimization concludes the design process.       manner  to minimize  residual error and capture  non-
                                                                    linear relationships and interactions between variables.
                  To achieve an optimal MLP model configuration, a   After building a complex model in the forward phase,
                systematic approach is adopted. Initially, a single hidden   a  backward  pruning  process  is  applied  to  eliminate
                layer is proposed, where the neuronal count aligns with   redundant or less important basis functions, resulting in
                the  quantity  of  input  features.  Subsequently,  various   a simpler and more generalizable model. Once training
                transfer functions, such as log-sig, tan-sig, and pure-  is complete, the model’s performance is evaluated using
                line, are systematically assessed. Once an appropriate   the testing dataset. Finally, the resulting MARS model
                activation  function is determined,  the augmentation   yields  an  interpretable  set  of  rules  or  functions  that
                of predictive precision is pursued.  This entails the   describe the underlying patterns in the data, rendering it
                progressive elevation of the count of hidden layers and   useful for both predictions and understanding variable
                neurons within these layers, thereby capturing intricacies   relationships. 50
                within the data and refining model performance through
                an iterative refinement process. 47                 2.4. Performance assessment criteria of MLMs
                                                                    Several key metrics were used to assess the effectiveness
                2.3.4. MARSs                                        of the  employed algorithms  in ML prediction  and
                The inception of MARS, attributed to Friedman,  has   forecasting models. These metrics encompassed the R ,
                                                            48
                                                                                                                    2
                permeated various branches of engineering, particularly   the RMSE, the MAE, and the developed discrepancy
                hydraulic engineering, demonstrating broad utilization.   ratio (DDR), as detailed in Equations XII-XV. 51
                MARS is an adaptable tool, facilitating the establishment
                of  relationships  between  independent  and  dependent   2   Σ  N  (O -O)(P -P)
                                                                                     i
                                                                                          i
                variables within a targeted system. By leveraging the   R=       i=1                            (XII)
                                                                             N
                MARS method, latent patterns within complex datasets      Σ  i=1 (O -O) 2  Σ  N  (P-P) 2
                                                                                 i
                                                                                              i
                                                                                           i=1
                are discerned, unveiling  hidden insights in intricate
                designs. The pattern recognition process involves the         Σ  N  (O -P ) 2
                proposal of a range of coefficients and basis functions,   RMSE=  i=1  i  i                     (XIII)
                validated through regression operations performed on              N
                the relevant dataset. A key strength of the MARS lies      Σ  N
                in its aptitude for effectively mapping input parameters   MAE=  i=1 (O -P )                    (XIV)
                                                                                  i
                                                                                    i
                to desired outputs, constructing  uncomplicated  yet            N

                Volume 22 Issue 6 (2025)                        80                           doi: 10.36922/AJWEP025120081
   81   82   83   84   85   86   87   88   89   90   91