Page 96 - MSAM-3-4
P. 96

Materials Science in Additive Manufacturing                             Super-resolution method for L-PBF



              The SR results were evaluated with the PSNR and   of the melt pool is enhanced. Further, multi-scale features
            SSIM. To verify the superiority of the MPSR-Net in the SR   of the melt pool image are extracted by the U-Net branch.
            reconstruction of melt pool images, the traditional Bicubic   Other attention mechanisms (CBAM, CA, and SENet)
            method,  SRCNN,  FSRCNN,  and  other  DL  models  were   were used for comparison, proving that the advanced
            used for comparison. The results are shown in Table 2.
                                                               attention mechanisms can effectively improve the
              In addition to EDSR, the performance of other DL   performance of the model. In addition, ECA-Net is proven
            models was better than the traditional interpolation   to be the most effective. Compared to only using a single
            method, demonstrating a high value of DL in melt pool   branch, different key information of the melt pool can be
            image  reconstruction. The  MPSR-Net  achieved the  best   extracted through the dual-path form, and then feature
            results compared to other methods, indicating that directly   fusion can improve the network performance. In the end,
            applying traditional SR models to melt pool images cannot   the average PSNR of the model reached 36.13, and the
            achieve excellent results. The targeted improvements   average SSIM reached 0.962. The SR reconstruction effect
            are needed based on the characteristics of the melt pool.   was found to be good, which can be used for melt pool
            Therefore, this study proposes targeted optimization. It   monitoring.
            first lightweights the network to ensure no parameter
            redundancy or overfitting according to the melt pool   Figure  8 shows the comparison of different SR
            image with a simple background and target. Meanwhile,   methods. It can be observed that the MPSP-Net has clearer
            by adopting the attention mechanism to emphasize high-  edges of the melt pool, and the reconstructed boundaries
            frequency information, the boundary reconstruction effect   are  also  clearer  for  the  case  of  spatter  adhesion.  The
                                                               reconstructed  melt  pool  image  by MPSR-Net  restored
                                                               more morphological details and effectively overcame the
            Table 1. Parameters used in the experiment
                                                               interference of noise. Furthermore, it can be difficult to
            Parameter             Experiment setup             assess image quality with subjective visual evaluation. The
            Data set              3847 images                  following section showcases the use of melt pool features
            Gradient update algorithm  Adaptive moment estimation (Adam)  as new evaluation metrics.
            Loss function         Mean Squared Error (MSE) Loss  3.2. Melt pool features extraction
            Learning rate         1e-4
            Iterations            100                          To demonstrate the significance of SR for melt pool
                                                               images, melt pool features were extracted through OTSU
            Software environment  Python 3.9; PyTorch 2.0.0    threshold segmentation and contour extraction methods. 23
            Hardware environment  GPU RTX 4070; CPU Intel i7-13700K  The HR images were used as ground truth. The MAPE and


            Table 2. PSNR and SSIM of different SR methods
            Networks                90w                  110w                 130w                 170w
                              PSNR       SSIM      PSNR       SSIM       PSNR      SSIM       PSNR       SSIM
            Bicubic            21.91     0.8267     23.36     0.8579     23.01     0.8466     23.23      0.8432
            EDSR 31            21.03     0.6170     21.09     0.6280     20.46     0.6238     19.43      0.6218
            SRCNN 28           27.58     0.8990     20.97     0.9251     28.89     0.9161     29.26      0.9095
            FSRCNN 29          29.83     0.9262     31.64     0.9420     30.67     0.9358     31.12      0.9302
            VDSR 30            29.74     0.9205     32.70     0.9453     31.48     0.9387     31.72      0.9342
            RCAN (CA)  32      31.52     0.9034     33.48     0.9267     32.97     0.9211     32.90      0.9132
                      41
            RCAN (CBAM )       34.52     0.9524     36.20     0.9611     35.68     0.9574     35.53      0.9511
            RCAN (SENet )      33.26     0.9363     35.31     0.9524     34.73     0.9484     34.75      0.9416
                     42
            U-Net branch       34.09     0.9543     35.78     0.9622     35.07     0.9579     34.98      0.9506
            RCEN branch        34.85     0.9597     36.48     0.9654     35.80     0.9620     35.44      0.9570
            MPSR-Net           35.05     0.9561     37.17     0.9681     36.35     0.9645     35.94      0.9595
            Abbreviations: CA: Channel attention; CBAM: Convolutional block attention module; EDSR: Enhanced deep super-resolution network; FSRCNN: Fast
            super-resolution convolutional neural network; MPSR-Net: Melt pool super-resolution network; PSNR: Peak signal-to-noise ratio; RCAN: Residual
            channel attention network; RCEN: Residual channel with the efficient channel attention network; SENet: Squeeze-and-excitation networks;
            SR: Super-resolution; SRCNN: Super-resolution convolutional neural network; SSIM: Structural similarity index measure; VDSR: Very deep
            super-resolution network.
            Volume 3 Issue 4 (2024)                         9                              doi: 10.36922/msam.5585
   91   92   93   94   95   96   97   98   99   100   101