Page 53 - MSAM-4-3
P. 53
Materials Science in Additive Manufacturing AI-driven defect detection in metal AM
for the model to perform well not only on specific print attributed to several limitations, including complex defect
samples but also on new types of defects caused by different types, variable object scales, low-contrast backgrounds,
materials in future applications. In future work, we hope and suboptimal annotation quality. These factors often
to expand the dataset with more experimental data or result in false positives or missed detections, reducing
conduct more detailed classification and detection studies precision, recall, and AP. Nevertheless, this approach
based on different materials and printing builds. paves the way for real-time compensation of bounding
box-based defective regions. Once the defective regions
In the object detection phase, the YOLOv5 model
achieved an AP of 81.5%, significantly higher than are detected, it is possible to repair them by recoating
the powder bed and/or exposing the defective region to
the AP of the Faster R-CNN model (i.e., 46.25%). This a laser with standard or customized volumetric energy
suggests that YOLOv5 may be better suited for PBF-LB density. More precise mask annotations can be used to
defect detection tasks, offering stronger adaptability to further improve model performance. Pixel-level labeling
complex, multi-scale defects with a faster inference speed is particularly effective for detecting complex shapes that
of under 1 s. Nevertheless, the Faster R-CNN model also require accurate localization, as it helps reduce boundary
demonstrated satisfactory detection results, effectively errors and enhances detection accuracy.
identifying potential defects and their locations within 2
s, which ensures timely detection of layer-wise changes 4.2. Comparison to previous studies
and confirms its feasibility for real-time monitoring and Most previous studies have adopted image classification
assisting manual inspection.
methods, and this study followed a similar approach
The generally lower precision of object detection to evaluate model performance (Table 8). The reported
models compared to image classification models can be accuracies in the literature vary widely (70 – 100%),
Table 8. Previous studies on image classification models
Author, year Research object Model (s) used Accuracy (%) References
Yin et al., 2025 Defect detection in PBF-LB Resnet50 and Resnet 50: 100 This work
EfficientNetV2B0 EfficientNetV2B0: 99.62
Han et al., 2019 Microscopic metal images for Inception-ResNet-v2 87.5 30
AM defect detection
Khan et al., 2021 FFF 3D printing defect CNN 84 7
detection
Kadam et al., 2021 FDM defect detection AlexNet+SVM 99.7 16
Westphal and Seitz, 2021 PBF defects in the selective VGG16 and Xception VGG16: 95.8 25
laser sintering process CNN Xception CNN: Not specified
Ansari et al., 2022 Porosity detection in PBF-LB Custom CNN CAD labels: 90 27
XCT labels: 97
Fu et al., 2022 Overview of ML-based defect CNN, SVM, and other ~75 – 95 across studies 29
detection in PBF-LB ML models
Abhilash and Ahmed, 2023 Surface quality improvement ResNet-50 CNN 96 32
in metal AM
Akmal et al., 2023 Defect detection in PBF-LB CNN, ANN, MLR CNN: 100 41
ANN and MLR: ~99
Khan et al., 2023 Anomaly detection in Random forest 99.98 31
PBF-LB using OT imaging
Lee et al., 2023 Defect detection in PBF-LB 3D-CNN Recall: 70.47 33
Schmitt et al., 2023 Powder bed monitoring in Xception-style neural ~99.15 for large patches 42
metal AM network
Kozhay et al., 2024 Defect detection in FDM Custom CNN with 97 28
ResNet backbone
Kuriachen et al., 2025 Defect detection in FDM Custom lightweight CNN 97.77 43
Abbreviations: AM: Additive manufacturing; ANN: Artificial neural network; CAD: Computer-aided design; CNN: Convolutional neural network;
FDM: Fused deposition modeling; FFF: Fused filament fabrication; ML: Machine learning; MLR: Multinomial logistic regression; OT: Optical
tomography; PBF-LB: Laser-based powder bed fusion; SVM: Support vector machine; XCT: X-ray computed tomography.
Volume 4 Issue 3 (2025) 11 doi: 10.36922/MSAM025150022

