Evaluation of the Effect Of Regularization on Neural Networks for Regression Prediction: A Case Study of MLLP, CNN, and FNN Models
DOI:
https://doi.org/10.35314/m2rcsf96Keywords:
Regularization, Neural Network, MLP, CNN, FNNAbstract
Regularization is an important technique for developing deep learning models to improve generalization and reduce overfitting. This study evaluated the effect of regularization on the performance of neural network models in regression prediction tasks using earthquake data. We compare Multilayer Perceptron (MLP), Convolutional Neural Network (CNN), and Feedforward Neural Network (FNN) architectures with L2 and Dropout regularization. The experimental results show that MLP without regularization achieved the best performance (RMSE: 0.500, MAE: 0.380, R²: 0.625), although prone to overfitting. CNN performed poorly on tabular data, while FNN showed marginal improvement with deeper layers. The novelty of this study lies in a comparative evaluation of regularization strategies across multiple architectures for earthquake regression prediction, highlighting practical implications for early warning systems.
Downloads
Published
Issue
Section
License
Copyright (c) 2025 INOVTEK Polbeng - Seri Informatika

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.