Evaluation of the Effect Of Regularization on Neural Networks for Regression Prediction: A Case Study of MLLP, CNN, and FNN Models

Authors

  • Susandri Susandri Universitas Lancang Kuning Author https://orcid.org/0000-0002-3875-9747
  • Ahmad Zamsuri Universitas Lancang Kuning Author
  • Nurliana Nasution Universitas Lancang Kuning Author
  • Maya Ramadhani Universitas Lancang Kuning Author

DOI:

https://doi.org/10.35314/m2rcsf96

Keywords:

Regularization, Neural Network, MLP, CNN, FNN

Abstract

Regularization is an important technique for developing deep learning models to improve generalization and reduce overfitting. This study evaluated the effect of regularization on the performance of neural network models in regression prediction tasks using earthquake data. We compare Multilayer Perceptron (MLP), Convolutional Neural Network (CNN), and Feedforward Neural Network (FNN) architectures with L2 and Dropout regularization. The experimental results show that MLP without regularization achieved the best performance (RMSE: 0.500, MAE: 0.380, R²: 0.625), although prone to overfitting. CNN performed poorly on tabular data, while FNN showed marginal improvement with deeper layers. The novelty of this study lies in a comparative evaluation of regularization strategies across multiple architectures for earthquake regression prediction, highlighting practical implications for early warning systems.

Downloads

Download data is not yet available.

Author Biography

  • Maya Ramadhani, Universitas Lancang Kuning

    Maya Ramadhani

Published

28-07-2025

Issue

Section

Articles

How to Cite

Evaluation of the Effect Of Regularization on Neural Networks for Regression Prediction: A Case Study of MLLP, CNN, and FNN Models. (2025). INOVTEK Polbeng - Seri Informatika, 10(3), 1350-1358. https://doi.org/10.35314/m2rcsf96