Ali, Shahwan Younis and Maseeh, Hajar (2025) Dropout: An Effective Approach to Prevent Neural Networks from Overfitting. Asian Journal of Research in Computer Science, 18 (2). pp. 163-185. ISSN 2581-8260
Full text not available from this repository.Abstract
Overfitting remains a significant challenge in training neural networks, often leading to poor generalization on unseen data. Dropout has emerged as a powerful regularization technique to mitigate overfitting by randomly deactivating neurons during training, thereby preventing co-adaptation of features and encouraging diverse representations. This paper explores the theoretical foundations and practical implementations of dropout across various neural network architectures, including Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs). Through empirical analysis on benchmark datasets such as CIFAR-10, MNIST, and others, dropout is shown to improve model robustness and accuracy significantly. The study also compares dropout with alternative regularization methods, such as weight constraints and batch normalization, highlighting its effectiveness in diverse scenarios. Despite its success, dropout's performance is influenced by hyperparameter tuning and dataset characteristics. The paper concludes by discussing limitations, such as computational overhead, and proposes directions for optimizing dropout for specific applications, including dynamic dropout rates and hybrid regularization techniques.
Item Type: | Article |
---|---|
Subjects: | East India Archive > Computer Science |
Depositing User: | Unnamed user with email support@eastindiaarchive.com |
Date Deposited: | 25 Feb 2025 04:07 |
Last Modified: | 25 Feb 2025 04:07 |
URI: | http://article.ths100.in/id/eprint/2130 |