Analisis Pengaruh Dropout Layer pada Convolutional Neural Networks untuk Klasifikasi Gambar Sticky Notes
Abstract
Overfitting remains a significant challenge in deep learning applications, particularly in image classification tasks. This study investigates the role of dropout layers in addressing overfitting by evaluating three models: one without dropout, one with a 10% dropout layer, and one with a 20% dropout layer. The models were trained on an image dataset, and their performance was assessed using metrics such as test accuracy, loss values, classification reports, and confusion matrices. All models achieved 100% test accuracy, but their loss values and training patterns differed significantly. The model without dropout exhibited signs of overfitting, while the model with a 10% dropout layer achieved the best balance between low loss and stable training patterns. In contrast, the 20% dropout model showed reduced learning capacity. These results demonstrate that a 10% dropout layer effectively enhances model generalization without compromising performance, providing a practical solution for mitigating overfitting in deep learning models.