3.5. Hyperparameters

JN Ju-Hyeon Noh
JK Jun-Hyeok Kim
HY Hee-Deok Yang
ask Ask a question
Favorite

Five-fold cross-validation was performed during the trial. The dataset was separated into five files, four of which were utilized for training, and one of which was used for validation. Training was conducted evenly in each class. We employed the Adam optimizer [23] as represented by Equations (1)–(3).

where ε=18, β1=0.9, and β2=0.999.

Training was performed every 100 epochs with an initial learning rate of 0.00001 that decreased using exponential decay. Dropout was set to 0.2 to avoid overfitting. Based on the GPU NVIDIA RTX A6000, a network configured in this manner required 25 h to learn.

With the exception of pre-processing, all training and testing procedures were performed in a Python environment using TensorFlow. Training required approximately four days to complete over 100 epochs.

Do you have any questions about this protocol?

Post your question to gather feedback from the community. We will also invite the authors of this article to respond.

post Post a Question
0 Q&A