Conjugate gradient methods constitute excellent neural network training methods characterized by their simplicity efficiency and their very low memory requirements. In this paper, we propose a new scaled conjugate gradient neural network training algorithm which guarantees descent property with standard Wolfe condition. Encouraging numerical experiments verify that the proposed algorithm provides fast and stable convergence.
K. Abbo, K., & H. Mohamed, H. (2015). New Scaled Conjugate Gradient Algorithm for Training Artificial Neural Networks Based on Pure Conjugacy Condition. Kirkuk Journal of Science, 10(3), 230-241. doi: 10.32894/kujss.2015.104992
MLA
Khalil K. Abbo; Hind H. Mohamed. "New Scaled Conjugate Gradient Algorithm for Training Artificial Neural Networks Based on Pure Conjugacy Condition". Kirkuk Journal of Science, 10, 3, 2015, 230-241. doi: 10.32894/kujss.2015.104992
HARVARD
K. Abbo, K., H. Mohamed, H. (2015). 'New Scaled Conjugate Gradient Algorithm for Training Artificial Neural Networks Based on Pure Conjugacy Condition', Kirkuk Journal of Science, 10(3), pp. 230-241. doi: 10.32894/kujss.2015.104992
VANCOUVER
K. Abbo, K., H. Mohamed, H. New Scaled Conjugate Gradient Algorithm for Training Artificial Neural Networks Based on Pure Conjugacy Condition. Kirkuk Journal of Science, 2015; 10(3): 230-241. doi: 10.32894/kujss.2015.104992