MODIFIED CONJUGATE GRADIENT METHOD FOR TRAINING NEURAL NETWORKS BASED ON LOGISTIC MAPPING
In this paper, we suggested a modified conjugate gradient method for training neural network which assurance the descent and the sufficient descent conditions. The global convergence of our proposed method has been studied. Finally, the test results present that, in general, the modified method is more superior and efficient when compared to other standard conjugate gradient methods
 A. Hmich, A. Badri, A. Sahel, Automatic speaker identification by using the neural network, in: IEEE 2011 International Conference on Multimedia Computing and Systems (ICMCS), (2011), 1–5.
 B. Lerner, H. Guterman, M. Aladjem, I. Dinstein, A comparative study of neural network based feature extraction paradigms, Pattern Recognition Letters 20 (1), (1999), 7–14.
 C. Charalambous, Conjugate gradient algorithm for efficient training of artificial neural networks, IEEE Proceedings 139 (3), (1992), 301–310.
 C. C. Peng, G. D. Magoulas, Adaptive nonmonotone conjugate gradient training algorithm for recurrent neural networks, in: 19th IEEE International Conference on Tools with Artificial Intelligence, (2008), 374–381.
 C. H. Wu, H. L. Chen, S. C. Chen, Gene classification artificial neural system, International Journal on Artificial Intelligence Tools 4 (4), (1995), 501–510.
 C. M. Bishop, Neural Networks for Pattern Recognition, Oxford, (1995).
 D. E. Rumelhart, G. E. Hinton, R. J. Williams, Learning internal representations by error propagation, in: D. E. Rumelhart, J. McClell and (Eds.), Parallel Dis-tributed Processing: Explorations in the Micro structure of Cognition, Cambridge, MA, (1986), 318–362.
 E. Polak, G. Ribiere, Note sur la convergence de directions conjuguees, Rev. Francaise Informat Recherche Operationelle 3, (1969), 35–43.
 G. Zoutendijk, Nonlinear programming computational methods, in: J. Abadie (Ed.), Integer and Nonlinear Programming, North- Holland, Amsterdam, (1970), 37–86.
 I. Jusoh, M. Mamat and M. Rivaie, A new edition of conjugate gradient methods for large-scale unconstrained optimization, International Journal of Mathematical Analysis, Vol. 8, No. 46, (2014), 2277 – 2291.
 I. E. Livieris, P. Pintelas, An improved spectral conjugate gradient neural network training algorithm, International Journal on Artificial Intelligence Tools 21 (1), (2012).
 J. Sun, J. Zhang, Convergence of conjugate gradient methods without line search, Annals of Operations Research 103, (2001), 161–173.
 J. Wang, W. Wu, M. Zurada, Deterministic convergence of conjugate gradient method for feedforward neural networks, Neurocomputing 74, (2011), 2368–2376.
 K. Sugiki, Y. Narushima, and H. Yabe, Globally convergent three–term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization, J. Optim. Theory Appl. 153, (2012), 733–757.
 H. Lu, H. Zhang, L. Ma, A new optimization algorithm based on chaos, Zhejiang University, Hangzhou 310027, China, (2005).
 M.F. Moller, A scaled conjugate gradient algorithm for fast supervised learning, Neural Networks 6, (1993), 525–533.
 M.R. Hestenes, E. Stiefel, Methods for conjugate gradients for solving linear systems, Journal of Research of the National Bureau of Standards 49, (1952), 409–436.
 R.Fletcher, C. Reeves, Function minimization by conjugate gradients, Comput. J.7, (1964), 149–154.
 R.Fletcher, Practical method of optimization, Unconstrained optimization, 1, John Wiley & Sons, New York, (1987).
 Y.H. Dai, Y. Yuan, A nonlinear conjugate gradient with a strong global convergence property, SIAMJ. Optim. 10, (1999), 177–182.
 Y.Liu, C. Storey, Efficient generalized conjugate gradient algorithms part1: Theory, J. Comput. Appl. Math.69, (1992), 129–137.
It is the policy of the Journal of Duhok University to own the copyright of the technical contributions. It publishes and facilitates the appropriate re-utilize of the published materials by others. Photocopying is permitted with credit and referring to the source for individuals use.
Copyright © 2017. All Rights Reserved.