The conjugate gradient class of methods for large-scale unconstrained optimization will be analyzed, based on exact and inexact line search frameworks. To rectify certain drawbacks of these methods, a new strategy for measuring the quality of a conjugate gradient search direction will be considered. Hence, the practical descent property and global convergence result of Al-Baali, for the Fletcher-Reeves method with inexact line search, will also be extended to other known conjugate gradient methods. Numerical results will be described to illustrate the behavior of certain members of the class of methods (in particular, that of Fletcher-Reeves and Polak-Ribière). It will be shown that the proposed strategy works better than other well known techniques in several cases.
This seminar will give you the opportunity to meet the speaker and all the researchers in attendance while enjoying drinks and snacks. We would highly appreciate if you could confirm your attendance.
Welcome to everyone!