TITLE

Parallel Rayleigh Quotient Optimization with FSAI-Based Preconditioning

AUTHOR(S)
Bergamaschi, Luca; Martínez, Angeles; Pini, Giorgio
PUB. DATE
January 2012
SOURCE
Journal of Applied Mathematics;2012, p1
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
The present paper describes a parallel preconditioned algorithm for the solution of partial eigenvalue problems for large sparse symmetric matrices, on parallel computers. Namely, we consider the Deflation-Accelerated Conjugate Gradient (DACG) algorithm accelerated by factorized-sparse- approximate-inverse- (FSAI-) type preconditioners. We present an enhanced parallel implementation of the FSAI preconditioner and make use of the recently developed Block FSAI-IC preconditioner, which combines the FSAI and the Block Jacobi-IC preconditioners. Results onto matrices of large size arising from finite element discretization of geomechanical models reveal that DACG accelerated by these type of preconditioners is competitive with respect to the available public parallel hypre package, especially in the computation of a few of the leftmost eigenpairs. The parallel DACG code accelerated by FSAI is written in MPI-Fortran 90 language and exhibits good scalability up to one thousand processors.
ACCESSION #
84862184

 

Related Articles

  • Convergence analysis of an optimization algorithm for computing the largest eigenvalue of a symmetric matrix. Borzykh, A. // Journal of Mathematical Sciences;Apr2008, Vol. 150 Issue 2, p1917 

    A new optimization algorithm for computing the largest eigenvalue of a real symmetric matrix is considered. The algorithm is based on a sequence of plane rotations increasing the sum of the matrix entries. It is proved that the algorithm converges linearly and it is shown that it may be regarded...

  • Inexact trust region PGC method for large sparse unconstrained optimization. Li, Junxiang; Yan, Limei; Li, Shudong; Huo, Jiazhen // Computational Optimization & Applications;Apr2012, Vol. 51 Issue 3, p981 

    We present an algorithm, partitioning group correction (PGC) algorithm based on trust region and conjugate gradient method, for large-scale sparse unconstrained optimization. In large sparse optimization, computing the whole Hessian matrix and solving the Newton-like equations at each iteration...

  • A Conjugate Gradient Algorithm with Function Value Information and N-Step Quadratic Convergence for Unconstrained Optimization. Li, Xiangrong; Zhao, Xupei; Duan, Xiabin; Wang, Xiaoliang // PLoS ONE;9/18/2015, Vol. 10 Issue 9, p1 

    It is generally acknowledged that the conjugate gradient (CG) method achieves global convergence—with at most a linear convergence rate—because CG formulas are generated by linear approximations of the objective functions. The quadratically convergent results are very limited. We...

  • Coupling the Gradient Method with a General Exterior Penalization Scheme for Convex Minimization. Peypouquet, Juan // Journal of Optimization Theory & Applications;Apr2012, Vol. 153 Issue 1, p123 

    In this paper, we propose and analyze an algorithm that couples the gradient method with a general exterior penalization scheme for constrained or hierarchical minimization of convex functions in Hilbert spaces. We prove that a proper but simple choice of the step sizes and penalization...

  • Nonmonotone Globalization Techniques for the Barzilai-Borwein Gradient Method. Grippo, L.; Sciandrone, M. // Computational Optimization & Applications;Nov2002, Vol. 23 Issue 2, p143 

    In this paper we propose new globalization strategies for the Barzilai and Borwein gradient method, based on suitable relaxations of the monotonicity requirements. In particular, we define a class of algorithms that combine nonmonotone watchdog techniques with nonmonotone linesearch rules and we...

  • A Comparison of Optimization Methods and Software for Large-scale L1-regularized Linear Classification. Guo-Xun Yuan; Kai-Wei Chang; Cho-Jui Hsieh; Chih-Jen Lin // Journal of Machine Learning Research;11/1/2010, Vol. 11 Issue 11, p3183 

    No abstract available.

  • Planar Conjugate Gradient Algorithm for Large-Scale Unconstrained Optimization, Part 1: Theory. Fasano, G. // Journal of Optimization Theory & Applications;Jun2005, Vol. 125 Issue 3, p523 

    In this paper, we present a new conjugate gradient (CG) based algorithm in the class of planar conjugate gradient methods. These methods aim at solving systems of linear equations whose coefficient matrix is indefinite and nonsingular. This is the case where the application of the standard CG...

  • A cyclic projected gradient method. Setzer, Simon; Steidl, Gabriele; Morgenthaler, Jan // Computational Optimization & Applications;Mar2013, Vol. 54 Issue 2, p417 

    In recent years, convex optimization methods were successfully applied for various image processing tasks and a large number of first-order methods were designed to minimize the corresponding functionals. Interestingly, it was shown recently in Grewenig et al. () that the simple idea of...

  • Linear convergence analysis of the use of gradient projection methods on total variation problems. Chen, Pengwen; Gui, Changfeng // Computational Optimization & Applications;Mar2013, Vol. 54 Issue 2, p283 

    Optimization problems using total variation frequently appear in image analysis models, in which the sharp edges of images are preserved. Direct gradient descent methods usually yield very slow convergence when used for such optimization problems. Recently, many duality-based gradient projection...

Share

Read the Article

Courtesy of NEW JERSEY STATE LIBRARY

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics