Section 2: Review of Numerical Methods
Atomic vibrartional motions in condensed matter systems,
chemical molecules or proteins.
One dimensional case:
Newton's law: F = ma = m d2x/dt2Multi-dimentional System:Hooke's law: F = - kx
Assumption of an oscillatory motion: x = A cos(wt+a0)
Equation of motion: (k'-w2)A'=0 (k'=k/m, A'= m1/2 A)
Newton's law: Sum j Fij = mi a = mi d2xi /dt2Methods:Hooke's law: Fij = - kij(xi-xj)
Assumption of oscillatory motions: xi = Ai cos(wt+ai)
Equation of motions is a matrix equation (eigenvalue problem):
(k'ij-w2)A'j = 0 (k'ij =kij /(mi1/2 mj1/2), A'i= mi1/2 Ai)
Methods are described for solving linear equations, expressed in terms of a matrix equation, Ax=b, in which the vector, x, is to be determined. The Gauss-Jordan Elimination method is one in which the inverse of the matrix, A, is calculated. If the inverse is not needed, then using Gaussian Elimination with backsubstitution is preferred, as it is about 3 times faster. For repeated solutions to problems that involve the same matrix, but various right hand sides, the LU decomposition method is recommended.
Iterative improvements to the solution, dealing with singular
problems, and methods for sparse matrices are briefly discussed.
Online
slides.
2.2. Minimization
This deals with the task to find the point that minimizes
a single or multi-dimensional function with as few calls as possibile.
There many applications in statistical analyses of data and in optimization
problems. A number of methods including golden section search, Brent's
method, simplex method, Powell's method, and gradient methods are described.
For multidimensional problems, the importance of minimizing along conjugate
directions is demonstrated. Linear programming (optimization) using
the simplex method is explained.