Foundations of Computational Mathematics I Topics
Biomathematics, Financial Mathematics, and Applied and Computational Mathematics at Florida State University
Finite Precision Arithmetic
- Floating Point Number Systems
- Floating point representation of real numbers
- Floating point arithmetic
- Overflow/underflow
- Scaling
- Using terms of infinite sequences to approximate a value
- Using infinite series to approximate a value
- Cancellation
- Analysis of Numerical Computation in Finite Precision
- Conditioning of a problem and the condition number
- Stability of an algorithm
- Backward error
- Foward, weak, and backward stability
Finite Dimensional Vector Spaces
- Vectors, Matrices, and Vector Spaces
- Vectors, their operations, and a vector space
- Linear combination, independence and dependence
- Bases of subspaces of Rn and Cn
- Linear functions between spaces and matrices
- Subspaces: domains, ranges and spans
- Distance, Angle, and Matrices
- Vector and matrix norms and the relationships to each other
- Inner products, norms and angles
- Polarization, Parallelogram Law and Cosine Laws
- Matrix rank, nonsingular matrices
- Orthonormal bases of subspaces
- Orthogonal/unitary matrices, isometries
Solving Linear Systems of Equations
- Factorization Methods
- Linear systems of equations
- Operations on equations and matrix operations
- Gauss transforms and their algebraic and computational properties
- LU factorization via Gauss transforms
- Pivoting, existence, stability, elementary permutations
- Data structures, computations, and LU factorization
- Numerical Analysis of Solving Linear Systems via LU Factorization
- Conditioning of system solving
- Backward error of factorization and complete solution algorithm
- Growth factor and backward stability
- Linear Stationary Methods
- Fixed point iterations, e.g., Richardson's, Jacobi, Gauss-Seidel, SOR
- Forward, backward, line, block, and symmetric iteration forms
- Convergence analysis: error and residual behavior
- Sufficient conditions for convergence for the various methods
- Nonstationary Methods and Optimization
- Optmization and system solving
- Level curves, gradients and the steepest descent method
- Conjugacy, conjugate directions, and incremental optimization
- The conjugate gradient method
- Preconditioning
Solving Nonlinear Equations
- Scalar Nonlinear Equation Methods
- Bisection method
- Secant method
- Regula falsi method
- Newton's method
- Fixed Point Analysis for Scalar Nonlinear Equations
- Contraction mappings
- Order of convergence
- Multiplicity of root and convergence rate
- Sufficient conditions
- Computational cost, convergence rate and total work
- Sytems of Nonlinear Equation Methods
- Generalized linear methods
- Netwon's method and Newton-like methods
- The secant condition and Quasi-Newton methods
- Convergence analysis
- Computational cost, convergence rate and total work
Optimization
- Linear Least Squares Problems
- The full-rank linear least squares problem
- Norm invariance and orthogonal transformation-based methods
- Householder reflectors and solving linear least squares problems
- Geometry of least squares: subspaces, orthogonal complements and projections
- The generalized inverse
- Unconstrained Nonlinear Optimization
- First and second order necessary and sufficient conditionsfor an optimal point
- Global vs. local convergence of a method
- Line Search Methods
- Steepest descent, Newton, Inexact Newton
- Quasi-Newton
- Wolfe conditions: sufficient decrease condition and curvature condition
- Convergence analysis
- Nonlinear Least Squares Problems
- Nonlinear Conjugate Gradient Methods
- Trust Region Idea