CIS 515, Fall 2014
Brief description:
The goal of this course is to provide
firm foundations in linear algebra and optimization techniques
that will enable students to analyze and solve problems
arising in various areas of computer science, especially
computer vision, robotics, machine learning, computer graphics,
embedded systems, and market engineering and systems.
The students will acquire a firm theoretical knowledge of these
concepts and tools. They will also learn how to use these
tools in practice by tackling various judiciously chosen projects
(from computer vision, etc.).
This course will serve as a basis to more advanced courses
in computer vision, convex optimization, machine learning,
robotics, computer graphics, embedded systems,
and market engineering and systems.
Topics covered include:
Fundamentals of linear algebra:
Basic concepts; solving linear systems; eigenvalues and
eigenvectors; introduction to the finite elements method;
singular value decomposition, pseudo-inverses, PCA.
Basics of quadratic optimization; the
Rayleigh-Ritz ratio.
Methods for computing eigenvalues (power iteration, QR method, etc.). Methods
using Krylov subspaces (Arnoldi, Lanczos). Hadamard matrices and applications.
Basics of optimization: review of analysis (derivatives, gradient, Hessian,
Lagrange multipliers).
Since there is too much material to be covered in one semester,
this academic year (2014-2015) I will offer a second
part of this course in CIS610 (Spring 2015).
Roughly 6-8 weeks of CIS610 will be devoted to finishing up the
core material. For the rest of the Spring semester, I will cover
more advanced topics (for details, see below).
Syllabus:
(*) means: if time permits.
-
Linear Algebra; Fall 2014
-
Basics
-
Motivations: Linear independence, rank
-
Vector Spaces
-
Linear independence, Bases
-
Linear maps
-
Image (range) and Kernel (nullspace), rank
-
Linear maps and Matrices
-
Affine maps
-
Direct products, sums and direct sums
-
The dual space E^* and linear forms
-
Hyperplanes and linear forms
-
Transpose of a linar map and of a matrix
-
The four fundamental subspaces
-
Determinants
-
Permutations, signature of a permutation
-
Alternating multilinear maps
-
Definition of a determinant
-
Inverse matrices and determinants
-
Systems of linear equations and determinants
-
Determinant of a linear map
-
The Cayley-Hamilton theorem
-
Solving Linear Systems
-
Gaussian Elimination
-
LU factorization
-
Gaussian elimination of tridiagonal systems
-
Cholesky Decomposition
-
Vector Norms and Matrix norms
-
Normed vector spaces
-
Matrix norms
-
Condition numbers of matrices
-
Iterative Methods for Solving Linear Systems
-
Convergence of Sequences of Vectors and Matrices
-
Convergence of Iterative Methods
-
Description of the Methods of Jacobi, Gauss-Seidel, and Relaxation
-
Convergence of the Methods of Jacobi, Gauss-Seidel, and Relaxation
-
Euclidean spaces
-
Inner products
-
Orthogonality, duality
-
Adjoint of a linear map
-
Linear isometries and Orthogonal Matrices
-
The orthogonal group
-
QR-Decomposition for invertible matrices
-
Orthogonal reflections
-
QR-decomposition using Householder matrices
-
Hermitian Spaces
-
Hermitian spaces, pre-Hilbert spaces
-
Orthogonality, duality, adjoint of a linear map
-
Linear isometries and unitary matrices
-
The unitary group
-
Eigenvalues and Eigenvectors
-
Basic Definitions
-
Algebraic and Geometric multiplicity
-
Reduction to upper triangular form; the Schur decomposition
-
Spectral Theorems
-
Symmetric matrices, Normal matrices
-
Self-adjoint and other special linear maps
-
Normal and other special matrices
-
Introduction to the Finite Elements Method
(Ritz, Galerkin, Rayleigh-Ritz)
-
A one-dimensional problem: bending of a beam
-
A two-dimensional problem: an elastic membrane
-
Time-dependent boundary problems: the wave equation
-
Singular Value Decomposition (SVD) and Polar Form
-
Symmetric, Positive, Definite Matrices
-
Symmetric, Positive, Semi-Definite Matrices
-
SVD for square matrices
-
SVD for rectangular matrices
-
Applications of SVD and Pseudo-Inverses
-
Least squares problems and the Pseudo-Inverse
-
Data compression and SVD
-
The Rayleigh-Ritz ratio and the Courant-Fisher theorem
-
Principal component analysis (PCA)
-
Best affine approximation
-
Annihilating polynomials; primary decomposition, Jordan form
-
Annihilating polynomials and the minimal polynomial
-
Minimal polynomials of diagonalizable linear maps
-
The primary decomposition theorem
-
Nilpotent linear maps and Jordan form
-
Linear Algebra; Spring 2015 (CIS610)
-
Basic Quadratic Optimization
-
Minimizing transpos{x} A x + 2 transpos{x} b,
where A is a positive definite symmetric matrix.
-
(*) Minimizing transpos{x} A x + 2 transpos{x} b,
where A is a positive semidefinite symmetric matrix
using a pseudo-inverse
-
(*) Minimizing transpos{x} A x on the unit sphere
subject to linear constraints
-
(*)The Schur complement
-
(*) Applications to computer vision (finding contours)
-
Methods for Computing Eigenvalues
-
Reduction to Hessenberg or tridiagonal form
-
Power Iteration
-
Inverse power iteration
-
Basic QR algorithm
-
QR algorithm with shifts
-
Methods using Krylov Subspaces
-
The Companion matrix
-
Arnoldi Iteration
-
How does Arnoldi find eigenvalues (Ritz values)
-
Using Arnoldi iteration to solve linear systems: GRMES
-
The symmetric case: Lanczos iteration
- (*) Conjugate gradient
-
Hadamard matrices
-
When do Hadamard matrices exist?
- Applications of Hadamard matrices (error-correcting codes, ...)
-
Positive and Nonnegative Matrices
-
Positive matrices and the Perron-Frobenius Theorem
-
Nonnegative matrices and the Perron-Frobenius Theorem
-
Stochastic matrices
-
Basics of Optimization; Spring 2015 (CIS610)
-
Review of Analysis
-
First and second derivative of a function
-
Gradient, Hessian
-
Extrema of real functions
-
Lagrangians and Lagrange multipliers
-
Extrema of real functions: using second derivatives
-
Extrema of real functions: using convexity
-
Newton's Method
-
Elementary spectral graph theory
Applications to graph clustering using normalized cuts
(CIS610)
-
Probabilistic algorithms for constructing approximate matrix
decomposition (CIS610)
-
Optimization methods on manifolds (CIS610)
Back to
Gallier Homepage
published by: