Definition: If A is a nxn matrix, L a scalar is an eigenvalue of A, it there is a NON-ZERO vector X so that AX = LX. (Usually L is the greek letter lambda.) This means 0 is an eigenvalue <=> homogenuous equation AX = 0 has a non-zero solution <=> AX = 0 has oo-many solutions. That is rref(A) has a column with no pivot. If we know L is eigenvalue (coming soon) then B = A-LI has non-zero X so that BX = 0. (because BX = (A-LI)X = AX - LIX = LX - LX = 0.) So again, finding eigenvectors is the same as finding non-zero solutions to homogenuous equations with oo-many solutons. Definiton: The vector W is a linear combination of vectors X1, X2, ... Xn if there are scalars (think reals) C1, C2, ... Cn so that W = C1*X1 + C2*X2 + ... + Cn*Xn This is the same as solving AY = W, where A is the matrix with A(:,j) = Xj (X1 is the first column of A, X2 the second column ... Xn the last column of A) and the unknown Y = [C1, C2 ... Cn]^T. If the system has a solution then W is a linear combination, other wise it is not a linear combination. Definition: The set {X1, X2, ... Xn} is linearly independent (sometimes just independent) if whenever there are scalars C1, C2, ... Cn so that C1*X1 + C2*X2 + ... Cn*Xn = 0, then C1 = C2 = ... = Cn = 0. This is the same as solving AY = 0, where A is the matrix with A(:,j) = Xj (X1 is the first column of A, X2 the second column ... Xn the last column of A) and the unknown Y = [C1, C2 ... Cn]^T. If the system has a unique solution (which must be C1=...=Cn = 0) then the set is independent, other it is not independent. Definition: A set which is NOT independent is said to be dependent We used this to obtain two independent solutions of AX=0 when rref(A) had two columns without pivots. We did deep background showing for every lambda, d/dx has lambda as an eigenvalue with eigenfuction exp(lambda x). The equation AX = lambda X became the ODE y' = lambda y in this case. We also say the sin(x) and cos(x) solutions of (D^2+I)y = 0 had stuff in common with finding two independ solutions of AX = 0.