Column Picture => We find the Scaler that will make the vectors go towards destination
Types
Unique Solution
Intersection at only 1 point
ρ(A/B) = ρ(A) = No. of unknowns => System is Consistent, Unique Solution
Infinite Solution
Overlapping Lines/Planes
ρ(A/B) = ρ(A) < No. of unknowns => System is Consistent, Infinite Solution
No Solution
Parallel or Skewed
ρ(A/B) ≠ ρ(A) => System is Inconsistent, No Solution
Gaussian Elimination Method
Theory
Eliminating a variable by performing some operations
Matrix Format => Ax = B
Row Echelon Form
Formats
1 x x | 1 x x | 1 x x x | 1 x x x |
0 x x | 0 0 x | 0 0 x x | 0 0 0 x |
0 0 x | 0 0 0 | 0 0 0 x | 0 0 0 0 |
Rank of the Matrix (ρ) => No. of complete non-zero Rows
Augmented Form => [A | B]
Elementary Row Operation
Interchanging the Rows
Multiplying by non-zero number
Add/Subtract a non-zeo constant
Gauss Jordan Method
Row Equivalent Form
Converting a Matrix into Upper Identity matrix
Reduce Row Equivalent Form
Converting a whole Matrix (Or its Sub-square matrix) into Identity matrix by doing row operation
Inverse Matrices
Additive/Multiplicative Identity/Inverse
Identity Matrix is the Multiplicative Identity for a Matrix
For Inverse Matrix
When row < col and (ρ) is maximum => Matrix will only have infinite no. of Right inverse
When row > col and (ρ) is maximum => Matrix will only have infinite no. of Left inverse
When row = col => Matrix will have both Inverse
To Find Inverse
Augment [A | I]
1/|A| * adj(A)
Elementary Matrices
Only 1 elementary row operation applied on identity matrix
When Elementary matrix is multiplied with a Matrix then the resultant matrix is same as if we have applied that elementary operation on the Matrix, Inverse of the Matrix can also be found by doing Inverse Operation
Permutation Matrix
Interchanging the Rows and Columns of an Identity matrix any no. of times
If W is a subspace of a vector space V, then dim(W) ≤ dim(V)
Ordered Bases
Dimensions
Cardinality of the Bases is the Dimension of that VS
Minimum no. of Vectors required to span the whole space for it to be Linearly independent
Dimension of Symmetric/Upper Triangular/Lower Triangular matrix = n(n+1)/2
Finite Dimensional Vector Space
Subspace Properties
Row, Column, Null Space
Interpolation Polynomial
Interpolation => Interpolation, in mathematics, the determination or estimation of the value of f(x), or a function of x, from certain known values of the function
Theorem
Let V and W be VS and T: V->W be linear, then N(T) & R(T) are subspaces of V and W respectively
Theorem 1
Let V and W be VS and T: V->W be linear, If N(T) and R(T) are finite dimensional, then dim(N(T)) = Nullity(T) and dim(R(T)) = Rank(T)
Dimension Theorem
Let V and W be VS and T: V->W be linear, If V is finite dimensional, then dim(N(T)) + dim(R(T)) = dim(V) or Rank(T) + Nullity(T) = dim(V)
Theorem 3
Let V and W be VS and T: V->W be linear, then T is one-one iff N(T) = {0}
If dim(V) = dim(W) and N(T) = {0}, then T is one-one & onto
Isomorphism
A Liner Transformation T: V->W if it is invertible => V and W are isomorphic to each other
Theorem 1
Let V and W be VS, {v1, v2, ..., vn} be a basis for V & {w1, w2, ..., wm} be any vectors in W, then there exists a unique LT T:V->W such that T(vi) = wi for i = 1, 2, ..., n