Applied Linear Algebra - Others

System of Linear Equations


Solution of Equation


  • Theory
    • Row Picture => We find the intersecting points
    • Column Picture => We find the Scaler that will make the vectors go towards destination
  • Types
    • Unique Solution
      • Intersection at only 1 point
      • ρ(A/B) = ρ(A) = No. of unknowns => System is Consistent, Unique Solution
    • Infinite Solution
      • Overlapping Lines/Planes
      • ρ(A/B) = ρ(A) < No. of unknowns => System is Consistent, Infinite Solution
    • No Solution
      • Parallel or Skewed
      • ρ(A/B) ≠ ρ(A) => System is Inconsistent, No Solution

Gaussian Elimination Method


  • Theory
    • Eliminating a variable by performing some operations
    • Matrix Format => Ax = B
    • Row Echelon Form
      • Formats
          1 x x | 1 x x | 1 x x x | 1 x x x |
          0 x x | 0 0 x | 0 0 x x | 0 0 0 x |
          0 0 x | 0 0 0 | 0 0 0 x | 0 0 0 0 |
        
      • Rank of the Matrix (ρ) => No. of complete non-zero Rows
    • Augmented Form => [A | B]
  • Elementary Row Operation
    • Interchanging the Rows
    • Multiplying by non-zero number
    • Add/Subtract a non-zeo constant

Gauss Jordan Method


  • Row Equivalent Form
    • Converting a Matrix into Upper Identity matrix
  • Reduce Row Equivalent Form
    • Converting a whole Matrix (Or its Sub-square matrix) into Identity matrix by doing row operation

Inverse Matrices


  • Additive/Multiplicative Identity/Inverse
  • Identity Matrix is the Multiplicative Identity for a Matrix
  • For Inverse Matrix
    • When row < col and (ρ) is maximum => Matrix will only have infinite no. of Right inverse
    • When row > col and (ρ) is maximum => Matrix will only have infinite no. of Left inverse
    • When row = col => Matrix will have both Inverse
  • To Find Inverse
    • Augment [A | I]
    • 1/|A| * adj(A)

Elementary Matrices


  • Only 1 elementary row operation applied on identity matrix
  • When Elementary matrix is multiplied with a Matrix then the resultant matrix is same as if we have applied that elementary operation on the Matrix, Inverse of the Matrix can also be found by doing Inverse Operation

Permutation Matrix


  • Interchanging the Rows and Columns of an Identity matrix any no. of times
  • Inverse of this matrix is its Transpose

LU Decomposition


  • A = LU
  • AX = B => LUX = B
  • Y = [y1, y2, y3]
  • LY = B => Find Y
  • X = [x, y, z]
  • UX = Y => Find X

Spaces


The Euclidean Space


  • Number System
    • Natural Number (N)
    • Whole Number (W)
    • Integer (Z)
    • Rational Number (Q)
    • Irrational Number (Qc)
    • Real Number (R)
    • Complex Number (C)
  • Binary Operations
    • ⁕ : A x A → A

Vector Space


  • Groups
    • Closure Axiom => a⁕b ∈G, ∀ a,b∈G
    • Associative Axiom => a⁕(b⁕c)=(a⁕b)⁕c ∈G, ∀ a,b,c∈G
    • Identity Axiom => a⁕e=e⁕a=a, e∈G, ∀ a∈G
    • Inverse Axiom => a⁕a'=a'⁕a=e, ∀ a∈G ∃ a'∈G
  • Abelian Group
    • Should be a Group
    • Commutative Axiom => a⁕b=b⁕a ∈G, ∀ a,b∈G
  • Rings
  • Fields
    • A set forms Abelian Group w.r.t Addition & Multiplication
    • Eg = (Q, +, _), (C, +, _), (R, +, *) => 0 is ignored in case of multiplication
  • Vector Space V(F)
    • Let (V, +) be an Abelian Group and V is said to be a vector space over Field (F), if
      • α.(u+v) = αu + αv ∈V ∀ u,v∈V & α∈F
      • (α+β).u = αu + βu ∈V ∀ u∈V & α,β∈F
      • α.(βu) = (αβ)u ∈V ∀ u∈V & α,β∈F
      • 1.u = u ∈V ∀ u∈V & 1∈F
      • α.0 = 0 ∈V ∀ α∈F
    • Element of V is vector, Element of F is scalar
    • Eg = Q(Q), R(Q), R(R), C(Q), C(R), C(C)

Subspace


  • If V(F) and W⊆V and W(F) Then W is Subspace
  • Vector Addition and Scalar Multiplication
  • Conditions for subset(W) to be a subspace
    • 0 ∈ W
    • x + y ∈ W ∀ x,y ∈ W
    • αx ∈ W ∀ x ∈ W, α ∈ F
  • Any straight line/plane which passes through the origin forms a subspace
  • Intersection of Subspace of V is also a subspace of V
    • Union of subspace is a subspace if one of them is a subset of other
  • Direct Sum of Subspace => V = W1 ⊕ W2
    • V is called direct sum of W1 & W2 if W1 and W2 are subspaces of V and W1 ∩ W2 = {0} and W1 + W2 = V

Linear Combination and Span


  • Linear Combination (L)
    • V = α1u1 + α2u2 + ... + αnun
  • Linear Span
    • All possible combination of Linear Combination
    • span of empty set is {0}
    • span of any subset S of a vector space V is a subspace of V, Any subspace of V that contains S must also contain the span of S
    • A subset S of vector space V generates V or spans V if L(S)/span(S) = V

Linearly Dependent & Independent


  • Linearly Dependent (Collinear)
    • Linear Dependent
  • Linear Independent
    • Testing for Linear Independence
    • Determinant gives area/volume of parallelogram/cube
    • Ron Scan => Determinant should not be equal to 0
        | f(x)   g(x)   h(x)   |
        | f'(x)  g'(x)  h'(x)  |
        | f''(x) g''(x) h''(x) |
      
    • Theorem
    • Remarks
    • Remarks
  • Bases
    • It is not unique for a VS
    • These Rn(R) are also called Euclidean Space
    • Definition
    • Remarks
    • Theorem
    • Theorem
    • To construct a Basis
      • Construct a Basis
    • If W is a subspace of a vector space V, then dim(W) ≤ dim(V)
    • Theorem
    • Theorem
    • Ordered Bases
  • Dimensions
    • Cardinality of the Bases is the Dimension of that VS
      • Minimum no. of Vectors required to span the whole space for it to be Linearly independent
    • Dimension of Symmetric/Upper Triangular/Lower Triangular matrix = n(n+1)/2
  • Finite Dimensional Vector Space

Subspace Properties


Row, Column, Null Space


  • Definition
  • Theorem
  • Find Basis
  • Theorem
  • Theorem

Interpolation Polynomial


  • Interpolation => Interpolation, in mathematics, the determination or estimation of the value of f(x), or a function of x, from certain known values of the function
  • Theorem
    • Let V and W be VS and T: V->W be linear, then N(T) & R(T) are subspaces of V and W respectively
  • Theorem 1
    • Let V and W be VS and T: V->W be linear, If N(T) and R(T) are finite dimensional, then dim(N(T)) = Nullity(T) and dim(R(T)) = Rank(T)
  • Dimension Theorem
    • Let V and W be VS and T: V->W be linear, If V is finite dimensional, then dim(N(T)) + dim(R(T)) = dim(V) or Rank(T) + Nullity(T) = dim(V)
  • Theorem 3
    • Let V and W be VS and T: V->W be linear, then T is one-one iff N(T) = {0}
      • If dim(V) = dim(W) and N(T) = {0}, then T is one-one & onto

Isomorphism


  • A Liner Transformation T: V->W if it is invertible => V and W are isomorphic to each other
  • Theorem 1
    • Let V and W be VS, {v1, v2, ..., vn} be a basis for V & {w1, w2, ..., wm} be any vectors in W, then there exists a unique LT T:V->W such that T(vi) = wi for i = 1, 2, ..., n
  • Theorem 2
    • Two VS V and W are isomorphic iff dim V = dim W
  • T: V->W be a LT, then
    • T is one-one iff ker(T) = {0}
    • If V=W, then T is one-one iff T is onto
  • Corollary
    • Any n-dimensional VS V is isomorphic to Rn

Linear Transformations and Applications


Linear Transformations


  • Basic properties
  • Invertible linear transformation
  • Matrices of linear

Transformations


  • Vector space of linear transformations
  • Change of bases
  • Similarity
Share: