doc-next-gen

Mathlib.LinearAlgebra.Matrix.Determinant.Basic

Module docstring

{"# Determinant of a matrix

This file defines the determinant of a matrix, Matrix.det, and its essential properties.

Main definitions

  • Matrix.det: the determinant of a square matrix, as a sum over permutations
  • Matrix.detRowAlternating: the determinant, as an AlternatingMap in the rows of the matrix

Main results

  • det_mul: the determinant of A * B is the product of determinants
  • det_zero_of_row_eq: the determinant is zero if there is a repeated row
  • det_block_diagonal: the determinant of a block diagonal matrix is a product of the blocks' determinants

Implementation notes

It is possible to configure simp to compute determinants. See the file MathlibTest/matrix.lean for some examples.

","### det_zero section

Prove that a matrix with a repeated column has determinant equal to zero. ","### det_eq section

Lemmas showing the determinant is invariant under a variety of operations. "}

Matrix.detRowAlternating definition
: (n → R) [⋀^n]→ₗ[R] R
Full source
/-- `det` is an `AlternatingMap` in the rows of the matrix. -/
def detRowAlternating : (n → R) [⋀^n]→ₗ[R] R :=
  MultilinearMap.alternatization ((MultilinearMap.mkPiAlgebra R n R).compLinearMap LinearMap.proj)
Determinant as an alternating multilinear map on matrix rows
Informal description
The determinant of a matrix, viewed as an alternating multilinear map in the rows of the matrix. Specifically, it is the alternatization of the multilinear map that projects each row of the matrix and takes their product in the base ring $R$.
Matrix.det abbrev
(M : Matrix n n R) : R
Full source
/-- The determinant of a matrix given by the Leibniz formula. -/
abbrev det (M : Matrix n n R) : R :=
  detRowAlternating M
Determinant via Leibniz Formula
Informal description
The determinant of a square matrix $M$ of size $n \times n$ with entries in a ring $R$ is given by the Leibniz formula: \[ \det(M) = \sum_{\sigma \in S_n} \text{sgn}(\sigma) \prod_{i=1}^n M_{\sigma(i),i} \] where $S_n$ is the symmetric group on $n$ elements and $\text{sgn}(\sigma)$ is the sign of the permutation $\sigma$.
Matrix.det_apply theorem
(M : Matrix n n R) : M.det = ∑ σ : Perm n, Equiv.Perm.sign σ • ∏ i, M (σ i) i
Full source
theorem det_apply (M : Matrix n n R) : M.det = ∑ σ : Perm n, Equiv.Perm.sign σ • ∏ i, M (σ i) i :=
  MultilinearMap.alternatization_apply _ M
Leibniz Formula for Determinant
Informal description
For any square matrix $M$ of size $n \times n$ with entries in a ring $R$, the determinant of $M$ is given by the Leibniz formula: \[ \det(M) = \sum_{\sigma \in S_n} \text{sgn}(\sigma) \prod_{i=1}^n M_{\sigma(i),i} \] where $S_n$ is the symmetric group on $n$ elements and $\text{sgn}(\sigma)$ is the sign of the permutation $\sigma$.
Matrix.det_apply' theorem
(M : Matrix n n R) : M.det = ∑ σ : Perm n, ε σ * ∏ i, M (σ i) i
Full source
theorem det_apply' (M : Matrix n n R) : M.det = ∑ σ : Perm n, ε σ * ∏ i, M (σ i) i := by
  simp [det_apply, Units.smul_def]
Leibniz Formula for Determinant (alternate form)
Informal description
For any square matrix $M$ of size $n \times n$ with entries in a ring $R$, the determinant of $M$ is given by the Leibniz formula: \[ \det(M) = \sum_{\sigma \in S_n} \varepsilon(\sigma) \prod_{i=1}^n M_{\sigma(i),i} \] where $S_n$ is the symmetric group on $n$ elements and $\varepsilon(\sigma)$ is the sign of the permutation $\sigma$ (equal to $1$ for even permutations and $-1$ for odd permutations).
Matrix.det_eq_detp_sub_detp theorem
(M : Matrix n n R) : M.det = M.detp 1 - M.detp (-1)
Full source
theorem det_eq_detp_sub_detp (M : Matrix n n R) : M.det = M.detp 1 - M.detp (-1) := by
  rw [det_apply, ← Equiv.sum_comp (Equiv.inv (Perm n)), ← ofSign_disjUnion, sum_disjUnion]
  simp_rw [inv_apply, sign_inv, sub_eq_add_neg, detp, ← sum_neg_distrib]
  refine congr_arg₂ (· + ·) (sum_congr rfl fun σ hσ ↦ ?_) (sum_congr rfl fun σ hσ ↦ ?_) <;>
    rw [mem_ofSign.mp hσ, ← Equiv.prod_comp σ] <;> simp
Determinant as Difference of Even and Odd Permutation Sums
Informal description
For any square matrix $M$ of size $n \times n$ with entries in a ring $R$, the determinant of $M$ equals the difference between the sum of products of matrix entries corresponding to even permutations and the sum of products corresponding to odd permutations. More precisely: \[ \det(M) = \sum_{\sigma \in S_n, \text{sgn}(\sigma)=1} \prod_{i=1}^n M_{\sigma(i),i} - \sum_{\sigma \in S_n, \text{sgn}(\sigma)=-1} \prod_{i=1}^n M_{\sigma(i),i} \] where $S_n$ is the symmetric group on $n$ elements and $\text{sgn}(\sigma)$ is the sign of the permutation $\sigma$.
Matrix.det_diagonal theorem
{d : n → R} : det (diagonal d) = ∏ i, d i
Full source
@[simp]
theorem det_diagonal {d : n → R} : det (diagonal d) = ∏ i, d i := by
  rw [det_apply']
  refine (Finset.sum_eq_single 1 ?_ ?_).trans ?_
  · rintro σ - h2
    obtain ⟨x, h3⟩ := not_forall.1 (mt Equiv.ext h2)
    convert mul_zero (ε σ)
    apply Finset.prod_eq_zero (mem_univ x)
    exact if_neg h3
  · simp
  · simp
Determinant of a Diagonal Matrix is the Product of Diagonal Entries
Informal description
For any diagonal matrix $M$ with entries $d_i$ on the diagonal, the determinant of $M$ is equal to the product of the diagonal entries: \[ \det(M) = \prod_{i} d_i. \]
Matrix.det_zero theorem
(_ : Nonempty n) : det (0 : Matrix n n R) = 0
Full source
theorem det_zero (_ : Nonempty n) : det (0 : Matrix n n R) = 0 :=
  (detRowAlternating : (n → R) [⋀^n]→ₗ[R] R).map_zero
Determinant of Zero Matrix is Zero
Informal description
For any nonempty type $n$ and any ring $R$, the determinant of the zero matrix of size $n \times n$ over $R$ is equal to $0$.
Matrix.det_one theorem
: det (1 : Matrix n n R) = 1
Full source
@[simp]
theorem det_one : det (1 : Matrix n n R) = 1 := by rw [← diagonal_one]; simp [-diagonal_one]
Determinant of Identity Matrix is One
Informal description
The determinant of the identity matrix of size $n \times n$ over a ring $R$ is equal to $1$.
Matrix.det_isEmpty theorem
[IsEmpty n] {A : Matrix n n R} : det A = 1
Full source
theorem det_isEmpty [IsEmpty n] {A : Matrix n n R} : det A = 1 := by simp [det_apply]
Determinant of Empty Matrix is One
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, if the index type $n$ is empty, then the determinant of $A$ is equal to $1$.
Matrix.coe_det_isEmpty theorem
[IsEmpty n] : (det : Matrix n n R → R) = Function.const _ 1
Full source
@[simp]
theorem coe_det_isEmpty [IsEmpty n] : (det : Matrix n n R → R) = Function.const _ 1 := by
  ext
  exact det_isEmpty
Determinant Function is Constant One for Empty Matrices
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, if the index type $n$ is empty, then the determinant function $\det$ is equal to the constant function that maps every matrix to $1$.
Matrix.det_eq_one_of_card_eq_zero theorem
{A : Matrix n n R} (h : Fintype.card n = 0) : det A = 1
Full source
theorem det_eq_one_of_card_eq_zero {A : Matrix n n R} (h : Fintype.card n = 0) : det A = 1 :=
  haveI : IsEmpty n := Fintype.card_eq_zero_iff.mp h
  det_isEmpty
Determinant of Zero-Dimensional Matrix is One
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, if the index set $n$ has cardinality zero (i.e., $n$ is empty), then the determinant of $A$ is equal to $1$.
Matrix.det_unique theorem
{n : Type*} [Unique n] [DecidableEq n] [Fintype n] (A : Matrix n n R) : det A = A default default
Full source
/-- If `n` has only one element, the determinant of an `n` by `n` matrix is just that element.
Although `Unique` implies `DecidableEq` and `Fintype`, the instances might
not be syntactically equal. Thus, we need to fill in the args explicitly. -/
@[simp]
theorem det_unique {n : Type*} [Unique n] [DecidableEq n] [Fintype n] (A : Matrix n n R) :
    det A = A default default := by simp [det_apply, univ_unique]
Determinant of Singleton Matrix Equals Its Single Entry
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, where the index type $n$ has exactly one element (i.e., $n$ is a singleton type), the determinant of $A$ is equal to its single entry $A_{\text{default}}$. Here $\text{default}$ denotes the unique element of type $n$.
Matrix.det_eq_elem_of_subsingleton theorem
[Subsingleton n] (A : Matrix n n R) (k : n) : det A = A k k
Full source
theorem det_eq_elem_of_subsingleton [Subsingleton n] (A : Matrix n n R) (k : n) :
    det A = A k k := by
  have := uniqueOfSubsingleton k
  convert det_unique A
Determinant of Subsingleton Matrix Equals Diagonal Entry
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, where the index type $n$ has at most one element (i.e., $n$ is a subsingleton type), the determinant of $A$ is equal to its diagonal entry $A_{k,k}$ for any $k \in n$.
Matrix.det_eq_elem_of_card_eq_one theorem
{A : Matrix n n R} (h : Fintype.card n = 1) (k : n) : det A = A k k
Full source
theorem det_eq_elem_of_card_eq_one {A : Matrix n n R} (h : Fintype.card n = 1) (k : n) :
    det A = A k k :=
  haveI : Subsingleton n := Fintype.card_le_one_iff_subsingleton.mp h.le
  det_eq_elem_of_subsingleton _ _
Determinant of Singleton Matrix Equals Its Single Entry
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, if the index set $n$ has cardinality one (i.e., $|n| = 1$), then the determinant of $A$ equals its single entry $A_{k,k}$ for any $k \in n$.
Matrix.det_mul_aux theorem
{M N : Matrix n n R} {p : n → n} (H : ¬Bijective p) : (∑ σ : Perm n, ε σ * ∏ x, M (σ x) (p x) * N (p x) x) = 0
Full source
theorem det_mul_aux {M N : Matrix n n R} {p : n → n} (H : ¬Bijective p) :
    (∑ σ : Perm n, ε σ * ∏ x, M (σ x) (p x) * N (p x) x) = 0 := by
  obtain ⟨i, j, hpij, hij⟩ : ∃ i j, p i = p j ∧ i ≠ j := by
    rw [← Finite.injective_iff_bijective, Injective] at H
    push_neg at H
    exact H
  exact
    sum_involution (fun σ _ => σ * Equiv.swap i j)
      (fun σ _ => by
        have : (∏ x, M (σ x) (p x)) = ∏ x, M ((σ * Equiv.swap i j) x) (p x) :=
          Fintype.prod_equiv (swap i j) _ _ (by simp [apply_swap_eq_self hpij])
        simp [this, sign_swap hij, -sign_swap', prod_mul_distrib])
      (fun σ _ _ => (not_congr mul_swap_eq_iff).mpr hij) (fun _ _ => mem_univ _) fun σ _ =>
      mul_swap_involutive i j σ
Vanishing of Determinant-like Sum for Non-Bijective Functions
Informal description
For any two square matrices $M$ and $N$ of size $n \times n$ over a ring $R$, and any function $p : n \to n$ that is not bijective, the sum over all permutations $\sigma$ of $n$ of the product $\text{sign}(\sigma) \cdot \prod_{x} M_{\sigma(x), p(x)} \cdot N_{p(x), x}$ equals zero. Here, $\text{sign}(\sigma)$ denotes the sign of the permutation $\sigma$, and the product is taken over all elements $x$ in the index set $n$.
Matrix.det_mul theorem
(M N : Matrix n n R) : det (M * N) = det M * det N
Full source
@[simp]
theorem det_mul (M N : Matrix n n R) : det (M * N) = det M * det N :=
  calc
    det (M * N) = ∑ p : n → n, ∑ σ : Perm n, ε σ * ∏ i, M (σ i) (p i) * N (p i) i := by
      simp only [det_apply', mul_apply, prod_univ_sum, mul_sum, Fintype.piFinset_univ]
      rw [Finset.sum_comm]
    _ = ∑ p : n → n with Bijective p, ∑ σ : Perm n, ε σ * ∏ i, M (σ i) (p i) * N (p i) i := by
      refine (sum_subset (filter_subset _ _) fun f _ hbij ↦ det_mul_aux ?_).symm
      simpa only [true_and, mem_filter, mem_univ] using hbij
    _ = ∑ τ : Perm n, ∑ σ : Perm n, ε σ * ∏ i, M (σ i) (τ i) * N (τ i) i :=
      sum_bij (fun p h ↦ Equiv.ofBijective p (mem_filter.1 h).2) (fun _ _ ↦ mem_univ _)
        (fun _ _ _ _ h ↦ by injection h)
        (fun b _ ↦ ⟨b, mem_filter.2 ⟨mem_univ _, b.bijective⟩, coe_fn_injective rfl⟩) fun _ _ ↦ rfl
    _ = ∑ σ : Perm n, ∑ τ : Perm n, (∏ i, N (σ i) i) * ε τ * ∏ j, M (τ j) (σ j) := by
      simp only [mul_comm, mul_left_comm, prod_mul_distrib, mul_assoc]
    _ = ∑ σ : Perm n, ∑ τ : Perm n, (∏ i, N (σ i) i) * (ε σ * ε τ) * ∏ i, M (τ i) i :=
      (sum_congr rfl fun σ _ =>
        Fintype.sum_equiv (Equiv.mulRight σ⁻¹) _ _ fun τ => by
          have : (∏ j, M (τ j) (σ j)) = ∏ j, M ((τ * σ⁻¹) j) j := by
            rw [← (σ⁻¹ : _ ≃ _).prod_comp]
            simp only [Equiv.Perm.coe_mul, apply_inv_self, Function.comp_apply]
          have h : ε σ * ε (τ * σ⁻¹) = ε τ :=
            calc
              ε σ * ε (τ * σ⁻¹) = ε (τ * σ⁻¹ * σ) := by
                rw [mul_comm, sign_mul (τ * σ⁻¹)]
                simp only [Int.cast_mul, Units.val_mul]
              _ = ε τ := by simp only [inv_mul_cancel_right]

          simp_rw [Equiv.coe_mulRight, h]
          simp only [this])
    _ = det M * det N := by
      simp only [det_apply', Finset.mul_sum, mul_comm, mul_left_comm, mul_assoc]
Multiplicativity of Determinant: $\det(MN) = \det(M)\det(N)$
Informal description
For any two square matrices $M$ and $N$ of size $n \times n$ over a ring $R$, the determinant of their product equals the product of their determinants, i.e., \[ \det(M \cdot N) = \det(M) \cdot \det(N). \]
Matrix.detMonoidHom definition
: Matrix n n R →* R
Full source
/-- The determinant of a matrix, as a monoid homomorphism. -/
def detMonoidHom : MatrixMatrix n n R →* R where
  toFun := det
  map_one' := det_one
  map_mul' := det_mul
Determinant as a monoid homomorphism
Informal description
The determinant function, viewed as a monoid homomorphism from the multiplicative monoid of $n \times n$ matrices over a ring $R$ to the multiplicative monoid of $R$. It maps the identity matrix to $1$ and satisfies the multiplicative property $\det(M \cdot N) = \det(M) \cdot \det(N)$ for any two matrices $M$ and $N$.
Matrix.coe_detMonoidHom theorem
: (detMonoidHom : Matrix n n R → R) = det
Full source
@[simp]
theorem coe_detMonoidHom : (detMonoidHom : Matrix n n R → R) = det :=
  rfl
Determinant Monoid Homomorphism as Determinant Function
Informal description
The determinant monoid homomorphism, when viewed as a function from $n \times n$ matrices over a ring $R$ to $R$, is equal to the determinant function $\det$.
Matrix.det_mul_comm theorem
(M N : Matrix m m R) : det (M * N) = det (N * M)
Full source
/-- On square matrices, `mul_comm` applies under `det`. -/
theorem det_mul_comm (M N : Matrix m m R) : det (M * N) = det (N * M) := by
  rw [det_mul, det_mul, mul_comm]
Commutativity of Determinant under Matrix Multiplication: $\det(MN) = \det(NM)$
Informal description
For any two square matrices $M$ and $N$ of size $m \times m$ over a ring $R$, the determinant of their product is commutative, i.e., \[ \det(M \cdot N) = \det(N \cdot M). \]
Matrix.det_mul_left_comm theorem
(M N P : Matrix m m R) : det (M * (N * P)) = det (N * (M * P))
Full source
/-- On square matrices, `mul_left_comm` applies under `det`. -/
theorem det_mul_left_comm (M N P : Matrix m m R) : det (M * (N * P)) = det (N * (M * P)) := by
  rw [← Matrix.mul_assoc, ← Matrix.mul_assoc, det_mul, det_mul_comm M N, ← det_mul]
Left-Commutativity of Determinant under Matrix Multiplication: $\det(M(NP)) = \det(N(MP))$
Informal description
For any three square matrices $M$, $N$, and $P$ of size $m \times m$ over a ring $R$, the determinant satisfies the left-commutativity property: \[ \det(M \cdot (N \cdot P)) = \det(N \cdot (M \cdot P)). \]
Matrix.det_mul_right_comm theorem
(M N P : Matrix m m R) : det (M * N * P) = det (M * P * N)
Full source
/-- On square matrices, `mul_right_comm` applies under `det`. -/
theorem det_mul_right_comm (M N P : Matrix m m R) : det (M * N * P) = det (M * P * N) := by
  rw [Matrix.mul_assoc, Matrix.mul_assoc, det_mul, det_mul_comm N P, ← det_mul]
Right-Commutativity of Determinant under Matrix Multiplication: $\det(MNP) = \det(MPN)$
Informal description
For any three square matrices $M$, $N$, and $P$ of size $m \times m$ over a ring $R$, the determinant satisfies the right-commutativity property: \[ \det(M \cdot N \cdot P) = \det(M \cdot P \cdot N). \]
Matrix.det_units_conj theorem
(M : (Matrix m m R)ˣ) (N : Matrix m m R) : det (M.val * N * M⁻¹.val) = det N
Full source
theorem det_units_conj (M : (Matrix m m R)ˣ) (N : Matrix m m R) :
    det (M.val * N * M⁻¹.val) = det N := by
  rw [det_mul_right_comm, Units.mul_inv, one_mul]
Determinant Invariance under Conjugation by Invertible Matrix: $\det(M N M^{-1}) = \det(N)$
Informal description
For any invertible matrix $M$ and any square matrix $N$ of size $m \times m$ over a ring $R$, the determinant of the conjugate $M N M^{-1}$ is equal to the determinant of $N$, i.e., \[ \det(M N M^{-1}) = \det(N). \]
Matrix.det_units_conj' theorem
(M : (Matrix m m R)ˣ) (N : Matrix m m R) : det (M⁻¹.val * N * ↑M.val) = det N
Full source
theorem det_units_conj' (M : (Matrix m m R)ˣ) (N : Matrix m m R) :
    det (M⁻¹.val * N * ↑M.val) = det N :=
  det_units_conj M⁻¹ N
Determinant Invariance under Conjugation by Inverse Matrix: $\det(M^{-1} N M) = \det(N)$
Informal description
For any invertible matrix $M$ and any square matrix $N$ of size $m \times m$ over a ring $R$, the determinant of the conjugate $M^{-1} N M$ is equal to the determinant of $N$, i.e., \[ \det(M^{-1} N M) = \det(N). \]
Matrix.det_transpose theorem
(M : Matrix n n R) : Mᵀ.det = M.det
Full source
/-- Transposing a matrix preserves the determinant. -/
@[simp]
theorem det_transpose (M : Matrix n n R) : Mᵀ.det = M.det := by
  rw [det_apply', det_apply']
  refine Fintype.sum_bijective _ inv_involutive.bijective _ _ ?_
  intro σ
  rw [sign_inv]
  congr 1
  apply Fintype.prod_equiv σ
  simp
Determinant of Transpose Matrix Equals Determinant of Original Matrix
Informal description
For any square matrix $M$ of size $n \times n$ with entries in a ring $R$, the determinant of the transpose matrix $M^\top$ is equal to the determinant of $M$, i.e., \[ \det(M^\top) = \det(M). \]
Matrix.det_permute theorem
(σ : Perm n) (M : Matrix n n R) : (M.submatrix σ id).det = Perm.sign σ * M.det
Full source
/-- Permuting the columns changes the sign of the determinant. -/
theorem det_permute (σ : Perm n) (M : Matrix n n R) :
    (M.submatrix σ id).det = Perm.sign σ * M.det :=
  ((detRowAlternating : (n → R) [⋀^n]→ₗ[R] R).map_perm M σ).trans (by simp [Units.smul_def])
Effect of Row Permutation on Determinant: $\det(M_{\sigma(i),j}) = \text{sign}(\sigma) \cdot \det(M)$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, and let $\sigma$ be a permutation of the set $\{1, \dots, n\}$. Then the determinant of the matrix obtained by permuting the rows of $M$ according to $\sigma$ satisfies: \[ \det(M_{\sigma(i),j}) = \text{sign}(\sigma) \cdot \det(M) \] where $\text{sign}(\sigma)$ is the sign of the permutation $\sigma$ (equal to $1$ if $\sigma$ is even and $-1$ if $\sigma$ is odd).
Matrix.det_permute' theorem
(σ : Perm n) (M : Matrix n n R) : (M.submatrix id σ).det = Perm.sign σ * M.det
Full source
/-- Permuting the rows changes the sign of the determinant. -/
theorem det_permute' (σ : Perm n) (M : Matrix n n R) :
    (M.submatrix id σ).det = Perm.sign σ * M.det := by
  rw [← det_transpose, transpose_submatrix, det_permute, det_transpose]
Effect of Column Permutation on Determinant: $\det(M_{i,\sigma(j)}) = \text{sign}(\sigma) \cdot \det(M)$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, and let $\sigma$ be a permutation of the set $\{1, \dots, n\}$. Then the determinant of the matrix obtained by permuting the columns of $M$ according to $\sigma$ satisfies: \[ \det(M_{i,\sigma(j)}) = \text{sign}(\sigma) \cdot \det(M) \] where $\text{sign}(\sigma)$ is the sign of the permutation $\sigma$ (equal to $1$ if $\sigma$ is even and $-1$ if $\sigma$ is odd).
Matrix.det_submatrix_equiv_self theorem
(e : n ≃ m) (A : Matrix m m R) : det (A.submatrix e e) = det A
Full source
/-- Permuting rows and columns with the same equivalence does not change the determinant. -/
@[simp]
theorem det_submatrix_equiv_self (e : n ≃ m) (A : Matrix m m R) :
    det (A.submatrix e e) = det A := by
  rw [det_apply', det_apply']
  apply Fintype.sum_equiv (Equiv.permCongr e)
  intro σ
  rw [Equiv.Perm.sign_permCongr e σ]
  congr 1
  apply Fintype.prod_equiv e
  intro i
  rw [Equiv.permCongr_apply, Equiv.symm_apply_apply, submatrix_apply]
Determinant Invariance under Simultaneous Row and Column Permutation via Equivalence
Informal description
Let $A$ be an $m \times m$ matrix over a ring $R$, and let $e : n \simeq m$ be an equivalence between the index sets $n$ and $m$. Then the determinant of the submatrix obtained by selecting rows and columns according to $e$ is equal to the determinant of $A$, i.e., \[ \det(A_{e(i),e(j)}) = \det(A). \]
Matrix.abs_det_submatrix_equiv_equiv theorem
{R : Type*} [CommRing R] [LinearOrder R] [IsStrictOrderedRing R] (e₁ e₂ : n ≃ m) (A : Matrix m m R) : |(A.submatrix e₁ e₂).det| = |A.det|
Full source
/-- Permuting rows and columns with two equivalences does not change the absolute value of the
determinant. -/
@[simp]
theorem abs_det_submatrix_equiv_equiv {R : Type*}
    [CommRing R] [LinearOrder R] [IsStrictOrderedRing R]
    (e₁ e₂ : n ≃ m) (A : Matrix m m R) :
    |(A.submatrix e₁ e₂).det| = |A.det| := by
  have hee : e₂ = e₁.trans (e₁.symm.trans e₂) := by ext; simp
  rw [hee]
  show |((A.submatrix id (e₁.symm.trans e₂)).submatrix e₁ e₁).det| = |A.det|
  rw [Matrix.det_submatrix_equiv_self, Matrix.det_permute', abs_mul, abs_unit_intCast, one_mul]
Absolute Determinant Invariance under Independent Row and Column Permutations: $|\det(A_{e_1(i), e_2(j)})| = |\det(A)|$
Informal description
Let $R$ be a commutative ring with a strict linear order, and let $A$ be an $m \times m$ matrix over $R$. For any two equivalences $e_1, e_2 : n \simeq m$ between index sets $n$ and $m$, the absolute value of the determinant of the submatrix $A_{e_1(i), e_2(j)}$ is equal to the absolute value of the determinant of $A$, i.e., \[ |\det(A_{e_1(i), e_2(j)})| = |\det(A)|. \]
Matrix.det_reindex_self theorem
(e : m ≃ n) (A : Matrix m m R) : det (reindex e e A) = det A
Full source
/-- Reindexing both indices along the same equivalence preserves the determinant.

For the `simp` version of this lemma, see `det_submatrix_equiv_self`; this one is unsuitable because
`Matrix.reindex_apply` unfolds `reindex` first.
-/
theorem det_reindex_self (e : m ≃ n) (A : Matrix m m R) : det (reindex e e A) = det A :=
  det_submatrix_equiv_self e.symm A
Determinant Invariance under Simultaneous Row and Column Reindexing
Informal description
Let $A$ be an $m \times m$ matrix over a ring $R$, and let $e : m \simeq n$ be an equivalence between the index sets $m$ and $n$. Then the determinant of the matrix obtained by reindexing both rows and columns of $A$ according to $e$ is equal to the determinant of $A$, i.e., \[ \det(A_{e(i),e(j)}) = \det(A). \]
Matrix.abs_det_reindex theorem
{R : Type*} [CommRing R] [LinearOrder R] [IsStrictOrderedRing R] (e₁ e₂ : m ≃ n) (A : Matrix m m R) : |det (reindex e₁ e₂ A)| = |det A|
Full source
/-- Reindexing both indices along equivalences preserves the absolute of the determinant.

For the `simp` version of this lemma, see `abs_det_submatrix_equiv_equiv`;
this one is unsuitable because `Matrix.reindex_apply` unfolds `reindex` first.
-/
theorem abs_det_reindex {R : Type*} [CommRing R] [LinearOrder R] [IsStrictOrderedRing R]
    (e₁ e₂ : m ≃ n) (A : Matrix m m R) :
    |det (reindex e₁ e₂ A)| = |det A| :=
  abs_det_submatrix_equiv_equiv e₁.symm e₂.symm A
Absolute Determinant Invariance under Reindexing: $|\det(A_{e_1, e_2})| = |\det(A)|$
Informal description
Let $R$ be a commutative ring with a strict linear order and the structure of a strictly ordered ring. For any $m \times m$ matrix $A$ over $R$ and any two index equivalences $e_1, e_2 : m \simeq n$, the absolute value of the determinant of the reindexed matrix $A_{e_1(i), e_2(j)}$ equals the absolute value of the determinant of $A$, i.e., \[ |\det(A_{e_1(i), e_2(j)})| = |\det(A)|. \]
Matrix.det_smul theorem
(A : Matrix n n R) (c : R) : det (c • A) = c ^ Fintype.card n * det A
Full source
theorem det_smul (A : Matrix n n R) (c : R) : det (c • A) = c ^ Fintype.card n * det A :=
  calc
    det (c • A) = det ((diagonal fun _ => c) * A) := by rw [smul_eq_diagonal_mul]
    _ = det (diagonal fun _ => c) * det A := det_mul _ _
    _ = c ^ Fintype.card n * det A := by simp
Determinant Scaling: $\det(cA) = c^n \det(A)$
Informal description
For any square matrix $A$ of size $n \times n$ with entries in a ring $R$ and any scalar $c \in R$, the determinant of the scaled matrix $cA$ equals $c^n$ times the determinant of $A$, i.e., \[ \det(cA) = c^n \det(A). \]
Matrix.det_smul_of_tower theorem
{α} [Monoid α] [MulAction α R] [IsScalarTower α R R] [SMulCommClass α R R] (c : α) (A : Matrix n n R) : det (c • A) = c ^ Fintype.card n • det A
Full source
@[simp]
theorem det_smul_of_tower {α} [Monoid α] [MulAction α R] [IsScalarTower α R R]
    [SMulCommClass α R R] (c : α) (A : Matrix n n R) :
    det (c • A) = c ^ Fintype.card n • det A := by
  rw [← smul_one_smul R c A, det_smul, smul_pow, one_pow, smul_mul_assoc, one_mul]
Determinant Scaling with Monoid Action: $\det(cA) = c^n \det(A)$
Informal description
Let $R$ be a ring and $\alpha$ be a monoid acting on $R$ such that $\alpha$ is a scalar tower over $R$ and the scalar multiplication commutes with the ring multiplication. For any scalar $c \in \alpha$ and any square matrix $A$ of size $n \times n$ with entries in $R$, the determinant of the scaled matrix $cA$ equals $c^n$ times the determinant of $A$, i.e., \[ \det(cA) = c^n \det(A). \]
Matrix.det_neg theorem
(A : Matrix n n R) : det (-A) = (-1) ^ Fintype.card n * det A
Full source
theorem det_neg (A : Matrix n n R) : det (-A) = (-1) ^ Fintype.card n * det A := by
  rw [← det_smul, neg_one_smul]
Determinant of Negated Matrix: $\det(-A) = (-1)^n \det(A)$
Informal description
For any square matrix $A$ of size $n \times n$ with entries in a ring $R$, the determinant of the negation of $A$ equals $(-1)^n$ times the determinant of $A$, i.e., \[ \det(-A) = (-1)^n \det(A). \]
Matrix.det_neg_eq_smul theorem
(A : Matrix n n R) : det (-A) = (-1 : Units ℤ) ^ Fintype.card n • det A
Full source
/-- A variant of `Matrix.det_neg` with scalar multiplication by `Units ℤ` instead of multiplication
by `R`. -/
theorem det_neg_eq_smul (A : Matrix n n R) :
    det (-A) = (-1 : Units ) ^ Fintype.card n • det A := by
  rw [← det_smul_of_tower, Units.neg_smul, one_smul]
Determinant of Negated Matrix via Integer Units: $\det(-A) = (-1)^n \det(A)$
Informal description
For any square matrix $A$ of size $n \times n$ with entries in a ring $R$, the determinant of the negated matrix $-A$ equals $(-1)^n$ times the determinant of $A$, where the scalar multiplication is performed using the unit $-1$ from the integers, i.e., \[ \det(-A) = (-1)^n \cdot \det(A). \]
Matrix.det_mul_row theorem
(v : n → R) (A : Matrix n n R) : det (of fun i j => v j * A i j) = (∏ i, v i) * det A
Full source
/-- Multiplying each row by a fixed `v i` multiplies the determinant by
the product of the `v`s. -/
theorem det_mul_row (v : n → R) (A : Matrix n n R) :
    det (of fun i j => v j * A i j) = (∏ i, v i) * det A :=
  calc
    det (of fun i j => v j * A i j) = det (A * diagonal v) :=
      congr_arg det <| by
        ext
        simp [mul_comm]
    _ = (∏ i, v i) * det A := by rw [det_mul, det_diagonal, mul_comm]
Determinant Scaling by Row Multiplier: $\det(v_j \cdot A_{i,j}) = (\prod_i v_i) \det(A)$
Informal description
For any vector $v \colon n \to R$ and any square matrix $A$ of size $n \times n$ over a ring $R$, the determinant of the matrix obtained by multiplying each entry $A_{i,j}$ by $v_j$ is equal to the product of the entries of $v$ multiplied by the determinant of $A$, i.e., \[ \det \big( (v_j \cdot A_{i,j})_{i,j} \big) = \left( \prod_{i=1}^n v_i \right) \cdot \det(A). \]
Matrix.det_mul_column theorem
(v : n → R) (A : Matrix n n R) : det (of fun i j => v i * A i j) = (∏ i, v i) * det A
Full source
/-- Multiplying each column by a fixed `v j` multiplies the determinant by
the product of the `v`s. -/
theorem det_mul_column (v : n → R) (A : Matrix n n R) :
    det (of fun i j => v i * A i j) = (∏ i, v i) * det A :=
  MultilinearMap.map_smul_univ _ v A
Determinant Scaling by Column Multiplier: $\det(v_i \cdot A_{i,j}) = (\prod_i v_i) \det(A)$
Informal description
For any vector $v \colon n \to R$ and any square matrix $A$ of size $n \times n$ over a ring $R$, the determinant of the matrix obtained by multiplying each entry $A_{i,j}$ by $v_i$ is equal to the product of the entries of $v$ multiplied by the determinant of $A$, i.e., \[ \det \big( (v_i \cdot A_{i,j})_{i,j} \big) = \left( \prod_{i=1}^n v_i \right) \cdot \det(A). \]
Matrix.det_pow theorem
(M : Matrix m m R) (n : ℕ) : det (M ^ n) = det M ^ n
Full source
@[simp]
theorem det_pow (M : Matrix m m R) (n : ) : det (M ^ n) = det M ^ n :=
  (detMonoidHom : MatrixMatrix m m R →* R).map_pow M n
Determinant of Matrix Power: $\det(M^n) = (\det M)^n$
Informal description
For any square matrix $M$ of size $m \times m$ over a ring $R$ and any natural number $n$, the determinant of $M^n$ is equal to the $n$-th power of the determinant of $M$, i.e., \[ \det(M^n) = (\det M)^n. \]
RingHom.map_det theorem
(f : R →+* S) (M : Matrix n n R) : f M.det = Matrix.det (f.mapMatrix M)
Full source
theorem _root_.RingHom.map_det (f : R →+* S) (M : Matrix n n R) :
    f M.det = Matrix.det (f.mapMatrix M) := by
  simp [Matrix.det_apply', map_sum f, map_prod f]
Determinant Preservation under Ring Homomorphism: $f(\det(M)) = \det(f(M))$
Informal description
For any ring homomorphism $f \colon R \to S$ and any square matrix $M$ of size $n \times n$ with entries in $R$, the image of the determinant of $M$ under $f$ is equal to the determinant of the matrix obtained by applying $f$ to each entry of $M$, i.e., \[ f(\det(M)) = \det(f(M_{i,j})_{i,j}). \]
RingEquiv.map_det theorem
(f : R ≃+* S) (M : Matrix n n R) : f M.det = Matrix.det (f.mapMatrix M)
Full source
theorem _root_.RingEquiv.map_det (f : R ≃+* S) (M : Matrix n n R) :
    f M.det = Matrix.det (f.mapMatrix M) :=
  f.toRingHom.map_det _
Determinant Preservation under Ring Isomorphism: $f(\det(M)) = \det(f(M))$
Informal description
For any ring isomorphism $f \colon R \to S$ and any square matrix $M$ of size $n \times n$ with entries in $R$, the image of the determinant of $M$ under $f$ is equal to the determinant of the matrix obtained by applying $f$ to each entry of $M$, i.e., \[ f(\det(M)) = \det(f(M_{i,j})_{i,j}). \]
AlgHom.map_det theorem
[Algebra R S] {T : Type z} [CommRing T] [Algebra R T] (f : S →ₐ[R] T) (M : Matrix n n S) : f M.det = Matrix.det (f.mapMatrix M)
Full source
theorem _root_.AlgHom.map_det [Algebra R S] {T : Type z} [CommRing T] [Algebra R T] (f : S →ₐ[R] T)
    (M : Matrix n n S) : f M.det = Matrix.det (f.mapMatrix M) :=
  f.toRingHom.map_det _
Determinant Preservation under Algebra Homomorphism: $f(\det(M)) = \det(f(M))$
Informal description
Let $R$, $S$, and $T$ be commutative rings with $R$-algebra structures on $S$ and $T$. For any $R$-algebra homomorphism $f \colon S \to T$ and any square matrix $M$ of size $n \times n$ with entries in $S$, the image of the determinant of $M$ under $f$ is equal to the determinant of the matrix obtained by applying $f$ to each entry of $M$, i.e., \[ f(\det(M)) = \det(f(M_{i,j})_{i,j}). \]
AlgEquiv.map_det theorem
[Algebra R S] {T : Type z} [CommRing T] [Algebra R T] (f : S ≃ₐ[R] T) (M : Matrix n n S) : f M.det = Matrix.det (f.mapMatrix M)
Full source
theorem _root_.AlgEquiv.map_det [Algebra R S] {T : Type z} [CommRing T] [Algebra R T]
    (f : S ≃ₐ[R] T) (M : Matrix n n S) : f M.det = Matrix.det (f.mapMatrix M) :=
  f.toAlgHom.map_det _
Determinant Preservation under Algebra Isomorphism: $f(\det(M)) = \det(f(M))$
Informal description
Let $R$, $S$, and $T$ be commutative rings with $R$-algebra structures on $S$ and $T$. For any $R$-algebra isomorphism $f \colon S \to T$ and any square matrix $M$ of size $n \times n$ with entries in $S$, the image of the determinant of $M$ under $f$ is equal to the determinant of the matrix obtained by applying $f$ to each entry of $M$, i.e., \[ f(\det(M)) = \det(f(M_{i,j})_{i,j}). \]
Int.cast_det theorem
(M : Matrix n n ℤ) : (M.det : R) = (M.map fun x ↦ (x : R)).det
Full source
@[norm_cast]
theorem _root_.Int.cast_det (M : Matrix n n ) :
    (M.det : R) = (M.map fun x ↦ (x : R)).det :=
  Int.castRingHom R |>.map_det M
Determinant Preservation under Integer Entry Casting: $\det(M) = \det(M_{i,j} \colon \mathbb{Z} \to R)$
Informal description
For any square matrix $M$ of size $n \times n$ with integer entries, the determinant of $M$ (viewed as an element of a ring $R$) equals the determinant of the matrix obtained by casting each entry of $M$ to $R$. That is, \[ \det(M) = \det(M_{i,j} \colon \mathbb{Z} \to R). \]
Rat.cast_det theorem
{F : Type*} [Field F] [CharZero F] (M : Matrix n n ℚ) : (M.det : F) = (M.map fun x ↦ (x : F)).det
Full source
@[norm_cast]
theorem _root_.Rat.cast_det {F : Type*} [Field F] [CharZero F] (M : Matrix n n ℚ) :
    (M.det : F) = (M.map fun x ↦ (x : F)).det :=
  Rat.castHom F |>.map_det M
Determinant Preservation under Rational Embedding: $\det(M) = \det(M_{i,j})$
Informal description
Let $F$ be a field of characteristic zero and $M$ be an $n \times n$ matrix with entries in $\mathbb{Q}$. Then the image of the determinant of $M$ under the canonical embedding $\mathbb{Q} \hookrightarrow F$ equals the determinant of the matrix obtained by applying this embedding to each entry of $M$, i.e., \[ \det(M) = \det(M_{i,j}). \]
Matrix.det_conjTranspose theorem
[StarRing R] (M : Matrix m m R) : det Mᴴ = star (det M)
Full source
@[simp]
theorem det_conjTranspose [StarRing R] (M : Matrix m m R) : det Mᴴ = star (det M) :=
  ((starRingEnd R).map_det _).symm.trans <| congr_arg star M.det_transpose
Determinant of Conjugate Transpose Equals Conjugate of Determinant
Informal description
For any square matrix $M$ of size $m \times m$ over a ring $R$ equipped with a star operation (forming a star ring), the determinant of the conjugate transpose $M^\mathsf{H}$ is equal to the star of the determinant of $M$, i.e., \[ \det(M^\mathsf{H}) = \overline{\det(M)}. \]
Matrix.det_eq_zero_of_row_eq_zero theorem
{A : Matrix n n R} (i : n) (h : ∀ j, A i j = 0) : det A = 0
Full source
theorem det_eq_zero_of_row_eq_zero {A : Matrix n n R} (i : n) (h : ∀ j, A i j = 0) : det A = 0 :=
  (detRowAlternating : (n → R) [⋀^n]→ₗ[R] R).map_coord_zero i (funext h)
Determinant Vanishes on Zero Row: $\det(A) = 0$ if a row is entirely zero
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, if there exists a row index $i$ such that every entry in the $i$-th row is zero (i.e., $A_{i,j} = 0$ for all $j$), then the determinant of $A$ is zero, i.e., $\det(A) = 0$.
Matrix.det_eq_zero_of_column_eq_zero theorem
{A : Matrix n n R} (j : n) (h : ∀ i, A i j = 0) : det A = 0
Full source
theorem det_eq_zero_of_column_eq_zero {A : Matrix n n R} (j : n) (h : ∀ i, A i j = 0) :
    det A = 0 := by
  rw [← det_transpose]
  exact det_eq_zero_of_row_eq_zero j h
Determinant Vanishes on Zero Column: $\det(A) = 0$ if a column is entirely zero
Informal description
For any square matrix $A$ of size $n \times n$ over a ring $R$, if there exists a column index $j$ such that every entry in the $j$-th column is zero (i.e., $A_{i,j} = 0$ for all $i$), then the determinant of $A$ is zero, i.e., $\det(A) = 0$.
Matrix.det_zero_of_row_eq theorem
(i_ne_j : i ≠ j) (hij : M i = M j) : M.det = 0
Full source
/-- If a matrix has a repeated row, the determinant will be zero. -/
theorem det_zero_of_row_eq (i_ne_j : i ≠ j) (hij : M i = M j) : M.det = 0 :=
  (detRowAlternating : (n → R) [⋀^n]→ₗ[R] R).map_eq_zero_of_eq M hij i_ne_j
Determinant Vanishes on Repeated Rows: $\det(M) = 0$ if two rows are equal
Informal description
For any square matrix $M$ over a ring $R$, if two distinct rows $i$ and $j$ (with $i \neq j$) are equal (i.e., $M_i = M_j$), then the determinant of $M$ is zero, i.e., $\det(M) = 0$.
Matrix.det_zero_of_column_eq theorem
(i_ne_j : i ≠ j) (hij : ∀ k, M k i = M k j) : M.det = 0
Full source
/-- If a matrix has a repeated column, the determinant will be zero. -/
theorem det_zero_of_column_eq (i_ne_j : i ≠ j) (hij : ∀ k, M k i = M k j) : M.det = 0 := by
  rw [← det_transpose, det_zero_of_row_eq i_ne_j]
  exact funext hij
Determinant Vanishes on Repeated Columns: $\det(M) = 0$ if two columns are equal
Informal description
For any square matrix $M$ over a ring $R$, if two distinct columns $i$ and $j$ (with $i \neq j$) are equal (i.e., $M_{k,i} = M_{k,j}$ for all rows $k$), then the determinant of $M$ is zero, i.e., $\det(M) = 0$.
Matrix.det_updateRow_eq_zero theorem
(h : i ≠ j) : (M.updateRow j (M i)).det = 0
Full source
/-- If we repeat a row of a matrix, we get a matrix of determinant zero. -/
theorem det_updateRow_eq_zero (h : i ≠ j) :
    (M.updateRow j (M i)).det = 0 := det_zero_of_row_eq h (by simp [h])
Determinant Vanishes When Updating a Row to Equal Another: $\det(M_{j \mapsto M_i}) = 0$ for $i \neq j$
Informal description
For any square matrix $M$ over a ring $R$, if we update the $j$-th row of $M$ to be equal to its $i$-th row (with $i \neq j$), then the determinant of the resulting matrix is zero, i.e., $\det(M \text{ with row } j \text{ replaced by row } i) = 0$.
Matrix.det_updateCol_eq_zero theorem
(h : i ≠ j) : (M.updateCol j (fun k ↦ M k i)).det = 0
Full source
/-- If we repeat a column of a matrix, we get a matrix of determinant zero. -/
theorem det_updateCol_eq_zero (h : i ≠ j) :
    (M.updateCol j (fun k ↦ M k i)).det = 0 := det_zero_of_column_eq h (by simp [h])
Determinant Vanishes When Updating a Column to Equal Another: $\det(M_{j \mapsto M_{\cdot i}}) = 0$ for $i \neq j$
Informal description
For any square matrix $M$ over a ring $R$, if we update the $j$-th column of $M$ to be equal to its $i$-th column (with $i \neq j$), then the determinant of the resulting matrix is zero, i.e., $\det(M \text{ with column } j \text{ replaced by column } i) = 0$.
Matrix.det_updateRow_add theorem
(M : Matrix n n R) (j : n) (u v : n → R) : det (updateRow M j <| u + v) = det (updateRow M j u) + det (updateRow M j v)
Full source
theorem det_updateRow_add (M : Matrix n n R) (j : n) (u v : n → R) :
    det (updateRow M j <| u + v) = det (updateRow M j u) + det (updateRow M j v) :=
  (detRowAlternating : (n → R) [⋀^n]→ₗ[R] R).map_update_add M j u v
Additivity of Determinant under Row Update: $\det(M_{j \mapsto u+v}) = \det(M_{j \mapsto u}) + \det(M_{j \mapsto v})$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, and let $j$ be an index. For any two row vectors $u, v : n \to R$, the determinant of the matrix obtained by updating the $j$-th row of $M$ to $u + v$ equals the sum of the determinants of the matrices obtained by updating the $j$-th row to $u$ and to $v$ separately. That is, \[ \det(M \text{ with row } j \text{ replaced by } u + v) = \det(M \text{ with row } j \text{ replaced by } u) + \det(M \text{ with row } j \text{ replaced by } v). \]
Matrix.det_updateCol_add theorem
(M : Matrix n n R) (j : n) (u v : n → R) : det (updateCol M j <| u + v) = det (updateCol M j u) + det (updateCol M j v)
Full source
theorem det_updateCol_add (M : Matrix n n R) (j : n) (u v : n → R) :
    det (updateCol M j <| u + v) = det (updateCol M j u) + det (updateCol M j v) := by
  rw [← det_transpose, ← updateRow_transpose, det_updateRow_add]
  simp [updateRow_transpose, det_transpose]
Additivity of Determinant under Column Update: $\det(M_{j \mapsto u+v}) = \det(M_{j \mapsto u}) + \det(M_{j \mapsto v})$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, and let $j$ be a column index. For any two column vectors $u, v : n \to R$, the determinant of the matrix obtained by updating the $j$-th column of $M$ to $u + v$ equals the sum of the determinants of the matrices obtained by updating the $j$-th column to $u$ and to $v$ separately. That is, \[ \det(M \text{ with column } j \text{ replaced by } u + v) = \det(M \text{ with column } j \text{ replaced by } u) + \det(M \text{ with column } j \text{ replaced by } v). \]
Matrix.det_updateRow_smul theorem
(M : Matrix n n R) (j : n) (s : R) (u : n → R) : det (updateRow M j <| s • u) = s * det (updateRow M j u)
Full source
theorem det_updateRow_smul (M : Matrix n n R) (j : n) (s : R) (u : n → R) :
    det (updateRow M j <| s • u) = s * det (updateRow M j u) :=
  (detRowAlternating : (n → R) [⋀^n]→ₗ[R] R).map_update_smul M j s u
Determinant Scaling under Row Update: $\det(M_{j \mapsto s u}) = s \det(M_{j \mapsto u})$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, let $j$ be a row index, $s$ a scalar in $R$, and $u$ a row vector in $R^n$. Then the determinant of the matrix obtained by updating the $j$-th row of $M$ to $s \cdot u$ satisfies: \[ \det(M \text{ with row } j \text{ replaced by } s \cdot u) = s \cdot \det(M \text{ with row } j \text{ replaced by } u). \]
Matrix.det_updateCol_smul theorem
(M : Matrix n n R) (j : n) (s : R) (u : n → R) : det (updateCol M j <| s • u) = s * det (updateCol M j u)
Full source
theorem det_updateCol_smul (M : Matrix n n R) (j : n) (s : R) (u : n → R) :
    det (updateCol M j <| s • u) = s * det (updateCol M j u) := by
  rw [← det_transpose, ← updateRow_transpose, det_updateRow_smul]
  simp [updateRow_transpose, det_transpose]
Determinant Scaling under Column Update: $\det(M_{j \mapsto s u}) = s \det(M_{j \mapsto u})$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, let $j$ be a column index, $s$ a scalar in $R$, and $u$ a column vector in $R^n$. Then the determinant of the matrix obtained by updating the $j$-th column of $M$ to $s \cdot u$ satisfies: \[ \det(M \text{ with column } j \text{ replaced by } s \cdot u) = s \cdot \det(M \text{ with column } j \text{ replaced by } u). \]
Matrix.det_updateRow_smul_left theorem
(M : Matrix n n R) (j : n) (s : R) (u : n → R) : det (updateRow (s • M) j u) = s ^ (Fintype.card n - 1) * det (updateRow M j u)
Full source
theorem det_updateRow_smul_left (M : Matrix n n R) (j : n) (s : R) (u : n → R) :
    det (updateRow (s • M) j u) = s ^ (Fintype.card n - 1) * det (updateRow M j u) :=
  MultilinearMap.map_update_smul_left _ M j s u
Determinant of Scaled Matrix with Row Update: $\det(sM \text{ with row } j \text{ replaced by } u) = s^{n-1} \det(M \text{ with row } j \text{ replaced by } u)$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, let $j$ be a row index, $s$ a scalar in $R$, and $u$ a row vector in $R^n$. Then the determinant of the matrix obtained by scaling $M$ by $s$ and then updating its $j$-th row to $u$ satisfies: \[ \det(\text{updateRow}(s \cdot M, j, u)) = s^{n-1} \cdot \det(\text{updateRow}(M, j, u)) \] where $n$ is the size of the matrix (given by `Fintype.card n`).
Matrix.det_updateCol_smul_left theorem
(M : Matrix n n R) (j : n) (s : R) (u : n → R) : det (updateCol (s • M) j u) = s ^ (Fintype.card n - 1) * det (updateCol M j u)
Full source
theorem det_updateCol_smul_left (M : Matrix n n R) (j : n) (s : R) (u : n → R) :
    det (updateCol (s • M) j u) = s ^ (Fintype.card n - 1) * det (updateCol M j u) := by
  rw [← det_transpose, ← updateRow_transpose, transpose_smul, det_updateRow_smul_left]
  simp [updateRow_transpose, det_transpose]
Determinant of Scaled Matrix with Column Update: $\det(sM \text{ with column } j \text{ replaced by } u) = s^{n-1} \det(M \text{ with column } j \text{ replaced by } u)$
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, let $j$ be a column index, $s$ a scalar in $R$, and $u$ a column vector in $R^n$. Then the determinant of the matrix obtained by scaling $M$ by $s$ and then updating its $j$-th column to $u$ satisfies: \[ \det(\text{updateCol}(s \cdot M, j, u)) = s^{n-1} \cdot \det(\text{updateCol}(M, j, u)) \] where $n$ is the size of the matrix (given by the cardinality of the index type $n$).
Matrix.det_updateRow_sum_aux theorem
(M : Matrix n n R) {j : n} (s : Finset n) (hj : j ∉ s) (c : n → R) (a : R) : (M.updateRow j (a • M j + ∑ k ∈ s, (c k) • M k)).det = a • M.det
Full source
theorem det_updateRow_sum_aux (M : Matrix n n R) {j : n} (s : Finset n) (hj : j ∉ s) (c : n → R)
    (a : R) :
    (M.updateRow j (a • M j + ∑ k ∈ s, (c k) • M k)).det = a • M.det := by
  induction s using Finset.induction_on with
  | empty => rw [Finset.sum_empty, add_zero, smul_eq_mul, det_updateRow_smul, updateRow_eq_self]
  | insert k _ hk h_ind =>
      have h : k ≠ j := fun h ↦ (h ▸ hj) (Finset.mem_insert_self _ _)
      rw [Finset.sum_insert hk, add_comm ((c k) • M k), ← add_assoc, det_updateRow_add,
        det_updateRow_smul, det_updateRow_eq_zero h, mul_zero, add_zero, h_ind]
      exact fun h ↦ hj (Finset.mem_insert_of_mem h)
Determinant Preservation under Row Linear Combination (Auxiliary Case)
Informal description
Let $M$ be an $n \times n$ matrix over a ring $R$, let $j$ be a row index, $s$ a finite set of row indices not containing $j$, $c : n \to R$ a coefficient function, and $a \in R$ a scalar. Then the determinant of the matrix obtained by updating the $j$-th row of $M$ to $a \cdot M_j + \sum_{k \in s} c_k \cdot M_k$ satisfies: \[ \det(M \text{ with row } j \text{ replaced by } a \cdot M_j + \sum_{k \in s} c_k \cdot M_k) = a \cdot \det(M). \]
Matrix.det_updateRow_sum theorem
(A : Matrix n n R) (j : n) (c : n → R) : (A.updateRow j (∑ k, (c k) • A k)).det = (c j) • A.det
Full source
/-- If we replace a row of a matrix by a linear combination of its rows, then the determinant is
multiplied by the coefficient of that row. -/
theorem det_updateRow_sum (A : Matrix n n R) (j : n) (c : n → R) :
    (A.updateRow j (∑ k, (c k) • A k)).det = (c j) • A.det := by
  convert det_updateRow_sum_aux A (Finset.univ.erase j) (Finset.univ.not_mem_erase j) c (c j)
  rw [← Finset.univ.add_sum_erase _ (Finset.mem_univ j)]
Determinant Scaling under Row Linear Combination
Informal description
Let $A$ be an $n \times n$ matrix over a ring $R$, and let $j$ be a row index. For any function $c : n \to R$, the determinant of the matrix obtained by replacing the $j$-th row of $A$ with the linear combination $\sum_k c_k \cdot A_k$ is equal to $c_j \cdot \det(A)$. In other words, \[ \det\left(A \text{ with row } j \text{ replaced by } \sum_k c_k \cdot A_k\right) = c_j \cdot \det(A). \]
Matrix.det_updateCol_sum theorem
(A : Matrix n n R) (j : n) (c : n → R) : (A.updateCol j (fun k ↦ ∑ i, (c i) • A k i)).det = (c j) • A.det
Full source
/-- If we replace a column of a matrix by a linear combination of its columns, then the determinant
is multiplied by the coefficient of that column. -/
theorem det_updateCol_sum (A : Matrix n n R) (j : n) (c : n → R) :
    (A.updateCol j (fun k ↦ ∑ i, (c i) • A k i)).det = (c j) • A.det := by
  rw [← det_transpose, ← updateRow_transpose, ← det_transpose A]
  convert det_updateRow_sum A.transpose j c
  simp only [smul_eq_mul, Finset.sum_apply, Pi.smul_apply, transpose_apply]
Determinant Scaling under Column Linear Combination
Informal description
Let $A$ be an $n \times n$ matrix over a ring $R$, and let $j$ be a column index. For any function $c : n \to R$, the determinant of the matrix obtained by replacing the $j$-th column of $A$ with the linear combination $\sum_i c_i \cdot A_{k,i}$ (for each row index $k$) is equal to $c_j \cdot \det(A)$. In other words, \[ \det\left(A \text{ with column } j \text{ replaced by } \left(\sum_i c_i \cdot A_{k,i}\right)_{k=1}^n\right) = c_j \cdot \det(A). \]
Matrix.det_eq_of_eq_mul_det_one theorem
{A B : Matrix n n R} (C : Matrix n n R) (hC : det C = 1) (hA : A = B * C) : det A = det B
Full source
theorem det_eq_of_eq_mul_det_one {A B : Matrix n n R} (C : Matrix n n R) (hC : det C = 1)
    (hA : A = B * C) : det A = det B :=
  calc
    det A = det (B * C) := congr_arg _ hA
    _ = det B * det C := det_mul _ _
    _ = det B := by rw [hC, mul_one]
Determinant Invariance under Right Multiplication by Unit Determinant Matrix: $\det(A) = \det(B)$ when $A = B \cdot C$ and $\det(C) = 1$
Informal description
Let $A$ and $B$ be $n \times n$ matrices over a ring $R$, and let $C$ be an $n \times n$ matrix with $\det(C) = 1$. If $A = B \cdot C$, then $\det(A) = \det(B)$.
Matrix.det_eq_of_eq_det_one_mul theorem
{A B : Matrix n n R} (C : Matrix n n R) (hC : det C = 1) (hA : A = C * B) : det A = det B
Full source
theorem det_eq_of_eq_det_one_mul {A B : Matrix n n R} (C : Matrix n n R) (hC : det C = 1)
    (hA : A = C * B) : det A = det B :=
  calc
    det A = det (C * B) := congr_arg _ hA
    _ = det C * det B := det_mul _ _
    _ = det B := by rw [hC, one_mul]
Determinant Invariance under Left Multiplication by Matrix with Unit Determinant: $\det(CB) = \det(B)$ when $\det(C) = 1$
Informal description
For any $n \times n$ matrices $A, B, C$ over a ring $R$, if $\det(C) = 1$ and $A = C \cdot B$, then $\det(A) = \det(B)$.
Matrix.det_updateRow_add_self theorem
(A : Matrix n n R) {i j : n} (hij : i ≠ j) : det (updateRow A i (A i + A j)) = det A
Full source
theorem det_updateRow_add_self (A : Matrix n n R) {i j : n} (hij : i ≠ j) :
    det (updateRow A i (A i + A j)) = det A := by
  simp [det_updateRow_add,
    det_zero_of_row_eq hij (updateRow_self.trans (updateRow_ne hij.symm).symm)]
Determinant Invariance under Row Addition: $\det(A_{i \mapsto A_i + A_j}) = \det(A)$ for $i \neq j$
Informal description
Let $A$ be an $n \times n$ matrix over a ring $R$, and let $i$ and $j$ be distinct indices ($i \neq j$). Then the determinant of the matrix obtained by adding the $j$-th row to the $i$-th row of $A$ is equal to the determinant of $A$, i.e., \[ \det(A \text{ with row } i \text{ replaced by } A_i + A_j) = \det(A). \]
Matrix.det_updateCol_add_self theorem
(A : Matrix n n R) {i j : n} (hij : i ≠ j) : det (updateCol A i fun k => A k i + A k j) = det A
Full source
theorem det_updateCol_add_self (A : Matrix n n R) {i j : n} (hij : i ≠ j) :
    det (updateCol A i fun k => A k i + A k j) = det A := by
  rw [← det_transpose, ← updateRow_transpose, ← det_transpose A]
  exact det_updateRow_add_self Aᵀ hij
Determinant Invariance under Column Addition: $\det(A_{\bullet i \mapsto A_{\bullet i} + A_{\bullet j}}) = \det(A)$ for $i \neq j$
Informal description
Let $A$ be an $n \times n$ matrix over a ring $R$, and let $i$ and $j$ be distinct column indices ($i \neq j$). Then the determinant of the matrix obtained by adding the $j$-th column to the $i$-th column of $A$ is equal to the determinant of $A$, i.e., \[ \det(A \text{ with column } i \text{ replaced by } A_{\bullet i} + A_{\bullet j}) = \det(A). \]
Matrix.det_updateRow_add_smul_self theorem
(A : Matrix n n R) {i j : n} (hij : i ≠ j) (c : R) : det (updateRow A i (A i + c • A j)) = det A
Full source
theorem det_updateRow_add_smul_self (A : Matrix n n R) {i j : n} (hij : i ≠ j) (c : R) :
    det (updateRow A i (A i + c • A j)) = det A := by
  simp [det_updateRow_add, det_updateRow_smul,
    det_zero_of_row_eq hij (updateRow_self.trans (updateRow_ne hij.symm).symm)]
Determinant Invariance under Row Addition: $\det(A_{i \mapsto A_i + c A_j}) = \det(A)$
Informal description
Let $A$ be an $n \times n$ matrix over a ring $R$, and let $i$ and $j$ be distinct row indices (i.e., $i \neq j$). For any scalar $c \in R$, the determinant of the matrix obtained by adding $c$ times the $j$-th row to the $i$-th row of $A$ is equal to the determinant of $A$, i.e., \[ \det(A \text{ with row } i \text{ replaced by } A_i + c A_j) = \det(A). \]
Matrix.det_updateCol_add_smul_self theorem
(A : Matrix n n R) {i j : n} (hij : i ≠ j) (c : R) : det (updateCol A i fun k => A k i + c • A k j) = det A
Full source
theorem det_updateCol_add_smul_self (A : Matrix n n R) {i j : n} (hij : i ≠ j) (c : R) :
    det (updateCol A i fun k => A k i + c • A k j) = det A := by
  rw [← det_transpose, ← updateRow_transpose, ← det_transpose A]
  exact det_updateRow_add_smul_self Aᵀ hij c
Determinant Invariance under Column Addition: $\det(A_{\cdot i \mapsto A_{\cdot i} + c A_{\cdot j}}) = \det(A)$
Informal description
Let $A$ be an $n \times n$ matrix over a ring $R$, and let $i$ and $j$ be distinct column indices (i.e., $i \neq j$). For any scalar $c \in R$, the determinant of the matrix obtained by adding $c$ times the $j$-th column to the $i$-th column of $A$ is equal to the determinant of $A$, i.e., \[ \det(A \text{ with column } i \text{ replaced by } A_{\cdot i} + c A_{\cdot j}) = \det(A). \]
Matrix.linearIndependent_rows_of_det_ne_zero theorem
[IsDomain R] {A : Matrix m m R} (hA : A.det ≠ 0) : LinearIndependent R A.row
Full source
theorem linearIndependent_rows_of_det_ne_zero [IsDomain R] {A : Matrix m m R} (hA : A.det ≠ 0) :
    LinearIndependent R A.row := by
  rw [row_def]
  contrapose! hA
  obtain ⟨c, hc0, i, hci⟩ := Fintype.not_linearIndependent_iff.1 hA
  have h0 := A.det_updateRow_sum i c
  rwa [det_eq_zero_of_row_eq_zero (i := i) (fun j ↦ by simp [hc0]), smul_eq_mul, eq_comm,
    mul_eq_zero_iff_left hci] at h0
Nonzero Determinant Implies Linear Independence of Rows
Informal description
Let $R$ be an integral domain and $A$ be an $m \times m$ matrix with entries in $R$. If the determinant of $A$ is nonzero, then the rows of $A$ are linearly independent vectors over $R$.
Matrix.linearIndependent_cols_of_det_ne_zero theorem
[IsDomain R] {A : Matrix m m R} (hA : A.det ≠ 0) : LinearIndependent R A.col
Full source
theorem linearIndependent_cols_of_det_ne_zero [IsDomain R] {A : Matrix m m R} (hA : A.det ≠ 0) :
    LinearIndependent R A.col :=
  Matrix.linearIndependent_rows_of_det_ne_zero (by simpa)
Nonzero Determinant Implies Linear Independence of Columns
Informal description
Let $R$ be an integral domain and $A$ be an $m \times m$ matrix with entries in $R$. If the determinant of $A$ is nonzero, then the columns of $A$ are linearly independent vectors over $R$.
Matrix.det_eq_of_forall_row_eq_smul_add_const_aux theorem
{A B : Matrix n n R} {s : Finset n} : ∀ (c : n → R) (_ : ∀ i, i ∉ s → c i = 0) (k : n) (_ : k ∉ s) (_ : ∀ i j, A i j = B i j + c i * B k j), det A = det B
Full source
theorem det_eq_of_forall_row_eq_smul_add_const_aux {A B : Matrix n n R} {s : Finset n} :
    ∀ (c : n → R) (_ : ∀ i, i ∉ s → c i = 0) (k : n) (_ : k ∉ s)
      (_ : ∀ i j, A i j = B i j + c i * B k j), det A = det B := by
  induction s using Finset.induction_on generalizing B with
  | empty =>
    rintro c hs k - A_eq
    have : ∀ i, c i = 0 := by
      intro i
      specialize hs i
      contrapose! hs
      simp [hs]
    congr
    ext i j
    rw [A_eq, this, zero_mul, add_zero]
  | insert i s _hi ih =>
    intro c hs k hk A_eq
    have hAi : A i = B i + c i • B k := funext (A_eq i)
    rw [@ih (updateRow B i (A i)) (Function.update c i 0), hAi, det_updateRow_add_smul_self]
    · exact mt (fun h => show k ∈ insert i s from h ▸ Finset.mem_insert_self _ _) hk
    · intro i' hi'
      rw [Function.update_apply]
      split_ifs with hi'i
      · rfl
      · exact hs i' fun h => hi' ((Finset.mem_insert.mp h).resolve_left hi'i)
    · exact k
    · exact fun h => hk (Finset.mem_insert_of_mem h)
    · intro i' j'
      rw [updateRow_apply, Function.update_apply]
      split_ifs with hi'i
      · simp [hi'i]
      rw [A_eq, updateRow_ne fun h : k = i => hk <| h ▸ Finset.mem_insert_self k s]
Determinant Invariance under Row Operations with Constraint on Coefficients
Informal description
Let $A$ and $B$ be $n \times n$ matrices over a ring $R$, and let $s$ be a finite set of row indices. Suppose there exists a vector $c \colon n \to R$ such that: 1. $c_i = 0$ for all $i \notin s$, 2. There exists an index $k \notin s$, 3. For all $i, j$, the entries satisfy $A_{i,j} = B_{i,j} + c_i \cdot B_{k,j}$. Then the determinants of $A$ and $B$ are equal, i.e., $\det(A) = \det(B)$.
Matrix.det_eq_of_forall_row_eq_smul_add_const theorem
{A B : Matrix n n R} (c : n → R) (k : n) (hk : c k = 0) (A_eq : ∀ i j, A i j = B i j + c i * B k j) : det A = det B
Full source
/-- If you add multiples of row `B k` to other rows, the determinant doesn't change. -/
theorem det_eq_of_forall_row_eq_smul_add_const {A B : Matrix n n R} (c : n → R) (k : n)
    (hk : c k = 0) (A_eq : ∀ i j, A i j = B i j + c i * B k j) : det A = det B :=
  det_eq_of_forall_row_eq_smul_add_const_aux c
    (fun i =>
      not_imp_comm.mp fun hi =>
        Finset.mem_erase.mpr
          ⟨mt (fun h : i = k => show c i = 0 from h.symm ▸ hk) hi, Finset.mem_univ i⟩)
    k (Finset.not_mem_erase k Finset.univ) A_eq
Determinant Invariance under Row Addition of Scaled Multiple
Informal description
Let $A$ and $B$ be $n \times n$ matrices over a ring $R$, and let $c \colon n \to R$ be a function such that $c(k) = 0$ for some fixed index $k$. If for all $i, j$ the entries of $A$ satisfy $A_{i,j} = B_{i,j} + c_i \cdot B_{k,j}$, then the determinants of $A$ and $B$ are equal, i.e., $\det(A) = \det(B)$.
Matrix.det_eq_of_forall_row_eq_smul_add_pred_aux theorem
{n : ℕ} (k : Fin (n + 1)) : ∀ (c : Fin n → R) (_hc : ∀ i : Fin n, k < i.succ → c i = 0) {M N : Matrix (Fin n.succ) (Fin n.succ) R} (_h0 : ∀ j, M 0 j = N 0 j) (_hsucc : ∀ (i : Fin n) (j), M i.succ j = N i.succ j + c i * M (Fin.castSucc i) j), det M = det N
Full source
theorem det_eq_of_forall_row_eq_smul_add_pred_aux {n : } (k : Fin (n + 1)) :
    ∀ (c : Fin n → R) (_hc : ∀ i : Fin n, k < i.succ → c i = 0)
      {M N : Matrix (Fin n.succ) (Fin n.succ) R} (_h0 : ∀ j, M 0 j = N 0 j)
      (_hsucc : ∀ (i : Fin n) (j), M i.succ j = N i.succ j + c i * M (Fin.castSucc i) j),
      det M = det N := by
  refine Fin.induction ?_ (fun k ih => ?_) k <;> intro c hc M N h0 hsucc
  · congr
    ext i j
    refine Fin.cases (h0 j) (fun i => ?_) i
    rw [hsucc, hc i (Fin.succ_pos _), zero_mul, add_zero]
  set M' := updateRow M k.succ (N k.succ) with hM'
  have hM : M = updateRow M' k.succ (M' k.succ + c k • M (Fin.castSucc k)) := by
    ext i j
    by_cases hi : i = k.succ
    · simp [hi, hM', hsucc, updateRow_self]
    rw [updateRow_ne hi, hM', updateRow_ne hi]
  have k_ne_succ : (Fin.castSucc k) ≠ k.succ := (Fin.castSucc_lt_succ k).ne
  have M_k : M (Fin.castSucc k) = M' (Fin.castSucc k) := (updateRow_ne k_ne_succ).symm
  rw [hM, M_k, det_updateRow_add_smul_self M' k_ne_succ.symm, ih (Function.update c k 0)]
  · intro i hi
    rw [Fin.lt_iff_val_lt_val, Fin.coe_castSucc, Fin.val_succ, Nat.lt_succ_iff] at hi
    rw [Function.update_apply]
    split_ifs with hik
    · rfl
    exact hc _ (Fin.succ_lt_succ_iff.mpr (lt_of_le_of_ne hi (Ne.symm hik)))
  · rwa [hM', updateRow_ne (Fin.succ_ne_zero _).symm]
  intro i j
  rw [Function.update_apply]
  split_ifs with hik
  · rw [zero_mul, add_zero, hM', hik, updateRow_self]
  rw [hM', updateRow_ne ((Fin.succ_injective _).ne hik), hsucc]
  by_cases hik2 : k < i
  · simp [hc i (Fin.succ_lt_succ_iff.mpr hik2)]
  rw [updateRow_ne]
  apply ne_of_lt
  rwa [Fin.lt_iff_val_lt_val, Fin.coe_castSucc, Fin.val_succ, Nat.lt_succ_iff, ← not_lt]
Determinant Invariance under Row Operations with Predicate Condition
Informal description
Let $n$ be a natural number and $k$ be an element of $\text{Fin}(n+1)$. For any function $c \colon \text{Fin}(n) \to R$ such that $c(i) = 0$ whenever $k < i+1$, and for any two $(n+1) \times (n+1)$ matrices $M$ and $N$ over a ring $R$ satisfying: 1. $M_{0j} = N_{0j}$ for all $j$, and 2. $M_{(i+1)j} = N_{(i+1)j} + c(i) \cdot M_{(\text{castSucc } i)j}$ for all $i \in \text{Fin}(n)$ and all $j$, we have $\det(M) = \det(N)$.
Matrix.det_eq_of_forall_row_eq_smul_add_pred theorem
{n : ℕ} {A B : Matrix (Fin (n + 1)) (Fin (n + 1)) R} (c : Fin n → R) (A_zero : ∀ j, A 0 j = B 0 j) (A_succ : ∀ (i : Fin n) (j), A i.succ j = B i.succ j + c i * A (Fin.castSucc i) j) : det A = det B
Full source
/-- If you add multiples of previous rows to the next row, the determinant doesn't change. -/
theorem det_eq_of_forall_row_eq_smul_add_pred {n : } {A B : Matrix (Fin (n + 1)) (Fin (n + 1)) R}
    (c : Fin n → R) (A_zero : ∀ j, A 0 j = B 0 j)
    (A_succ : ∀ (i : Fin n) (j), A i.succ j = B i.succ j + c i * A (Fin.castSucc i) j) :
    det A = det B :=
  det_eq_of_forall_row_eq_smul_add_pred_aux (Fin.last _) c
    (fun _ hi => absurd hi (not_lt_of_ge (Fin.le_last _))) A_zero A_succ
Determinant Invariance under Row Addition of Scaled Previous Rows
Informal description
Let $n$ be a natural number and $A, B$ be $(n+1) \times (n+1)$ matrices over a ring $R$. Given coefficients $c \colon \text{Fin}(n) \to R$, if: 1. The first rows of $A$ and $B$ are identical, i.e., $A_{0j} = B_{0j}$ for all $j$, and 2. For each subsequent row $i+1$ (where $i \in \text{Fin}(n)$), the entries satisfy $A_{(i+1)j} = B_{(i+1)j} + c_i \cdot A_{ij}$ for all $j$, then the determinants of $A$ and $B$ are equal, i.e., $\det(A) = \det(B)$.
Matrix.det_eq_of_forall_col_eq_smul_add_pred theorem
{n : ℕ} {A B : Matrix (Fin (n + 1)) (Fin (n + 1)) R} (c : Fin n → R) (A_zero : ∀ i, A i 0 = B i 0) (A_succ : ∀ (i) (j : Fin n), A i j.succ = B i j.succ + c j * A i (Fin.castSucc j)) : det A = det B
Full source
/-- If you add multiples of previous columns to the next columns, the determinant doesn't change. -/
theorem det_eq_of_forall_col_eq_smul_add_pred {n : } {A B : Matrix (Fin (n + 1)) (Fin (n + 1)) R}
    (c : Fin n → R) (A_zero : ∀ i, A i 0 = B i 0)
    (A_succ : ∀ (i) (j : Fin n), A i j.succ = B i j.succ + c j * A i (Fin.castSucc j)) :
    det A = det B := by
  rw [← det_transpose A, ← det_transpose B]
  exact det_eq_of_forall_row_eq_smul_add_pred c A_zero fun i j => A_succ j i
Determinant Invariance under Column Addition of Scaled Previous Columns
Informal description
Let $n$ be a natural number and $A, B$ be $(n+1) \times (n+1)$ matrices over a ring $R$. Given coefficients $c \colon \text{Fin}(n) \to R$, if: 1. The first columns of $A$ and $B$ are identical, i.e., $A_{i0} = B_{i0}$ for all $i$, and 2. For each subsequent column $j+1$ (where $j \in \text{Fin}(n)$), the entries satisfy $A_{i(j+1)} = B_{i(j+1)} + c_j \cdot A_{i(\text{castSucc } j)}$ for all $i$, then the determinants of $A$ and $B$ are equal, i.e., $\det(A) = \det(B)$.
Matrix.det_blockDiagonal theorem
{o : Type*} [Fintype o] [DecidableEq o] (M : o → Matrix n n R) : (blockDiagonal M).det = ∏ k, (M k).det
Full source
@[simp]
theorem det_blockDiagonal {o : Type*} [Fintype o] [DecidableEq o] (M : o → Matrix n n R) :
    (blockDiagonal M).det = ∏ k, (M k).det := by
  -- Rewrite the determinants as a sum over permutations.
  simp_rw [det_apply']
  -- The right hand side is a product of sums, rewrite it as a sum of products.
  rw [Finset.prod_sum]
  simp_rw [Finset.prod_attach_univ, Finset.univ_pi_univ]
  -- We claim that the only permutations contributing to the sum are those that
  -- preserve their second component.
  let preserving_snd : Finset (Equiv.Perm (n × o)) := {σ | ∀ x, (σ x).snd = x.snd}
  have mem_preserving_snd :
    ∀ {σ : Equiv.Perm (n × o)}, σ ∈ preserving_sndσ ∈ preserving_snd ↔ ∀ x, (σ x).snd = x.snd := fun {σ} =>
    Finset.mem_filter.trans ⟨fun h => h.2, fun h => ⟨Finset.mem_univ _, h⟩⟩
  rw [← Finset.sum_subset (Finset.subset_univ preserving_snd) _]
  -- And that these are in bijection with `o → Equiv.Perm m`.
  · refine (Finset.sum_bij (fun σ _ => prodCongrLeft fun k ↦ σ k (mem_univ k)) ?_ ?_ ?_ ?_).symm
    · intro σ _
      rw [mem_preserving_snd]
      rintro ⟨-, x⟩
      simp only [prodCongrLeft_apply]
    · intro σ _ σ' _ eq
      ext x hx k
      simp only at eq
      have :
        ∀ k x,
          prodCongrLeft (fun k => σ k (Finset.mem_univ _)) (k, x) =
            prodCongrLeft (fun k => σ' k (Finset.mem_univ _)) (k, x) :=
        fun k x => by rw [eq]
      simp only [prodCongrLeft_apply, Prod.mk_inj] at this
      exact (this k x).1
    · intro σ hσ
      rw [mem_preserving_snd] at hσ
      have hσ' : ∀ x, (σ⁻¹ x).snd = x.snd := by
        intro x
        conv_rhs => rw [← Perm.apply_inv_self σ x, hσ]
      have mk_apply_eq : ∀ k x, ((σ (x, k)).fst, k) = σ (x, k) := by
        intro k x
        ext
        · simp only
        · simp only [hσ]
      have mk_inv_apply_eq : ∀ k x, ((σ⁻¹ (x, k)).fst, k) = σ⁻¹ (x, k) := by
        intro k x
        conv_lhs => rw [← Perm.apply_inv_self σ (x, k)]
        ext
        · simp only [apply_inv_self]
        · simp only [hσ']
      refine ⟨fun k _ => ⟨fun x => (σ (x, k)).fst, fun x => (σ⁻¹ (x, k)).fst, ?_, ?_⟩, ?_, ?_⟩
      · intro x
        simp only [mk_apply_eq, inv_apply_self]
      · intro x
        simp only [mk_inv_apply_eq, apply_inv_self]
      · apply Finset.mem_univ
      · ext ⟨k, x⟩
        · simp only [coe_fn_mk, prodCongrLeft_apply]
        · simp only [prodCongrLeft_apply, hσ]
    · intro σ _
      rw [Finset.prod_mul_distrib, ← Finset.univ_product_univ, Finset.prod_product_right]
      simp only [sign_prodCongrLeft, Units.coe_prod, Int.cast_prod, blockDiagonal_apply_eq,
        prodCongrLeft_apply]
  · intro σ _ hσ
    rw [mem_preserving_snd] at hσ
    obtain ⟨⟨k, x⟩, hkx⟩ := not_forall.mp hσ
    rw [Finset.prod_eq_zero (Finset.mem_univ (k, x)), mul_zero]
    rw [blockDiagonal_apply_ne]
    exact hkx
Determinant of Block Diagonal Matrix Equals Product of Block Determinants
Informal description
Let $o$ be a finite type with decidable equality, and let $M \colon o \to \text{Matrix}(n, n, R)$ be a family of $n \times n$ matrices over a ring $R$. The determinant of the block diagonal matrix formed by $M$ is equal to the product of the determinants of the individual matrices $M(k)$ for each $k \in o$, i.e., \[ \det(\text{blockDiagonal}(M)) = \prod_{k \in o} \det(M(k)). \]
Matrix.det_fromBlocks_zero₂₁ theorem
(A : Matrix m m R) (B : Matrix m n R) (D : Matrix n n R) : (Matrix.fromBlocks A B 0 D).det = A.det * D.det
Full source
/-- The determinant of a 2×2 block matrix with the lower-left block equal to zero is the product of
the determinants of the diagonal blocks. For the generalization to any number of blocks, see
`Matrix.det_of_upperTriangular`. -/
@[simp]
theorem det_fromBlocks_zero₂₁ (A : Matrix m m R) (B : Matrix m n R) (D : Matrix n n R) :
    (Matrix.fromBlocks A B 0 D).det = A.det * D.det := by
  classical
    simp_rw [det_apply']
    convert Eq.symm <|
      sum_subset (M := R) (subset_univ ((sumCongrHom m n).range : Set (Perm (m ⊕ n))).toFinset) ?_
    · simp_rw [sum_mul_sum, ← sum_product', univ_product_univ]
      refine sum_nbij (fun σ ↦ σ.fst.sumCongr σ.snd) ?_ ?_ ?_ ?_
      · intro σ₁₂ _
        simp
      · intro σ₁ _ σ₂ _
        dsimp only
        intro h
        have h2 : ∀ x, Perm.sumCongr σ₁.fst σ₁.snd x = Perm.sumCongr σ₂.fst σ₂.snd x :=
          DFunLike.congr_fun h
        simp only [Sum.map_inr, Sum.map_inl, Perm.sumCongr_apply, Sum.forall, Sum.inl.injEq,
          Sum.inr.injEq] at h2
        ext x
        · exact h2.left x
        · exact h2.right x
      · intro σ hσ
        rw [mem_coe, Set.mem_toFinset] at hσ
        obtain ⟨σ₁₂, hσ₁₂⟩ := hσ
        use σ₁₂
        rw [← hσ₁₂]
        simp
      · simp only [forall_prop_of_true, Prod.forall, mem_univ]
        intro σ₁ σ₂
        rw [Fintype.prod_sum_type]
        simp_rw [Equiv.sumCongr_apply, Sum.map_inr, Sum.map_inl, fromBlocks_apply₁₁,
          fromBlocks_apply₂₂]
        rw [mul_mul_mul_comm]
        congr
        rw [sign_sumCongr, Units.val_mul, Int.cast_mul]
    · rintro σ - hσn
      have h1 : ¬∀ x, ∃ y, Sum.inl y = σ (Sum.inl x) := by
        rw [Set.mem_toFinset] at hσn
        simpa only [Set.MapsTo, Set.mem_range, forall_exists_index, forall_apply_eq_imp_iff] using
          mt mem_sumCongrHom_range_of_perm_mapsTo_inl hσn
      obtain ⟨a, ha⟩ := not_forall.mp h1
      rcases hx : σ (Sum.inl a) with a2 | b
      · have hn := (not_exists.mp ha) a2
        exact absurd hx.symm hn
      · rw [Finset.prod_eq_zero (Finset.mem_univ (Sum.inl a)), mul_zero]
        rw [hx, fromBlocks_apply₂₁, zero_apply]
Determinant of Block Matrix with Lower-Left Zero Block: $\det \begin{pmatrix} A & B \\ 0 & D \end{pmatrix} = \det A \cdot \det D$
Informal description
Let $A$ be an $m \times m$ matrix, $B$ an $m \times n$ matrix, and $D$ an $n \times n$ matrix, all with entries in a ring $R$. The determinant of the block matrix \[ \begin{pmatrix} A & B \\ 0 & D \end{pmatrix} \] is equal to the product of the determinants of $A$ and $D$, i.e., \[ \det \begin{pmatrix} A & B \\ 0 & D \end{pmatrix} = \det A \cdot \det D. \]
Matrix.det_fromBlocks_zero₁₂ theorem
(A : Matrix m m R) (C : Matrix n m R) (D : Matrix n n R) : (Matrix.fromBlocks A 0 C D).det = A.det * D.det
Full source
/-- The determinant of a 2×2 block matrix with the upper-right block equal to zero is the product of
the determinants of the diagonal blocks. For the generalization to any number of blocks, see
`Matrix.det_of_lowerTriangular`. -/
@[simp]
theorem det_fromBlocks_zero₁₂ (A : Matrix m m R) (C : Matrix n m R) (D : Matrix n n R) :
    (Matrix.fromBlocks A 0 C D).det = A.det * D.det := by
  rw [← det_transpose, fromBlocks_transpose, transpose_zero, det_fromBlocks_zero₂₁, det_transpose,
    det_transpose]
Determinant of Block Matrix with Upper-Right Zero Block: $\det \begin{pmatrix} A & 0 \\ C & D \end{pmatrix} = \det A \cdot \det D$
Informal description
Let $A$ be an $m \times m$ matrix, $C$ an $n \times m$ matrix, and $D$ an $n \times n$ matrix, all with entries in a ring $R$. The determinant of the block matrix \[ \begin{pmatrix} A & 0 \\ C & D \end{pmatrix} \] is equal to the product of the determinants of $A$ and $D$, i.e., \[ \det \begin{pmatrix} A & 0 \\ C & D \end{pmatrix} = \det A \cdot \det D. \]
Matrix.det_succ_column_zero theorem
{n : ℕ} (A : Matrix (Fin n.succ) (Fin n.succ) R) : det A = ∑ i : Fin n.succ, (-1) ^ (i : ℕ) * A i 0 * det (A.submatrix i.succAbove Fin.succ)
Full source
/-- Laplacian expansion of the determinant of an `n+1 × n+1` matrix along column 0. -/
theorem det_succ_column_zero {n : } (A : Matrix (Fin n.succ) (Fin n.succ) R) :
    det A = ∑ i : Fin n.succ, (-1) ^ (i : ℕ) * A i 0 * det (A.submatrix i.succAbove Fin.succ) := by
  rw [Matrix.det_apply, Finset.univ_perm_fin_succ, ← Finset.univ_product_univ]
  simp only [Finset.sum_map, Equiv.toEmbedding_apply, Finset.sum_product, Matrix.submatrix]
  refine Finset.sum_congr rfl fun i _ => Fin.cases ?_ (fun i => ?_) i
  · simp only [Fin.prod_univ_succ, Matrix.det_apply, Finset.mul_sum,
      Equiv.Perm.decomposeFin_symm_apply_zero, Fin.val_zero, one_mul,
      Equiv.Perm.decomposeFin.symm_sign, Equiv.swap_self, if_true, id, eq_self_iff_true,
      Equiv.Perm.decomposeFin_symm_apply_succ, Fin.succAbove_zero, Equiv.coe_refl, pow_zero,
      mul_smul_comm, of_apply]
  -- `univ_perm_fin_succ` gives a different embedding of `Perm (Fin n)` into
  -- `Perm (Fin n.succ)` than the determinant of the submatrix we want,
  -- permute `A` so that we get the correct one.
  have : (-1 : R) ^ (i : ) = (Perm.sign i.cycleRange) := by simp [Fin.sign_cycleRange]
  rw [Fin.val_succ, pow_succ', this, mul_assoc, mul_assoc, mul_left_comm (ε _),
    ← det_permute, Matrix.det_apply, Finset.mul_sum, Finset.mul_sum]
  -- now we just need to move the corresponding parts to the same place
  refine Finset.sum_congr rfl fun σ _ => ?_
  rw [Equiv.Perm.decomposeFin.symm_sign, if_neg (Fin.succ_ne_zero i)]
  calc
    ((-1 * Perm.sign σ : ) • ∏ i', A (Perm.decomposeFin.symm (Fin.succ i, σ) i') i') =
        (-1 * Perm.sign σ : ) • (A (Fin.succ i) 0 *
          ∏ i', A ((Fin.succ i).succAbove (Fin.cycleRange i (σ i'))) i'.succ) := by
      simp only [Fin.prod_univ_succ, Fin.succAbove_cycleRange,
        Equiv.Perm.decomposeFin_symm_apply_zero, Equiv.Perm.decomposeFin_symm_apply_succ]
    _ = -1 * (A (Fin.succ i) 0 * (Perm.sign σ : ) •
        ∏ i', A ((Fin.succ i).succAbove (Fin.cycleRange i (σ i'))) i'.succ) := by
      simp [mul_assoc, mul_comm, _root_.neg_mul, one_mul, zsmul_eq_mul, neg_inj, neg_smul,
        Fin.succAbove_cycleRange, mul_left_comm]
Laplace Expansion of Determinant Along First Column: $\det(A) = \sum_{i=0}^n (-1)^i A_{i0} \det(A_{i+1,1})$
Informal description
For any $(n+1) \times (n+1)$ matrix $A$ with entries in a ring $R$, the determinant of $A$ can be computed by expanding along the first column as: \[ \det(A) = \sum_{i=0}^n (-1)^i \cdot A_{i0} \cdot \det(A_{i+1,1}) \] where $A_{i+1,1}$ denotes the submatrix obtained by removing the $i$-th row and first column from $A$.
Matrix.det_succ_row_zero theorem
{n : ℕ} (A : Matrix (Fin n.succ) (Fin n.succ) R) : det A = ∑ j : Fin n.succ, (-1) ^ (j : ℕ) * A 0 j * det (A.submatrix Fin.succ j.succAbove)
Full source
/-- Laplacian expansion of the determinant of an `n+1 × n+1` matrix along row 0. -/
theorem det_succ_row_zero {n : } (A : Matrix (Fin n.succ) (Fin n.succ) R) :
    det A = ∑ j : Fin n.succ, (-1) ^ (j : ℕ) * A 0 j * det (A.submatrix Fin.succ j.succAbove) := by
  rw [← det_transpose A, det_succ_column_zero]
  refine Finset.sum_congr rfl fun i _ => ?_
  rw [← det_transpose]
  simp only [transpose_apply, transpose_submatrix, transpose_transpose]
Laplace Expansion of Determinant Along First Row: $\det(A) = \sum_{j=0}^n (-1)^j A_{0j} \det(A_{1,j+1})$
Informal description
For any $(n+1) \times (n+1)$ matrix $A$ with entries in a ring $R$, the determinant of $A$ can be computed by expanding along the first row as: \[ \det(A) = \sum_{j=0}^n (-1)^j \cdot A_{0j} \cdot \det(A_{1,j+1}) \] where $A_{1,j+1}$ denotes the submatrix obtained by removing the first row and $j$-th column from $A$.
Matrix.det_succ_row theorem
{n : ℕ} (A : Matrix (Fin n.succ) (Fin n.succ) R) (i : Fin n.succ) : det A = ∑ j : Fin n.succ, (-1) ^ (i + j : ℕ) * A i j * det (A.submatrix i.succAbove j.succAbove)
Full source
/-- Laplacian expansion of the determinant of an `n+1 × n+1` matrix along row `i`. -/
theorem det_succ_row {n : } (A : Matrix (Fin n.succ) (Fin n.succ) R) (i : Fin n.succ) :
    det A =
      ∑ j : Fin n.succ, (-1) ^ (i + j : ℕ) * A i j * det (A.submatrix i.succAbove j.succAbove) := by
  simp_rw [pow_add, mul_assoc, ← mul_sum]
  have : det A = (-1 : R) ^ (i : ) * (Perm.sign i.cycleRange⁻¹) * det A := by
    calc
      det A = ↑((-1 : ℤˣ) ^ (i : ) * (-1 : ℤˣ) ^ (i : ) : ℤˣ) * det A := by simp
      _ = (-1 : R) ^ (i : ) * (Perm.sign i.cycleRange⁻¹) * det A := by simp [-Int.units_mul_self]
  rw [this, mul_assoc]
  congr
  rw [← det_permute, det_succ_row_zero]
  refine Finset.sum_congr rfl fun j _ => ?_
  rw [mul_assoc, Matrix.submatrix_apply, submatrix_submatrix, id_comp, Function.comp_def, id]
  congr
  · rw [Equiv.Perm.inv_def, Fin.cycleRange_symm_zero]
  · ext i' j'
    rw [Equiv.Perm.inv_def, Matrix.submatrix_apply, Matrix.submatrix_apply,
      Fin.cycleRange_symm_succ]
Laplace Expansion of Determinant Along Any Row: $\det(A) = \sum_j (-1)^{i+j} A_{ij} \det(A_{i^c,j^c})$
Informal description
For any $(n+1) \times (n+1)$ matrix $A$ with entries in a commutative ring $R$ and any row index $i \in \{0, \dots, n\}$, the determinant of $A$ can be computed by Laplace expansion along the $i$-th row as: \[ \det(A) = \sum_{j=0}^n (-1)^{i+j} \cdot A_{ij} \cdot \det(A_{i^c,j^c}) \] where $A_{i^c,j^c}$ denotes the submatrix obtained by removing the $i$-th row and $j$-th column from $A$.
Matrix.det_succ_column theorem
{n : ℕ} (A : Matrix (Fin n.succ) (Fin n.succ) R) (j : Fin n.succ) : det A = ∑ i : Fin n.succ, (-1) ^ (i + j : ℕ) * A i j * det (A.submatrix i.succAbove j.succAbove)
Full source
/-- Laplacian expansion of the determinant of an `n+1 × n+1` matrix along column `j`. -/
theorem det_succ_column {n : } (A : Matrix (Fin n.succ) (Fin n.succ) R) (j : Fin n.succ) :
    det A =
      ∑ i : Fin n.succ, (-1) ^ (i + j : ℕ) * A i j * det (A.submatrix i.succAbove j.succAbove) := by
  rw [← det_transpose, det_succ_row _ j]
  refine Finset.sum_congr rfl fun i _ => ?_
  rw [add_comm, ← det_transpose, transpose_apply, transpose_submatrix, transpose_transpose]
Laplace Expansion of Determinant Along Any Column: $\det(A) = \sum_i (-1)^{i+j} A_{ij} \det(A_{i^c,j^c})$
Informal description
For any $(n+1) \times (n+1)$ matrix $A$ with entries in a commutative ring $R$ and any column index $j \in \{0, \dots, n\}$, the determinant of $A$ can be computed by Laplace expansion along the $j$-th column as: \[ \det(A) = \sum_{i=0}^n (-1)^{i+j} \cdot A_{ij} \cdot \det(A_{i^c,j^c}) \] where $A_{i^c,j^c}$ denotes the submatrix obtained by removing the $i$-th row and $j$-th column from $A$.
Matrix.det_fin_zero theorem
{A : Matrix (Fin 0) (Fin 0) R} : det A = 1
Full source
/-- Determinant of 0x0 matrix -/
@[simp]
theorem det_fin_zero {A : Matrix (Fin 0) (Fin 0) R} : det A = 1 :=
  det_isEmpty
Determinant of the 0x0 Matrix is One
Informal description
For any matrix $A$ of size $0 \times 0$ over a ring $R$, the determinant of $A$ is equal to $1$.
Matrix.det_fin_one theorem
(A : Matrix (Fin 1) (Fin 1) R) : det A = A 0 0
Full source
/-- Determinant of 1x1 matrix -/
theorem det_fin_one (A : Matrix (Fin 1) (Fin 1) R) : det A = A 0 0 :=
  det_unique A
Determinant of $1 \times 1$ Matrix Equals Its Single Entry
Informal description
For any $1 \times 1$ matrix $A$ with entries in a ring $R$, the determinant of $A$ is equal to its single entry $A_{0,0}$.
Matrix.det_fin_one_of theorem
(a : R) : det !![a] = a
Full source
theorem det_fin_one_of (a : R) : det !![a] = a :=
  det_fin_one _
Determinant of $1 \times 1$ Matrix $\begin{pmatrix}a\end{pmatrix}$ is $a$
Informal description
For any element $a$ in a ring $R$, the determinant of the $1 \times 1$ matrix $\begin{pmatrix}a\end{pmatrix}$ is equal to $a$.
Matrix.det_fin_two theorem
(A : Matrix (Fin 2) (Fin 2) R) : det A = A 0 0 * A 1 1 - A 0 1 * A 1 0
Full source
/-- Determinant of 2x2 matrix -/
theorem det_fin_two (A : Matrix (Fin 2) (Fin 2) R) : det A = A 0 0 * A 1 1 - A 0 1 * A 1 0 := by
  simp only [det_succ_row_zero, det_unique, Fin.default_eq_zero, submatrix_apply,
    Fin.succ_zero_eq_one, Fin.sum_univ_succ, Fin.val_zero, Fin.zero_succAbove, univ_unique,
    Fin.val_succ, Fin.val_eq_zero, Fin.succ_succAbove_zero, sum_singleton]
  ring
Determinant Formula for $2 \times 2$ Matrices: $\det(A) = A_{00}A_{11} - A_{01}A_{10}$
Informal description
For any $2 \times 2$ matrix $A$ with entries in a ring $R$, the determinant of $A$ is given by: \[ \det(A) = A_{00} \cdot A_{11} - A_{01} \cdot A_{10} \] where $A_{ij}$ denotes the entry in the $i$-th row and $j$-th column of $A$.
Matrix.det_fin_two_of theorem
(a b c d : R) : Matrix.det !![a, b; c, d] = a * d - b * c
Full source
@[simp]
theorem det_fin_two_of (a b c d : R) : Matrix.det !![a, b; c, d] = a * d - b * c :=
  det_fin_two _
Determinant formula for $2 \times 2$ matrices: $\det \begin{pmatrix} a & b \\ c & d \end{pmatrix} = ad - bc$
Informal description
For any elements $a, b, c, d$ in a ring $R$, the determinant of the $2 \times 2$ matrix $\begin{pmatrix} a & b \\ c & d \end{pmatrix}$ is given by: \[ \det \begin{pmatrix} a & b \\ c & d \end{pmatrix} = ad - bc. \]
Matrix.det_fin_three theorem
(A : Matrix (Fin 3) (Fin 3) R) : det A = A 0 0 * A 1 1 * A 2 2 - A 0 0 * A 1 2 * A 2 1 - A 0 1 * A 1 0 * A 2 2 + A 0 1 * A 1 2 * A 2 0 + A 0 2 * A 1 0 * A 2 1 - A 0 2 * A 1 1 * A 2 0
Full source
/-- Determinant of 3x3 matrix -/
theorem det_fin_three (A : Matrix (Fin 3) (Fin 3) R) :
    det A =
      A 0 0 * A 1 1 * A 2 2 - A 0 0 * A 1 2 * A 2 1
      - A 0 1 * A 1 0 * A 2 2 + A 0 1 * A 1 2 * A 2 0
      + A 0 2 * A 1 0 * A 2 1 - A 0 2 * A 1 1 * A 2 0 := by
  simp only [det_succ_row_zero, submatrix_apply, Fin.succ_zero_eq_one, submatrix_submatrix,
    det_unique, Fin.default_eq_zero, Function.comp_apply, Fin.succ_one_eq_two, Fin.sum_univ_succ,
    Fin.val_zero, Fin.zero_succAbove, univ_unique, Fin.val_succ, Fin.val_eq_zero,
    Fin.succ_succAbove_zero, sum_singleton, Fin.succ_succAbove_one]
  ring
Determinant Formula for $3 \times 3$ Matrices: $\det(A) = A_{00}A_{11}A_{22} - A_{00}A_{12}A_{21} - A_{01}A_{10}A_{22} + A_{01}A_{12}A_{20} + A_{02}A_{10}A_{21} - A_{02}A_{11}A_{20}$
Informal description
For any $3 \times 3$ matrix $A$ with entries in a ring $R$, the determinant of $A$ is given by: \[ \det(A) = A_{00}A_{11}A_{22} - A_{00}A_{12}A_{21} - A_{01}A_{10}A_{22} + A_{01}A_{12}A_{20} + A_{02}A_{10}A_{21} - A_{02}A_{11}A_{20} \] where $A_{ij}$ denotes the entry in the $i$-th row and $j$-th column of $A$.