Now, see a linear operator is just a matrix. Suppose $A: V \to W$ and $|v_1\rangle, |v_2\rangle, ..., |v_m\rangle$ are basis of $V$ and $|w_1\rangle, |w_2\rangle, ..., |w_n\rangle$ is a basis of $W$ then,
$$A|v_j\rangle = \Sigma_i A_{ij}|w_i\rangle $$
Inner Products
Ok so imagine an operation $(\_ ,\_): V\times V \to \mathbb{C}$ such that the following shit holds ok?
We can say that $|v\rangle$ is normalized iff $||v|| = 1$.
A set of $|a_i\rangle$ vectors is orthonormal if $\langle a_i|a_j \rangle = \delta_{ij}$ i.e., $\forall\ i \neq j\ \langle a_i|a_j \rangle = 0$ and $\langle a_i|a_j \rangle = 1\ \forall\ i=j$.
A Hilbert Space $\mathcal{H}$ is complete which means that every Cauchy sequence of vectors admits in the space itself. Under this hypothesis there exist Hilbert bases also known as complete orthonormal systems of vectors in $\mathcal{H}$.
For any orthonormal basis of $\mathcal{H}$, we have the following.
where $\lambda_i$ are eigenvalues of $M$ under a given orthonormal basis set $\{\vert i\rangle\}$ for vector space $V$, each $\vert i \rangle$ is an eigenvector of $M$ with eigenvalue $\lambda_i$.
If $M$ is Hermitian, all eigenvalues $(\lambda_i\text{ s})$ are non-negative.
Entanglement excludes the possibility of associating state vectors with individual subsystems. Therefore, we introduce density matrices and the corresponding idea of reduction preformed with partial trace.
$$\text{tr}(A \otimes B) = \text{ tr}(A) \cdot \text{tr}(B)$$
$L_v$ forms the vector space of operators over the Hilbert space $V$. Then, we can show that $L_v $ is also a Hilbert space with$\text{tr}(A^\dagger B)$ as the inner product operator on $L_v \times L_v$.
Also, we have $div(L_v) = dim(V)^2$.
Commutator and Anti-commutator
$$[A, B] = AB - BA\\
\{A, B\} = AB + BA$$
Theorem of Simultaneous Diagonalization
Suppose $A$ and $B$ are both Hermitian matrices, then $[A, B] = 0$ iff $\exists$ orthonormal basis such that both $A$ and $B$ are diagonal with respect to that basis.
Polar Value Decomposition
If $A$ is any linear operator and $U$ is a unitary then $J, K$ are positive operators, such that
$$A = UJ = KU, \text{ where } J = \sqrt{A^\dagger A} \text{ and } K = \sqrt{AA^\dagger}$$
Moreover, if $A^{-1}$ exists, then $U$ is unique.
Singular Value Decomposition
If $A$ is a square matrix and $\exists\ U, V$ unitaries then $D$ is a diagonal matrix, such that
$$A = UDV$$
where $D$ has non-negative values.
Corollary: If $A$ has non-negative eigenvalues then, $A = U^\dagger DU$ is possible where $D$ has non-negative values.