Linear Algebra
Linear Operators and Matrices
A(Σaivi)=ΣaiAviA (\Sigma a_i |v_i\rangle) = \Sigma a_iA|v_i\rangle
Now, see a linear operator is just a matrix. Suppose A:VWA: V \to W and v1,v2,...,vm|v_1\rangle, |v_2\rangle, ..., |v_m\rangle are basis of VV and w1,w2,...,wn|w_1\rangle, |w_2\rangle, ..., |w_n\rangle is a basis of WW then,
Avj=ΣiAijwiA|v_j\rangle = \Sigma_i A_{ij}|w_i\rangle
Inner Products
Ok so imagine an operation (_,_):V×VC(\_ ,\_): V\times V \to \mathbb{C} such that the following shit holds ok?
(v,Σiλiwi)=Σiλi(v,wi)(|v\rangle, \Sigma_i \lambda_i|w_i\rangle) = \Sigma_i\lambda_i(|v\rangle, |w_i\rangle)
(v,w)=(w,v)(|v\rangle, |w\rangle) = (|w\rangle, |v\rangle)^*
(v,v)0(|v\rangle, |v\rangle) \geq 0 and =0= 0  iff v|v\rangle
In finite dimensions, inner product space i.e., vector spaces equipped with inner prducts for all v|v\rangle \in  vector space ==  Hilbert Space
Consider i & j|i\rangle\ \&\ |j\rangle to be orthonormal basis, we have —
vw=(Σivii,Σjwjj)=ΣiΣjviwjδij=Σiviwi=vw\langle v|w \rangle = (\Sigma_i v_i|i\rangle, \Sigma_jw_j|j\rangle) = \Sigma_i \Sigma_j v_i^*w_j\delta_{ij} = \Sigma_iv_i^*w_i = |v\rangle^\dagger |w\rangle
Norm of a vector
v=vv||v|| = \sqrt{\langle v|v \rangle}
We can say that v|v\rangle is normalized iff v=1||v|| = 1.
A set of ai|a_i\rangle vectors is orthonormal if aiaj=δij\langle a_i|a_j \rangle = \delta_{ij} i.e.,  ij aiaj=0\forall\ i \neq j\ \langle a_i|a_j \rangle = 0 and aiaj=1  i=j\langle a_i|a_j \rangle = 1\ \forall\ i=j.
Gram Schmidt: for orthonormal basis
vk+1=wk+1Σi=1kviwk+1viwk+1Σi=1kviwk+1vi, v1=w1/w1|v_{k+1}\rangle = \frac{|w_{k+1}\rangle - \Sigma_{i=1}^k \langle v_i | w_{k+1}\rangle |v_i\rangle}{|||w_{k+1}\rangle - \Sigma_{i=1}^k \langle v_i | w_{k+1}\rangle |v_i\rangle||},\ |v_1\rangle = |w_1\rangle/|||w_1 \rangle||
Outer Product
wv(v)=wvv=wvv=vvw|w\rangle \langle v|(|v'\rangle) = |w\rangle |v\rangle^\dagger |v'\rangle = |w\rangle \langle v|v' \rangle = \langle v|v' \rangle |w\rangle
From this notion we obtain the completeness relation, Σiii=I\Sigma_i |i\rangle \langle i| = I.
A=IwAIv=ΣijwjwjAvivi=ΣijwjAviwjviA = I_wAI_v = \Sigma_{ij} |w_j\rangle\langle w_j|A|v_i\rangle\langle v_i| = \Sigma_{ij} \langle w_j|A|v_i\rangle|w_j\rangle\langle v_i|
Cauchy Schwarz: vvwwvwwv=vw2\langle v|v \rangle \langle w|w \rangle \geq \langle v|w \rangle \langle w|v \rangle = |\langle v|w \rangle|^2
Hilbert Space
A Hilbert Space H\mathcal{H} is complete which means that every Cauchy sequence of vectors admits in the space itself. Under this hypothesis there exist Hilbert bases also known as complete orthonormal systems of vectors in H\mathcal{H}.
For any orthonormal basis of H\mathcal{H}, we have the following.
Orthonormalityψiψj=δijCompletenessiψiψi=I\text{Orthonormality} \equiv\langle \psi_i|\psi_j\rangle = \delta_{ij}\\ \text{Completeness} \equiv\sum_{i} |\psi_i\rangle\langle\psi_i| = I
Eigenvectors and Eigenvalues
Under a given linear transformation AA, Av=λvA|v\rangle = \lambda|v\rangle  where  v\exists\ |v\rangle  s.t. they do not get shifted off their span.
All such vectors are referred as eigenvectors and (AλI)v=0    detAλI=0(A - \lambda I)|v\rangle = 0 \implies det|A-\lambda I| = 0 gives all possible eigenvalue.
It is the space of all vectors with a given eigenvalue λ\lambda. When an eigenspace is more than one dimensional, we call it degenerate.
Adjoints and Hermitian
Suppose A:VVA: V \to V then  A:VV\exists\ A^\dagger: V \to V such that  v, wV\forall\ \vert v\rangle,\ \vert w\rangle \in V we have,
(v,Aw)=(Av,w)(\vert v\rangle, A\vert w\rangle) = (A^\dagger \vert v\rangle, \vert w\rangle)
This operator is called as the adjoint or Hermitian conjugate of the operator AA.
(v,Aw)=vAw=vAw=(Av)w=(Av,w)(\vert v\rangle, A\vert w\rangle) = \langle v\vert A\vert w\rangle = \bold{v}^\dagger A\bold{w} = (A^\dagger\bold{v})^\dagger\bold{w} = (A^\dagger\vert v\rangle, \vert w\rangle)
We have, (AB)=BA(AB)^\dagger = B^\dagger A^\dagger
v=v\vert v\rangle^\dagger = \langle v\vert
Some defintions
Normal matrices: AA=AAAA^\dagger = A^\dagger A
Hermitian matrices: A=AA^\dagger = A
Unitary matrices: AA=IAA^\dagger = I
A normal matrix is Hermitian if and only if it has real eigenvalues.
If xAx0, x\langle x| A|x\rangle \geq 0, \forall\ |x\rangle then AA is positive semi-definite and has positive eigenvalues.
Some properties
If a Hermitian matrix has positive eigenvalues then it is positive semi-definite.
If M=AAM = AA^\dagger then it is both Hermitian and positive semi-definite.
All positive semi-definite operators are Hermitian, by definition.
Spectral Decomposition
Definition: A linear operator is diagonalizable if and only if it is normal.
Some notes and derivation regarding the above:
Av=λv=ΣiλijqiA\vec{v} = \lambda\vec{v} = \Sigma_i \lambda_{ij}\vec{q_i}  where qiq_i's are linearly independent eigenvalues of AA.
AQ=QΛAQ = Q\Lambda where Q=[q1q2qn]    A=QΛQ1Q = \begin{bmatrix} q_1&q_2 &\ldots&q_n \end{bmatrix} \implies A = Q{\Lambda}Q^{-1}
A=IAI=(P+Q)A(P+Q)=PAP+QAP+PAQ+QAQ    A=λP2+0+0+QAQ    A=λP2+QAQA = IAI = (P+Q)A(P+Q) = PAP + QAP + PAQ + QAQ\\ \implies A = \lambda{P^2} + 0 + 0 + QAQ\\ \implies A = \lambda{P^2} + QAQ
Matrices and Vectors
In the following statements we are dealing with {i}\{\vert i\rangle\}  as a orthonormal basis set.
I=Σiiψ=Σσii, where σi=iψI = \Sigma \vert i\rangle\langle i\vert\\ \vert \psi \rangle = \Sigma \sigma_i\vert i\rangle, \text{ where } \sigma_i = \langle i\vert \psi\rangle\\
Now, to represent a operator or linear transformation as matrix in orthonormal basis.
Aij=iAjA=Σi,jiAjijtr(A)=ΣiiAiA_{ij} = \langle i\vert A\vert j\rangle\\ A = \Sigma_{i, j} \langle i\vert A\vert j\rangle \vert i\rangle \langle j\vert\\ \text{tr}(A) = \Sigma_i \langle i\vert A\vert i\rangle
Now diagonalization for any normal matrix.
M=Σiλiii=ΣiλiPi, where Pi=Pif(M)=Σif(λi)iiM = \Sigma_i \lambda_i \vert i\rangle \langle i\vert = \Sigma_i \lambda_i P_i,\ \text{where}\ P_i^\dagger = P_i\\ f(M) = \Sigma_i f(\lambda_i) \vert i\rangle \langle i\vert
where λi\lambda_i are eigenvalues of MM under a given orthonormal basis set {i}\{\vert i\rangle\} for vector space VV, each i\vert i \rangle is an eigenvector of MM with eigenvalue λi\lambda_i.
If MM is Hermitian, all eigenvalues (λi s)(\lambda_i\text{ s}) are non-negative.
Tensor Products
zvw=(zv)(w)=(v)(zw)z\vert{vw}\rangle = (z\vert{v}\rangle) \otimes (\vert{w}\rangle) =(\vert{v}\rangle) \otimes (z\vert{w}\rangle) 
(v1+v2)w=v1w+v2w(\vert v_1 \rangle + \vert v_2 \rangle) \otimes \vert w \rangle = \vert{v_1w}\rangle + \vert{v_2w}\rangle 
v(w1+w2)=vw1+vw2\vert v \rangle \otimes (\vert w_1 \rangle + \vert w_2 \rangle) = \vert{vw_1}\rangle + \vert{vw_2}\rangle 
ψk=ψψ k times\vert \psi \rangle^{\otimes^k} = \vert \psi \rangle \otimes \ldots \otimes \vert \psi \rangle \text{ k times}
(AB)=AB(A \otimes B)^\dagger = A^\dagger \otimes B^\dagger
Linear Product
ABA\otimes{B} forms the linear operator that acts on VWV\otimes W vector space givern that AA acts on VV and BB acts on WW.
(AB)(Σiaiviwi)=ΣiaiAviBwi(A\otimes B)(\Sigma_{i} a_i\vert v_i\rangle \otimes \vert w_i \rangle) = \Sigma_{i} a_iA\vert v_i\rangle \otimes B\vert w_i \rangle
Inner Product
(Σiaiviwi,Σiajvjwj)=Σijaibjvivjwiwj(\Sigma_{i} a_i\vert v_i\rangle \otimes \vert w_i \rangle, \Sigma_{i} a_j\vert v'_j\rangle \otimes \vert w'_j \rangle) = \Sigma_{ij} a_i^*b_j \langle v_i\vert v'_j\rangle\langle w_i\vert w'_j\rangle
Properties of trace are given below as follows.
tr(A)=ΣiAii\text{tr}(A) = \Sigma_i A_{ii}
tr(A)=ΣiiAi\text{tr}(A) = \Sigma_i \langle i\vert A\vert i\rangle for orthonormal basis
tr(AB)=tr(BA)\text{tr}(AB) = \text{tr}(BA)
tr(zA+B)=ztr(A)+tr(B)\text{tr}(zA+B) = z\cdot \text{tr}(A) + \text{tr}(B)
The above properties yield certain implications as follows.
tr(UAU)=tr(A)\text{tr}(UAU^\dagger) = tr(A)
tr(Aψψ)=ΣiiAψψi\text{tr}(A\vert \psi\rangle\langle\psi\vert) = \Sigma_i \langle i\vert A\vert \psi\rangle\langle\psi\vert i \rangle
tr(A)=iλi\text{tr}(A) = \sum_i \lambda_i, det(A)=iλi\text{det}(A) = \prod_i \lambda_i with algebraic multiplicities
Partial Trace
Entanglement excludes the possibility of associating state vectors with individual subsystems. Therefore, we introduce density matrices and the corresponding idea of reduction preformed with partial trace.
tr(AB)= tr(A)tr(B)\text{tr}(A \otimes B) = \text{ tr}(A) \cdot \text{tr}(B)
ρAB:HAHBundefined trBρA:HA\rho_{AB} : \mathcal{H_A}\otimes\mathcal{H_B} \xrightarrow{\text{ tr}_B} \rho_A : \mathcal{H_A}
 trB(AB)=A tr(B)\text{ tr}_B(AB) = A \text{ tr}(B)
Hilbert-Schimdt Inner Product
LvL_v forms the vector space of operators over the Hilbert space VV. Then, we can show that LvL_v  is also a Hilbert space with tr(AB)\text{tr}(A^\dagger B) as the inner product operator on Lv×LvL_v \times L_v.
Also, we have div(Lv)=dim(V)2div(L_v) = dim(V)^2.
Commutator and Anti-commutator
[A,B]=ABBA{A,B}=AB+BA[A, B] = AB - BA\\ \{A, B\} = AB + BA
Theorem of Simultaneous Diagonalization
Suppose AA and BB are both Hermitian matrices, then [A,B]=0[A, B] = 0 iff \exists orthonormal basis such that both AA and BB are diagonal with respect to that basis.
Polar Value Decomposition
If AA is any linear operator and UU is a unitary then J,KJ, K are positive operators, such that
A=UJ=KU, where J=AA and K=AAA = UJ = KU, \text{ where } J = \sqrt{A^\dagger A} \text{ and } K = \sqrt{AA^\dagger}
Moreover, if A1A^{-1} exists, then UU is unique.
Singular Value Decomposition
If AA is a square matrix and  U,V\exists\ U, V unitaries then DD is a diagonal matrix, such that
where DD has non-negative values.
Corollary: If AA has non-negative eigenvalues then, A=UDUA = U^\dagger DU is possible where DD has non-negative values.