We will now discuss the postulates of quantum mechanics.
I want to discuss the conditions under which the probability interpretation will be consistent.
Let \(A\) be a dynamical variable and the corresponding operator be denoted by \(\hat{A}\).
Let the eigen values and eigen vectors be denoted by \(\alpha_k\) and \(\ket{\alpha_k}\) respectively.
\begin{equation}
\hat{A} \, \ket{\alpha_k} = \alpha_k \ket{\alpha_k}.
\end{equation}
ToHide:
If a system is known to be in state with value \(\alpha_1\) of \(A\) , what is the probability that a measurement of \(A\) will give some other value \(\alpha_2 \ne \alpha_1\)?
ANSWER ZERO:
What guarantees that theory will always give the answer you expect? Different eigenvectors should be orthogonal.Now let us ask :
The Answer is:
- It is given that a system is in state \(\ket{\psi}\).
- In order to find the probabilities of different possible outcomes \(\alpha_n\) on measurement of \(A\), one needs to expand the vector \(\ket{\psi}\) and write it as superposition of eigenvectors \(\hat{A}\).
Let us ask:
What ensures that such every vector can be written as linear combination of eigen vectors of \(\hat{A}\)?
The Answer is:
- The eigenvectors should form a basis.
- Then only there will be a guarantee that every vector can be expanded in as a linear combination of the eigenvectors.
- The postulates give the prescription of computing the probabilities of different outcomes of measurement of \(A\).
Let us ask:
- That sum of all probabilities is one means \(\sum_n |c_n|^2 = |<\psi|\psi>|^2\).
How do we see this property?
The Answer is:
- This will be ensured in Parseval relation holds and the eigenvectors are normalized.
- The Parseval relation holds if the basis is a complete ortho normal set.
- If a system is in a state \(\ket{\psi}\) which satisfies \(|\innerproduct{\psi}{\alpha_m}|^2=1\), it means that outcome of measurement of \(A\) will definitely be the value \(\alpha_m\).
Let us Ask
For consistency with the first part of the postulate, we must be able to prove that
the given condition, \(|\innerproduct{\psi}{\alpha_m}|^2=1\), implies that \(\ket{\psi}\) is an eigen vector of \(\hat{A}\) with eigen value \(\alpha_m\).
Click for the proof
- For computing probabilities, the state vector \(|\psi>\) and all the eigenvectors should be normalized.
- This means that \[<\psi|\psi> =1 \qquad \qquad |<\alpha_n|\alpha_n>|=1 \text { for all n}\.]
- If the out come of measurement is to be definite and equal to \(\alpha_m\), for some \(m\), we must have \[|<\psi|\alpha_m>|=1.\]
- Therefore we get \[|<\psi|\alpha_m>|^2 = <\psi|\psi> |<\alpha_m|\alpha_m>|.\]
- This then implies that \{|\psi>\} should be equal to some some constant times \(|E_m>\). \[|\psi> = K |\alpha_m>.\]
WHY ? How do I see the last statement?
You have to recall the full statement of Cauchy Schwarz inequality.
The Cauchy Schwarz inequality \[|<f|g>|^2 \le |f| |g|\] becomes equality if and only if \[|f> = \text{const} |g>\].
Exclude node summary :
We have several requirements coming from the consistency of probability interpretation, the content of the third postulate.
All the requirements on the eigen vectors of operator \(\hat{A}\) are met if \(\hat{A}\) is a self adjoint operator.
4727: Diamond Point