First, we observe the necessary concepts for quantum mechanics: our first concept is that of a state vector∣ψ⟩.
Secondly, the quantities we can measure are defined by operators, such as O^, the fact of which is encoded in what is referred to as a canonical commutation relation:
[pi^,xj^]=pi^(xj^)−xj^(pi^)=−iℏδij.
pi^ represents the momentum operator in the i-direction, and xj^ represents the position operator in the j-direction.
What is also important to note, is that these measurable quantities are not represented by typical numbers, because they do not commute (meaning for these numbers, ab−ba=0).
We can act upon a state vector ∣ψ⟩, with an operator, and obtain the value we would gain from experiment:
O^∣ospec⟩=ospec∣ospec⟩.
To demonstrate this, we imagine an electron prepared with momentum 3.21skgm, which we can wrap in a ket like so: ∣3.21skgm⟩, and then imagine we act on it with the momentum operator p^, like so:
That being said, this process is only so easy when our system is in an eigenstate of our operator.
Each operator has an associated set of eigenstates, which are the building blocks of state vectors that describe some quantum system.
In more technical language: the eigenstates of some operator form the basis vectors of the Vector Space that contains our state vectors.
We say that a state vector represents some operators eigenstate (or one of them), if measuring the quantity always yields the same result: mathematically, if we apply an operator to some eigenstate, we get the original vector back, taken by some scalar value.
Operator×eigenstate=eigenvalue×eigenstate
We can contrast this with the fact that state vectors are linear combinations of eigenstates, which means we can expand our state vector(∣ψ⟩), in terms of its eigenstates ({oi}), like so:
∣ψ⟩=i∑ai∣oi⟩+…an∣on⟩.
Note that the set of elements {ai…} are complex probability amplitudes, which express the probability of measuring some eigenstate (oi).
The probability of measuring some eigenstates, relative to its probability amplitude, is the value of the probability amplitude squared: ∣ai∣2.
This means if we prepare a system, and take multiple measurements, we will get multiple values, according to the probability ∣ai∣2 of the eigenstates associated with the real values.
To work with a system which is not in an eigenstate of our operator, we need to make use of a tool, which is a conjugated state vector, referred to as a bra, which is a Hermitian conjugated ket:
⟨ψ∣=∣ψ⟩†=(∣ψ⟩⋆)T
We can use an eigenstate bra (⟨oi∣) to project out the desired probability amplitude aj directly from a ket:
⟨oj∣ψ⟩=⟨oj∣i∑ai∣oi⟩=
The equivalence of the two sides of this equation is given by the following:
δij is the Kronecker delta: It's equivalent to the inner product because eigenstates are typically orthogonal.
i∑aiδij=
i∑fiδij=fi
aj
What this means is that measuring the probability of the value 3.21skgm can be done by taking the general ket (∣ψ⟩) by the eigenstate bra:
⟨3.21skgm∣ψ⟩=
i∑ai⟨3.21skgm∣oi⟩=a3.21
This means that the probability of obtaining such a value is ∣a3.21∣2
Additionally, we can define expectation
Another use of bra vectors is the calculation of expectation values, which is the statistical average of values obtained from multiple runs of experiment:
Note that ⟨ψ∣ is not an eigenstate, but the complex conjugate of our original ket vector ∣ψ⟩.
The main idea here is that the calculation of such a product gives us a sum of all possible states of the system taken by the corresponding probability amplitudes:
Note: We can use the expansion of some general state vector in terms of its eigenstates, combined with the relationship between bras and kets to directly calculate the expansion of the associated bra vector: