Cauchy-Schwarz Inequality and Heisenberg's Uncertainty Principle

August 25, 2017 | Author: Mark Eichenlaub | Category: Vector Space, Norm (Mathematics), Space, Abstract Algebra, Mathematical Concepts
Share Embed Donate

Short Description

Descripción: These are some brief notes I wrote for myself because I kept forgetting where the Cauchy-Schwarz Inequality...


The Cauchy-Schwarz Inequality July 24, 2009



The Cauchy Schwarz Inequality states that any time you take an inner product (i.e. dot product) of two vectors, you can’t get something longer than if you just multiplied the two vectors’ lengths. This is the crucial theorem underlying Heisenberg’s Uncertainty Principle. In the simple vectors we draw in physics 101, the inequality is obvious, because the dot product of two vectors ~a and ~b is ~a · ~b = |a||b| cos θ


with θ the angle between the vectors. Because cos θ ≤ 1, the dot product is less than or equal to the product of the lengths. Also, the dot product equals the product of the lengths only when the vectors point in the same direction. Here, we want to generalize this result to abstract vector spaces, working from the axioms. Since we’re talking about abstract vectors from now on, rather than little arrows in a plane, we’ll switch to the bra-ket notation.


Inner Products and Norms

Vector spaces can exist without inner products, but they’re less interesting that way because we can’t define orthogonality or take projections. Inner products are also closely tied to norms. If a vector space has an inner product defined, we can define the norm of a vector v by p (2) |v| = hv | vi

Alternatively, if we have a norm but no inner product (as might be the case in a physical situation, where the norm is what you get by laying down a ruler), we can define the inner product of two vectors by 4hv | wi ≡ |v + w|2 − |v − w|2


Although it isn’t immediately obvious, this definition is linear in both vectors. For a complex vector space, the appropriate definition is 4ha|bi ≡ |a + b|2 − |a − b|2 − ı(|a + ıb|2 − |a − ıb|2 ). 1



Decomposition of a Vector

When we have an inner product space, there are some vectors whose inner products are especially simple - parallel and orthogonal vectors. Intuitively, parallel vectors are vectors that point the same direction. Their definition is two vectors for which the length of the sum is the sum of the lengths. Orthogonal (or perpendicular) vectors are defined as having an inner product of zero. For these vectors, the Pythagorean Theorem tells us that the square of the length of the sum is the sum of the squares of the lengths. Looking at any two vectors |yi and |xi, it would be nice if we could understand them simply in terms of parallel and orthogonal vectors, even though |yi and |xi are in general neither of these. What we’ll try to do then is take |yi apart into two pieces, one of which is parallel to |xi, and the other orthogonal to it. Specifically, we want to find some scalar c such that | yi = c | xi+ | wi


hx | wi = 0



Dotting both sides of (5) with hx | gives hx | yi = chx | xi + hx | wi hx | yi = chx | xi + 0 hx | yi c = hx | xi


Putting this result for c back into (5) yields | yi =

hx | yi | xi+ | wi. hx | xi


Geometrically, we’re creating a right triangle with | yi as the hypotenuse.


Proof of the Cauchy-Schwarz Inequality

We suspect that |hx | yi| ≤ |x||y|


. That’s the Cauchy-Schwarz Inequality. We want to prove it from the axioms of the inner product, which are hx | xi ≥ 0, and hx | xi = 0 ⇔| xi = 0 hx | yi = hy | xi∗ hx | αy + βzi = αhx | yi + βhx | zi 2


Written out entirely in inner products, the Cauchy-Schwarz inequality is p |hx | yi| ≤ hx | xihy | yi. (11)

Because the right hand side is positive, the statement is equivalent to what we get by squaring both sides. hx | yi2 ≤ hx | xihy | yi.


We broke | yi down into parallel and perpendicular parts, so let’s substitute (8) into (12) and see what we’ve got. ! 2 hx | yi 2 hx | xi (13) hx | yi ≤ hx | xi hw | wi + hx | xi Simplifying a little gives that the Cauchy-Schwarz Inequality is equivalent to hx | yi2 ≤ hx | yi2 + hx | xihw | wi


which is obviously true. Further, the equality obtains only when | xi = 0 or | wi = 0, meaning | xi and | yi are linearly dependent.


Alternate Proof

Here’s another proof that is a bit more clever. We’ll work in a real vector space for this one. The Cauchy-Schwarz Inequality is trivial if | xi and | yi are dependent. So assume they are independent, meaning ∀λ ∈ R,

λ | yi− | xi = 6 0.


Then the norm of this subtraction must be positive. ∀λ ∈ R,

λ2 hy | yi − 2λhx | yi + hx | xi ≥ 0


The left hand side is quadratic in λ, but the quadratic has no roots because it’s always greater than zero. Therefore the discriminant is negative. This means 4hx | yi2 − 4hy | yihx | xi < 0 hx | yi2

< hx | xihy | yi

The square root of that is the Cauchy-Schwarz Inequality.




The Triangle Inequality

Once we have the Cauchy-Schwarz Inequality, we can get the Triangle Inequality, which says |x + y| ≤ |x| + |y|


Both sides are positive, so we can square them. Writing the result in inner products gives p hx | xi + 2 hx | xihy | yi + hy | yi p 2hx | yi ≤ 2 hx | xihy | yi

hx + y | x + yi ≤


showing that the Triangle Inequality is equivalent to the Schwarz Inequality. Another Triangle Inequality is ||x| − |y|| ≤ |x − y|


This follows from |x||y| −2|x||y|

≥ |hx | yi| ≤ 2hx | yi

|x|2 − 2|x||y| + |y|2

|x|2 + hx | yi + |y|2

(|x| − |y|)


hx | xi + 2hx | yi + hy | yi

(|x| − |y|)


hx + y | x + yi

||x| − |y|| ||x| − |y||

≤ ≤

|x + y| |x − y|


Combining the two Triangle Inequalities gives ||x| − |y|| ≤ |x − y| ≤ |x| + |y|.


This means that when you add or subtract two vectors, the length of the resultant is always between what you get by adding and subtracting the lengths of the two vectors directly.


The Uncertainty Principle

Suppose two Hermitian linear operators A and B have the commutator [A, B] = C


with C some other linear operator. We’re interested in the simultaneous eigenvectors of A and B. Or, if they don’t have them, we’re interested in knowing how close we can get. 4

When |ψi is an eigenvector of A, it has zero variance, meaning hψ|A2 |ψi = hψ|A|ψi2 . So the variance is a measure of how close we are to an eigenvector. To make things easier on ourselves, we could consider the operator A′ ≡ A − hψ|A|ψi. Adding a constant to an operator doesn’t change its variance, but this particular constant does kill the mean. So without loss of generality, let’s kill off the means of A and B. Here it is: p ∆A∆B ≡ hΨ|A2 |ΨihΨ|B 2 |Ψi p hAΨ|AΨihBΨ|BΨi = ≥ |hAΨ|BΨi| = |hΨ|AB|Ψi| 1 ∗ ≥ (hΨ|AB|Ψi − hΨ|AB|Ψi ) 2ı 1 = hΨ|C|Ψi 2ı =

1 |hΨ|C|Ψi| 2


We invoked the Cauchy-Schwarz Inequality to go from the second to the third line. To go from the fourth to fifth line, we said that the magnitude of a complex number is greater than or equal to the magnitude of its imaginary part. For the special case where A is the position operator and B is the momentum operator, the canonical commutation relation tells us C = h ¯ . This is a constant times the identity, and plugging this into (24) gives ∆x∆p ≥

¯ h 2


Also note that when we asserted that the magnitude of a number is less than or equal to that of its imaginary part, we could have made an equivalent statement about its real part. In this case we’d get an uncertainty relation involving the anti-commutator.



Axler, Sheldon. Linear Algebra Done Right Spivak, Michael. Calculus on Manifolds Blandord, Roger and Kip Thorne. Applications of Classical Physics Hassani, Sadri. Mathematical Physics Shankar, R. Principles of Quantum Mechanics Wikipedia Uncertainty Principle


View more...


Copyright ©2017 KUPDF Inc.