PROFESSOR: The story begins with a statement that we will verify to some degree in the homework that is due on Wednesday of vector operators. This operator Sz, the one-line summary is that this operator is a vector operator under angular momentum-- under the total angular momentum. And as such, its matrix elements will behave like the matrix elements of Jz.
So how is that true? What does it mean to say that S is a vector operator under J? It is to say that for J, S is like a vector. And that is a concrete statement that you should check whether it's true.
The statement is that Ji Sj-- here, this i and j run from 1 to 3-- is equal to ih bar epsilon ijk Sk. That is the statement that S is a vector operator.
You may have seen this in a previous course, in 805, where you might have proven that X and P are vector operators for L. And here, the proof is not all that difficult. In fact, it's almost obvious this is true, isn't it?
Ji is Li plus Si. Li doesn't talk to Sj. But Si with Sj satisfy the algebra of angular momentum that is precisely this one. So this is almost no calculation.
You will check-- you'll remind yourself-- that if you have a vector, that L-- for L-- X and P are vector operators. So for example, Li Pj is ih bar epsilon ijk Pk. And this, you do by calculating the commutator. But after you calculate the commutator a few times, it's better to just remember, oh, it's a vector operator. That's a good way of thinking about this.
OK. If you have vector operators, they have very peculiar properties sometimes. One that may sound a little unmotivated, but it's very useful, is the following. Suppose you form the double commutator of J squared with the vector operator S.
Here, you will find an identity. And to make it fun for you, I will not tell you the number that appears here. It's some number. But with some number here, this is identical to the following SJ times J minus 1/2 J squared S plus SJ squared.
Well, it's important to look at this equation, even to make sure everything is in order. This is a vector equation, so it's three equations. So J squared, despite all this arrow, is a scalar, has no indices. It's J1 squared plus J2 squared plus J3 squared.
S, on the other hand, has an arrow, and it's a vector. So you could look at this equation for the third component, for the first component, for the second. So here is a vector, the three things.
Here is also a vector. It's S dot J and J. So the vector index is carried by the J here. The vector index is carried by S here-- once to the left of J squared, once to the right of J squared.
Also, maybe you should notice that SJ is the same thing as JS, not because these operators commute but because SX and JX commute. Sy and Jy commute. And Sz and Jz commute. Different components would not commute. But here, these ones do commute.
So this is a formula you will show by computation. I don't think there's a simple way to derive this formula. But it's true and false by computation.
This formula implies a result that is quite pretty. It's sometimes called a projection lemma. So all we're doing is trying to compute a matrix element, and we're forced to consider a lot of structure. We're just trying to show this simple matrix element, with an Sz here, is proportional to mj. This is our goal. And we're going to do that.
So suppose you take that interesting equation and find its expectation value on a state that is an eigenstate of j. So suppose you take a j mj and put this whole equation inside this-- left-hand side, then right-hand side-- a state that is an eigenstate of j. Now, that state may be an eigenstate of other operators, as well. It doesn't matter.
Now, look at your left-hand side. It's a commutator. You have a j squared on the left, a commutator on the right minus commutator on the left, j squared on the right. In both cases, there will be either a j squared near the bra or a j squared near the ket. Those two terms come with opposite signs.
Since those are eigenstates with the same eigenvalues, that's what we're doing-- an expectation value here, the left-hand side is 0. So the left-hand side contributes nothing. So left-hand side is 0, on this thing, is 0.
And let's look at the right-hand side. It's equal to j mj S dot J J jmj and minus-- so that was the first term. Now, we have to compute this thing on this Jm Jm state.
Again, a J squared is either to the left or to the right. Therefore, this gives a number which is h squared j times j plus 1. This gives the same number, as I showed you, on the bra. You have two terms. The factor of 1/2 cancels. And you're left just with the expectation value of S, which is kind of what we wanted here.
So this is minus h squared j times j plus 1, which is the expectation value of J squared times the expectation value jmj of S on the jmj. OK. You have this term minus that term equal to 0.
So what have we learned? We have learned that this term that we can call expectation value of S vector on a j eigenstate is equal to the expectation value of this quantity, S dot J J on the eigenstate divided by this number, which turns out to be the expectation value of J squared on that eigenstate.
This formula looks like a projection formula in which you say the expectation value of S is the expectation value of the projection of the vectored S onto the vector J. Remember, if you have, for example, projection of a vector a into a unit vector n, what is the projection of a vector a into a unit vector n? Well, the projection is a dot n times n. That's the component of a along the vector n.
But if n is not a unit vector, the projection of a along b is a dot b times b over b squared. Because, in fact, the projection along a vector or along a unit vector is the same thing. It's just a projection. And here, you have unit vectors.
So this is the projection lemma. It's a very nice result-- pretty striking, in fact. This result is also mentioned in Griffiths. It doesn't give a derivation of this result. It's just quoted.
But it's a beautiful and important result. It's conceptually interesting. It's valid for any vector operator under J. And this will answer our question. Because now, we can use this formula to compute the matrix element.
So what do we have for our case? We have that nljmj Sz nljmj is what? Well, we have the expectation value of Jz on this state. So it's going to be h bar mj over h squared j times j plus 1. That's the denominator. And you still have here what may look like a small challenge, or a big challenge-- happily, it's a small challenge. S dot L mljmj.
Here, this is called the scalar operator. This is a variant on the rotations. And scalar operators are independent of mj. We got the mj dependence here. We want to claim that this expectation value is proportional to mj.
And we have the result here, unless there is mj dependence here. But there is no mj dependence here because, as I said, this has to do with the fact that this is a scalar operator.
So let's calculate this part to finish this whole computation. How do you do that? Well, you have to remember you have J equal to L plus S. So in here, we'll do the following. I'm sorry. I had the confusion here. S dot J-- it's S dot J here. It's starting to look wrong.
So I mean the dot product of J and S. So here, I'll take l equals to J minus S. And L squared is equal to J squared plus S squared minus 2S dot J. Here, it's important that S dot J and J dot S are the same. And therefore, this S dot J is 1/2 of J squared plus S squared minus L squared.
So with S dot J being this, you see immediately what is this number. This is h bar mj h squared j times j plus 1. And now, 1/2-- so I have a 1/2 here, I'll put it in front-- J squared-- so this is j times j plus 1, there's an h squared that's in addition-- h squared here-- j times j plus 1 S squared, which is 3/4, for it's been 1/2, minus L squared, which is minus l times l plus 1.
OK, almost there. Wow, this takes time. But we have a result. So what is the result? The matrix element nljmj lz-- oops, I'll put the whole thing together-- lz plus 2Sz-- back to the whole matrix element.
Remember, we had one piece-- hmj and, now, this part. So adding the hmj to this new part, we have hmj 1 plus j times j plus 1 minus l times l plus 1 plus 3/4 over 2j times j plus 1. Phew. OK. I'm sorry. It's all here. I just copied that term, hopefully without mistakes.
So we have our matrix element. And that matrix element in the top blackboard there gives us the splitting. It's probably a good time to introduce notation. And there's a notation here where this is called g sub J of l. And it's called the Landé g-factor.
It's a g-factor in the sense that affects the energy levels as if you were modifying the magnetic moment of the particle. So this number tells you how the level split. They split proportional to mj-- all the various levels.
And for the full multiplate, the multiplate is an eigenstate of j and an eigenstate of l. So throughout all the states in the multiplate, this is a single number. And just, you have the hmj.
End result is the weak Zeeman splitting nljm is eh bar over 2mc times B times gJ of l times mj. And this number is about 579 times 10 to the minus 9 eV per Gauss. It's small.
So wow. It took us some effort. But here we are. We have the weak-field Zeeman splitting completely computed.