Vous êtes sur la page 1sur 6

# hw9-10.

nb: 9/24/04::9:04:50

## Homework: Due 9-10-04

Sakurai Chap 1: 1,4,8,10
1.1
There are probably lots of clever answers that are successive applications of variations of
#AB, C' = A#B, C' + #A, C' B = A B, C - A, C B

## #AB, CD' = ABCD - CDAB

while on the right

## -ACD, B + A C, B D - C D, A B + C, A DB =

= -ACDB - ACBD + ACBD + ABCD - CDAB - C ADB + C ADB + ACDB
= -ACDB + ABCD - CDAB + ACDB
= ABCD - CDAB
which checks.
1.4

a) tr+X Y / = i ;i X Y i?
= ij ;i X j? ; j Y i?
= ij ; j Y i? ;i X j?

= j ; j Y X j? = tr+Y X /

note in line 3 the matrix elements are just c-numbers and, so, commute.
b) Consider the action of +X Y / between two arbitrary states
;a +X Y / b? = ;b X Y a?*
= +i ;b X i? ;i Y a?/*
= i ;i X b? ;a Y i?
= i ;a Y i? ;i X b?
= ;a Y X b?

## c) Let ;x', ;x be arbitrary states, and i? be eigenstates of A

hw9-10.nb: 9/24/04::9:04:50

;x' ei f A x? = i j ;x' i? ;i ei f A j? ; j x?
= i j ;x' i? ;i ei f a j j? ; j x?
= i j ;x' i? ei f a j di j ; j x?
= j ;x' j? ei f a j ; j x?
so

ei f

= ei f a j j ? ; j
j

## d) a ya * +x/ ya +x'/ = a ;a x? ;x' a? = a ;x' a? ;a x? = ;x' x? = d+x - x'/

1.8
I'll give one example in gory detail bra-ket notation, repeat that example in matrix notation, and then confirm the
remaining results in matrix notation.
Given Sx = 2 ++? ;- + -? ;+/, S y = i 2 +- +? ;- + -? ;+/, and Sz = 2 ++? ;+ - -? ;-/; show
#Si , S j ' = iei jk Sk . For the case #Sx , S y ' = iSz , start with

Sx S y = i
4 ++@ ;- + -@ ;+0 +- +? ;- + -@ ;+0
2

LM - +? ;- +? ;-
MM
MM + +? ;- -? ;+
2 M
= i 4 MM
MM - -? ;+ +? ;-
MM
N + -? ;+ -? ;+

=
0
\]
]
= + +? ;+ ]]]
2

]] = i
]
4 ++@ ;+ - -@ ;-0 = i
2 Sz
= - -? ;- ]]
]]
=
0
^

## Similarly, reversing the order


S y Sx = i
4 +- +? ;- + -@ ;+0 ++@ ;- + -@ ;+0
2

LM - +? ;- +? ;-
MM
M - +? ;- -? ;+
2 M
= i
4 MMM
MM + -? ;+ +? ;-
MM
N + -? ;+ -? ;+

=
0
\]
]
= - +? ;+ ]]]
2
]] = i
+- +? ;+ + -@ ;-0 = -i 2 Sz
4
= + -? ;- ]]]
]]
=
0
^

Combining the two results, #Sx , S y ' = Sx S y - S y Sx = iSz . This same result is much more compact if done in a matrix
L 0 1 \]
L 0 -1 \]
L 1 0 \]
notation where Sx = 2 MM
], S y = i 2 MM
], and Sz = 2 MM
]. Then
N1 0^
N1 0 ^
N 0 -1 ^

#Sx , S y ' = Sx S y - S y Sx =

1 0 \
L 0 1 \] LM 0 -1 \] LM 0 -1 \] LM 0 1 \]\]
L 1 0 \] LM -1 0 \]\]
2 L
2 L
2 L
]] = iSz
i
4 MMMM
4 MMMM
2 MM
]M
]-M
]M
]] = i
]-M
]] = i
NN 1 0 ^ N 1 0 ^ N 1 0 ^ N 1 0 ^^
NN 0 -1 ^ N 0 1 ^^
N 0 -1 ^

Similarly, it is straightforward to show #Sx , Sz ' = -iS y and #S y , Sz ' = iSx . Combining the three results,
#Si , S j ' = iei jk Sk .
To confirm the first of the "off-diagonal" anti-commutator relations

0 1 \ L 0 -1 \ L 0 -1 \ L 0 1 \\
2 L
MLM
]M
]] + MM
]] MM
]]]] = 0
Sx , S y  = Sx S y + S y Sx  = i
4 MM 1 0 ] M 1
0 ^ N 1 0 ^ N 1 0 ^^
NN
^N

hw9-10.nb: 9/24/04::9:04:50

and

0 1 \ L 0 1 \ 2 L 1 0 \ 2
2 L
M
]M
]
M
]
Sx , Sx  = Sx Sx + Sx Sx  = 2 Sx Sx =
2 M 1 0 ] M 1 0 ] =
2 M 0 1 ] =
2 I
N
N
^N
^
^


di j
Combining these results with others for S y and Sz , the more general result follows, Si , S j  =
2

1.10
I did this by hand, but there are tools for this sort of thing...Here is H
h  a, a, a, a;
MatrixFormh


a a

a a

## This solves for the eigenvalues and eigenvectors

evals, evectors  Eigensystemh;
evals
MatrixForm  evectors

 2 a, 2 a
1 

2


1



,


1 2


1






Test result
MatrixFormSimplifyh.# &  evectors

 2 2  a 






,

 2 a



2 2  a 









2 a



## The eigenstates aren't normalized, so

norms  SimplifySqrt#.# &  evectors



 4  2 2 , 
2 2 2 

hw9-10.nb: 9/24/04::9:04:50

## normvectors  Simplifyevectors  norms

MatrixForm  normvectors

1 2
1
1 2
1
 
 , 
 ,  
 , 







2 2 2 
2 2 2 
42 2
42 2
1 2









42 2





1

 


42 2








,






1 2






  



2 2 2  










1




 




2
2
2





## Bransden and Joachain 5.7, 5.11

5.7

a) Show i) #xn , p' = inxn-1 and ii) #x, pn ' = in pn-1

i) Use induction. Assume that #xn , p' = inxn-1 is true for n, then show that it is also true for n + 1. Specifically,
#xn+1 , p' = #xxn , p'
= x#xn , p' + #x, p' xn
= xinxn-1 + ixn
= i+n + 1/xn

where in the third line we used the assumed result and that #x, p' = i. This last relation also shows that the hypothesis
is true for n = 1. It follows that it is true for all n 1.

ii) Similarly

## #x, pn+1 ' = #x, p pn '

= p#x, pn ' + #x, p' pn
= pin pn-1 + i pn
= i+n + 1/ pn

which shows the inductive part, and #x, p' = i establishes the result for n = 1.
b i) Using a power series expansion for f ,

# f +x/, p' = %S fn xn , p)
= S fn #xn , p'
n

= S fn inxn-1
n

n
= i
x S fn x

f +x/

= i

## b ii) A similar power series substitution yields

hw9-10.nb: 9/24/04::9:04:50

## #x, g+p/' = %x, S gn pn )

n

= S gn in pn-1
n

= i
p g+p/

5.11
Consider

LM 1 0 0 \]
LM 0 1 0 \]
LM 2 0 0 \]
MM
]]
MM
]]
M
]
H = wMMM 0 2 0 ]]] , A = lMMM 1 0 0 ]]] , and B = mMMMM 0 0 1 ]]]]
M
]
M
]
M
]
N0 0 2^
N0 0 2^
N0 1 0^

## a) The eigenvalues and normalized eigenvectors of A are, l,


H
2

LM 1 \]
MM ]]
1
MM 1 ]]!, -l,

H
MM ]]
2
N0^

LM 1 \]
MM
]
MM -1 ]]]!, 2 l,
MM
]]
N 0 ^

LM 0 \]
MM ]]
MM 0 ]]!.
MM ]]
N1^

LM 0 \]
LM 0 \]
LM 1 \]
MM ]]
MM
]]
M ]
1
MM 1 ]]!, -m,
 MMM 1 ]]]!, 2 m, MMMM 0 ]]]]!. Note that for both A and B, I've adopted a common phase
And for B, m,
H
MM ]]
2 M
]
M ]
N1^
N -1 ^
N0^
convention that the first non-zero component in the eigenvector is chosen to be real and positive.
1


H
2

LM 1 \]
LM 0 \]
LM 0 \]
MM ]]
MM ]]
M ]
The eigenvalues Ei and eignvectors ui ? for H are, w, MMM 0 ]]]!, 2 w, MMM 1 ]]]!, 2 w, MMMM 0 ]]]]!. Note that the eigenvalue
M ]
M ]
M ]
N0^
N0^
N1^
E = 2 w is doubly degenerate. An arbitrary state u? can be written as a linear combination of energy eigenstates,

u? = c1 u1 ? + c2 u2 ? + c3 u3 ?

b i) If u? is normalized to unity
1 = ;u u? = +c1* ;u1 + c2* ;u2 + c*3 ;u3 / +c1 u1 ? + c2 u2 ? + c3 u3 ?/ = c1 2 + c2 2 + c3 2 = S ci 2 = 1
i

ii) The expectation value ;H ?u = ;u H u? = w+c1 2 + 2 c2 2 + 2 c3 2 /. For ;A? and ;B? a bit more algebra is required.

;A? = l+

c*1

c*2

c*3

;B? = m+

c*1

c*2

c*3

LM 0 1 0 \] LM c1 \]
M
]M ]
/ MMMM 1 0 0 ]]]] MMMM c2 ]]]] = l+c1 c*2 + c2 c*1 + 2 c3 2 /
M
]M ]
N 0 0 2 ^ N c3 ^

LM 2 0 0 \] LM c1 \]
M
]M ]
/ MMMM 0 0 1 ]]]] MMMM c2 ]]]] = m+c3 c*2 + c2 c*3 + 2 c1 2 /
M
]M ]
N 0 1 0 ^ N c3 ^

iii) Possible measurements of the energy are the eigenvalues of H , w and 2 w. After a measurement of the energy, the
system will be in an eigenstate corresponding to the energy eigenvalue, i.e. the system will be projected onto the
subspace corresponding to that eigenvalue. For w, there is a single eigenstate so that after the measurement the state
will be

hw9-10.nb: 9/24/04::9:04:50

u'? = # Pw u?
= # P1 u?
= # u1 ? ;u1 u?
= # u1 ? ;u1 +c1 u1 ? + c2 u2 ? + c3 u3 ?/
= # c1 u1 ?

## where # is a normalization constant, and after normalization u'? = u1 ?.

Simlarly, the projection onto the E = 2 w subspace is P2 w = P2 + P3 = u2 ? ;u2 + u3 ? ;u3 and so after a measurement
yielding 2 w
1
u'? = # P2 w u? =
c u ? + c3 u3 ?
Sc W2 +Sc W2 2 2
2