Académique Documents
Professionnel Documents
Culture Documents
MIT 2.717
Image quality metrics p-1
Linear inversion model
object
“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
inversion problem:
determine f, given the measurement g =H f
(noise variance) σ2
noise-to-signal ratio (NSR) = = =σ 2
(average signal power) 1
“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
1 n
µk
2
C = ∑ ln1 + 2
eigenvalues
of H
2 k =1 σ
MIT 2.717
Image quality metrics p-3
The significance of eigenvalues
(aka how many
rank of dimensions 1 n µ k2
measurement the measurement C = ∑ ln1 + 2
is worth) 2 k =1 σ
n
n-1
...
1 σ2
0
µ n2 µ n−1
2 ... µ 22 µ12
eigenvalues of H
MIT 2.717
Image quality metrics p-4
Precision of measurement
1 n µ k2
C = ∑ ln1 + 2 = µ t2 < σ 2 < µ t2−1
2 k =1 σ noise floor
µ t2− 2 µ t2−1 µ t2
... + ln1 + 2 + ln1 + 2 + ln1 + 2 + ...
σ σ σ
≈precision this term
of (t-2)th measurement ≤1
E.g. 0.5470839348
MIT 2.717
Image quality metrics p-5
Formal definition of cross-entropy (1)
MIT 2.717
Image quality metrics p-6
Formal definition of cross-entropy (2)
• Fair coin: p(H)=1/2; p(T)=1/2
1 1 1 1
Entropy = − log 2 − log 2 = 1 bit
2 2 2 2
MIT 2.717
Image quality metrics p-7
Formal definition of cross-entropy (3)
Joint Entropy
log2[how many are the possible states of a combined variable
obtained from the Cartesian product of two variables?]
“physical
hardware attributes”
channel (measurement)
field
f propagation H detection g
MIT 2.717
Image quality metrics p-8
Formal definition of cross-entropy (4)
Conditional Entropy
log2[how many are the possible states of a combined variable
given the actual state of one of the two variables?]
“physical
hardware attributes”
channel (measurement)
field
f propagation H detection g
MIT 2.717
Image quality metrics p-9
Formal definition of cross-entropy (5)
object
“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
MIT 2.717
Image quality metrics p-10
Formal definition of cross-entropy (6)
uncertainty added due to noise
Cond. Entropy(F | G )
representation by
Seth Lloyd, 2.100
MIT 2.717
Image quality metrics p-11
Formal definition of cross-entropy (7)
Joint Entropy(F , G )
Entropy(F ) Entropy(G )
MIT 2.717
Image quality metrics p-12
Formal definition of cross-entropy (8)
Physical Channel
(transform)
F G
information information
source receiver
(object) (measurement)
Corruption source (Noise)
MIT 2.717
Image quality metrics p-13
Entropy & Differential Entropy
• Discrete objects (can take values among a discrete set of states)
– definition of entropy
Entropy = −∑ p ( xk ) log 2 p ( xk )
k
– unit: 1 bit (=entropy value of a YES/NO question with 50%
uncertainty)
• Continuous objects (can take values from among a continuum)
– definition of differential entropy
MIT 2.717
Image quality metrics p-14
Image Mutual Information (IMI)
object
“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
Assumptions: (a) F has Gaussian statistics
(b) white additive Gaussian noise (waGn)
i.e. g=Hf+w
where W is a Gaussian random vector with diagonal
correlation matrix
1 n µ k2
Then C(F , G ) = ∑ ln1 + 2 µ k : eigenvalues of H
2 k =1 σ
MIT 2.717
Image quality metrics p-15
Mutual information &
rank of
measurement degrees of freedom
n
n-1
...
1 σ2
0
µ n2 µ n2−1 ... µ 22 µ12
mutual
information
1 n µ k2 As noise increases
C = ∑ ln1 + 2 • one rank of H is lost whenever
2 k =1 σ σ2 overcomes a new eigenvalue
• the remaining ranks lose precision
σ2
MIT 2.717
Image quality metrics p-16
Example: two-point resolution
Finite-NA imaging system, unit magnification
g A = f A + sf B
g B = sf A + f B
s = sinc ( x )
2
~ ~
A B
MIT 2.717
Image quality metrics p-18
IMI for two-point resolution problem
1 s µ1 = 1 + s
H = det (H ) = 1 − s 2
s 1 µ2 = 1− s
−11 1 − s
H =
2
1− s − s 1
1 (1 − s ) 1 (1 + s )
2 2
C(F , G ) = ln1 + + ln 1 +
2 σ 2 2 σ 2
MIT 2.717
Image quality metrics p-19
IMI vs source separation
1
(SNR ) =
σ2
underdetermined overdetermined
(more unknowns than (more measurements
measurements) than unknowns)
MIT 2.717
Image quality metrics p-21
IMI for rectangular matrices (2)
HT = square matrix
recall pseudo-inverse (
fˆ = H T H )
−1
HT g
( )
eigenvalues H T H ≡ singular values(H )
MIT 2.717
Image quality metrics p-22
IMI for rectangular matrices (3)
object
“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
under/over determined
1 n τk
C = ∑ ln1 + 2
singular values
of H
2 k =1 σ
MIT 2.717
Image quality metrics p-23
Confocal microscope
Small pinhole:
Depth resolution
virtual slice pinhole
Light efficiency
object
beam Large pinhole:
splitter Intensity
detector Depth resolution
Light efficiency
MIT 2.717
Image quality metrics p-24
Depth “resolution” vs. noise
point sources,
Object structure: mutually
incoherent NA=0.2
optical axis
sampling distance
Imaging method
correspondence intensity
measurements
CFM
object scanning
direction
MIT 2.717
Image quality metrics p-25
Depth “resolution”
vs. noise & pinhole size
MIT 2.717
Image quality metrics p-27
Other image quality metrics
• Mean Square Error (MSQ) between object and image
∑( ) ˆf = result of
2
E= f k − fˆk
k inversion
object
samples
– e.g. pseudoinverse minimizes MSQ in an overdetermined problem
– obvious problem: most of the time, we don’t know what f is!
– more when we deal with Wiener filters and regularization
MIT 2.717
Image quality metrics p-28
Receiver Operator Characteristic
MIT 2.717
Image quality metrics p-29