Vous êtes sur la page 1sur 29

Image Quality Metrics

• Image quality metrics


• Mutual information (cross-entropy) metric
• Intuitive definition
• Rigorous definition using entropy
• Example: two-point resolution problem
• Example: confocal microscopy
• Square error metric
• Receiver Operator Characteristic (ROC)
• Heterodyne detection

MIT 2.717
Image quality metrics p-1
Linear inversion model
object

“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
inversion problem:
determine f, given the measurement g =H f
(noise variance) σ2
noise-to-signal ratio (NSR) = = =σ 2
(average signal power) 1

normalizing signal power to 1


MIT 2.717
Image quality metrics p-2
Mutual information (cross-entropy)
object

“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g

1 n
µk 
2
C = ∑ ln1 + 2 
eigenvalues
of H
2 k =1  σ 

MIT 2.717
Image quality metrics p-3
The significance of eigenvalues
(aka how many
rank of dimensions 1 n  µ k2 
measurement the measurement C = ∑ ln1 + 2 
is worth) 2 k =1  σ 

n
n-1
...

1 σ2
0
µ n2 µ n−1
2 ... µ 22 µ12

eigenvalues of H
MIT 2.717
Image quality metrics p-4
Precision of measurement
1 n  µ k2 
C = ∑ ln1 + 2  = µ t2 < σ 2 < µ t2−1
2 k =1  σ  noise floor

 µ t2− 2   µ t2−1   µ t2 
... + ln1 + 2  + ln1 + 2  + ln1 + 2  + ...
 σ   σ   σ 
≈precision this term
of (t-2)th measurement ≤1

E.g. 0.5470839348

these digits worthless this term


if σ ≈10-5 ≈0

MIT 2.717
Image quality metrics p-5
Formal definition of cross-entropy (1)

Entropy in thermodynamics (discrete systems):


• log2[how many are the possible states of the system?]

E.g. two-state system: fair coin, outcome=heads (H) or tails (T)


Entropy=log22=1

Unfair coin: seems more reasonable to “weigh” the two states


according to their frequencies of occurence (i.e., probabilities)

Entropy = − ∑ p(state ) log 2 p(state )


states

MIT 2.717
Image quality metrics p-6
Formal definition of cross-entropy (2)
• Fair coin: p(H)=1/2; p(T)=1/2
1 1 1 1
Entropy = − log 2 − log 2 = 1 bit
2 2 2 2

• Unfair coin: p(H)=1/4; p(T)=3/4


1 1 3 3
Entropy = − log 2 − log 2 = 0.81 bits
4 4 4 4

Maximum entropy ⇔ Maximum uncertainty

MIT 2.717
Image quality metrics p-7
Formal definition of cross-entropy (3)
Joint Entropy
log2[how many are the possible states of a combined variable
obtained from the Cartesian product of two variables?]

Joint Entropy( X , Y ) = − ∑ ∑ p(x, y )log p(x, y )2


states states
x∈ X y∈Y

object E.g. Joint Entropy(F , G ) = ?

“physical
hardware attributes”
channel (measurement)
field
f propagation H detection g
MIT 2.717
Image quality metrics p-8
Formal definition of cross-entropy (4)
Conditional Entropy
log2[how many are the possible states of a combined variable
given the actual state of one of the two variables?]

Cond. Entropy(Y | X ) = − ∑ ∑ p(x, y )log p( y | x )


2
states states
x∈X y∈Y

object E.g. Cond. Entropy(G | F ) = ?

“physical
hardware attributes”
channel (measurement)
field
f propagation H detection g
MIT 2.717
Image quality metrics p-9
Formal definition of cross-entropy (5)
object

“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g

Noise adds uncertainty to the measurement wrt the object


⇔ eliminates information from the measurement wrt object

MIT 2.717
Image quality metrics p-10
Formal definition of cross-entropy (6)
uncertainty added due to noise

Cond. Entropy(F | G )
representation by
Seth Lloyd, 2.100

Entropy(F ) C(F , G ) Entropy(G )


information information
contained contained
in the object in the measurement

Cond. Entropy(G | F ) cross-entropy


(aka mutual information)
information eliminated due to noise

MIT 2.717
Image quality metrics p-11
Formal definition of cross-entropy (7)
Joint Entropy(F , G )

Cond. Entropy(F | G ) Cond. Entropy(G | F )


C(F , G )

Entropy(F ) Entropy(G )

MIT 2.717
Image quality metrics p-12
Formal definition of cross-entropy (8)

Physical Channel
(transform)
F G
information information
source receiver
(object) (measurement)
Corruption source (Noise)

C( F , G ) = Entropy(F ) − Cond. Entropy( F | G )


= Entropy(G ) − Cond. Entropy(G | F )
= Entropy(F ) + Entropy(G ) − Joint Entropy( F , G )

MIT 2.717
Image quality metrics p-13
Entropy & Differential Entropy
• Discrete objects (can take values among a discrete set of states)
– definition of entropy
Entropy = −∑ p ( xk ) log 2 p ( xk )
k
– unit: 1 bit (=entropy value of a YES/NO question with 50%
uncertainty)
• Continuous objects (can take values from among a continuum)
– definition of differential entropy

Diff. Entropy = − ∫( p) (x )ln p(x ) dx


Ω X
– unit: 1 nat (=diff. entropy value of a significant digit in the
representation of a random number, divided by ln10)

MIT 2.717
Image quality metrics p-14
Image Mutual Information (IMI)
object

“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
Assumptions: (a) F has Gaussian statistics
(b) white additive Gaussian noise (waGn)
i.e. g=Hf+w
where W is a Gaussian random vector with diagonal
correlation matrix
1 n  µ k2 
Then C(F , G ) = ∑ ln1 + 2  µ k : eigenvalues of H
2 k =1  σ 
MIT 2.717
Image quality metrics p-15
Mutual information &
rank of
measurement degrees of freedom
n
n-1
...

1 σ2
0
µ n2 µ n2−1 ... µ 22 µ12
mutual
information
1 n  µ k2  As noise increases
C = ∑ ln1 + 2  • one rank of H is lost whenever
2 k =1  σ  σ2 overcomes a new eigenvalue
• the remaining ranks lose precision

σ2
MIT 2.717
Image quality metrics p-16
Example: two-point resolution
Finite-NA imaging system, unit magnification

Two point-sources Two point-detectors


(object) (measurement)
~
fA A A gA intensities
x measured
~
fB B B gB
Classical view
noiseless
intensity
intensities
@detector
emitted
plane
g A gB
MIT 2.717 x
Image quality metrics p-17
Cross-leaking power

g A = f A + sf B
g B = sf A + f B

s = sinc ( x )
2

~ ~
A B
MIT 2.717
Image quality metrics p-18
IMI for two-point resolution problem

1 s  µ1 = 1 + s
H =   det (H ) = 1 − s 2

 s 1 µ2 = 1− s

−11  1 − s
H = 
2 

1− s  − s 1 

1  (1 − s )  1  (1 + s ) 
2 2
C(F , G ) = ln1 +  + ln  1 + 
2  σ 2  2  σ 2 

MIT 2.717
Image quality metrics p-19
IMI vs source separation

1
(SNR ) =
σ2

MIT 2.717 s→1 s→0


Image quality metrics p-20
IMI for rectangular matrices (1)
H = =

underdetermined overdetermined
(more unknowns than (more measurements
measurements) than unknowns)

eigenvalues cannot be computed, but instead


we compute the singular values of the
rectangular matrix

MIT 2.717
Image quality metrics p-21
IMI for rectangular matrices (2)
HT = square matrix

recall pseudo-inverse (
fˆ = H T H )
−1
HT g

inversion operation associated with rank of

( )
eigenvalues H T H ≡ singular values(H )

MIT 2.717
Image quality metrics p-22
IMI for rectangular matrices (3)
object

“physical
hardware
attributes”
channel
(measurement)
field
f propagation H detection g
under/over determined
1 n  τk 
C = ∑ ln1 + 2 
singular values
of H
2 k =1  σ 

MIT 2.717
Image quality metrics p-23
Confocal microscope

Small pinhole:

Depth resolution
virtual slice pinhole
Light efficiency

object
beam Large pinhole:
splitter Intensity
detector Depth resolution

Light efficiency

MIT 2.717
Image quality metrics p-24
Depth “resolution” vs. noise
point sources,
Object structure: mutually
incoherent NA=0.2

optical axis

sampling distance

Imaging method

correspondence intensity
measurements
CFM
object scanning
direction

MIT 2.717
Image quality metrics p-25
Depth “resolution”
vs. noise & pinhole size

units: Rayleigh distance


MIT 2.717
Image quality metrics p-26
IMI summary
• It quantifies the number of possible states of the object that the
imaging system can successfully discern; this includes
– the rank of the system, i.e. the number of object dimensions that
the system can map
– the precision available at each rank, i.e. how many significant
digits can be reliably measured at each available dimension
• An alternative interpretation of IMI is the game of “20 questions:” how
many questions about the object can be answered reliably based on the
image information?
• IMI is intricately linked to image exploitation for applications, e.g.
medical diagnosis, target detection & identification, etc.
• Unfortunately, it can be computed in closed form only for additive
Gaussian statistics of both object and image; other more realistic
models are usually intractable

MIT 2.717
Image quality metrics p-27
Other image quality metrics
• Mean Square Error (MSQ) between object and image

∑( ) ˆf =  result of 
2
E= f k − fˆk
k  inversion 
object
samples  
– e.g. pseudoinverse minimizes MSQ in an overdetermined problem
– obvious problem: most of the time, we don’t know what f is!
– more when we deal with Wiener filters and regularization

• Receiver Operator Characteristic


– measures the performance of a cognitive system (human or
computer program) in a detection or estimation task based on the
image data

MIT 2.717
Image quality metrics p-28
Receiver Operator Characteristic

Target detection task

Example: medical diagnosis,


• H0 (null hypothesis) =
no tumor
• H1 = tumor

TP = true positive (i.e. correct


identification of tumor)
FP = false positive (aka false
alarm)

MIT 2.717
Image quality metrics p-29

Vous aimerez peut-être aussi