Vous êtes sur la page 1sur 3

BILKO study

Skewness
Skewnessis a measure of symmetry, or more precisely, the lac
k of symmetry. A distribution, or data set, is symmetric if it loo
ks the same to the left and right of the center point.
The tapering sides are called tails.
Negative skew: The left tail is longer; the mass of the distribution is concentrated o
n the right of the figure. The distribution is said to be left-skewed, left-tailed, or ske
wed to the left.
Positive skew: The right tail is longer; the mass of the distribution is concentrated o
n the left of the figure. The distribution is said to be right-skewed, right-tailed, or sk
ewed to the right.

Kurtosis

Kurtosis is a measure of whether the data are


peaked or flat relative to a normal distribution.
The height and sharpness of the peak relative t
o the rest of the data are measured by a numb
er called kurtosis.
Higher values indicate a higher, sharper peak; lower
values indicate a lower, less distinct peak.

Entropy
Shannon entropy gives a measure of uncertainty about it
s actual structure. Shannons function is based on the co
ncept that the information gain from an event is inversely
related to its probability of occurrence. Several authors h
ave used Shannons concept for image processing and pa
ttern recognition problems.
The entropy is an important factor to estimate whether t
he digital image is basically the same with the original im
age. Usually, the higher the resolution is , the more simila
r the digital image is to the original one.
In order to measure the diversification, entropy is a widel
y accepted measure of diversity.