Académique Documents
Professionnel Documents
Culture Documents
Techniques
Tan,Steinbach, Kumar
4/18/2004
Rule-Based Classifier
Classify
Rule:
(Condition) y
where
Tan,Steinbach, Kumar
4/18/2004
human
python
salmon
whale
frog
komodo
bat
pigeon
cat
leopard shark
turtle
penguin
porcupine
eel
salamander
gila monster
platypus
owl
dolphin
eagle
Blood Type
warm
cold
cold
warm
cold
cold
warm
warm
warm
cold
cold
warm
warm
cold
cold
cold
warm
warm
warm
warm
Give Birth
yes
no
no
yes
no
no
yes
no
yes
yes
no
no
yes
no
no
no
no
no
yes
no
Can Fly
no
no
no
no
no
no
yes
yes
no
no
no
no
no
no
no
no
no
yes
no
yes
Live in Water
no
no
yes
yes
sometimes
no
no
no
no
yes
sometimes
sometimes
no
yes
sometimes
no
no
no
yes
no
Class
mammals
reptiles
fishes
mammals
amphibians
reptiles
mammals
birds
mammals
fishes
reptiles
birds
mammals
fishes
amphibians
reptiles
mammals
birds
mammals
birds
4/18/2004
Application of Rule-Based
Classifier
A rule
hawk
grizzly bear
Blood Type
warm
warm
Give Birth
Can Fly
Live in Water
Class
no
yes
yes
no
no
no
?
?
4/18/2004
4/18/2004
lemur
turtle
dogfish shark
Blood Type
warm
cold
cold
Give Birth
Can Fly
Live in Water
Class
yes
no
yes
no
no
no
no
sometimes
yes
?
?
?
Tan,Steinbach, Kumar
4/18/2004
Characteristics of Rule-Based
Classifier
Mutually
exclusive rules
Classifier contains mutually exclusive rules if
the rules are independent of each other
Every record is covered by at most one rule
Exhaustive
rules
Classifier has exhaustive coverage if it
accounts for every possible combination of
attribute values
Each record is covered by at least one rule
Tan,Steinbach, Kumar
4/18/2004
S
{
l e
<
>
Tan,Steinbach, Kumar
4/18/2004
S
i
l e
<
Yes
Single
125K
No
No
Married
100K
No
No
Single
70K
No
Yes
Married
120K
No
No
Divorced 95K
Yes
No
Married
No
Yes
Divorced 220K
No
No
Single
85K
Yes
No
Married
75K
No
10
No
Single
90K
Yes
1
o
Taxable
Income Cheat
>
60K
10
Initial Rule:
(Refund=No) (Status=Married) No
4/18/2004
Tan,Steinbach, Kumar
4/18/2004
10
When
turtle
Tan,Steinbach, Kumar
Blood Type
cold
Give Birth
Can Fly
Live in Water
Class
no
no
sometimes
4/18/2004
11
ordering
Class-based
ordering
Tan,Steinbach, Kumar
4/18/2004
12
Method:
Indirect
Method:
Tan,Steinbach, Kumar
4/18/2004
13
Tan,Steinbach, Kumar
4/18/2004
14
(ii) Step 1
Tan,Steinbach, Kumar
4/18/2004
15
R1
R1
R2
(iii) Step 2
Tan,Steinbach, Kumar
(iv) Step 3
4/18/2004
16
Growing
Instance
Rule
Evaluation
Stopping
Rule
Elimination
Criterion
Pruning
Tan,Steinbach, Kumar
4/18/2004
17
Rule Growing
Tan,Steinbach, Kumar
4/18/2004
18
CN2 Algorithm:
Start from an empty conjunct: {}
Add conjuncts that minimizes the entropy measure: {A}, {A,B},
Determine the rule consequent by taking majority class of instances
covered by the rule
RIPPER Algorithm:
Start from an empty rule: {} => class
Add conjuncts that maximizes FOILs information gain measure:
R0: {} => class (initial rule)
R1: {A} => class (rule after adding conjunct)
Gain(R0, R1) = t [ log (p1/(p1+n1)) log (p0/(p0 + n0)) ]
where t: number of positive instances covered by both R0 and R1
p0: number of positive instances covered by R0
n0: number of negative instances covered by R0
p1: number of positive instances covered by R1
n1: number of negative instances covered by R1
Tan,Steinbach, Kumar
4/18/2004
19
Instance Elimination
Why do we need to
eliminate instances?
Otherwise, the next rule is
identical to previous rule
Why do we remove
positive instances?
Ensure that the next rule is
different
Why do we remove
negative instances?
Prevent underestimating
accuracy of rule
Compare rules R2 and R3
in the diagram
Tan,Steinbach, Kumar
4/18/2004
20
Rule Evaluation
Metrics:
nc
Accuracy
n
nc 1
Laplace
nk
nc kp
M-estimate
nk
Tan,Steinbach, Kumar
n : Number of instances
covered by rule
nc : Number of instances
covered by rule
k : Number of classes
p : Prior probability
4/18/2004
21
criterion
Compute the gain
If gain is not significant, discard the new rule
Rule
Pruning
Similar to post-pruning of decision trees
Reduced Error Pruning:
Remove one of the conjuncts in the rule
Compare error rate on validation set before and
after pruning
If error improves, prune the conjunct
Tan,Steinbach, Kumar
4/18/2004
22
a single rule
Remove
Prune
Add
Repeat
Tan,Steinbach, Kumar
4/18/2004
23
Tan,Steinbach, Kumar
4/18/2004
24
Growing a rule:
Start from empty rule
Add conjuncts as long as they improve FOILs
information gain
Stop when rule no longer covers negative examples
Prune the rule immediately using incremental reduced
error pruning
Measure for pruning: v = (p-n)/(p+n)
p: number of positive examples covered by the rule in
the validation set
n: number of negative examples covered by the rule in
the validation set
4/18/2004
25
a Rule Set:
Use sequential covering algorithm
Finds the best rule that covers the current set of
positive examples
Eliminate both positive and negative examples
covered by the rule
Tan,Steinbach, Kumar
4/18/2004
26
Compare the rule set for r against the rule set for r*
and r
Choose rule set that minimizes MDL principle
4/18/2004
27
Indirect Methods
Tan,Steinbach, Kumar
4/18/2004
28
4/18/2004
29
Tan,Steinbach, Kumar
4/18/2004
30
Example
Name
human
python
salmon
whale
frog
komodo
bat
pigeon
cat
leopard shark
turtle
penguin
porcupine
eel
salamander
gila monster
platypus
owl
dolphin
eagle
Tan,Steinbach, Kumar
Give Birth
yes
no
no
yes
no
no
yes
no
yes
yes
no
no
yes
no
no
no
no
no
yes
no
Lay Eggs
no
yes
yes
no
yes
yes
no
yes
no
no
yes
yes
no
yes
yes
yes
yes
yes
no
yes
Can Fly
no
no
no
no
no
no
yes
yes
no
no
no
no
no
no
no
no
no
yes
no
yes
no
no
yes
yes
sometimes
no
no
no
no
yes
sometimes
sometimes
no
yes
sometimes
no
no
no
yes
no
yes
no
no
no
yes
yes
yes
yes
yes
no
yes
yes
yes
no
yes
yes
yes
yes
no
yes
Class
mammals
reptiles
fishes
mammals
amphibians
reptiles
mammals
birds
mammals
fishes
reptiles
birds
mammals
fishes
amphibians
reptiles
mammals
birds
mammals
birds
4/18/2004
31
Give
Birth?
Yes
No
Live In
Water?
Mammals
Yes
( ) Amphibians
RIPPER:
No
Sometimes
Fishes
Yes
Birds
Tan,Steinbach, Kumar
Can
Fly?
Amphibians
Reptiles
No
Reptiles
4/18/2004
32
0
0
0
3
0
Mammals
0
1
0
0
6
0
0
0
2
0
Mammals
2
0
1
1
4
RIPPER:
PREDICTED CLASS
Amphibians Fishes Reptiles Birds
ACTUAL Amphibians
0
0
0
CLASS Fishes
0
3
0
Reptiles
0
0
3
Birds
0
0
1
Mammals
0
2
1
Tan,Steinbach, Kumar
4/18/2004
33
Advantages of Rule-Based
Classifiers
As
Tan,Steinbach, Kumar
4/18/2004
34
Instance-Based Classifiers
S
A tr1
...
A trN
C la s s
A
B
B
C
A
A tr1
...
A trN
C
B
Tan,Steinbach, Kumar
4/18/2004
35
Rote-learner
Memorizes entire training data and performs
classification only if attributes of record match one of
the training examples exactly
Nearest neighbor
Uses k closest points (nearest neighbors) for
performing classification
Tan,Steinbach, Kumar
4/18/2004
36
idea:
If it walks like a duck, quacks like a duck, then
its probably a duck
Compute
Distance
Training
Records
Tan,Steinbach, Kumar
Test
Record
Choose k of the
nearest records
4/18/2004
37
Nearest-Neighbor Classifiers
Tan,Steinbach, Kumar
4/18/2004
38
4/18/2004
39
1 nearest-neighbor
Voronoi Diagram
Tan,Steinbach, Kumar
4/18/2004
40
d ( p, q )
( pi
i
q )
Determine
Tan,Steinbach, Kumar
4/18/2004
41
the value of k:
Tan,Steinbach, Kumar
4/18/2004
42
issues
Attributes may have to be scaled to prevent
distance measures from being dominated by
one of the attributes
Example:
height of a person may vary from 1.5m to 1.8m
weight of a person may vary from 90lb to 300lb
income of a person may vary from $10K to $1M
Tan,Steinbach, Kumar
4/18/2004
43
curse of dimensionality
vs
100000000000
011111111111
000000000001
d = 1.4142
d = 1.4142
Tan,Steinbach, Kumar
4/18/2004
44
Tan,Steinbach, Kumar
4/18/2004
45
Example: PEBLS
PEBLS:
Tan,Steinbach, Kumar
4/18/2004
46
Example: PEBLS
Tid Refund Marital
Status
Taxable
Income Cheat
Yes
Single
125K
No
d(Single,Married)
No
Married
100K
No
No
Single
70K
No
Yes
Married
120K
No
No
Divorced 95K
Yes
No
Married
No
d(Married,Divorced)
Yes
Divorced 220K
No
No
Single
85K
Yes
d(Refund=Yes,Refund=No)
No
Married
75K
No
10
No
Single
90K
Yes
60K
d(Single,Divorced)
10
Marital Status
Class
Refund
Single
Married
Divorced
Yes
No
Tan,Steinbach, Kumar
Class
Yes
No
Yes
No
d (V1 ,V2 )
i
4/18/2004
n1i n2i
n1 n2
47
Example: PEBLS
( X , Y ) wX wY d ( X i , Yi )
i 1
where:
Tan,Steinbach, Kumar
4/18/2004
48
Bayes Classifier
A probabilistic
problems
Conditional Probability:
P ( A, C )
P (C | A)
P ( A)
P ( A, C )
P( A | C )
P (C )
Bayes theorem:
P ( A | C ) P (C )
P (C | A)
P ( A)
Tan,Steinbach, Kumar
4/18/2004
49
P ( S | M ) P ( M ) 0.5 1 / 50000
P( M | S )
0.0002
P( S )
1 / 20
Tan,Steinbach, Kumar
4/18/2004
50
Bayesian Classifiers
Tan,Steinbach, Kumar
4/18/2004
51
Bayesian Classifiers
Approach:
compute the posterior probability P(C | A1, A2, , An) for
all values of C using the Bayes theorem
P (C | A A A )
1
P ( A A A | C ) P (C )
P( A A A )
1
Tan,Steinbach, Kumar
4/18/2004
52
Tan,Steinbach, Kumar
4/18/2004
53
Tan,Steinbach, Kumar
4/18/2004
54
continuous attributes:
Discretize the range into bins
one ordinal attribute per bin
violates independence assumption
Tan,Steinbach, Kumar
4/18/2004
55
Normal distribution:
1
P( A | c )
e
2
i
( Ai ij ) 2
2 ij2
ij
1
P ( Income 120 | No)
e
2 (54.54)
Tan,Steinbach, Kumar
( 120110 ) 2
2 ( 2975 )
0.0072
4/18/2004
56
P(X|Class=No) = P(Refund=No|Class=No)
P(Married| Class=No)
P(Income=120K| Class=No)
= 4/7 4/7 0.0072 = 0.0024
=> Class = No
Tan,Steinbach, Kumar
4/18/2004
57
Tan,Steinbach, Kumar
c: number of classes
p: prior probability
m: parameter
4/18/2004
58
human
python
salmon
whale
frog
komodo
bat
pigeon
cat
leopard shark
turtle
penguin
porcupine
eel
salamander
gila monster
platypus
owl
dolphin
eagle
Give Birth
yes
Give Birth
yes
no
no
yes
no
no
yes
no
yes
yes
no
no
yes
no
no
no
no
no
yes
no
Can Fly
no
no
no
no
no
no
yes
yes
no
no
no
no
no
no
no
no
no
yes
no
yes
Can Fly
no
Tan,Steinbach, Kumar
no
no
yes
yes
sometimes
no
no
no
no
yes
sometimes
sometimes
no
yes
sometimes
no
no
no
yes
no
Class
yes
no
no
no
yes
yes
yes
yes
yes
no
yes
yes
yes
no
yes
yes
yes
yes
no
yes
mammals
non-mammals
non-mammals
mammals
non-mammals
non-mammals
mammals
non-mammals
mammals
non-mammals
non-mammals
non-mammals
mammals
non-mammals
non-mammals
non-mammals
mammals
non-mammals
mammals
non-mammals
yes
no
Class
A: attributes
M: mammals
N: non-mammals
6 6 2 2
P ( A | M ) 0.06
7 7 7 7
1 10 3 4
P ( A | N ) 0.0042
13 13 13 13
7
P ( A | M ) P ( M ) 0.06 0.021
20
13
P ( A | N ) P ( N ) 0.004 0.0027
20
P(A|M)P(M) > P(A|N)P(N)
=> Mammals
4/18/2004
59
Handle
Robust
to irrelevant attributes
Independence
attributes
Use other techniques such as Bayesian Belief
Networks (BBN)
Tan,Steinbach, Kumar
4/18/2004
60
Tan,Steinbach, Kumar
4/18/2004
61
Y I ( 0 .3 X 1 0 .3 X 2 0 .3 X 3 0 .4 0 )
1
where I ( z )
0
Tan,Steinbach, Kumar
if z is true
otherwise
4/18/2004
62
Model is an assembly of
inter-connected nodes and
weighted links
Perceptron Model
Y I ( wi X i t )
or
Y sign( wi X i t )
i
Tan,Steinbach, Kumar
4/18/2004
63
Tan,Steinbach, Kumar
4/18/2004
64
Adjust
Tan,Steinbach, Kumar
4/18/2004
65
Find a linear hyperplane (decision boundary) that will separate the data
Tan,Steinbach, Kumar
4/18/2004
66
Tan,Steinbach, Kumar
4/18/2004
67
Tan,Steinbach, Kumar
4/18/2004
68
Tan,Steinbach, Kumar
4/18/2004
69
Tan,Steinbach, Kumar
4/18/2004
70
Tan,Steinbach, Kumar
4/18/2004
71
w x b 0
w x b 1
w x b 1
f ( x)
1
if w x b 1
1 if w x b 1
Tan,Steinbach, Kumar
2
Margin 2
|| w ||
4/18/2004
72
want to maximize:
2
Margin 2
|| w ||
2
|| w ||
Which is equivalent to minimizing: L( w)
2
f ( xi )
1 if w x i b 1
Tan,Steinbach, Kumar
4/18/2004
73
Tan,Steinbach, Kumar
4/18/2004
74
Subject to:
f ( xi )
Tan,Steinbach, Kumar
1
if w x i b 1 - i
1 if w x i b 1 i
4/18/2004
75
Tan,Steinbach, Kumar
4/18/2004
76
Tan,Steinbach, Kumar
4/18/2004
77
Ensemble Methods
Construct
data
Predict
Tan,Steinbach, Kumar
4/18/2004
78
General Idea
Tan,Steinbach, Kumar
4/18/2004
79
(
1
)
0.06
i
i 13
25
Tan,Steinbach, Kumar
4/18/2004
80
Tan,Steinbach, Kumar
4/18/2004
81
Bagging
Sampling
with replacement
Original Data
Bagging (Round 1)
Bagging (Round 2)
Bagging (Round 3)
Build
1
7
1
1
2
8
4
8
3
10
9
5
4
8
1
10
5
2
2
5
6
5
3
5
7
10
2
9
8
10
7
6
9
5
3
3
10
9
2
7
Each
Tan,Steinbach, Kumar
4/18/2004
82
Boosting
An
Tan,Steinbach, Kumar
4/18/2004
83
Boosting
Records
1
7
5
4
2
3
4
4
3
2
9
8
4
8
4
10
5
7
2
4
6
9
5
5
7
4
1
4
8
10
7
6
9
6
4
3
10
3
2
4
Tan,Steinbach, Kumar
4/18/2004
84
Example: AdaBoost
Error rate:
1
i
N
w C ( x ) y
N
j 1
Importance of a classifier:
1 1 i
i ln
2 i
Tan,Steinbach, Kumar
4/18/2004
85
Example: AdaBoost
Weight
update:
exp j
if C j ( xi ) yi
w
wi
Z j exp j if C j ( xi ) yi
where Z j is the normalization factor
( j 1)
( j)
i
If
C * ( x ) arg max j C j ( x ) y
y
Tan,Steinbach, Kumar
j 1
4/18/2004
86
Illustrating AdaBoost
Initial weights for each data point
Tan,Steinbach, Kumar
Data points
for training
4/18/2004
87
Illustrating AdaBoost
Tan,Steinbach, Kumar
4/18/2004
88