Académique Documents
Professionnel Documents
Culture Documents
Who are
you?
either
studying Automation and Robotics (Master of
Science)
Module Optimization
or
studying Informatik
- BA-Modul Einfhrung in die Computational
Intelligence
- Hauptdiplom-Wahlvorlesung (SPG 6 & 7)
Who am I
?
Gnter Rudolph
Fakultt fr Informatik, LS 11
office hours:
Tuesday, 10:3011:30am
and by appointment
Information
http://ls11-www.cs.unidortmund.de/people/rudolph/
teaching/lectures/CI/WS2009-10/lecture.jsp
Knowledge about
mathematics,
programming,
logic
is helpful.
What is CI ?
) umbrella term for computational methods inspired by
nature
fuzzy systems
swarm intelligence
artificial immune systems new
developments
growth processes in trees
...
Biological Prototype
Abstraction
axon
nucleus /
dendrites
cell body synapse
Model
x1
x2 funktion f
f(x1, x2, , xn)
xn
McCulloch-Pitts-Neuron 1943:
xi { 0, 1 } =: B
f: Bn B
basic idea:
- neuron is either active or inactive
- skills result from connecting neurons
McCulloch-Pitts-Neuron
xn xn ...
=1 =n
G. Rudolph: Computational Intelligence Winter Term 2009/10
13
Introduction to Artificial Neural Networks Lecture 01
McCulloch-Pitts-Neuron NOT
x1 0
n binary input signals x1, , xn
threshold > 0 y1
in addition: m binary inhibitory signals y1, , ym
Analogons
Assumption: x1
inputs also available in inverted form, i.e. 9 inverted inputs. x2
) x1 + x2
Theorem:
Every logical function F: Bn B can be simulated
with a two-layered McCulloch/Pitts net.
Example:
x1
x2 3
x3
x1
x2 3 1
x3
x1 2
x4
G. Rudolph: Computational Intelligence Winter Term 2009/10
16
Introduction to Artificial Neural Networks Lecture 01
x1 0,2
fires 1 if 0,2 x1 + 0,4 x2 + 0,3 x3 0,7 10
x2 0,4 0,7
0,3
2 x1 + 4 x2 + 3 x3 7
x3
)
duplicate inputs!
x1
x2 7
x3 ) equivalent!
Theorem:
Weighted and unweighted MCP-nets are equivalent for weights 2 Q+.
Proof:
) Let N
Threshold = a0 b1 bn
difficult to construct
N 0 N 0
Example:
1 separating line
J
separates R2
0 N in 2 classes
0 1
G. Rudolph: Computational Intelligence Winter Term 2009/10
21
Introduction to Artificial Neural Networks Lecture 01
=0 =1
0
0 1
XOR x1 x2 xor
0 0 0 )0 <
1 w1, w2 > 0
0 1 1 ) w2
? ) w1 + w2 2
1 0 1 ) w1
0
0 1 1 1 0 ) w1 + w2 <
contradiction!
w1 x1 + w2 x2
disillusioning result:
perceptions fail to solve a number of trivial problems!
- XOR-Problem
- Parity-Problem
- Connectivity-Problem
1. Multilayer Perceptrons:
x1
2
x2
1 ) realizes XOR
x1
2
x2
1 g(0,0) = 1
g(0,1) = +1
g(1,0) = +1
0 g(1,1) = 1
0 1
G. Rudolph: Computational Intelligence Winter Term 2009/10
24
Introduction to Artificial Neural Networks Lecture 01
as yet: by construction
example: NAND-gate
x1 x2 NAND
0 0 1 )0
0 1 1 ) w2 requires solution of a system of
1 0 1 ) w1 linear inequalities (2 P)
Perceptron Learning
Assumption: test examples with correct I/O behavior available
Principle:
(1) choose initial weights in arbitrary manner
(2) fed in test pattern
(3) if output of perceptron wrong, then change weights
(4) goto (2) until correct output for al test paterns
graphically:
1. choose w0 at random, t = 0
2. choose arbitrary x 2 P [ N
3. if x 2 P and wtx > 0 then goto 2
I/O correct!
if x 2 N and wtx 0 then goto 2
4. if x 2 P and wtx 0 then let wx 0, should be > 0!
wt+1 = wt + x; t++; goto 2 (w+x)x = wx + xx > w x
Example
1 -
threshold as a weight: w = (, w1, w2) x1 w 0
x2 w1
2
)
8 a,b 2 X:
a + (1-) b 2
X
for 2 (0,1) G. Rudolph: Computational Intelligence Winter Term 2009/10
29
Introduction to Artificial Neural Networks Lecture 01