Vous êtes sur la page 1sur 31

Unsupervised Learning

Klinkhachorn:CpE520

Unsupervised Learning Introduction

Klinkhachorn:CpE520

Classication of ANN Paradigms

Unsupervised

Klinkhachorn:CpE520

Supervised Learning Paradigms


Training data consists of both the input and the desired output
weights are adjusted according to a difference between desired output and actual output

The output for every input set is known before training starts
Guarantees if you successfully train you know what the network does!

Typical networks
Back Propagation..BAM
Klinkhachorn:CpE520

Unsupervised Learning Paradiagms


Training data consists only of inputs to the network Outputs for training sets are unknown until training is complete Typical networks
Kohonen Counter Propagation Adaptive resonance Theory
Klinkhachorn:CpE520

Unsupervised Learning: Example


Simple case
1 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.2 0.4 0.6 0.8 1 Series1

Data x .1 .8 .7 .2 .8 .3 y .2 .9 .7 .1 .8 .1

Klinkhachorn:CpE520

Unsupervised Learning: Example


Complex case
10 dimensional data set!
0,.1,.3,.6,.3,.7,.6,.4,.9,.2 1,.3,.9,.5,.7,.1,.3,.2,.8,.1 .. .. what??

Might need Unsupervised Learning!


Klinkhachorn:CpE520

Unsupervised Learning Mechanisms


Competitive Learning
Neurons compete with each other to form output Uses include Pattern Classications

Lateral Inhibition
Neurons inhibit the output of nearby neurons Uses include edge enhancement

Neighborhood Adaptations
Neurons and those nearby are adapted Uses include self-organizing maps
Klinkhachorn:CpE520

Semi-Supervised Paradigms
Descriptive term for networks that are not cleanly supervised or unsupervised (Catch-all term) Many possible forms
Pre-processing of data to form clusters Input/output specied but internally some unsupervised learning takes place (CounterPropagation) Post-processing of data

Klinkhachorn:CpE520

Unsupervised Neural Networks

Klinkhachorn:CpE520

Competitive Learning Network


Output Classication Competitive Layer A PATTERN B C

Input Layer

Input Pattern 1

1
Klinkhachorn:CpE520

Competitive Learning Network


1 if Sj >Sk for all units Output = 0, otherwise j Wj1 Wj2 Wji Sj = S Xj * Wji

Klinkhachorn:CpE520

Competitive Learning Algorithm


Assign random weights to all neurons (normalized, Siwji =1) Calculate output of each neuron Determine Winner Adjust weights of Winner Repeat for all pairs until convergence

Klinkhachorn:CpE520

Competitive Learning Algorithm


Adjust weights of Winner
DWji(t+1) = h*(Xi/m - Wji) Where Wji(t+1) h Xi m Wji = new weight of neuron j from input i = learning coefficient (typically .1-.3) = value of input i = magnitude of input squared (sum of all 1s) = old weight of neuron j from input i

Klinkhachorn:CpE520

Competitive Learning: Example


Two classes A & B Two inputs X1 & X2
X1
WA1 WA2 WB1 WB2

X2

Klinkhachorn:CpE520

Competitive Learning:

Example (Cont)

Let

WA1 = 0.3, WA2 = 0.7, WB1 = 0.6, WB2 = 0.4, and h = 0.3

If X1 = 0, and X2 = 1 then A = 0.3*0 + 0.7*1 = 0.7 and B = 0.6*0 + 0.4 *1 = 0.4


Klinkhachorn:CpE520

Competitive Learning:

Example (Cont)

A is the winner, then WA1 = 0.3 + 0.3(0/1 -0.3) = 0.21 WA2 = 0.7 + 0.3(1/1-0.7) = 0.79 Repeat for all input pair until convergence ** Si DWji = h*(Si Xi/m - Si Wji) = 0
=m =1
Klinkhachorn:CpE520

Competitive Learning:

Note

This is a simple network and not guaranteed to converge if the input set is not separated into well define clusters or if a sufficient number of output neurons are not provided! The fewer the output neurons, the more general classification while more output neurons yield more specialized classes.

Klinkhachorn:CpE520

Competitive Learning

: A Simplified Pattern Classification Example

P1 = (101) P2 = (100) P3 = (010) P4 = (011) Training Set Network Configuration

A B

Klinkhachorn:CpE520

Competitive Learning

: A Simplified Pattern Classification Example


A B P1 = (101) P2 = (100) P3 = (010) P4 = (011)

Hamming Distances

P1 P2 P3 P4 P1 0 1 3 2 P2 1 P3 3 P4 2 0 2 3 2 0 1 3 1 0

Class A Class B

Klinkhachorn:CpE520

A Simple Competitive Learning Example

Klinkhachorn:CpE520

Training Set

Training set consisted of all possible pairs of adjacent points

Klinkhachorn:CpE520

Resulting Classification by trained network

Klinkhachorn:CpE520

Classification of an irregular grid

Klinkhachorn:CpE520

Competitive Learning Notes


Useful as simple pattern classifier Can be used in more complex networks Input pattern must be well organized Unorganized input pattern can lead to unstable behavior
Klinkhachorn:CpE520

Competition and Inhibition Network


Max Network Like Hamming Only one has output of 1, rest are 0

Klinkhachorn:CpE520

Lateral Inhibition Network

Klinkhachorn:CpE520

Lateral Inhibition Network

Klinkhachorn:CpE520

Lateral Inhibition Network

Klinkhachorn:CpE520

Lateral Inhibition Network

Klinkhachorn:CpE520

Lateral Inhibition Notes


Inhibits nearby units only Enhances peaks and edges of patterns Requires tuning of inhibitory and excitory parameters for desired results

Klinkhachorn:CpE520

Vous aimerez peut-être aussi