Vous êtes sur la page 1sur 7

Lecture 4

Recursive identication algorithm


In o-line or batch identication, data up to some t = N is rst collected, then the model parameter vector is computed. In on-line or recursive identication, the model parameter vector (n) is required for every t = n. It is important that memory and computational time for updating the model parameter vector (n) do not increase with t. This poses constraints on how the estimates may be computed. General form of recursive algorithms is (n) = (n 1) + K(n)(n) In the following we will derive the recursive form of the least squares algorithm by modifying the o-line algorithm.
1

Lecture 4

Derivation of recursive least squares Suppose we have all the data collected up to time n. Then consider the formation of the regression matrix (n) as
(n) =

T (1)

and output vector y(n) = [y(1), ..., y(n)]T . Thus the least squares solution is (n) = [(n)T (n)]1(n)T y(n) When a new data point comes, n is increased by 1, the least squares solution as above requires repetitive calculation including recalculating the inverse (expensive in computer time and storage) Lets look at the expression (n)T (n) and (n)T y(n) and dene P1(n) = (n)T (n)
2

T (2) . . . T (n)

Lecture 4

P1(n) = (n)T (n)


= [(1), (2), ..., (n)]
n

T (1) T (2) . . . T (n)

=
t=1 n1

(t)T (t)

(t)T (t) + (n)T (n)

T y(n) = [(1), (2), ..., (n)] (n)


n

t=1 = P1(n 1) + (n)T (n)

y(1) y(2) . . . y(n)

=
t=1 n1

(t)y(t) (t)y(t) + (n)y(n)


t=1

= (n 1)T y(n 1) + (n)y(n)


3

Lecture 4

P1(n) = P1(n 1) + (n)T (n) (1) (n)T y(n) = (n 1)T y(n 1) + (n)y(n) (2)
The least squares estimate at time n (n) = [(n)T (n)]1(n)T y(n) = P(n)(n)T y(n) = P(n) (n 1)T y(n 1) + (n)y(n) (3) Because the least squares estimate at time (n 1) is given by (n 1) = P(n 1)(n 1)T y(n 1), so P1(n 1)(n 1) = (n 1)T y(n 1) (4) Substitute (4) to (3)
4

Lecture 4

(n)

= P(n) P1(n 1)(n 1) + (n)y(n)


Applying (1) = P(n) P1(n)(n 1) (n)T (n)(n 1) + (n)y(n) = (n 1) + P(n)(n) y(n) T (n)(n 1) = (n 1) + K(n)(n) Thus the recursive least squares (RLS) are

(n) = y(n) T (n)(n 1)

(5)

P(n) = (P1(n 1) + (n)T (n))1 (6) K(n) = P(n)(n) (7) (n) = (n 1) + K(n)(n) (8)
Yet we still require a matrix inverse to be calculated in (6). However this can be avoided through matrix inversion lemma (to be derived next week).
5

Lecture 4

Assignment http: //www. personal. reading. ac. uk/ sis01xh / teaching / CS3assignment.htm The assignment has three parts, all relating to recursive least squares system identication. Part A develops software to do recursive least squares identication. Part B provides some real data and allows you to identify a mechanical master robot of a master/slave tele-manipulator. Part C (worth 20% of the marks) extends the RLS algorithm to either include instrument variables or a PLS noise model based on the Moving Average Model, or both. You can treat part C as optional. It is recommended that you use Matlab for the assignment, although you will not be penalised for using other programming languages.
6

Lecture 4

Files: Matlab and data les are available at WWW address: http://www.personal.rdg.ac.uk/ sis01xh / teaching /Sysid /datales/Datasets.htm Here you will also nd the les loadass.m (will load the data into a suitable vector) crib.m (available as a skeleton for your program) dord2.m (a way of generating a test system) Copy the programs and the data set (as specied by your name) into an appropriate directory and run it under Matlab. Enter a le name and you will then have a variable y in the Matlab workspace containing the corresponding response data. Use the skeleton code for recursive least squares estimator (see crib.m)
7

Vous aimerez peut-être aussi