Vous êtes sur la page 1sur 8

Fitting a Curve to a Data Set with the New Gradient Operator

This article explains how to use the Mathcad function genfit to fit a general,
nonlinear function to a data set. It also shows how to use genfit with the new
gradient operator, introduced in Mathcad 14, to obtain a better fit in certain
cases.
Fitting a Curve to a Data Set
Curve fitting - or regression analysis - is the process of finding a curve that
best fits a set of data points. Typically, you specify the curve by a model
function with unknown parameters. Your goal is to find the values of the
parameters that minimize the overall difference between the data values and
the corresponding function values.
Start by looking at an example that uses the data in the table below:
data
0 1
0
1
2
3
4
5
6
7
8
9
10
0 0
0.2 0.1546
0.4 0.49
0.6 1.0249
0.8 1.3897
1 1.9376
1.2 3.3416
1.4 4.836
1.6 5.9844
1.8 6.1197
2 8.4171
:=
The first column contains the x-values of the data, while the second column
contains the y-values.
X data
0
( )
:= Y data
1
( )
:=
Here is a plot of the data in the xy-plane. Each point corresponds to a pair of
numbers (x, y) in a single row of the table.
0 0.5 1 1.5 2
0
2
4
6
8
10
Y
X
Suppose you believe that the data is modeled by a model function of the
form A x
b
, where x is a variable and A and b are unknown parameters.
You can use the Mathcad function genfit to fit a nonlinear function like
A x
b
to the data.The function genfit has the following form:
genfit X Y , guess , f , ( )
where
X and Y are vectors containing the x-values and the y-values, -
respectively, of the data.
guess is a vector of initial guess values for the parameters. -
f is the model function. -
In this example, define f as
f x A , b , ( ) A x
b
:=
Notice that the parameters A and b are listed after x as arguments to the
function.
Next, create a vector of guess values for A and b:
guess
5
2
|

\
|
|
.
:=
The first entry is the guess for A and the second is the guess for b. Now
apply genfit to return a vector containing the parameters A and b that give
the best fit.
A
b
|

\
|
|
.
genfit X Y , guess , f , ( ) :=
A 2.374 =
b 1.799 =
The graph of the function f x A , b , ( ) A x
b
= with these parameter
values is shown below, along with the original data points.
0 0.5 1 1.5 2
0
2
4
6
8
10
0
2
4
6
8
10
Y f x A , b , ( )
X x ,
Using the Mathcad 14 Gradient Operator to Improve the Fit
In some cases, you can improve the results of genfit by supplying it
with the partial derivatives of the model function with respect to the
unknown parameters. This is often the case when the model function
is a rational function. The partial derivatives help guide the genfit
algorithm toward the optimal solution. The gradient operator
provides an easy way to compute the partial derivatives.
Here's an example.
data1
0 1
0
1
2
3
4
5
6
7
8
9
10
-0.6 0.526
-0.5 0.369
-0.4 0.3637
-0.3 0.2878
-0.2 0.2352
-0.1 0.2197
0 0.2347
0.1 0.2241
0.2 0.2212
0.3 0.2199
0.4 0.2056
:=
X1 data1
0
( )
:= Y1 data1
1
( )
:=
You can model the data with a rational function of the form
g x a , ( )
a
0
a
1
x +
a
2
a
3
x +
:=
in which the unknown parameters a
0
, a
1
, a
2
, anda
3
are contained in a
single vector a. Note that g is undefined when its denominator is 0.
Now apply genfit as in the previous example.
guess1
1
1
2
2
|

\
|
|
|
|
|
.
:=
badparams genfit X1 Y1 , guess1 , g , ( ) :=
As you can see from the graph below, the resulting parameters do not give a
very good fit.
0.4 0.2 0 0.2 0.4
0.2
0.3
0.4
0.5
0.6
0.2
0.3
0.4
0.5
Y1 g x badparams , ( )
X1 x ,
To get a better fit, you can supply the last argument to genfit as a vector G,
whose first entry is the model function g, and whose remaining entries are the
partial derivatives of g with respect to the unknown parameters.
The Gradient Operator
To compute the partial derivatives of g, press [Ctrl] [Shift] G to insert the
Mathcad 14 gradient operator.

Next, type g(x,a), the model function, in the upper-right placeholder, and type
a, the vector of parameters, in the lower-left placeholder.

a
g x a , ( )
Press [Ctrl] [Shift] [.] to evaluate the result symbolically.

a
g x a , ( )
1
x a
3
a
2
+
x
x a
3
a
2
+
x a
1
a
0
+
x a
3
a
2
+ ( )
2

x x a
1
a
0
+ ( )
x a
3
a
2
+ ( )
2

(
(
(
(
(
(
(
(
(
(
(

The result is a vector containing the partial derivatives of g with respect to the
unknown parameters in a. Note that the vector does not contain the partial
derivative of g with respect to x.
To create the vector G for genfit, use the stack function to put the
model function g on top of the vector of partial derivatives.
G x a , ( ) stack g x a , ( )
a
g x a , ( ) , ( ) :=
G x a , ( )
x a
1
a
0
+
x a
3
a
2
+
1
x a
3
a
2
+
x
x a
3
a
2
+
x a
1
a
0
+
x a
3
a
2
+ ( )
2

x x a
1
a
0
+ ( )
x a
3
a
2
+ ( )
2

(
(
(
(
(
(
(
(
(
(
(
(
(
(

Now, apply genfit with the vector G as its last argument as follows:
1. Type
goodparams genfit X1 Y1 , guess1 , G , ( ) :=
2. Right-click the function genfit and select Levenberg
Marquardt from the drop-down menu.
Note: For this problem, the Levenberg Marquardt algorithm is more
effective than the default algorithm, Optimized Levenberg Marquardt.
Unlike the default algorithm, you must supply the partial derivatives when
using Levenberg Marquardt.
The graph below clearly shows that the second set of parameters,
goodparams, provides a better fit to the data than badparams:
0.4 0.2 0 0.2 0.4
0.1
0.2
0.3
0.4
0.5
0.1
0.2
0.3
0.4
0.5
Y1
g x badparams , ( )
g x goodparams , ( )
X1 x ,
You can confirm that goodparams provides a better fit by computing
the sums of squares of the residuals for the two fits.
goodresiduals Y1 g X1 goodparams , ( ) := badresiduals Y1 g X1 badparams , ( ) :=
i 0 10 .. :=
i
goodresiduals
i
( )
2

0.004 =
i
badresiduals
i
( )
2

0.083 =

Vous aimerez peut-être aussi