y = ax^b + 5
Where, a, and b are the unknown coefficients of the polynomials. In this tutorial, we will discuss a method to determine the unknown coefficients using the Taylor series expansion assuming the data points are provided.
From the above polynomial equation, y is the dependent variable, and x is the independent variable. Let's assume that we know ten data points for the polynomial so that the following equations may be written based on the data points:
y₁ = ax₁^b + 5
y2 = ax2^b + 5
y3 = ax3^b + 5
.
.
.
y10 = ax10^b + 5
This model seems to have a nonlinear dependence on the parameters a and b. Nonlinear regression can be used here to estimate these parameters. Nonlinear regression is based on determining the values of the parameters that minimize the sum of the squares of the residuals and the solution is preceded in an iterative manner. The Gauss-Newton method is one of the algorithms which minimize the sum of the squares of the residuals between nonlinear equation and data. A Taylor series expansion is used to express the nonlinear equation is an approximate linear form. After that, least square method is applied to estimate the parameters, which move towards the minimization of the residuals.
yi = f(xi; a,b,c) + ei; i = 1, 2, 3, ... ... ... 10
Where, yi is a measured value of the dependent variable, f(xi; a,b,c) is the nonlinear function of the independent variable, xi, with two unknown parameters a, b, and one given parameter c, which is 5 according to the given correlation. The last term ei is the random error. The model may be represented in a short form without the parameters for convenience as,
yi = f(xi) + ei (1)
For the two parameters case, a Taylor series expansion is written for the first two terms as,
Here, j is the initial guess, j+1 is the prediction, ∆a = aj+1 - aj and ∆b = bj+1 - bj. Equation (2) is substituted to equation (1) which yields,
In matrix form, equation (3) may be written as,
Where, Zj is the matrix of the partial derivatives of the function at the initial guess j.
The vector {D} contains the difference between the measurements and function values at 10 points.
And, the vector {A} contains the changes in the parameters’ values as,
Applying least square method to equation (4) results the following normal equation,
Equation (5) is implemented to solve for {∆A} to estimate the improved values of the parameters a and b,
aj+1 = aj + ∆a
bj+1 = bj + ∆b
Thus, this procedure is continued until the solution converges.
#TaylorSeries #Polynomial #NonlinearRegression #LeastSquare #Matlab #Blog #Blogger