Polynomial regression with R

In this post I’ll show you how simple is to use R in order to fit a polynomial  linear model to our data.

Let’s assume we have 4 points with the following X,Y coordinates

> y <- c(10,0,3,50)
> x <- c(1,2,3,4)

 

Which plotted look like this:
> plot(x,y)

 

plot2

 

Now we can try to find the polynomial which would fit those points.

> reg = glm(y ~ I(x) + I(x^2) + I(x^3))

Summary prints out the coefficients of our model, where the model is

y = a + bx + cx^2 + dx^3 + error

> summary(reg)

Call:
glm(formula = y ~ I(x) + I(x^2) + I(x^3))

Deviance Residuals:
[1] 0 0 0 0

Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 2.000 NA NA NA
I(x) 27.333 NA NA NA
I(x^2) -24.500 NA NA NA
I(x^3) 5.167 NA NA NA

(Dispersion parameter for gaussian family taken to be NaN)

Null deviance: 1.6168e+03 on 3 degrees of freedom
Residual deviance: 3.3101e-27 on 0 degrees of freedom
AIC: -228.08

Number of Fisher Scoring iterations: 1

 

to plot the curve in the same area, we can use the “curve” function:

> curve(2 + 27.333*x -24.500 *x^2 + 5.167 * x^3, add=T)

plot3

 

nice curve, ah? đŸ˜€

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *