## Polynomial Regression using Python

### Brian H. Russell

In this Lab, I will discuss polynomial regression methods. Regression methods can be broken into two categories: linear and nonlinear. There is only one linear approach, and that is where we will begin. However, there are various nonlinear regression methods, such as polynomial regression and the radial basis function approach, also called kernel regression. Linear regression is the simplest polynomial regression approach and uses a first-order polynomial. I will compare linear regression with two polynomial regression approaches: quadratic and cubic. This will illustrate the concept of under-fitting, exact-fitting and over-fitting using polynomial regression. I will start with the primal solution, which is a well-known approach. However, I will then discuss the dual solution, which is not as well-known. The advantage of the dual solution is that it will lead directly to kernel regression, which I will discuss in Lab 23.

These methods will all be illustrated with Python examples applied to a simple three-point problem and a more complex ten-point sine wave with additive noise.