Skip to content

Latest commit

 

History

History
66 lines (54 loc) · 3.22 KB

File metadata and controls

66 lines (54 loc) · 3.22 KB

Polynomial-Regression-From-Scratch

Polynomial regression using the normal equation and gradient descent methods.

Dependencies

Licence

This code is licensed under the MIT License - see the LICENSE.md file for details.

Synopsis

The goal of polynomial regression is to fit a nth degree polynomial to data to establish a general relationship between the independent variable x and dependent variable y. Polynomial regression is a special form of multiple linear regression, in which the objective is to minimize the cost function given by:

and the hypothesis is given by the linear model:

The PolynomialRegression class can perform polynomial regression using two different methods: the normal equation and gradient descent. The normal equation method uses the closed form solution to linear regression:

and does not require iterative computations or feature scaling. Gradient descent is an iterative approach that increments theta according to the direction of the gradient of the cost function.

Code Example 1: Normal Equation Method

x_pts, y_pts = generatePolyPoints(0, 50, 100, [5, 1, 1], 
                                  noiseLevel = 2, plot = 1)
PR = PolynomialRegression(x_pts, y_pts)
theta = PR.fit(method = 'normal_equation', order = 2)
PR.plot_predictedPolyLine()
           

Code Example 2: Gradient Descent Method

x_pts, y_pts = generatePolyPoints(0, 50, 100, [5, 1, 1], 
                                  noiseLevel = 2, plot = 1)
PR = PolynomialRegression(x_pts, y_pts)
theta = PR.fit(method = 'gradient_descent',  order = 2, tol = 10**-3, numIters = 100, learningRate = 10**-4)
PR.plot_predictedPolyLine()
PR.plotCost()