Model for solving regression tasks, in which the objective is to adjust a line for the data and make predictions on new values. The input of this model is the feature matrix X and a y vector of predictions is obtained, trying to be as close as possible to the actual y values. The linear regression formula is the sum of the bias term (
So the simple linear regression formula looks like:
And that can be further simplified as:
Here is a simple implementation of Linear Regression in python:
w0 = 7.1
def linear_regression(xi):
n = len(xi)
pred = w0
w = [0.01, 0.04, 0.002]
for j in range(n):
pred = pred + w[j] * xi[j]
return predIf we look at the
We need to assure that the result is shown on the untransformed scale by using the inverse function exp().
The entire code of this project is available in this jupyter notebook.
|
The notes are written by the community. If you see an error here, please create a PR with a fix. |