You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Setting m=300 turns A into a square matrix. In this case, the zero block in the sigma-matrix disappears.
106
+
Setting $m=300$ turns $A$ into a square matrix. In this case, the zero block in the sigma-matrix disappears.
102
107
Plot the result for epsilon equal to 0.1, 1e-6, and 1e-12.
103
108
104
109
#### Model Complexity (Optional):
105
110
Another solution to the overfitting problem is reducing the complexity of the model.
106
111
To assess the quality of polynomial fit to the data, compute and plot the Mean Squared Error (Mean Squared Error (MSE) measure how close the regression line is to data points) for every degree of polynomial upto 20.
107
112
108
-
MSE can be calculated using the following equation, where N is the number of samples, $y_i$ is the original point and $\hat{y_i}$ is the predictied output.
113
+
MSE can be calculated using the following equation, where $N$ is the number of samples, $y_i$ is the original point and $\hat{y_i}$ is the predictied output.
From the plot, estimate the optimal degree of polynomial and fit the polynomial with this new degree and compare the regression.
@@ -126,7 +131,7 @@ Plot the result. Compute the zero. When do the regression line and the x-axis in
126
131
127
132
#### Fitting a higher order Polynomial:
128
133
129
-
Re-using the code you wrote for the proof of concept task, fit a polynomial of degree 20 to the data.
134
+
Re-using the code you wrote for the proof of concept task, fit a polynomial of degree 20 to the data. Before plotting have a closer look at `datetime_stamps` and its values and scale the axis appropriately.
0 commit comments