You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Setting n=300 turns A into a square matrix. In this case, the zero block in the sigma-matrix disappears.
106
+
Setting $m=300$ turns $A$ into a square matrix. In this case, the zero block in the sigma-matrix disappears.
102
107
Plot the result for epsilon equal to 0.1, 1e-6, and 1e-12.
103
108
104
109
#### Model Complexity (Optional):
105
110
Another solution to the overfitting problem is reducing the complexity of the model.
106
111
To assess the quality of polynomial fit to the data, compute and plot the Mean Squared Error (Mean Squared Error (MSE) measure how close the regression line is to data points) for every degree of polynomial up to 20.
107
112
108
-
MSE can be calculated using the following equation, where N is the number of samples, $y_i$ is the original point and $\hat{y_i}$ is the predictied output.
113
+
MSE can be calculated using the following equation, where $N$ is the number of samples, $y_i$ is the original point and $\hat{y_i}$ is the predictied output.
Are the degree of the polynomial and the MSE linked?
@@ -127,8 +132,7 @@ Plot the result. Compute the zero. When do the regression line and the x-axis in
127
132
128
133
#### Fitting a higher order Polynomial:
129
134
130
-
Re-using the code you wrote for the proof of concept task, fit a polynomial of degree 20 to the data.
131
-
Study the scaling of the timestamp values. Is the range suitable for stable numerical computations?
135
+
Re-using the code you wrote for the proof of concept task, fit a polynomial of degree 20 to the data. Before plotting have a closer look at `datetime_stamps` and its values and scale the axis appropriately.
0 commit comments