Skip to content

Commit bbd766a

Browse files
committed
math fixes and note about pegel axis
1 parent 9e76236 commit bbd766a

File tree

1 file changed

+19
-14
lines changed

1 file changed

+19
-14
lines changed

README.md

Lines changed: 19 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -34,13 +34,13 @@ With m=2,
3434

3535
$$\mathbf{A}_m^{\dagger}\mathbf{b} = \mathbf{x}$$
3636

37-
will produce the coefficients for a straight line. Evaluate your first-degree polynomial via ax+b.
37+
will produce the coefficients for a straight line. Evaluate your first-degree polynomial via $ax+b$.
3838
Plot the result using `matplotlib.pyplot`'s `plot` function.
3939

4040

4141
#### Fitting a Polynomial to a function:
4242
The straight line above is insufficient to model the data. Using your
43-
implementation of `set_up_point_matrix` set m=300 (to set up a square matrix) and fit the polynomial
43+
implementation of `set_up_point_matrix` set $m=300$ (to set up a square matrix) and fit the polynomial
4444
by computing
4545

4646
$$\mathbf{A}^{\dagger}\mathbf{b} = \mathbf{x}_{\text{fit}}.$$
@@ -66,30 +66,35 @@ into the form:
6666
$$ \mathbf{A} = \mathbf{U} \Sigma \mathbf{V}^T
6767
$$
6868

69-
In the SVD-Form computing, the inverse is simple. Swap U and V and replace every of the m singular values with it's inverse
69+
In the SVD-Form computing, the inverse is simple. Swap $U$ and $V$ and replace every of the m singular values with it's inverse
7070

7171
$$1/\sigma_i .$$
7272

73+
This results in the matrix
74+
```math
75+
\Sigma^\dagger = \begin{pmatrix}
76+
\sigma_1^{-1} & & & \\\\
77+
& \ddots & \\\\
78+
& & \sigma_m^{-1} \\\\ \hline
79+
& 0 &
80+
\end{pmatrix}
81+
```
82+
7383
A solution to the overfitting problem is to filter the singular values.
7484
Compute a diagonal for a filter matrix by evaluating:
7585

7686
$$f_i = \sigma_i^2 / (\sigma_i^2 + \epsilon)$$
7787

78-
The idea is to compute a loop over i for all of the m singular values.
79-
Roughly speaking multiplication by f underscore i will filter a singular value when
88+
The idea is to compute a loop over $i$ for all of the m singular values.
89+
Roughly speaking multiplication by $f_i$ will filter a singular value when
8090

8191
$$\sigma_i \lt \epsilon .$$
8292

8393
Apply the regularization by computing:
8494

8595

8696
$$
87-
\mathbf{x}_r= \mathbf{V} \mathbf{F} \begin{pmatrix}
88-
\sigma_1^{-1} & & & \\\\
89-
& \ddots & \\\\
90-
& & \sigma_m^{-1} \\\\ \hline
91-
& 0 &
92-
\end{pmatrix}
97+
\mathbf{x}_r= \mathbf{V} \mathbf{F} \mathbf{\Sigma}^\dagger
9398
\mathbf{U}^T \mathbf{b}
9499
$$
95100

@@ -98,14 +103,14 @@ with
98103

99104
$$\mathbf{A} \in \mathbb{R}^{m,n}, \mathbf{U} \in \mathbb{R}^{m,m}, \mathbf{V} \in \mathbb{R}^{n,n}, \mathbf{F} \in \mathbb{R}^{m,m}, \Sigma^{\dagger} \in \mathbb{R}^{n,m} \text{ and } \mathbf{b} \in \mathbb{R}^{n,1}.$$
100105

101-
Setting m=300 turns A into a square matrix. In this case, the zero block in the sigma-matrix disappears.
106+
Setting $m=300$ turns $A$ into a square matrix. In this case, the zero block in the sigma-matrix disappears.
102107
Plot the result for epsilon equal to 0.1, 1e-6, and 1e-12.
103108

104109
#### Model Complexity (Optional):
105110
Another solution to the overfitting problem is reducing the complexity of the model.
106111
To assess the quality of polynomial fit to the data, compute and plot the Mean Squared Error (Mean Squared Error (MSE) measure how close the regression line is to data points) for every degree of polynomial upto 20.
107112

108-
MSE can be calculated using the following equation, where N is the number of samples, $y_i$ is the original point and $\hat{y_i}$ is the predictied output.
113+
MSE can be calculated using the following equation, where $N$ is the number of samples, $y_i$ is the original point and $\hat{y_i}$ is the predictied output.
109114
$$MSE=\frac{1}{N} \sum_{i=1}^{N} (y_i-\hat{y_i})^2$$
110115

111116
From the plot, estimate the optimal degree of polynomial and fit the polynomial with this new degree and compare the regression.
@@ -126,7 +131,7 @@ Plot the result. Compute the zero. When do the regression line and the x-axis in
126131

127132
#### Fitting a higher order Polynomial:
128133

129-
Re-using the code you wrote for the proof of concept task, fit a polynomial of degree 20 to the data.
134+
Re-using the code you wrote for the proof of concept task, fit a polynomial of degree 20 to the data. Before plotting have a closer look at `datetime_stamps` and its values and scale the axis appropriately.
130135
Plot the result.
131136

132137

0 commit comments

Comments
 (0)