Skip to content

Commit 9e1aa5d

Browse files
authored
Initial commit
0 parents  commit 9e1aa5d

27 files changed

+1122
-0
lines changed

.flake8

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
#########################
2+
# Flake8 Configuration #
3+
# (.flake8) #
4+
#########################
5+
[flake8]
6+
ignore =
7+
S101
8+
S301
9+
S403
10+
S404
11+
S603
12+
W503
13+
E203
14+
E402
15+
DAR101
16+
DAR201
17+
N400
18+
exclude =
19+
.tox,
20+
.git,
21+
__pycache__,
22+
docs/source/conf.py,
23+
build,
24+
dist,
25+
tests/fixtures/*,
26+
*.pyc,
27+
*.bib,
28+
*.egg-info,
29+
.cache,
30+
.eggs,
31+
data.
32+
max-line-length = 120
33+
max-complexity = 20
34+
import-order-style = pycharm
35+
application-import-names =
36+
seleqt
37+
tests

.github/workflows/test.yml

Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
name: Tests
2+
3+
on: [ push, pull_request ]
4+
5+
jobs:
6+
lint:
7+
name: Lint
8+
runs-on: ubuntu-latest
9+
strategy:
10+
matrix:
11+
python-version: [3.11.0]
12+
steps:
13+
- uses: actions/checkout@v2
14+
- name: Set up Python ${{ matrix.python-version }}
15+
uses: actions/setup-python@v2
16+
with:
17+
python-version: ${{ matrix.python-version }}
18+
- name: Install dependencies
19+
run: pip install nox
20+
- name: Run flake8
21+
run: nox -s lint
22+
- name: Run mypy
23+
run: nox -s typing

.gitignore

Lines changed: 174 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,174 @@
1+
ffhq_style_gan/
2+
ffhq_style_gan.zip
3+
4+
.vscode/
5+
.pytest_cache/
6+
7+
# Byte-compiled / optimized / DLL files
8+
__pycache__/
9+
*.py[cod]
10+
*$py.class
11+
12+
# C extensions
13+
*.so
14+
15+
# Distribution / packaging
16+
.Python
17+
build/
18+
develop-eggs/
19+
dist/
20+
downloads/
21+
eggs/
22+
.eggs/
23+
lib/
24+
lib64/
25+
parts/
26+
sdist/
27+
var/
28+
wheels/
29+
share/python-wheels/
30+
*.egg-info/
31+
.installed.cfg
32+
*.egg
33+
MANIFEST
34+
35+
# PyInstaller
36+
# Usually these files are written by a python script from a template
37+
# before PyInstaller builds the exe, so as to inject date/other infos into it.
38+
*.manifest
39+
*.spec
40+
41+
# Installer logs
42+
pip-log.txt
43+
pip-delete-this-directory.txt
44+
45+
# Unit test / coverage reports
46+
htmlcov/
47+
.tox/
48+
.nox/
49+
.coverage
50+
.coverage.*
51+
.cache
52+
nosetests.xml
53+
coverage.xml
54+
*.cover
55+
*.py,cover
56+
.hypothesis/
57+
.pytest_cache/
58+
cover/
59+
60+
# Translations
61+
*.mo
62+
*.pot
63+
64+
# Django stuff:
65+
*.log
66+
local_settings.py
67+
db.sqlite3
68+
db.sqlite3-journal
69+
70+
# Flask stuff:
71+
instance/
72+
.webassets-cache
73+
74+
# Scrapy stuff:
75+
.scrapy
76+
77+
# Sphinx documentation
78+
docs/_build/
79+
80+
# PyBuilder
81+
.pybuilder/
82+
target/
83+
84+
# Jupyter Notebook
85+
.ipynb_checkpoints
86+
87+
# IPython
88+
profile_default/
89+
ipython_config.py
90+
91+
# pyenv
92+
# For a library or package, you might want to ignore these files since the code is
93+
# intended to run in multiple environments; otherwise, check them in:
94+
# .python-version
95+
96+
# pipenv
97+
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
98+
# However, in case of collaboration, if having platform-specific dependencies or dependencies
99+
# having no cross-platform support, pipenv may install dependencies that don't work, or not
100+
# install all needed dependencies.
101+
#Pipfile.lock
102+
103+
# poetry
104+
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
105+
# This is especially recommended for binary packages to ensure reproducibility, and is more
106+
# commonly ignored for libraries.
107+
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
108+
#poetry.lock
109+
110+
# pdm
111+
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
112+
#pdm.lock
113+
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
114+
# in version control.
115+
# https://pdm.fming.dev/#use-with-ide
116+
.pdm.toml
117+
118+
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
119+
__pypackages__/
120+
121+
# Celery stuff
122+
celerybeat-schedule
123+
celerybeat.pid
124+
125+
# SageMath parsed files
126+
*.sage.py
127+
128+
# Environments
129+
.env
130+
.venv
131+
env/
132+
venv/
133+
ENV/
134+
env.bak/
135+
venv.bak/
136+
137+
# Spyder project settings
138+
.spyderproject
139+
.spyproject
140+
141+
# Rope project settings
142+
.ropeproject
143+
144+
# mkdocs documentation
145+
/site
146+
147+
# mypy
148+
.mypy_cache/
149+
.dmypy.json
150+
dmypy.json
151+
152+
# Pyre type checker
153+
.pyre/
154+
155+
# pytype static type analyzer
156+
.pytype/
157+
158+
# Cython debug symbols
159+
cython_debug/
160+
161+
# PyCharm
162+
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
163+
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
164+
# and can be added to the global gitignore or merged into this file. For a more nuclear
165+
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
166+
#.idea/
167+
168+
classifier_weights.jpg
169+
input_opt.jpg
170+
mean_freq_difference.jpg
171+
real_fake_mean-log_fft2.jpg
172+
row_average_shifted_mean-log_fft2.jpg
173+
integrated_gradients_*.jpg
174+
out/

README.md

Lines changed: 75 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,75 @@
1+
# Exercise: Interpretable machine learning
2+
3+
# Task 1: Input optimization.
4+
Open the `src/input_opt.py` file. The network `./data/weights.pth` contains network weights pre-trained on MNIST. Turn the network optimization problem around, and find an input that makes a particular output neuron extremely happy. In other words maximize,
5+
6+
```math
7+
\max_\mathbf{x} y_i = f(\mathbf{x}, \theta) .
8+
```
9+
10+
Use `torch.func.grad` to find the gradients of the network input $\mathbf{x}$.
11+
Start with a network input of shape `[1, 1, 28, 28]`. Compare a random initialization
12+
to starting from an array filled with ones, iteratively optimize it. Execute your script with `python src/input_opt.py`.
13+
14+
# Task 2 Integrated Gradients (Optional):
15+
16+
17+
Reuse your MNIST digit recognition code. Implement IG as discussed in the lecture. Recall the equation
18+
19+
```math
20+
\text{IntegratedGrads}_i(x) = (x_i - x_i') \cdot \frac{1}{m} \sum_{k=1}^m \frac{\partial F (x' + \frac{k}{m} \cdot (x - x'))}{\partial x_i}.
21+
```
22+
23+
F partial xi denotes the gradients with respect to the input color-channels i.
24+
x prime denotes a baseline black image. And x symbolizes an input we are interested in.
25+
Finally, m denotes the number of summation steps from the black baseline image to the interesting input.
26+
27+
Follow the todos in `./src/mnist_integrated.py` and then run `scripts/integrated_gradients.slurm`.
28+
29+
30+
31+
# Task 3 Deepfake detection (Optional):
32+
In this exercise we will consider 128 by 128-pixel fake images from [StyleGAN](https://github.com/NVlabs/stylegan) and pictures of real people from the [Flickr-Faces-HQ](https://github.com/NVlabs/ffhq-dataset) dataset.
33+
34+
Flickr-Faces-HQ images depict real people, such as the person below:
35+
36+
![real person](./figures/real.png)
37+
38+
Generative adversarial networks allow the generation of fake images at scale. Does the picture below seem real?
39+
40+
![fake person](./figures/fake.png)
41+
42+
How can we identify the fake? Given that modern neural networks can generate hundreds of fake images per second can we create a classifier to automate the process?
43+
44+
### 3.1 Getting started:
45+
1. Move to the `data` folder in your terminal. Download [ffhq_style_gan.zip](https://drive.google.com/uc?id=1MOHKuEVqURfCKAN9dwp1o2tuR19OTQCF&export=download) on bender using the command
46+
```bash
47+
gdown https://drive.google.com/uc?id=1MOHKuEVqURfCKAN9dwp1o2tuR19OTQCF
48+
```
49+
If `gdown` is not installed, type `pip install gdown` and then try again.
50+
2. Type `export UNZIP_DISABLE_ZIPBOMB_DETECTION=TRUE` to make unzipping big archives possible.
51+
3. Extract the image pairs here by executing `unzip ffhq_style_gan.zip` in the terminal.
52+
53+
The desired outcome is to have a folder called `ffhq_style_gan` in the project data-folder.
54+
55+
56+
### 3.2 Analyzing the data
57+
The `load_folder` function from the `util` module loads both real and fake data.
58+
Code to load the data is already present in the `deepfake_interpretation.py` file.
59+
60+
Compute log-scaled frequency domain representations of samples from both sources via
61+
62+
``` math
63+
\mathbf{F}_I = \log_e (| \mathcal{F}_{2d}(\mathbf(I)) | + \epsilon ), \text{ with } \mathbf{I} \in \mathbb{R}^{h,w,c}, \epsilon \approx 0 .
64+
```
65+
66+
Above `h`, `w` and `c` denote image height, width and columns. `Log` denotes the natural logarithm, and bars denote the absolute value. A small epsilon is added for numerical stability.
67+
68+
Use the numpy functions `np.log`, `np.abs`, `np.fft.fft2`. By default, `fft2` transforms the last two axes. The last axis contains the color channels in this case. We are looking to transform the rows and columns.
69+
70+
Plot mean spectra for real and fake images as well as their difference over the entire validation or test sets. For that complete the TODOs in `src/deepfake_interpretation.py` and run the script `scripts/train.slurm`.
71+
72+
73+
## 3.3 Training and interpreting a linear classifier
74+
Train a linear classifier consisting of a single `nn.Linear`-layer on the log-scaled Fourier coefficients using Torch. Plot the result. What do you see?
75+

data/MNIST/t10k-images-idx3-ubyte

7.48 MB
Binary file not shown.
1.57 MB
Binary file not shown.

data/MNIST/t10k-labels-idx1-ubyte

9.77 KB
Binary file not shown.
4.44 KB
Binary file not shown.

data/MNIST/train-images-idx3-ubyte

44.9 MB
Binary file not shown.
9.45 MB
Binary file not shown.

0 commit comments

Comments
 (0)