-
Train a CNN model with an upper limit of 10M parameters.
-
Please download CUB dataset from here: https://data.caltech.edu/records/65de6-vp158/files/CUB_200_2011.tgz?download=1
-
Use default train-test split for this task.
-
Submissions will be evaluated based on parameter efficiency, training time efficiency (no. of iterations) and accuracy.
-
Submissions should include report, code and final model checkpoint. Drive link is fine for checkpoint.
-
Report must include architecture and traiining details. Training loss and accuracy curves should also be incorporated along with final results.
-
External models: Only ImageNet pretrained models are allowed.
-
Please use moodle forum or mail to gnr638@googlegroups.com to post queries.
-
Deadline: March 5 2024, 11:59 PM
Report and saved checkpoint for all commits are uploaded here
- Download the dataset from here
- Clone/Download the repo
- Extract the
.tgzfile in the repo folder cnn_classification.ipynbhave all the util functions and hyperparams- By default we use default number of train/test dataset
- For data augmentation, call
data_augmentation()function before callingtrain_modefunction - To call the train function use this
train_model(model,
criterion = nn.CrossEntropyLoss(),
learning_rate=learning_rate,
optimizer = None,
schedular=None,
num_epoch=num_epoch,
save_checkpoint=False,
time_start_from=0):- Keep above points in mind
- Open
efficientnet_no_data_augmentation.ipynb - Run all cells of the
.ipynbfile
The exact architecture can be found in efficientnet_no_data_augmentation.ipynb
| Weight | Acc@1 | Acc@5 | Params | GFLOPS |
|---|---|---|---|---|
| EfficientNet_B1_Weights.IMAGENET1K_V2 | 79.838 | 94.934 | 7.8M | 0.69 |
Changes in classifier layer of EfficientNet
Sequential(
(0): Dropout(p=0.4, inplace=True)
(1): Linear(in_features=1280, out_features=200, bias=True)
)Number of training parameters = 256200

Number of training parameters = 6769384

Number of training parameters = 256200

- Top 1 Accuracy: 71.48%
- Time required for training: 26 min 54 sec
- Total Number of Parameters: 6,769,384
Fine Tuning:
- Total Number of Parameters: 256,200
Transfer Learning:
- Total Number of Parameters: 6,769,384
