- PrepLayer - Conv 3x3 s1, p1) >> BN >> RELU [64k]
- Layer1 -
X = Conv 3x3 (s1, p1) >> MaxPool2D >> BN >> RELU [128k]
R1 = ResBlock( (Conv-BN-ReLU-Conv-BN-ReLU))(X) [128k] - Add(X, R1)
- Layer 2 -
Conv 3x3 [256k]
MaxPooling2D
BN
ReLU - Layer 3 -
X = Conv 3x3 (s1, p1) >> MaxPool2D >> BN >> RELU [512k]
R2 = ResBlock( (Conv-BN-ReLU-Conv-BN-ReLU))(X) [512k]
Add(X, R2)
MaxPooling with Kernel Size 4 - FC Layer
- SoftMax
Uses One Cycle Policy such that:
Total Epochs = 24
Max at Epoch = 5
LRMIN = FIND
LRMAX = FIND
NO Annihilation
Uses this transform -RandomCrop 32, 32 (after padding of 4) >> FlipLR >> Followed by CutOut(8, 8)
Batch size = 512
Use ADAM, and CrossEntropyLoss
Clone the project as shown below:-
$ git clone [email protected]:pankaja0285/era_v1_session10_pankaja.git
$ cd era_v1_session10_pankaja
About the file structure
|__config
__config.yaml
|__data
|__data_analysis
|__data_loader
__load_data.py
__albumentation.py
|__models
__model.py
|__utils
__dataset.py
__engine.py
__helper.py
__plot_metrics.py
__test.py
__train.py
|__CiFAR_S10.ipynb
|__README.md
NOTE: List of libraries required: torch and torchsummary, tqdm for progress bar, which are installed using requirements.txt
One of 2 ways to run any of the notebooks, for instance CiFAR_S10.ipynb notebook:
- Using Anaconda prompt - Run as an administrator start jupyter notebook from the folder era_v1_session10_pankaja and run it off of your localhost
NOTE: Without Admin privileges, the installs will not be correct and further import libraries will fail.
jupyter notebook
- Upload the notebook folder era_v1_session10_pankaja to google colab at colab.google.com and run it on colab
File used: models/model.py, model with Net Class and CiFAR_S10.ipynb
Target: - create a model with Residual block
Results:
- Total parameters: 6,573,120
- Train accuracy of 88.14 and test accuracy of 89.38
Analysis:
- To see how the accuracy is, using residual blocks.
For any questions, bug(even typos) and/or features requests do not hesitate to contact me or open an issue!