Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding resnet18 support #694

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

adding resnet18 support #694

wants to merge 1 commit into from

Conversation

cl3m3nt
Copy link

@cl3m3nt cl3m3nt commented Dec 10, 2020

Adding pre-trained Resnet18 model for transfer learning.
Model has been trained using CIFAR100 coarse (20 super classes).

Changes are the following:

  • adding keras_resnet18.py part (a full "manual implementation of resnet18 as it's not available from keras applications)
  • updating keras.py part to load keras_resnet18 part & define Resnet18LinearKeras and Resnet18CategoricalKeras classes
  • updating tflite.py part to optionally take into account inference on TPU Edge
  • updating utils.py to take into account new model type (resnet18)
  • updating train.py to take into account new model type (resnet18)

Copy link
Contributor

@DocGarbanzo DocGarbanzo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @cl3m3nt,
Thanks for sending this over. It looks good partially but some bits missing:

  1. You don't seem to have the newest version of dev - can you please re-base your branch?
  2. Do you have an IDE which shows pep-8 compatibility (like PyCharm etc). There are a lot of non-compliant format issues, which these IDEs will automatically fix for you.
  3. There is an import error in the CI, you should have that locally too, which version of TF do you use locally?
  4. Can you please add a test into test_keras.py that shows the model is working?


def resnet18_default_n_linear(num_outputs, input_shape=(120,60,3)):

# Instantiate a ResNet18 model
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of a comment, please rather provide a doctoring.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is a doctoring :) ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a typo (autocomplete :-) meant to be 'docstring'.

from tensorflow.keras.layers import Activation, Dropout, Flatten
from tensorflow.keras.layers import LSTM
from tensorflow.keras.layers import TimeDistributed as TD
from tensorflow.keras.layers import Conv3D, MaxPooling3D, Conv2DTranspose
from tensorflow.keras.backend import concatenate
from tensorflow.keras.models import Model, Sequential
from donkeycar.parts.keras_resnet18 import identity_block,conv_block,ResNet18
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This line falls over in the CI, please also use pep-8 formatting

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any guess why it falls over in CI ?
Might be related to keras_resnet18 import
'from keras_applications.imagenet_utils' ?

utils = tf.keras.utils


def resnet18_default_n_linear(num_outputs, input_shape=(120,60,3)):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As you already made a new module for the new model, pls put those functions there.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What do you mean here ? Do you recommend to move resnet18 function and classes to keras_resnet18 module ?

def resnet18_default_n_linear(num_outputs, input_shape=(120,60,3)):

# Instantiate a ResNet18 model
resnet18 = ResNet18(include_top=False,weights='cifar100_coarse',input_shape=input_shape,backend=backend,layers=layers,models=models,utils=utils)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please use pep-8 line length and formatting.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

img_arr = img_arr.reshape((1,) + img_arr.shape)
outputs = self.model.predict(img_arr)
steering = outputs[0]
return steering[0] , dk.utils.throttle(steering[0])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the behaviour of KerasInferred, KerasLinear returns steering and throttle independently.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes you are right, I'll fix this following KerasLinear.

name = conv_name_base + '2b')(x)
x = BatchNormalization(axis=bn_axis,name=bn_name_base+'2b')(x)
x = Activation('relu')(x)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

here too.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok



x = Convolution2D(filters2, kernel_size,
padding='same',
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indention: pep-8.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok

pooling=None,
classes=20,
**kwargs):
global backend, layers, models, keras_utils
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please try to avoid global variables. If yo need to share them between functions it's fine to put them into a class.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


def identity_block(input_tensor, kernel_size, filters, stage, block):
filters1, filters2 = filters
if backend.image_data_format() == 'channels_last':
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This following code seems to re-appear a couple of times, maybe factor this into a small helper function.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, same here I followed what keras-team did for resnet50.
There a 2 occurences of the same code, which could be acceptable don't you think ?

@@ -4,136 +4,117 @@
Basic usage should feel familiar: python train_v2.py --model models/mypilot

Usage:
train.py [--tubs=tubs] (--model=<model>) [--type=(linear|inferred|tensorrt_linear|tflite_linear)]
train.py [--tubs=tubs] (--model=<model>) [--type=(linear|inferred|tensorrt_linear|tflite_linear|tflite_r18_lin|tflite_r18_cat)]

Options:
-h --help Show this screen.
"""

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You don't have the HEAD revision of the dev branch, this file is now only a small script supported for backward compatibility.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok I'll double check then how you train within the HEAD revision.

@DocGarbanzo
Copy link
Contributor

Hi @cl3m3nt - do you still want to pursue this? We have a reset 18 in pytorch now, but there is no harm having this in tf, too?

@sctse999
Copy link
Contributor

@cl3m3nt this looks great and I believe a lot of people would like to see this merged. Do you have time to work on this?

@tikurahul
Copy link
Collaborator

We already have Resnet18 with PyTorch now. FYI

@cl3m3nt
Copy link
Author

cl3m3nt commented Jan 27, 2021

Thanks @sctse999.
Yes I could find some extra time to update it so that it meets @DocGarbanzo guidance.
But I would also need to adapt taking into account last dev significant updates. I need to evaluate the effort before. Also I was considering making a PR to keras team so that Resnet18 would be available natively out from tf.keras.applications. Let me take some time to rethink about that.

@cl3m3nt
Copy link
Author

cl3m3nt commented Jan 27, 2021

We already have Resnet18 with PyTorch now. FYI

Thanks @tikurahul for the heads up.
I'll figure out whether I can update anyway this PR so that Resnet18 is available from keras as well.
Meanwhile, do you think that mobilenetv2 would make sense to be available ? Resnet18 got 7 Millions parameters vs Mobilenet which comes 2.2 Millions only. Might be interesting to have it as well to decrease latency when using Transfer Learning. What do you think ?

@sctse999
Copy link
Contributor

Great I would love to see resnet18 or mobilenetv2 on keras as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants