Modifying federated learning #1900
Replies: 2 comments
-
This is a general research question, @holgerroth @ZiyueXu77 can you help sharing your thoughts on this discussion, thanks! |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @Nicholas-B1 , you can reduce the number of layers or use the SimpleCNN in the network definitions. Also, changing the number of training samples being loaded in the data loader should speed things up. The easiest would be to break the loop after a certain number of iterations in training and validation. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello, I would like to know how I would go about cutting down datasets or features or CNN layers to make faster running federated learning. I want to be able to test the speed of these examples like the cifar10 example but make it run faster even if it means sacrificing some accuracy.
Beta Was this translation helpful? Give feedback.
All reactions