-
Notifications
You must be signed in to change notification settings - Fork 931
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How did you decide the number of hidden layers to be 32. Can you please explain this? #35
Comments
As of having tried ANNs and RNNs on other datasets before, I know that with this quantity of data, this amount of hidden layer would be good. In practice, I've tried a few ones before settling on one. You could use this to speed up this process of finding the best parameters: https://www.neuraxle.org/stable/hyperparameter_tuning.html
Using Perceptrons and a 3D cube. Values are normalized and inputted to the perceptrons of the LSTM. The axis of time is well separated as a specific dimension in the 3D cube. |
You have 128 readings in every window. How will these input values go into the LSTRM network? I am trying to follow similar logic for my own generated dataset. So I need to understand the logic to implement for my dataset.
The text was updated successfully, but these errors were encountered: