-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ZeroDivisionError #16
Comments
I haven't touched Tensorflow in a while. But wouldn't this be for loading a checkpoint?
It doesn't seem like normal Tensorflow checkpoint.
This is the first mention of val_batches_text that I found in the code. Where are you loading the datasets btw? |
I loaded a dataset that I cleaned from my own pc. I first ran my dataset through the preprocessing program you have posted as well. For my project, I was trying to adapt your implementation so that I can generate a title based off the description of the given problem. I assume this would work for this application. The dataset I used was originally downloaded from the StackOverflow dataset found on Google Cloud API. |
It should work, but the main error seems to be that len(val_batches_text) is 0. That means the source of the bug is in the code snippet where val_batches_text is being created. For some reason no data is being loaded. |
I've tried what you said in the last comment but the data set is not loading and it return the error as the zero division. |
Probably you have to figure out what max length/min length are suitable for your dataset: |
Trying your method for a different dataset and I am getting a ZeroDivisionError for the Training and Validation Section. I assume that something is not loading properly because there should be no zero values.
Here is the code:
`import pickle
import random
with tf.Session() as sess: # Start Tensorflow Session
display_step = 100
patience = 5
I can get rid of the error with exception handling but I was wondering if you had and idea of why it's not working in the first place.
The text was updated successfully, but these errors were encountered: