-
Notifications
You must be signed in to change notification settings - Fork 268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
False Positives #140
Comments
Tried the above with the nsfw.299x299.h5 model, except the first one, which it has successfully identified as a drawing, the rests all mistakenly identified as porn with a very high probability. In my test batch, I have a few fully clothed, nothing sexy picture of a person also being classified as porn with a 1.0 probability. 🤷♂️ |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I used the newest Inception v3 model.
Here is a collection of all false Positives i have gotten. I will let you know for further false positives.
Still nice job! The model predicted like 500 images ( and rising ) correctly!
The text was updated successfully, but these errors were encountered: