Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

False Positives #140

Open
Pdzly opened this issue Sep 1, 2023 · 1 comment
Open

False Positives #140

Pdzly opened this issue Sep 1, 2023 · 1 comment

Comments

@Pdzly
Copy link

Pdzly commented Sep 1, 2023

I used the newest Inception v3 model.

Here is a collection of all false Positives i have gotten. I will let you know for further false positives.

2b90bb2d-c12f-4015-94cd-7a1ef395251f

grafik

b6ab2f51-2b2b-4372-bdbe-b2034c2342e2

grafik

9a8098fb-dd45-4229-89c8-e6b65464c778

grafik

09f2426e-a620-4fc3-ac32-775a0a985252

grafik

Still nice job! The model predicted like 500 images ( and rising ) correctly!

@edwios
Copy link

edwios commented Sep 5, 2023

Tried the above with the nsfw.299x299.h5 model, except the first one, which it has successfully identified as a drawing, the rests all mistakenly identified as porn with a very high probability. In my test batch, I have a few fully clothed, nothing sexy picture of a person also being classified as porn with a 1.0 probability. 🤷‍♂️

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants