Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

should I transfer the label into one-hot? #9

Open
sxzy opened this issue Oct 25, 2018 · 2 comments
Open

should I transfer the label into one-hot? #9

sxzy opened this issue Oct 25, 2018 · 2 comments

Comments

@sxzy
Copy link

sxzy commented Oct 25, 2018

Hey.
I am trying to use your code to train my model.
and I have noticed that y_a and y_b should be one-hot.so when I implement your code in my experiment ,should I encode the label which is integer into one-hot ???
and I have transfer it in one-hot and start train .....but I found that the correct is always zero.....

@hongyi-zhang
Copy link

Yes, y_a and y_b are assumed to be one-hot encoding of labels. The CIFAR-10 code should serve as a working example -- you can print its variable type / size / etc. and make a side-by-side comparison with your implementation, which hopefully will give you enough information to debug your code.

Best,
Hongyi

@guanxiongsun
Copy link

Yes, y_a and y_b are assumed to be one-hot encoding of labels. The CIFAR-10 code should serve as a working example -- you can print its variable type / size / etc. and make a side-by-side comparison with your implementation, which hopefully will give you enough information to debug your code.

Best,
Hongyi

It's not one-hot encoding.
In this repo, you use the target, the index of the one-hot encoding. And you just use the original target, instead of y = λ*y1 + (1-λ)y2 different from what is mentioned in the paper.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants