Exercise 5&6 00_fundamentals - random seeds (solution spoiler) #943
Replies: 1 comment
-
Nevermind... I got it. The seeding is applied once, then all the data is generated with that seed, meaning that in the first example the seed is basically reset. In the second example the seed is not set and the data generation just continues with the initialized seed. In the last example, no seed is set. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I have a question regarding the seeds in torch. I have attached 3 different versions of the code, together with their output. What I understand is, that when I use torch.manual_seed(1234) after every line, then the same tensor is generated, as expected. However, if I just use it in the beginning then I would expect t1 to be created with seed 1234 and t2 to be random. This does not seem to be the case, due to the torch.cuda.manual_seed(1234), which seems to set t2. If I however remove the torch.manual_seed(1234) then I would expect the first tensor to be the content of t2, however, it then seems to be completely random.
I didn't get the correlation from the docs (manual seed docs (where I looked)
cuda docs (where I looked))
I attached the 3 code snippets below and marked the tensors I would expect to be the same in bold.
Beta Was this translation helpful? Give feedback.
All reactions