Update self.log #132
Labels
bug / fix
Something isn't working
help wanted
Extra attention is needed
won't fix
This will not be worked on
🐛 Bug
Some tutorials still return non-detached tensors in
training_step
- this is deprecated in 1.6 and may cause memory leaks if people follow those patterns.e.g.
output = OrderedDict({"loss": g_loss, "progress_bar": tqdm_dict, "log": tqdm_dict})
The error pops up in the autogenerated docs here: https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/basic-gan.html
I raised this in Lightning-Universe/lightning-bolts#793 too.
I see the examples in https://github.com/PyTorchLightning/pytorch-lightning/tree/master/pl_examples already defer to lightning bolts for more robust examples - wherever a good source of docs/best practices should be, I think this specific error be fixed? (these examples are also discussed in #71).
Thanks, loving the library btw! :)
The text was updated successfully, but these errors were encountered: