how to print loss every n steps without progress bar #6452
Unanswered
sun-peach
asked this question in
Lightning Trainer API: Trainer, LightningModule, LightningDataModule
Replies: 1 comment
-
If you can live with stuff getting logged into a file, I highly recommend that you use the from pytorch_lightning.loggers import CSVLogger
logdir = os.getcwd() # this sets very the file will be stored
trainer = Trainer(logger=CSVLogger(logdir)) this will create a |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, I am new to PL. I would like to print loss every n steps (or per epoch) without display progress bar. Now I just disable the progress bar, but I find nothing printed out, including the loss. I see my code has something like
self.log('val_loss', reduced_loss, prog_bar=False, logger=True, on_step=False, on_epoch=True, sync_dist=True)
, but cannot find where the log is. I just setTrainer(logger=True)
. How can I get the log printed to console or some file?Thank you very much!
Beta Was this translation helpful? Give feedback.
All reactions