Skip to content

How to use this together with wandb? Plus a question about class imbalance #489

Answered by KevinMusgrave
jasperhyp asked this question in Q&A
Discussion options

You must be logged in to vote

Hi Kevin, Thanks for creating this wonderful repo! I was just wondering what the best practice is to use this along with wandb, which I typically use for hyperparameter tune. Basically, I'll need to log each epoch's loss & metrics, etc., to wandb. I guess probably either not use the trainer but have to write the training code as in classical pytorch by hand, or use custom hook perhaps.

Yes you could pass in end_of_iteration_hook like this:

def hook(trainer):
    for k,v in trainer.losses.items():
        log(k, v)

trainer = MetricLossOnly(..., end_of_iteration_hook = hook)

But instead of this, I would write my own training code, or use a framework like PyTorch Lightning, PyTorch Ignite…

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@jasperhyp
Comment options

@KevinMusgrave
Comment options

@jasperhyp
Comment options

Answer selected by jasperhyp
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants