Skip to content

Question: weird valid loss when re-scaling y #1013

Closed Answered by BenjaminBossan
lfbittencourt asked this question in Q&A
Discussion options

You must be logged in to vote

First of all, I must say this project has been a fundamental part of my master's thesis, so thank you very much for that.

Happy to hear that, thanks.

In converted the issue into a discussion, I hope you don't mind.

Regarding your problem, I could reproduce it with a synthetic dataset. My first thought was that by scaling y, we change the order of magnitude of the loss and thus need to adjust the learning rate to prevent overfitting. But after experimenting a bit with this, I don't believe anymore that this is the problem (or not the whole problem).

Interestingly, when I used the default loss (MSE), there was no such weird behavior. One reason could be the normalization step in MAPE, whi…

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@lfbittencourt
Comment options

@BenjaminBossan
Comment options

@lfbittencourt
Comment options

Answer selected by lfbittencourt
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #1012 on August 25, 2023 14:53.