Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Weird spikes in the loss #84

Open
return-nihil opened this issue May 14, 2024 · 0 comments
Open

Weird spikes in the loss #84

return-nihil opened this issue May 14, 2024 · 0 comments

Comments

@return-nihil
Copy link

return-nihil commented May 14, 2024

Hi!

I'm currently experimenting with a custom X-UNet to train a conditional model with my custom embeddings. I'm also using classifier-free guidance with a probability of 0.1. I've tested my whole pipeline in advance on a toy-UNet.
The train seems to work fine, and the samples generated at checkpoints are indeed getting better, also according to the conditioning signals.

However, I noticed some strange behaviour in the training loss curve. During the first few iterations, the loss improves quickly and reach a plateau, stabilizing at a mean value of approximately 0.02. After some time, however, some weird spikes start to appear in the loss curve, with values around 0.25. They're non-periodic, not epoch-dependent, and they occur randomly after a certain point, remaining from there onwards every approximately 200 steps on average.
I've also tried to restart the training from the last checkpoint: initially, the spikes disappeared, but reappeared again after further ~20k steps (see the attached image).

Did somebody experienced similar behaviour? What could be the possible cause? Is it a problem?

Thank you!

Screenshot 2024-05-14 alle 13 48 16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant