Learn Rate, Loss, TensorBoard Discussion #1773
JustAnOkapi
started this conversation in
General
Replies: 4 comments 4 replies
-
|
Beta Was this translation helpful? Give feedback.
0 replies
-
in the case of dreambooth, a constant learning rate for a small dataset is not ideal |
Beta Was this translation helpful? Give feedback.
1 reply
-
why is loss so all over the place anyways, even the avg doesnt go down |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
lr_scheduler
is slightly betterUsing constant give slightly lower loss and is easier to resume off. Same conclution as this reddit post.
You can enable tensorboard pretty easily with two blocks:
This will stay running in the background so you can start training.
I manually reorganize the log files into this structure:
(run) (unet or text) (steps)
Beta Was this translation helpful? Give feedback.
All reactions