My Progress

[Supervised ML] Gradient descent / Learning Rate - 6 본문

AI/ML Specialization

[Supervised ML] Gradient descent / Learning Rate - 6

ghwangbo 2023. 7. 28. 13:35
반응형

1. Checking Gradient descent for convergence


How to check if Gradient descent is working well?

 

1.1 Graph

This graph is called Learning curve. As the iteration increases, the minimum cost should decrease. If the graph remains constant after enough iterations, we call that it has converged

 

1.2 Epsilon / Automatic convergence test

 

Let epsilon be 0.001

 

If the cost function decreaes by epsilon in one iteration, declare convergence

 

2. Learning Rate


When Learning rate is too large or have a bug in our code

 

We should identify the problem by looking at the minimum cost function for each iteration graph. 

If learning rate is too small, it will take a lot of time to find minimum cost.

 

Values of Learning rate to try:

0.001 0.01 0.1 .....

 

 

반응형