Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
Tags
- neural network
- supervised ml
- feature engineering
- Supervised Learning
- 프롬프트 엔지니어링
- Andrew Ng
- Machine Learning
- nlp
- llama
- 머신러닝
- learning algorithms
- 인공지능
- Unsupervised Learning
- ML
- GPT
- coursera
- AI
- feature scaling
- 딥러닝
- Deep Learning
- AI 트렌드
- Regression
- bingai
- 언어모델
- LLM
- prompt
- 챗지피티
- ChatGPT
- Scikitlearn
- 인공신경망
Archives
- Today
- Total
My Progress
[Supervised ML] Gradient descent / Learning Rate - 6 본문
AI/ML Specialization
[Supervised ML] Gradient descent / Learning Rate - 6
ghwangbo 2023. 7. 28. 13:35반응형
1. Checking Gradient descent for convergence
How to check if Gradient descent is working well?
1.1 Graph
This graph is called Learning curve. As the iteration increases, the minimum cost should decrease. If the graph remains constant after enough iterations, we call that it has converged
1.2 Epsilon / Automatic convergence test
Let epsilon be 0.001
If the cost function decreaes by epsilon in one iteration, declare convergence
2. Learning Rate
When Learning rate is too large or have a bug in our code
We should identify the problem by looking at the minimum cost function for each iteration graph.
If learning rate is too small, it will take a lot of time to find minimum cost.
Values of Learning rate to try:
0.001 0.01 0.1 .....
반응형
'AI > ML Specialization' 카테고리의 다른 글
[Supervised ML] Classification with logistic regression - 8 (1) | 2023.07.30 |
---|---|
[Supervised ML] Feature Engineering - 7 (0) | 2023.07.28 |
[Supervised ML] Feature scaling - 5 (0) | 2023.07.28 |
[Supervised ML] Multiple linear regression - 4 (0) | 2023.07.28 |
[Supervised ML] Gradient Descent - 3 (0) | 2023.07.27 |