Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
Tags
- 인공지능
- coursera
- 언어모델
- feature engineering
- 프롬프트 엔지니어링
- 인공신경망
- supervised ml
- learning algorithms
- Andrew Ng
- ChatGPT
- nlp
- llama
- 머신러닝
- GPT
- prompt
- feature scaling
- Regression
- 딥러닝
- LLM
- bingai
- Scikitlearn
- 챗지피티
- neural network
- Supervised Learning
- Unsupervised Learning
- Deep Learning
- AI
- AI 트렌드
- ML
- Machine Learning
Archives
- Today
- Total
목록supervised ml (1)
My Progress

1. Checking Gradient descent for convergence How to check if Gradient descent is working well? 1.1 Graph This graph is called Learning curve. As the iteration increases, the minimum cost should decrease. If the graph remains constant after enough iterations, we call that it has converged 1.2 Epsilon / Automatic convergence test Let epsilon be 0.001 If the cost function decreaes by epsilon in one..
AI/ML Specialization
2023. 7. 28. 13:35