Notice
Recent Posts
Recent Comments
Link
| 일 | 월 | 화 | 수 | 목 | 금 | 토 |
|---|---|---|---|---|---|---|
| 1 | 2 | 3 | 4 | 5 | 6 | |
| 7 | 8 | 9 | 10 | 11 | 12 | 13 |
| 14 | 15 | 16 | 17 | 18 | 19 | 20 |
| 21 | 22 | 23 | 24 | 25 | 26 | 27 |
| 28 | 29 | 30 | 31 |
Tags
- 딥러닝
- LLM
- coursera
- neural network
- feature scaling
- 프롬프트 엔지니어링
- feature engineering
- Machine Learning
- AI 트렌드
- Andrew Ng
- 인공신경망
- GPT
- Regression
- prompt
- 머신러닝
- 언어모델
- bingai
- supervised ml
- AI
- Scikitlearn
- Supervised Learning
- ChatGPT
- Unsupervised Learning
- learning algorithms
- ML
- llama
- 챗지피티
- Deep Learning
- nlp
- 인공지능
Archives
- Today
- Total
목록Supervised Learning (1)
My Progress
1. Cost function1.1 IntuitionWe use logistic regression / sigmoid function to estimate the data's label or category. How do we choose w and b?For linear regression, we used squared error cost.Linear regression is a convex form, so we could use standard gradient descent equation to figure out local minimum. Since logistic regression is a non-convex form, it has multiple local minimum. Thus, we ca..
AI/ML Specialization
2023. 7. 31. 17:14