| 일 | 월 | 화 | 수 | 목 | 금 | 토 |
|---|---|---|---|---|---|---|
| 1 | ||||||
| 2 | 3 | 4 | 5 | 6 | 7 | 8 |
| 9 | 10 | 11 | 12 | 13 | 14 | 15 |
| 16 | 17 | 18 | 19 | 20 | 21 | 22 |
| 23 | 24 | 25 | 26 | 27 | 28 | 29 |
| 30 |
- GPT
- llama
- LLM
- 머신러닝
- neural network
- prompt
- learning algorithms
- feature scaling
- Machine Learning
- 프롬프트 엔지니어링
- 딥러닝
- Deep Learning
- Andrew Ng
- Unsupervised Learning
- ML
- ChatGPT
- supervised ml
- 인공지능
- AI 트렌드
- nlp
- AI
- bingai
- Supervised Learning
- Scikitlearn
- Regression
- feature engineering
- 인공신경망
- coursera
- 챗지피티
- 언어모델
- Today
- Total
My Progress
[Supervised ML] Review - 11 본문

This is the overview of supervised learning. Based on the characteristics of problems, we can divide problems into regression and classification. To solve the problem, we use linear regression for regression and logistic regression for classification problem. We have to get the best linear or logistic regression formula that fits the datasest. The way to find the formula is the cost function. By finding the minimum cost between the regression formula's output or prediction and the answer of the dataset, we can find the best regression formula to solve the problem. But we cannot manually test out every single w and b to get the formula. We can find the best w and b through gradient descent. We also have a different problem: bias and variance. If our formula underfits or overfits the data, we have to change the formula to be the right fit. We can do this by regularization. By increasing the number of x, we can decrease the number of w.
'AI > ML Specialization' 카테고리의 다른 글
| [Advanced Learning Algorithms] Neural Network Training - 2 (0) | 2024.01.16 |
|---|---|
| [Advanced Learning Algorithms] Neural Network - 1 (0) | 2023.08.03 |
| [Supervised ML] The problem of overfitting / Regularization - 10 (0) | 2023.08.01 |
| [Supervised ML] Cost function/Gradient descent for logistic regression - 9 (0) | 2023.07.31 |
| [Supervised ML] Classification with logistic regression - 8 (1) | 2023.07.30 |