일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
- 인공지능
- LLM
- feature engineering
- coursera
- nlp
- Machine Learning
- neural network
- bingai
- Scikitlearn
- 챗지피티
- 딥러닝
- 프롬프트 엔지니어링
- feature scaling
- Deep Learning
- Unsupervised Learning
- 언어모델
- llama
- 머신러닝
- Andrew Ng
- Supervised Learning
- prompt
- ChatGPT
- supervised ml
- 인공신경망
- learning algorithms
- AI 트렌드
- Regression
- ML
- AI
- GPT
- Today
- Total
My Progress
[Supervised ML] Classification with logistic regression - 8 본문
[Supervised ML] Classification with logistic regression - 8
ghwangbo 2023. 7. 30. 16:391. Intuition
Linear Regression is not a good method for Classification Problem.
Why?
One outlier on the right changes the linear regression function. It moves decision boundary dramatically.
2. Logistic Regression
2.1 Formula
Logistic function allows the function to be curved unlike the linear regression.
This is a sigmoid function aka logistic function. x-axis represents the number z. It outputs a value between 0 and 1 by its function.
Why do we use sigmoid function?
To examine the value between 0 and 1. To categorize and set the threshold easier.
Z value will the output from w * x + b. This is the logistic regression.
2.2 Interpretation of a logistic regression
If a patient comes in and get the result of 0.7 through logistic function, the patient has 70% of being malignant.
2. Decision Boundary
We have to set a threshold to decide whether our prediction is equal to 1 (Yes) or 0(No)
Example)
Ultimatelt, z = x1 + x2 - 3 / x1 + x2 = 3 will the decision boundary equation.
3. Code
Sigmoid function
def sigmoid(z):
g = 1 / (1 + np.exp(-z))
return g
4. Understanding the Logistic function equation
To understand logistic function, we have to understand the odds.
The odds of x to happen is the probability of x to not happen. The equation for that is P/(1 - p).
We often use logarithmic odd to illustrate the relationship of probability and x into a linear scale fomr negative infinity to positive infinity.
However, we want to find the probability of x to happen. Thus, we are converting log-odds equation into probability equation, which is the equation above(Logistic regression).
'AI > ML Specialization' 카테고리의 다른 글
[Supervised ML] The problem of overfitting / Regularization - 10 (0) | 2023.08.01 |
---|---|
[Supervised ML] Cost function/Gradient descent for logistic regression - 9 (0) | 2023.07.31 |
[Supervised ML] Feature Engineering - 7 (0) | 2023.07.28 |
[Supervised ML] Gradient descent / Learning Rate - 6 (0) | 2023.07.28 |
[Supervised ML] Feature scaling - 5 (0) | 2023.07.28 |