My Progress

[Supervised ML] Classification with logistic regression - 8 본문

AI/ML Specialization

[Supervised ML] Classification with logistic regression - 8

ghwangbo 2023. 7. 30. 16:39
반응형

1. Intuition


Linear Regression is not a good method for Classification Problem.


Why?

One outlier on the right changes the linear regression function. It moves decision boundary dramatically.

 

2. Logistic Regression


2.1 Formula

Logistic function allows the function to be curved unlike the linear regression.

 

This is a sigmoid function aka logistic function. x-axis represents the number z. It outputs a value between 0 and 1 by its function.

 

Why do we use sigmoid function? 

To examine the value between 0 and 1. To categorize and set the threshold easier.

 

 

Z value will the output from w * x + b. This is the logistic regression.

 

 

2.2 Interpretation  of a logistic regression

If a patient comes in and get the result of 0.7 through logistic function, the patient has 70% of being malignant.

 

2. Decision Boundary


We have to set a threshold to decide whether our prediction is equal to 1 (Yes) or 0(No)

 

Example) 

Ultimatelt, z = x1 + x2 - 3 / x1 + x2 = 3 will the decision boundary equation.

 

 

3. Code


Sigmoid function

def sigmoid(z):
	g = 1 / (1 + np.exp(-z))
    
    return g

 

 

 

4. Understanding the Logistic function equation


 

To understand logistic function, we have to understand the odds.

 The odds of x to happen is the probability of x to not happen. The equation for that is P/(1 - p).
We often use logarithmic odd to illustrate the relationship of probability and x into a linear scale fomr negative infinity to positive infinity. 

 

However, we want to find the probability of x to happen. Thus, we are converting log-odds equation into probability equation, which is the equation above(Logistic regression).

반응형