Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
Tags
- feature scaling
- 챗지피티
- Machine Learning
- learning algorithms
- Andrew Ng
- Supervised Learning
- AI
- feature engineering
- LLM
- 프롬프트 엔지니어링
- Deep Learning
- AI 트렌드
- Scikitlearn
- llama
- GPT
- 인공신경망
- 언어모델
- 딥러닝
- Regression
- neural network
- supervised ml
- prompt
- ML
- coursera
- Unsupervised Learning
- bingai
- nlp
- 머신러닝
- 인공지능
- ChatGPT
Archives
- Today
- Total
My Progress
[Supervised ML] Feature Engineering - 7 본문
반응형
1. What is Feature Engineering?
Using intuition to design new features, by transforming or combinging original features.
Example) House price.
If we are given the dimension of the house such as length and width, we can create a new variable to include for price prediction of a house. For example, we create a area variable with length and width.
2. Feature engineering in Polynomial Regression
We may want to create a curve line instead of a linear line.
We can use feature engineering by multiplying x by exponents. But now x has a different scale of value than the original x. This is where feature scaling comes to play its job. We need to rescale the data for better gradient descent performance.
Other than x^2, x^3, there are other choices of feature.
반응형
'AI > ML Specialization' 카테고리의 다른 글
[Supervised ML] Cost function/Gradient descent for logistic regression - 9 (0) | 2023.07.31 |
---|---|
[Supervised ML] Classification with logistic regression - 8 (1) | 2023.07.30 |
[Supervised ML] Gradient descent / Learning Rate - 6 (0) | 2023.07.28 |
[Supervised ML] Feature scaling - 5 (0) | 2023.07.28 |
[Supervised ML] Multiple linear regression - 4 (0) | 2023.07.28 |