Notice
Recent Posts
Recent Comments
Link
일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 | 31 |
Tags
- Supervised Learning
- 머신러닝
- 언어모델
- ML
- prompt
- Andrew Ng
- Machine Learning
- nlp
- 챗지피티
- LLM
- 딥러닝
- llama
- neural network
- ChatGPT
- learning algorithms
- 프롬프트 엔지니어링
- Deep Learning
- 인공지능
- AI 트렌드
- bingai
- supervised ml
- Unsupervised Learning
- Regression
- Scikitlearn
- AI
- feature scaling
- GPT
- feature engineering
- 인공신경망
- coursera
Archives
- Today
- Total
My Progress
[Supervised ML] Feature Engineering - 7 본문
반응형
1. What is Feature Engineering?
Using intuition to design new features, by transforming or combinging original features.
Example) House price.

If we are given the dimension of the house such as length and width, we can create a new variable to include for price prediction of a house. For example, we create a area variable with length and width.
2. Feature engineering in Polynomial Regression
We may want to create a curve line instead of a linear line.

We can use feature engineering by multiplying x by exponents. But now x has a different scale of value than the original x. This is where feature scaling comes to play its job. We need to rescale the data for better gradient descent performance.
Other than x^2, x^3, there are other choices of feature.
반응형
'AI > ML Specialization' 카테고리의 다른 글
[Supervised ML] Cost function/Gradient descent for logistic regression - 9 (0) | 2023.07.31 |
---|---|
[Supervised ML] Classification with logistic regression - 8 (1) | 2023.07.30 |
[Supervised ML] Gradient descent / Learning Rate - 6 (0) | 2023.07.28 |
[Supervised ML] Feature scaling - 5 (0) | 2023.07.28 |
[Supervised ML] Multiple linear regression - 4 (0) | 2023.07.28 |