본문 바로가기

My Progress

검색하기
My Progress
프로필사진 ghwangbo

  • 분류 전체보기 (24)
    • AI (17)
      • ML Specialization (16)
      • 파이썬 머신러닝 완벽 가이드 (1)
      • CS231N - CNN for Visual Rec.. (0)
    • Research (5)
    • Coding Problems (2)
Guestbook
Notice
Recent Posts
Recent Comments
Link
«   2025/05   »
일 월 화 수 목 금 토
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
Tags
  • Unsupervised Learning
  • AI
  • ChatGPT
  • 머신러닝
  • feature scaling
  • neural network
  • 인공지능
  • AI 트렌드
  • llama
  • Regression
  • learning algorithms
  • feature engineering
  • ML
  • GPT
  • Scikitlearn
  • 언어모델
  • Andrew Ng
  • 인공신경망
  • Machine Learning
  • coursera
  • 딥러닝
  • nlp
  • bingai
  • LLM
  • Deep Learning
  • supervised ml
  • Supervised Learning
  • 프롬프트 엔지니어링
  • prompt
  • 챗지피티
more
Archives
Today
Total
관리 메뉴
  • 글쓰기
  • 방명록
  • RSS
  • 관리

목록Scikitlearn (1)

My Progress

[Supervised ML] Cost function/Gradient descent for logistic regression - 9

1. Cost function1.1 IntuitionWe use logistic regression / sigmoid function to estimate the data's label or category. How do we choose w and b?For linear regression, we used squared error cost.Linear regression is a convex form, so we could use standard gradient descent equation to figure out local minimum. Since logistic regression is a non-convex form, it has multiple local minimum. Thus, we ca..

AI/ML Specialization 2023. 7. 31. 17:14
이전 Prev 1 Next 다음

Blog is powered by kakao / Designed by Tistory

티스토리툴바