본문 바로가기

My Progress

검색하기
My Progress
프로필사진 ghwangbo

  • 분류 전체보기 (28)
    • AI (17)
      • ML Specialization (16)
      • 파이썬 머신러닝 완벽 가이드 (1)
      • CS231N - CNN for Visual Rec.. (0)
    • Rust (0)
    • Research (5)
    • Coding Problems (2)
    • ... (4)
Guestbook
Notice
Recent Posts
Recent Comments
Link
«   2025/06   »
일 월 화 수 목 금 토
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30
Tags
  • llama
  • prompt
  • neural network
  • AI
  • bingai
  • learning algorithms
  • Regression
  • AI 트렌드
  • ChatGPT
  • 언어모델
  • Andrew Ng
  • Unsupervised Learning
  • 인공신경망
  • coursera
  • Scikitlearn
  • nlp
  • 머신러닝
  • 인공지능
  • 딥러닝
  • feature scaling
  • Deep Learning
  • 챗지피티
  • GPT
  • 프롬프트 엔지니어링
  • LLM
  • supervised ml
  • Supervised Learning
  • feature engineering
  • Machine Learning
  • ML
more
Archives
Today
Total
관리 메뉴
  • 글쓰기
  • 방명록
  • RSS
  • 관리

목록Supervised Learning (1)

My Progress

[Supervised ML] Cost function/Gradient descent for logistic regression - 9

1. Cost function1.1 IntuitionWe use logistic regression / sigmoid function to estimate the data's label or category. How do we choose w and b?For linear regression, we used squared error cost.Linear regression is a convex form, so we could use standard gradient descent equation to figure out local minimum. Since logistic regression is a non-convex form, it has multiple local minimum. Thus, we ca..

AI/ML Specialization 2023. 7. 31. 17:14
이전 Prev 1 Next 다음

Blog is powered by kakao / Designed by Tistory

티스토리툴바