본문 바로가기

My Progress

검색하기
My Progress
프로필사진 ghwangbo

  • 분류 전체보기 (24)
    • AI (17)
      • ML Specialization (16)
      • 파이썬 머신러닝 완벽 가이드 (1)
      • CS231N - CNN for Visual Rec.. (0)
    • Rust (0)
    • Research (5)
    • Coding Problems (2)
Guestbook
Notice
Recent Posts
Recent Comments
Link
«   2025/05   »
일 월 화 수 목 금 토
1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31
Tags
  • 인공지능
  • coursera
  • 언어모델
  • feature engineering
  • 프롬프트 엔지니어링
  • 인공신경망
  • supervised ml
  • learning algorithms
  • Andrew Ng
  • ChatGPT
  • nlp
  • llama
  • 머신러닝
  • GPT
  • prompt
  • feature scaling
  • Regression
  • 딥러닝
  • LLM
  • bingai
  • Scikitlearn
  • 챗지피티
  • neural network
  • Supervised Learning
  • Unsupervised Learning
  • Deep Learning
  • AI
  • AI 트렌드
  • ML
  • Machine Learning
more
Archives
Today
Total
관리 메뉴
  • 글쓰기
  • 방명록
  • RSS
  • 관리

목록supervised ml (1)

My Progress

[Supervised ML] Gradient descent / Learning Rate - 6

1. Checking Gradient descent for convergence How to check if Gradient descent is working well? 1.1 Graph This graph is called Learning curve. As the iteration increases, the minimum cost should decrease. If the graph remains constant after enough iterations, we call that it has converged 1.2 Epsilon / Automatic convergence test Let epsilon be 0.001 If the cost function decreaes by epsilon in one..

AI/ML Specialization 2023. 7. 28. 13:35
이전 Prev 1 Next 다음

Blog is powered by kakao / Designed by Tistory

티스토리툴바