옵티마이저(Optimizer)

  • SGD : Stochastic Gradient Descent, 조금씩 기울기를 수정해나가는 방식으로, 기본에 속하지만 방향이 다른경우 기울기가 달라져 탐색경로가 비효율적. 속도가 후에 나온 옵티마이저보다 훨씬 느린편임.
  • Momentum : SGD에 가속변수를 하나 추가하여 탐색이 가속적
  • AdaGrad : 개별 매개변수에 적응적으로 학습률을 조정하면서 학습을 진행. 즉 초반에는 빠르다가 점차 감소해가면서 갱신해감
  • RMSProp : 지수 이동평균을 이용해(Exponential Moving Average)초반 기울기보다는 최근의 기울기 정보를 반영해감.
  • Adam : 모멘텀과 AdaGrad의 장점을 취한 방법으로 편향 보정이 진행

참고 및 시각화 출처

https://github.com/Jaewan-Yun/optimizer-visualization

https://emiliendupont.github.io/2018/01/24/optimization-visualization/

5 thoughts on “옵티마이저(Optimizer)

  1. It’s a pity you don’t have a donate button! I’d certainly donate to this superb blog!
    I suppose for now i’ll settle for bookmarking and adding your RSS feed to my Google account.

    I look forward to brand new updates and will
    share this site with my Facebook group. Chat soon!

    Look at my website: could g

  2. Unquestionably believe that which you stated.
    Your favorite justification seemed to be on the net
    the easiest thing to be aware of. I say to you, I certainly
    get annoyed while people think about worries that they plainly don’t
    know about. You managed to hit the nail upon the top as well as defined out the whole thing
    without having side effect , people can take a signal.
    Will probably be back to get more. Thanks

    Feel free to surf to my homepage; when g

  3. great issues altogether, you simply gained a new reader.
    What could you suggest in regards to your post that you just
    made a few days in the past? Any positive?

    my blog – cbd oil (nyedupia.net)

  4. of course like your web-site however you need to check the spelling on several of your posts.
    A number of them are rife with spelling issues and I in finding it
    very troublesome to inform the truth nevertheless I’ll definitely come back again.

    my web page; cbd oil that works 2020

댓글 남기기