almost stochastic

annals of computational statistics

  • home
  • about
  • archive
  • toolbox
  • notes
  • contact
skip to main | skip to sidebar

2020/11/11

Convergence rate of gradient descent for convex functions

Suppose, given a convex function $f: \bR^d \to \bR$, we would like to find the minimum of $f$ by iterating \begin{align*} \theta_t = \theta_{t-1} - \gamma \nabla f(\theta_{t-1}). \end{align*} How fast do we converge to the minima of $f$?

Read more »
author: Deniz on Wednesday, November 11, 2020 0 comments
categories: optimization
Newer Posts Older Posts Home
my personal webpage.

rss

Posts
Atom
Posts
All Comments
Atom
All Comments

search

mathematics village

archive

  • ▼  2020 (1)
    • ▼  November 2020 (1)
      • Convergence rate of gradient descent for convex fu...
  • ►  2019 (1)
    • ►  June 2019 (1)
  • ►  2016 (3)
    • ►  October 2016 (1)
    • ►  September 2016 (1)
    • ►  January 2016 (1)
  • ►  2015 (2)
    • ►  March 2015 (2)
  • ►  2014 (1)
    • ►  June 2014 (1)
  • ►  2013 (10)
    • ►  November 2013 (2)
    • ►  August 2013 (1)
    • ►  July 2013 (2)
    • ►  June 2013 (1)
    • ►  May 2013 (4)

band pass filter

analysis (2) bayesian statistics (2) chaos (1) dynamical systems (2) inequalities (1) inverse problems (1) linear algebra (1) machine learning (3) matrix factorizations (1) maximum-likelihood (3) measure theory (2) monte carlo methods (5) optimization (5) probability (2) sequential monte carlo (1) stochastic differential equations (1) stochastic filtering (1) time series analysis (2)

links

  • Francis Bach's Blog
  • I'm a bandit
  • Sam Power's Bluesky
  • Tricki
  • Xi'an's Og
Powered by Blogger.
 
"There are some enterprises in which a careful disorderliness is the true method." (Moby Dick, chapter 82)
by Ömer Deniz Akyıldız.