next up previous
Next: Up: Previous:

Incremental (Stochastic) Gradient Descent



Batch mode Gradient Descent:

Do until satisfied

1.
Compute the gradient $\nabla E_{D}[\vec{w}]$
2.
$\vec{w} \leftarrow \vec{w} -\eta \nabla E_{D}[\vec{w}] $




Incremental mode Gradient Descent:

Do until satisfied



\begin{displaymath}E_{D}[\vec{w}] \equiv \frac{1}{2}\sum_{d \in D}(t_{d} - o_{d})^{2} \end{displaymath}


\begin{displaymath}E_{d}[\vec{w}] \equiv \frac{1}{2}(t_{d} - o_{d})^{2} \end{displaymath}

Incremental Gradient Descent can approximate Batch Gradient Descent arbitrarily closely if $\eta$ made small enough