global minima
18
Aug
Convergence in a Neural Network
Convergence refers to the point in machine learning training where ideal weights have been discovered and cost function minimized. But this is, at times, plagued by issues of local minima and slow learning. Here Mini-batch gradient descent comes handy, which combines the advantages of batch gradient learning as well as stochastic gradient descent. The process […]