Good for Batch descent, when the gradient is fairly stable from one iteration to the next. Not so good for Stochastic descent and MiniBatch with small batch sizes, as those gradients won't settle near zero even as the weights converge.
Inheritance: implements MCordingley\Regression\Algorithm\GradientDescent\StoppingCriteria\StoppingCriteria
コード例 #1
0
 public function testConverged()
 {
     $criteria = new GradientNorm(1.0);
     static::assertFalse($criteria->converged([2, 2], []));
     static::assertTrue($criteria->converged([0.5, 0.5], []));
 }