SgdOptimizer

class SgdOptimizer @JvmOverloads constructor(lr: Double, momentum: Double = 0.0, weightDecay: Double = 0.0) : Optimizer(source)

Stochastic Gradient Descent optimizer with optional momentum and weight decay.

Constructors

Link copied to clipboard
constructor(lr: Double, momentum: Double = 0.0, weightDecay: Double = 0.0)

Functions

Link copied to clipboard
open override fun addParameter(param: ModuleParameter<*, *>, applyWeightDecay: Boolean)

Register a raw module parameter to be optimized.

open override fun addParameter(param: Parameter, applyWeightDecay: Boolean)

Register a parameter to be optimized.

Link copied to clipboard
open override fun step()

Perform one optimization step, updating all registered parameters in-place (via reassigning their tensor values where needed).

Link copied to clipboard
open override fun zeroGrad()

Zero accumulated gradients on all registered parameters.