Table of Contents

Property AdamAlpha

Namespace
LMKit.Finetuning
Assembly
LM-Kit.NET.dll

AdamAlpha

Gets or sets the learning rate (alpha) for the Adam optimizer. This parameter controls how much the model weights are adjusted with respect to the loss gradient.

public float AdamAlpha { get; set; }

Property Value

float

The default value is 0.001.

Remarks

The learning rate is a crucial hyperparameter in gradient descent-based optimization algorithms like Adam. It determines the step size at each iteration while moving toward a minimum of the loss function. Smaller values lead to slower but more stable convergence, whereas larger values can speed up training but might cause divergence.