Table of Contents

Property AdamDecay

Namespace
LMKit.Finetuning
Assembly
LM-Kit.NET.dll

AdamDecay

Gets or sets the AdamW optimizer's weight decay factor. This parameter controls the rate at which weights decrease during training.

public float AdamDecay { get; set; }

Property Value

float

The default value is 0.1f. A value greater than zero activates the AdamW optimizer.

Remarks

Setting this parameter to a non-zero value switches the optimization from Adam to AdamW, which includes weight decay to help in regularizing the model and potentially improve generalization.