Table of Contents

Property MaximumContextLength

Namespace
LMKit.TextAnalysis
Assembly
LM-Kit.NET.dll

MaximumContextLength

Gets or sets the maximum context length (in tokens) that can be used for the model input. Reducing this value can dramatically increase inference speed on CPUs, as the computation scales with context length. However, this comes at the cost of higher perplexity, which may reduce the quality of model outputs. The value is clamped to the model's inherent maximum context length.

public int MaximumContextLength { get; set; }

Property Value

int

the default context size is automatically determined at runtime based on hardware capabilities and the model's constraints, commonly ranging from 2048 to 8192 tokens (though not guaranteed).

Examples

extractor.MaximumContextLength = 2048;  // Set a smaller maximum context length
Console.WriteLine("Current Max Context Length: " + extractor.MaximumContextLength);