Table of Contents

Property MaximumContextLength

Namespace
LMKit.TextGeneration
Assembly
LM-Kit.NET.dll

MaximumContextLength

Gets or sets the maximum context length (in tokens) that can be used for the model input. Reducing this value can dramatically increase inference speed on CPUs, but may reduce output quality.

public int MaximumContextLength { get; set; }

Property Value

int

the default context size is automatically determined at runtime based on hardware capabilities and the model's constraints, commonly ranging from 2048 to 8192 tokens (though not guaranteed).

Examples

// Dynamically adjusting the maximum context length based on user preference.
var model = new LMKit.Model.LM("my-model.gguf");
var summarizer = new LMKit.TextGeneration.Summarizer(model);

// Prompt the user for a desired context length
Console.Write("Enter desired maximum context length: ");
if (int.TryParse(Console.ReadLine(), out int contextLength))
{
    summarizer.MaximumContextLength = contextLength;
}

Console.WriteLine("Effective Maximum Context Length: " + summarizer.MaximumContextLength);