Table of Contents

Property MaximumContextLength

Namespace
LMKit.Extraction
Assembly
LM-Kit.NET.dll

MaximumContextLength

Gets or sets the maximum context length (in tokens) allowed for the language model during text extraction.

public int MaximumContextLength { get; set; }

Property Value

int

An int representing the maximum number of tokens the LM is permitted to handle during extraction.

Examples

// Create a TextExtraction instance with a given LM model
TextExtraction textExtraction = new TextExtraction(myLM);

// Override the default context length if you want a specific limit
textExtraction.MaximumContextLength = 4096;

// Proceed with extraction
textExtraction.SetContent("Your very long text to analyze...");
TextExtractionResult result = textExtraction.Parse();

// The extraction process will not exceed the defined maximum context length

Remarks

By default, this value is automatically determined at runtime based on your hardware capabilities, the model's constraints, and internal minimum requirements. Often, modern language models support context windows in the range of 2048 to 8192 tokens, but this can vary depending on the specific model and available hardware resources.

When a value is assigned to this property, it is clamped to ensure it does not exceed ContextLength (the model's maximum supported context size) and does not fall below MinContextSize (the library's internally defined minimum context size). Thus, attempting to set a value larger than the model supports or smaller than the library minimum will automatically be adjusted to these boundaries.

This property is crucial for controlling how much text (in tokens) the language model can consider at once. If your text content is significantly longer than the maximum context length, multiple parse calls or chunking strategies might be required to process all of it.