Class PromptFilterContext
- Namespace
- LMKit.TextGeneration.Filters
- Assembly
- LM-Kit.NET.dll
Provides contextual information to IPromptFilter implementations.
public sealed class PromptFilterContext
- Inheritance
-
PromptFilterContext
- Inherited Members
Examples
Rewriting a prompt inside a filter:
public async Task OnPromptAsync(
PromptFilterContext context,
Func<PromptFilterContext, Task> next)
{
// Append safety instructions to every prompt
context.Prompt += "\n\nRemember: be helpful, harmless, and honest.";
await next(context);
}
Short-circuiting with a cached result:
public async Task OnPromptAsync(
PromptFilterContext context,
Func<PromptFilterContext, Task> next)
{
var cached = _cache.TryGet(context.Prompt);
if (cached != null)
{
context.Result = cached;
return; // Skip inference entirely
}
await next(context);
// Store the result in cache for next time
_cache.Set(context.Prompt, context.Result);
}
Remarks
The context is created once per submission and flows through every registered prompt filter. Filters can modify Prompt to rewrite the user input before inference, or set Result to a non-null value to short-circuit inference entirely (e.g., for caching).
The Properties dictionary carries arbitrary state between filters in the same pipeline invocation. For example, a prompt filter can store a cache key that a later ICompletionFilter reads to populate the cache after inference.
Properties
- CancellationToken
Gets the cancellation token for this operation.
- ChatHistory
Gets the full conversation history at the time of submission.
- IsToolResponse
Gets a value indicating whether this submission is a tool-result re-submission (i.e., the model is receiving tool call results, not a fresh user message).
- Prompt
Gets or sets the user prompt that will be submitted for inference.
- Properties
Gets the properties dictionary for passing arbitrary state between filters.
- Result
Gets or sets the inference result.
- SystemPrompt
Gets the system prompt configured on the conversation.