Table of Contents

Method FitParameters

Namespace
LMKit.Hardware
Assembly
LM-Kit.NET.dll

FitParameters(string, uint, uint, DeviceConfiguration)

Fits model and context parameters to available device memory, determining the optimal context size and GPU layer count that can be allocated without running out of memory.

public static MemoryEstimation.FitResult FitParameters(string modelPath, uint contextSize = 0, uint minimumContextSize = 2048, LM.DeviceConfiguration deviceConfiguration = null)

Parameters

modelPath string

The file path to the model.

contextSize uint

The desired context size (in tokens). Set to 0 to let the fitting process determine the maximum context size that fits. If non-zero, the fitter will attempt to use this exact size and only reduce it if necessary.

minimumContextSize uint

The minimum acceptable context size (in tokens). The fitting process will not reduce the context size below this value. Defaults to 2048.

deviceConfiguration LM.DeviceConfiguration

Optional device configuration specifying GPU preferences (main GPU, layer count, tensor distribution). If null, the system default configuration is used.

Returns

MemoryEstimation.FitResult

A MemoryEstimation.FitResult indicating whether the fitting succeeded and the resulting context size and GPU layer count.

Examples

// Check if a model fits with a specific context size
var result = MemoryEstimation.FitParameters("model.gguf", contextSize: 8192);
if (result.Success)
{
    Console.WriteLine($"Fits with context={result.ContextSize}, GPU layers={result.GpuLayerCount}");
}

// Find the maximum context size that fits
var maxResult = MemoryEstimation.FitParameters("model.gguf", contextSize: 0);
Console.WriteLine($"Maximum context size: {maxResult.ContextSize}");

Exceptions

ArgumentNullException

Thrown when modelPath is null or empty.

FileNotFoundException

Thrown when the model file does not exist at the specified path.

InvalidDataException

Thrown when the file is not a valid GGUF model.

FitParameters(LM, uint, uint)

Fits model and context parameters to available device memory using a loaded model instance.

public static MemoryEstimation.FitResult FitParameters(LM model, uint contextSize = 0, uint minimumContextSize = 2048)

Parameters

model LM

The loaded model instance.

contextSize uint

The desired context size (in tokens). Set to 0 to let the fitting process determine the maximum context size that fits.

minimumContextSize uint

The minimum acceptable context size (in tokens). Defaults to 2048.

Returns

MemoryEstimation.FitResult

A MemoryEstimation.FitResult indicating whether the fitting succeeded and the resulting context size and GPU layer count.

Examples

LM model = LM.LoadFromModelID("gemma3:12b");

var result = MemoryEstimation.FitParameters(model, contextSize: 16384);
if (result.Success)
{
    Console.WriteLine($"Context {result.ContextSize} fits with {result.GpuLayerCount} GPU layers");
}

Exceptions

ArgumentNullException

Thrown when model is null.

Share