Table of Contents

What .NET Frameworks and Integrations Does LM-Kit.NET Support?


TL;DR

LM-Kit.NET targets .NET Standard 2.0, which means it works in virtually any .NET project: console apps, ASP.NET Core, MAUI, WPF, WinForms, Blazor Server, Windows Services, and more. It also ships official integration packages for Microsoft Semantic Kernel and Microsoft.Extensions.AI, so you can plug local inference into existing AI pipelines with minimal code changes.


.NET Framework Compatibility

Framework / Platform Compatible Notes
.NET 8.0, 9.0, 10.0 Yes First-class support. Recommended.
.NET Standard 2.0 Yes Broadest compatibility layer.
.NET Framework 4.6.1+ Yes Via .NET Standard 2.0 compatibility.
ASP.NET Core Yes Use for AI-powered web APIs and services.
.NET MAUI Yes Cross-platform mobile and desktop apps. See LM-Kit Maestro for a reference implementation.
WPF / WinForms Yes Windows desktop applications.
Blazor Server Yes Server-side Blazor with local inference.
Worker Services Yes Background processing, queue consumers, scheduled tasks.
Unity Possible Via .NET Standard 2.0. Requires native binary management.

Microsoft AI Ecosystem Integrations

LM-Kit.NET provides two official bridge packages that let you use local LM-Kit models as drop-in replacements for cloud AI services in the Microsoft AI ecosystem:

Microsoft.Extensions.AI

The LM-Kit.NET.Integrations.ExtensionsAI package implements IChatClient, the standard abstraction in the Microsoft.Extensions.AI library:

using LMKit.Model;
using LMKit.Integrations.ExtensionsAI;
using Microsoft.Extensions.AI;

using LM model = LM.LoadFromModelID("qwen3.5:9b");

// Create an IChatClient backed by local inference
IChatClient client = new LMKitChatClient(model);

// Use the standard Microsoft.Extensions.AI interface
var response = await client.GetResponseAsync("What is retrieval-augmented generation?");
Console.WriteLine(response);

This means any code written against IChatClient (including libraries, middleware, and tools from the Microsoft ecosystem) works with LM-Kit.NET without modification.

Microsoft Semantic Kernel

The LM-Kit.NET.Integrations.SemanticKernel package implements IChatCompletionService, plugging LM-Kit models into Semantic Kernel pipelines:

using LMKit.Model;
using LMKit.Integrations.SemanticKernel;
using Microsoft.SemanticKernel;

using LM model = LM.LoadFromModelID("qwen3.5:9b");

var kernel = Kernel.CreateBuilder()
    .AddLMKitChatCompletion(model)
    .Build();

var result = await kernel.InvokePromptAsync("Summarize the benefits of local AI inference.");
Console.WriteLine(result);

Both integrations support streaming, tool/function calling, and configurable sampling parameters.


NuGet Packages

Package Purpose
LM-Kit.NET Core SDK. All inference, RAG, agents, tools, speech, vision.
LM-Kit.NET.Integrations.ExtensionsAI Microsoft.Extensions.AI bridge (IChatClient)
LM-Kit.NET.Integrations.SemanticKernel Semantic Kernel bridge (IChatCompletionService)
LM-Kit.NET.Data.Connectors.Qdrant Qdrant vector database connector for RAG
LM-Kit.NET.Backend.Cuda12.* CUDA 12 GPU backend (platform-specific)
LM-Kit.NET.Backend.Cuda13.* CUDA 13 GPU backend (platform-specific)

Share