What Is Function Calling and Tool Use in LM-Kit.NET?
TL;DR
Function calling lets AI models invoke .NET methods during a conversation. The model decides when to call a tool, generates the arguments as JSON, and LM-Kit.NET executes the method and feeds the result back to the model. You can create custom tools by implementing the ITool interface or by decorating methods with the [LMFunction] attribute. Tools work with both MultiTurnConversation and Agent.
How Function Calling Works
The execution loop follows four steps:
- Model generates a tool call. Based on the conversation and available tools, the model outputs a structured tool call with a name and JSON arguments.
- SDK parses and validates. LM-Kit.NET extracts the tool call, looks up the tool by name in the registry, and checks permission policies.
- Tool executes. The SDK calls
InvokeAsyncon the tool with the JSON arguments. The tool runs your .NET code (API calls, calculations, database queries, etc.) and returns a JSON result. - Result fed back to model. The tool result is added to the conversation, and the model generates its next response (which may include additional tool calls).
This loop repeats until the model produces a final text response with no more tool calls.
Creating a Custom Tool
Option 1: Implement ITool
using LMKit.Agents.Tools;
public class WeatherTool : ITool
{
public string Name => "get_weather";
public string Description => "Get the current weather for a location.";
public string InputSchema => """
{
"type": "object",
"properties": {
"location": { "type": "string", "description": "City name or coordinates" },
"units": { "type": "string", "enum": ["metric", "us"], "default": "metric" }
},
"required": ["location"]
}
""";
public async Task<string> InvokeAsync(string arguments, CancellationToken ct = default)
{
var args = JsonSerializer.Deserialize<WeatherArgs>(arguments);
// Call your weather API, database, or any .NET code
var weather = await GetWeatherFromApi(args.Location, args.Units, ct);
return JsonSerializer.Serialize(weather);
}
}
The InputSchema uses JSON Schema (Draft-07 subset) to define what arguments the tool accepts. The model reads this schema to know how to call the tool correctly.
Option 2: Use the LMFunction Attribute
For simpler tools, annotate a method directly:
using LMKit.Agents.Tools;
public class MyTools
{
[LMFunction("get_time", "Get current time in a timezone")]
public string GetTime(string timezone = "UTC")
{
var tz = TimeZoneInfo.FindSystemTimeZoneById(timezone);
return TimeZoneInfo.ConvertTimeFromUtc(DateTime.UtcNow, tz).ToString("F");
}
[LMFunction("calculate", "Perform arithmetic on two numbers")]
public double Calculate(double a, string operation, double b)
{
return operation switch
{
"+" => a + b,
"-" => a - b,
"*" => a * b,
"/" => b != 0 ? a / b : throw new ArgumentException("Division by zero"),
_ => throw new ArgumentException($"Unknown operation: {operation}")
};
}
}
The SDK automatically generates the input schema from the method signature. Parameter names become JSON property names. Optional parameters (with defaults) are not marked as required.
Registering Tools
With MultiTurnConversation
var chat = new MultiTurnConversation(model);
chat.Tools.Register(new WeatherTool());
chat.Tools.Register(new MyTools()); // Discovers all [LMFunction] methods
With Agent
var agent = Agent.CreateBuilder(model)
.WithTools(tools =>
{
tools.Register(new WeatherTool());
tools.Register(new MyTools());
})
.Build();
From MCP Servers
var mcpClient = new McpClient("http://localhost:8080");
await mcpClient.ConnectAsync();
chat.Tools.Register(mcpClient); // Register all tools from the MCP server
Tool Events
Monitor and control tool execution with events:
chat.BeforeToolInvocation += (sender, e) =>
{
Console.WriteLine($"Calling tool: {e.ToolName} with args: {e.Arguments}");
// e.Cancel = true; // Optionally cancel the call
};
chat.AfterToolInvocation += (sender, e) =>
{
Console.WriteLine($"Tool result: {e.Result}");
};
chat.ToolApprovalRequired += (sender, e) =>
{
// Ask the user for approval before executing
Console.Write($"Allow {e.ToolName}? (y/n): ");
e.IsApproved = Console.ReadLine()?.Trim().ToLower() == "y";
};
Tools Are Fully Async
All tools implement Task<string> InvokeAsync(...), supporting long-running operations like HTTP requests, database queries, and file processing. The CancellationToken parameter allows cooperative cancellation.
📚 Related Content
- What built-in tools does LM-Kit.NET provide for AI agents?: The constantly growing catalog of ready-to-use tools.
- What is the difference between an AI agent and a chatbot?: When to use tools with conversations vs full agent orchestration.
- Build a Function-Calling Agent: Step-by-step guide for building agents that use tools.
- Connect to MCP Servers from Your Application: Import tools from external MCP-compatible services.