Table of Contents

Class LoraFinetuning

Namespace
LMKit.Finetuning
Assembly
LM-Kit.NET.dll

Provides an engine specifically designed to fine-tune existing models using the Low-Rank Adaptation (LoRA) technique.

public sealed class LoraFinetuning : IDisposable
Inheritance
LoraFinetuning
Implements
Inherited Members

Remarks

The fine-tuning process involves training an existing model using a training dataset, enabling the adaptation of pre-trained models to specific tasks or data.

Constructors

LoraFinetuning(LLM, FinetuningIntent)

Initializes a new instance of the LoraFinetuning class using a specified model.

LoraFinetuning(string, FinetuningIntent)

Initializes a new instance of the LoraFinetuning class using specified paths.

Properties

BatchSize

Gets or sets the size of the batches used for parallel training.

ContextSize

Gets or sets the context size used during model training.

EnableSampleProcessing

Gets or sets a value indicating whether imported samples should be pre-processed by the engine. When enabled, preprocessing optimizes prompt formatting.

Iterations

Gets or sets the number of iterations that the Adam optimization algorithm performs on each training batch.

LoraTrainingParameters

Provides a reference to the parameters used for training and fine-tuning AI models with the Low-Rank Adaptation (LoRA) approach.

SampleAvgLength

Gets the average length, in tokens, of the longest sample within the training data.

SampleCount

Gets the number of training samples derived from the training data.

SampleMaxLength

Gets the length, in tokens, of the longest sample within the training data.

SampleMinLength

Gets the length, in tokens, of the shortest sample within the training data.

ThreadCount

Gets or sets the number of threads to be used for processing.
Ensures that the thread count is always at least 1 to prevent invalid configurations.

TrainingCheckpoint

Gets or sets the file path of the training checkpoint used to resume a LoRA training session.

TrainingSeed

Gets or sets the seed value used for training.
This seed is utilized to randomize the order of samples during the training process, ensuring varied training sequences.

UseGradientCheckpointing

Determines whether gradient checkpointing is enabled. Gradient checkpointing can reduce memory usage by approximately 50% at the cost of increased runtime. Disabling checkpointing may accelerate fine-tuning if sufficient RAM is available.

Methods

CheckpointToLora(LLM, string, string)

Converts a LoRA training checkpoint to a LoRA format and saves it to the specified path.

ClearTrainingData()

Clears the training data.

Dispose()

Ensures the release of this instance and the complete removal of all associated unmanaged resources.

FilterSamplesBySize(int, int)

Filters the training samples by size, removing those that do not fall within the specified range.

Finetune2Lora(string)

Initiates the fine-tuning process using specified training parameters and model weights.

Finetune2Model(string, float, MetadataCollection)

Executes the fine-tuning process and merges the resulting LoRA adapter into a new model.

GetSample(int)

Retrieves a training sample from the dataset based on the provided sample index.

LoadTrainingDataFromChatHistory(ChatHistory)

Loads a training dataset from a ChatHistory object. Extracts training data from a provided ChatHistory object and prepares it for the training process.

LoadTrainingDataFromText(Stream, string, Encoding)

Loads a training dataset from a plain text file. This method reads a text file containing training data samples, processes the data, and prepares it for the training process.

LoadTrainingDataFromText(string, string, Encoding)

Loads a training dataset from a plain text file.
Reads a text file containing training data samples separated by a specified delimiter, processes the data, and prepares it for the training process.

RemoveSample(int)

Removes a training sample from the dataset based on the provided sample index.

SaveTrainingData(string, bool, Encoding, string)

Saves the training data to the specified file path.

Events

FinetuningProgress

Occurs when there is progress in the fine-tuning process.