Today, I want to show you how you can use Dependency Injection (DI) with Semantic Kernel. I'm pretty busy with a move right now, so this post will be pretty short and sweet. If you don't know what Semantic Kernel is, here's a brief overview:
Here's an example of a native plugin:
public interface IKernelPlugin;
public interface IMyTimeKernelPlugin: IKernelPlugin
{
DateTimeOffset Time();
}
public class MyTimeKernelPlugin : IMyTimeKernelPlugin
{
[KernelFunction, Description("Get the current time")]
public DateTimeOffset Time() => DateTimeOffset.Now;
}
I wrote an extension method that will scan the loaded assemblies that match my root namespace, and will register those services, plugins, and Semantic Kernel itself. Like so: AddSemanticKernel(this IServiceCollection services)
Let's take a look at the assembly scanner.
private static List<Type> Scanner()
{
var interfaceType = typeof(IKernelPlugin);
// Get all loaded assemblies
return Assembly.GetExecutingAssembly().GetTypes().Where(type =>
type is { IsInterface: false, IsAbstract: false, IsClass: true } &&
interfaceType.IsAssignableFrom(type)
).ToList();
}
My convention is that all of my plugins implement an interface that inherits from IKernelPlugin
so they can be located via reflection.
Thanks to Mike Tucker for suggesting the IKernelPlugin
change.
The types can then be registered as services.
var pluginTypes = Scanner();
var pluginInterfaces = new List<Type>();
pluginTypes.ForEach(type =>
{
var serviceType = type.GetInterfaces()
.FirstOrDefault(it => it.Name == $"I{type.Name}");
if (serviceType == null) return;
pluginInterfaces.Add(serviceType);
services.AddSingleton(serviceType, type);
});
For each of the plugins that were located, the matching interface is also located. I then verify that the interface is not null as a sanity check. The interface type is added to the list, which I will use when I register the plugins with Semantic Kernel.
The plugins are then registered with dotnet's DI container. This allows the plugins to have injected services such as an HTTPClient
.
Next, the kernel is registered.
services.AddTransient<Kernel>(serviceProvider =>
{
// Allows us to get our config keys
var configuration = serviceProvider.GetRequiredService<IConfiguration>();
var kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.Services.AddLogging(l => l
.SetMinimumLevel(LogLevel.Trace)
.AddSemanticKernelLogging()
);
// For each of the plugin types, add them to the kernel plugins
pluginInterfaces.ForEach(type =>
{
kernelBuilder.Plugins.AddFromObject(serviceProvider.GetRequiredService(type));
});
// If you want to make use of the built-in, experimental web search plugin
var bingKey = configuration.GetValue<string>("Azure:Bing")!;
#pragma warning disable SKEXP0050
var webSearchEnginePlugin = new WebSearchEnginePlugin(new BingConnector(bingKey));
#pragma warning restore SKEXP0050
kernelBuilder.Plugins.AddFromObject(webSearchEnginePlugin, "bing");
// Get the Azure Configuration
var options = serviceProvider.GetRequiredService<IOptions<AzureOpenAIConfiguration>>().Value;
// Register the Azure OpenAI service
kernelBuilder.Services.AddAzureOpenAIChatCompletion(
options.Deployment,
options.Endpoint,
options.Key);
// Build and return the kernel
var kernel = kernelBuilder.Build();
return kernel;
});
There you have it. You can now inject the kernel, and it will be able to use the provided plugins.
As a bonus, I'll share my IChatCompletionService
wrapper.
public class KernelChatService(Kernel kernel, ISemanticKernelLoggingProvider loggingProvider) : IKernelChatService
{
private readonly OpenAIPromptExecutionSettings _openAIPromptExecutionSettings = new()
{ ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions
};
private readonly IChatCompletionService _chatService =
kernel.GetRequiredService<IChatCompletionService>();
public async Task<(IReadOnlyList<ChatMessageContent> chatResult, string diagnoseResult)>
GetChatMessageContentsAsync(string messageInput, bool log = false)
{ var correlationKey = string.Empty;
if (log)
correlationKey = loggingProvider.StartCorrelation();
var chatResult =
await _chatService.GetChatMessageContentsAsync(messageInput, _openAIPromptExecutionSettings, kernel);
if (!log)
return (chatResult, string.Empty);
var diagnoseResult = loggingProvider.DiagnoseToString(correlationKey);
return (chatResult, diagnoseResult);
}
public async Task<(ChatMessageContent chatResult, string diagnoseResult)> GetChatMessageContentAsync(
string messageInput, bool log = false)
{ var (chatResult, diagnoseResult) = await GetChatMessageContentsAsync(messageInput, log);
return (chatResult[0], diagnoseResult);
}}
This service gets registered in the same extension method as above. This does two things for me. First, it keeps me from needing to provide the prompt execution settings whenever I use the chat service. Secondly, it allows me to use this diagnostic logger that I found. I will (probably) cover the logger in next month's post.
I hope this was helpful to you and you learned something. I'd love to hear from you if you did.
Comments
You can also comment directly on GitHub.