I'm developing an Azure Function and I've integrated SK to it injecting the kernel as seen in many starter projects.
//Microsoft Semantic Kernel configuration build
var skBuild = Kernel.Builder
.WithLoggerFactory(loggerFactory)
.WithAzureTextEmbeddingGenerationService("text-embedding-ada-002", azureOpenAIOptions.Endpoint, azureOpenAIOptions.ApiKey)
.WithAzureTextCompletionService("gpt-35-turbo-instruct", azureOpenAIOptions.Endpoint, azureOpenAIOptions.ApiKey)
.WithAzureChatCompletionService("gpt-35-turbo", azureOpenAIOptions.Endpoint, azureOpenAIOptions.ApiKey)
.WithMemoryStorage(memoryStore)
.Build();
Even I specified different models for each searvice
.WithAzureTextCompletionService("gpt-35-turbo-instruct",... ,...)
and
.WithAzureChatCompletionService("gpt-35-turbo",... ,...)
SK is only using the model specified for chat completion service. What I'm expecting is to SK uses the correct model depending on implementation.
For example: gpt-35-turbo-instruct when implementing
var result = await kernel.RunAsync(context, skill["Joke"]);
Console.WriteLine(result);
and gpt-35-turbo when implementing
var chat = chatCompletionService.CreateNewChat("You are an AI assistant that helps people find information.");
chat.AddMessage(AuthorRole.User, "Hi, what information can you provide for me?");
string response = await chatCompletionService.GenerateMessageAsync(chat, new ChatRequestSettings());
Console.WriteLine(response);
I'm missing something?