LLM Provider Refactoring Plan
This document outlines a plan to refactor the LLM provider integration to make it more extensible and configuration-driven.
Goals
Make adding new
LLMProvider
implementations easier with minimal changes to core components.Centralize LLM provider configuration.
Decouple the provider registry and consumers (like the Slack client) from specific provider implementations.
Remove the forced "LangChain as gateway" pattern, allowing direct use of any configured provider.
Proposed Strategy
Configuration-Driven Initialization:
Introduce a dedicated
LLMProviders map[string]map[string]interface{}
section ininternal/config/config.go
to hold settings for each potential provider (OpenAI, Ollama, Anthropic, etc.).The main
LLMProvider
field in the config will specify which configured provider to use.Example
config.yaml
structure:
Provider Factory Pattern:
Define a standard factory function signature:
type ProviderFactory func(config map[string]interface{}, logger *logging.Logger) (LLMProvider, error)
ininternal/llm/provider.go
.Maintain a package-level map
providerFactories map[string]ProviderFactory
within thellm
package.Create a registration function
RegisterProviderFactory(name string, factory ProviderFactory)
inregistry.go
(orprovider.go
) to populate this map.
Self-Registration using
init()
:In each provider implementation file (e.g.,
internal/llm/openai.go
), add aninit()
function.Inside
init()
, callRegisterProviderFactory("provider-name", NewConcreteProviderFactory)
to register the provider's factory function.
Registry Initialization Logic:
Modify
NewProviderRegistry
.It will iterate through the
cfg.LLMProviders
configuration map.For each configured provider, it looks up the corresponding factory function in the
providerFactories
map.It calls the factory, passing the specific configuration block and a logger.
If the factory returns a valid provider, it's added to the registry's
providers
map.
Simplify Provider Implementations:
Update provider constructors (e.g.,
NewOpenAIProvider
) to acceptmap[string]interface{}
and a logger. They should parse this map for their settings instead of reading environment variables directly (except potentially for API keys, which should still come from env for security).Each provider handles
ProviderOptions
(likeTemperature
) appropriately for its API.
Simplify Slack Client Usage:
In
internal/slack/client.go
'scallLLM
function, read the desired provider name fromc.cfg.LLMProvider
.Retrieve the provider instance using
c.llmRegistry.GetProvider(providerName)
.Remove hardcoded "langchain" usage.
Update
main.go
:Ensure the new
LLMProviders
config section is loaded.
Update Example Configuration:
Reflect the new
llm_providers
structure inmcp-servers.json.example
and any other relevant examples.
Last updated
Was this helpful?