5#ifndef FORESTHUB_CONFIG_CONFIG_HPP
6#define FORESTHUB_CONFIG_CONFIG_HPP
Client and provider configuration types.
Top-level namespace for the ForestHub SDK.
Minimal Optional<T> polyfill for C++14 compatibility.
Minimal Optional<T> polyfill for C++14 compatibility.
Definition optional.hpp:21
Top-level client configuration controlling which providers are created.
Definition config.hpp:48
std::vector< LocalConfig > local
Zero or more local model configurations.
Definition config.hpp:50
RemoteProviders remote
Remote provider configurations.
Definition config.hpp:49
Configuration for a local LLM execution engine.
Definition config.hpp:39
std::string model_id
Unique identifier for routing.
Definition config.hpp:41
std::string model_path
Path to the model weights file.
Definition config.hpp:40
int context_size
Context window size in tokens.
Definition config.hpp:42
bool use_gpu
Enable GPU acceleration.
Definition config.hpp:44
int n_threads
CPU threads for inference.
Definition config.hpp:43
Shared configuration for any remote LLM provider.
Definition config.hpp:24
std::string base_url
API base URL (empty = provider default).
Definition config.hpp:26
std::vector< std::string > supported_models
Models available through this provider.
Definition config.hpp:27
std::string api_key
Authentication token.
Definition config.hpp:25
Container for all remote provider configurations.
Definition config.hpp:31
foresthub::Optional< ProviderConfig > openai
OpenAI (native Responses API).
Definition config.hpp:35
foresthub::Optional< ProviderConfig > anthropic
Anthropic Claude (native Messages API).
Definition config.hpp:32
foresthub::Optional< ProviderConfig > gemini
Google Gemini (native generateContent API).
Definition config.hpp:34