ForestHub SDK 0.1.0
C++14 LLM SDK for PC and embedded platforms
Loading...
Searching...
No Matches
ForestHub SDK

A platform-agnostic C++14 framework for building LLM-powered applications, from cloud servers to microcontrollers. Write your code once and deploy it on PC (Linux/macOS/Windows) or embedded devices (ESP32, Portenta H7) without changing a line.

Built on a Hardware Abstraction Layer (HAL) that abstracts network, console, time, crypto, and GPIO – the LLM SDK, agent framework, and RAG system run on top of it unchanged across platforms.

Features

  • Platform-agnostic via HAL – same application code runs on PC and embedded targets; platform-specific implementations (CPR on PC, ArduinoHttpClient on ESP32) are injected at build time
  • Multi-provider LLM client – unified interface for ForestHub, OpenAI, Gemini, and Anthropic; routes requests to the right provider based on model ID
  • Agent framework – tool calling, multi-turn conversations, agent handoffs, and web search
  • RAG retriever – semantic document search via ForestHub backend
  • Embedded-safe – C++14, no exceptions, no RTTI; targets any C++14-capable toolchain including embedded compilers

Installation

PC (CMake)

Add to your project's CMakeLists.txt:

include(FetchContent)
FetchContent_Declare(
fh-sdk
GIT_REPOSITORY https://github.com/ForestHubAI/fh-sdk.git
GIT_TAG v0.1.1
)
FetchContent_MakeAvailable(fh-sdk)
target_link_libraries(your_app PRIVATE foresthub_core)

Requirements: CMake 3.14+, C++14 compiler (GCC 7+, Clang 5+, MSVC 2017+). Dependencies (CPR, nlohmann/json) are fetched automatically.

Embedded (PlatformIO)

Add to your platformio.ini:

lib_deps =
foresthubai/fh-sdk@^0.1.1

Requirements: PlatformIO CLI (pip install platformio).

See Embedded Guide for detailed setup.

Usage

PC – Agent with Tools

#include "platform_setup.hpp"
using json = nlohmann::json;
// 1. Define tool: argument struct, deserializer, handler
struct WeatherArgs { std::string city; };
void from_json(const json& j, WeatherArgs& a) { a.city = j.value("city", ""); }
json GetWeather(const WeatherArgs& a) { return {{"temp", "22C"}, {"city", a.city}}; }
int main() {
// 2. Platform + HTTP client via HAL
auto platform = app::SetupPlatform();
http_cfg.host = "fh-backend-368736749905.europe-west1.run.app";
auto http_client = platform->CreateHttpClient(http_cfg);
// 3. Configure provider + create client
fh_cfg.base_url = "https://fh-backend-368736749905.europe-west1.run.app";
fh_cfg.api_key = std::getenv("FORESTHUB_API_KEY");
fh_cfg.supported_models = {"gpt-4.1", "gpt-4o"};
cfg.remote.foresthub = fh_cfg;
auto client = foresthub::Client::Create(cfg, http_client);
// 4. Build agent with tool
json schema = json::parse(R"({"properties":{"city":{"type":"string"}}})", nullptr, false);
auto agent = std::make_shared<foresthub::agent::Agent>("weather-bot");
agent->WithInstructions("You help with weather queries.")
.WithOptions(foresthub::core::Options().WithTemperature(0.7f).WithMaxTokens(1024))
"get_weather", "Get current weather for a city", schema, GetWeather));
// 5. Run agent
auto runner = std::make_shared<foresthub::agent::Runner>(client, "gpt-4o");
auto input = std::make_shared<foresthub::core::InputString>("Weather in Berlin?");
foresthub::agent::RunResultOrError result = runner->Run(agent, input);
if (!result.HasError()) {
printf("%s\n", result.result->final_output.dump().c_str());
}
}
Agent with instructions, tools, and optional response format.
static std::unique_ptr< Client > Create(const config::ClientConfig &cfg, const std::shared_ptr< core::HttpClient > &http_client=nullptr)
Create and initialize a Client from configuration.
Multi-provider Client that routes chat requests by model ID.
Client and provider configuration types.
Polymorphic input types for chat requests.
Wrapper for nlohmann/json that works around abs macro conflicts.
std::shared_ptr< FunctionTool > NewFunctionTool(std::string name, std::string description, const json &schema, const std::function< R(T)> &handler)
Create a FunctionTool with type-safe argument parsing.
Definition tools.hpp:173
Runner that executes agent loops with LLM calls, tool dispatch, and handoffs.
Result wrapper that holds either a RunResult or an error message.
Definition runner.hpp:34
foresthub::Optional< RunResult > result
Present on success.
Definition runner.hpp:35
bool HasError() const
Check if an error occurred.
Definition runner.hpp:39
Top-level client configuration controlling which providers are created.
Definition config.hpp:48
RemoteProviders remote
Remote provider configurations.
Definition config.hpp:49
Shared configuration for any remote LLM provider.
Definition config.hpp:24
std::string base_url
API base URL (empty = provider default).
Definition config.hpp:26
std::vector< std::string > supported_models
Models available through this provider.
Definition config.hpp:27
std::string api_key
Authentication token.
Definition config.hpp:25
foresthub::Optional< ProviderConfig > foresthub
ForestHub backend provider.
Definition config.hpp:33
Model-specific generation options.
Definition options.hpp:17
Call-time configuration for HTTP client creation.
Definition platform.hpp:37
const char * host
Target hostname (required).
Definition platform.hpp:38
Tool system: ExternalTool, FunctionTool, WebSearch, and tool call types.

Embedded – Chat on ESP32

The same SDK, but with Arduino setup()/loop() and explicit WiFi + time sync:

#include <Arduino.h>
#include "env.hpp" // WiFi credentials + API key
static std::shared_ptr<foresthub::platform::PlatformContext> platform;
void setup() {
// 1. Create platform (WiFi, Serial, NTP, TLS)
config.network.ssid = kWifiSsid;
config.network.password = kWifiPassword;
// 2. Connect network + sync time (required for TLS)
platform->console->Begin();
platform->network->Connect();
platform->time->SyncTime();
// 3. Create client (same API as PC)
http_cfg.host = "fh-backend-368736749905.europe-west1.run.app";
auto http_client = platform->CreateHttpClient(http_cfg);
fh_cfg.base_url = "https://fh-backend-368736749905.europe-west1.run.app";
fh_cfg.api_key = kForesthubApiKey;
fh_cfg.supported_models = {"gpt-4.1", "gpt-4.1-mini"};
cfg.remote.foresthub = fh_cfg;
auto client = foresthub::Client::Create(cfg, http_client);
// 4. Send chat request
req.model = "gpt-4.1-mini";
req.input = std::make_shared<foresthub::core::InputString>("What is the capital of France?");
auto response = client->Chat(req);
if (response) {
platform->console->Printf("Response: %s\n", response->text.c_str());
}
}
void loop() {
platform->time->Delay(10000);
}
ChatRequest and ChatResponse types with fluent builder API.
std::shared_ptr< PlatformContext > CreatePlatform(const PlatformConfig &config={})
Factory function.
PlatformContext factory and subsystem aggregation.
Chat completion request sent to an LLM provider.
Definition types.hpp:39
ModelID model
Target model identifier.
Definition types.hpp:40
std::shared_ptr< Input > input
Polymorphic input (InputString or InputItems).
Definition types.hpp:41
const char * password
Network password (nullptr for open networks).
Definition network.hpp:32
const char * ssid
Network identifier (nullptr to skip setup).
Definition network.hpp:31
Construction-time configuration passed to CreatePlatform().
Definition platform.hpp:28
NetworkConfig network
Network credentials and options.
Definition platform.hpp:29

Note how steps 3-4 (client setup and chat request) are identical on both platforms – only the platform initialization differs.

Multi-Provider Support

Route requests to any supported provider – the client selects the right one based on model ID:

Provider Config Env Variable
ForestHub cfg.remote.foresthub = fh_cfg; FORESTHUB_API_KEY
OpenAI cfg.remote.openai = oai_cfg; OPENAI_API_KEY
Gemini cfg.remote.gemini = gem_cfg; GEMINI_API_KEY
Anthropic cfg.remote.anthropic = ant_cfg; ANTHROPIC_API_KEY

See Provider Guide for detailed configuration.

For complete examples see examples/pc/ and examples/embedded/.

Documentation

Architecture

Application Layer -- examples/pc/, examples/embedded/
|
v
HAL (foresthub::platform) -- Network, Console, Time, Crypto
|
v
Core (foresthub::core) -- LLM client, Agent framework, Tools

The Core layer defines abstract interfaces (e.g., HttpClient). The HAL layer provides platform-specific implementations (CPR on PC, ArduinoHttpClient on ESP32). Applications create a PlatformConfig (with WiFi credentials on embedded) and call CreatePlatform(config) to initialize the platform, then inject dependencies into Core.

Optional Subsystems

HAL subsystems are opt-in via compile-time macros. Add to your platformio.ini:

build_flags =
-DFORESTHUB_ENABLE_NETWORK ; WiFi + HTTP client
-DFORESTHUB_ENABLE_CRYPTO ; TLS/HTTPS support
-DFORESTHUB_ENABLE_GPIO ; Digital/analog/PWM pin I/O

Console and Time are always available. Omitting macros saves significant Flash:

Configuration ESP32 Flash Portenta Flash
Full (all subsystems) 908 KB 473 KB
Minimal (Console+Time) 389 KB 270 KB
Savings 519 KB (57%) 203 KB (43%)

Development

Everything below is for contributors and maintainers working on the SDK itself.

Building from Source

# PC
cmake -S . -B build -DBUILD_TESTING=ON
cmake --build build -j4
# Embedded (pattern: pio run -d examples/embedded/<provider>/<example> -e esp32dev)
pio run -d pio/build_test -e esp32dev

See Embedded Guide for detailed setup.

Testing

cd build && ctest --output-on-failure

Run a specific test: ./build/bin/Debug/run_core_tests --gtest_filter="InputTest.*"

Hand-rolled mocks in tests/mocks/ (no GMock – incompatible with -fno-rtti). See THIRD_PARTY_NOTICES for dependency license details.

License

This project is dual-licensed:

By using this software, you agree to the terms of the AGPL-3.0 license unless you have obtained a separate commercial license.