Skip to main content

Timber

Compile XGBoost, LightGBM, scikit-learn, CatBoost & ONNX models to native C99. Serve any model — local file or remote URL — in one command.

pip install timber-compiler
then run:
$ timber serve https://yourhost.com/model.json
↳ downloads · compiles · serves — all in one command
Get Started →GitHub ★
336×Faster P50 Latency
533×Higher Throughput
91µsHTTP Inference
48KBCompiled Artifact
5Frameworks

One command.
Native speed.

Point Timber at any URL and it downloads, compiles, and serves immediately. No separate load step. No configuration. Python never touches the hot path.

Or load a local file and serve by name — your choice.

Serve from URL (recommended)
# Option 1: serve directly from a URL — no pre-download needed
pip install timber-compiler
timber serve https://yourhost.com/fraud_model.json
Serve from local file
# Option 2: load a local file, then serve by name
pip install timber-compiler
timber load fraud_model.json --name fraud-detector
timber serve fraud-detector

# Query either way
curl http://localhost:11434/api/predict \
-d '{"model": "fraud-detector", "inputs": [[1.0, 2.0, ...]]}'

Why Timber?

5 Framework Parsers

XGBoost, LightGBM, scikit-learn, CatBoost, and ONNX. Auto-detected from file extension and content.

6 Optimizer Passes

Dead leaf elimination, constant folding, threshold quantization, branch sorting, pipeline fusion, vectorization analysis.

3 Code Backends

C99 for servers & embedded, WebAssembly for browsers & edge, MISRA-C for safety-critical (automotive, medical).

Ollama-Style Serving

timber load → timber serve. REST API on port 11434. Same developer experience as Ollama, but for classical ML.

Zero Dependencies

Generated code needs only a C99 compiler. No runtime libraries, no dynamic allocation, no recursion. Thread-safe by design.

Audit Trails

Every compilation produces a deterministic JSON audit report with SHA-256 hashes, pass logs, and timing. Built for regulated industries.

How It Works

Timber treats your trained model as a program specification and compiles it through a classical compiler pipeline.

Model Artifact→ Parse →Timber IR→ Optimize →Optimized IR→ Emit →C99 / WASM→ Compile →.so / .dylib→ Serve →HTTP API
Read the Docs →