kumiko.so beta
Tech detail

BYOK + local LLM — no AI vendor lock-in

All AI calls go through one OpenAI-compatible adapter. Anthropic, OpenAI, Llama, vLLM — interchangeable via env-var, no code change.

cloud Anthropic cloud OpenAI on-prem · DACH-pick Ollama / vLLM SINGLE INTERFACE OpenAI-Compat-Adapter LLM_ENDPOINT=… LLM_MODEL=… API_KEY your kumiko app AI-Builder · Designer · MCP env-var-switch, kein Code-Change

Other tech USPs

Next step

Let's talk.

Pilot program open for European mid-market and indie hackers. Source access on request.

Built with itself · publicstatus.eu in production