Skip to content

Router Configuration

Environment Variables

VariableDefaultDescription
ROUTER_PORT50052gRPC server port
REDIS_URLredis://localhost:6379/0Service discovery and persistence (auto-registers Router)
REDIS_SECRET_KEYEncryption key for discovery registry
CU_BRAIN_GRPC_URLOptional override for contextunity.brain host
CU_SHIELD_GRPC_URLOptional override for contextunity.shield host
CU_ROUTER_DEFAULT_LLMopenai/gpt-5-miniRouter baseline LLM (used internally or if graph provides none)
CU_ROUTER_FALLBACK_LLMSComma-separated global fallback chain for the Router
CU_ROUTER_ALLOW_GLOBAL_FALLBACKfalseAllow Router’s fallback chain to apply to projects missing fallback_keys
OPENAI_API_KEYOpenAI API key (global fallback)
ANTHROPIC_API_KEYAnthropic API key (global fallback)
INCEPTION_API_KEYInception Labs (Mercury-2) key (global fallback)
GOOGLE_PROJECT_IDGoogle Cloud project for Vertex AI
GROQ_API_KEYGroq API key (global fallback)
PERPLEXITY_API_KEYPerplexity API key (global fallback)
DEBUG_PIPELINE0Enable pipeline debug logs
DEBUG_WEB_SEARCH0Enable web search debug logs

API Key Resolution

Router resolves API keys for LLM providers using a two-tier fallback chain:

PrioritySourcePath / VariableWhen
1contextunity.shield{tenant}/api_keys/{provider}Project synced keys at startup
2Router envOPENAI_API_KEY, INCEPTION_API_KEY, etc.Shield unavailable or key not found

How it works:

  1. Each project manages its own model + API key pairs in its .env file.
  2. At startup, the project’s apps.py registers tools with Router and receives shield_url.
  3. Project syncs its API keys to Shield via PutSecret({tenant}/api_keys/{provider}, key).
  4. At runtime, Router calls shield_get_secret({tenant}/api_keys/{provider}) to resolve the key.
  5. If Shield is unavailable or returns no key, Router falls back to its own env keys (e.g. INCEPTION_API_KEY).

This means:

  • With Shield: Full tenant isolation. Each project has its own keys per provider.
  • Without Shield: All projects share Router’s global env keys. Fine for dev or single-project setups.
  • Mixed: Some providers resolved from Shield, others from Router env.

Agent Configuration

Agents are configured via AgentConfig objects stored in contextunity.view’s database. Each agent specifies:

  • LLM provider — which model to use
  • Allowed tools — which tools the agent can invoke
  • System prompt — base personality and instructions
  • Temperature — LLM temperature setting
  • Fallback models — backup providers if primary fails

Running

Terminal window
# Start gRPC server
uv run python -m contextunity.router
# With custom port
ROUTER_PORT=50099 uv run python -m contextunity.router