Skip to content

Offline Client

File: main_logic/omni_offline_client.py

The OmniOfflineClient provides text-based LLM conversation as a fallback when the Realtime API is unavailable.

When it's used

  • When the selected provider doesn't support Realtime API
  • When using local LLM deployments (Ollama, etc.)
  • When voice input is disabled and text-only mode is preferred

Capabilities

  • Text-in, text-out conversation
  • Compatible with any OpenAI-compatible API endpoint
  • Uses LangChain for LLM integration
  • Supports conversation history and system prompts

Differences from Realtime Client

FeatureRealtime ClientOffline Client
Audio I/ONativeRequires separate STT/TTS
StreamingWebSocket bidirectionalHTTP streaming
Multi-modalNative (audio + images)Text only
LatencyLower (persistent connection)Higher (per-request)
Provider supportLimited (Realtime API required)Any OpenAI-compatible

Released under the MIT License.