Connections give you control over where AI runs and how it integrates. You can:
• Use your own models — Connect to your preferred LLM provider. Thanks to connections, you choose the AI that powers your workspace.
• Create your own intelligence service — Host AI logic elsewhere and expose it via a Hub Protocol endpoint. Connections let Synap talk to your custom service.
• Integrate third-party tools — APIs, webhooks, or middleware. Connections enable an extended, personalized AI experience.
• Workspace-wide setup — Connections are shared across your workspace so everyone benefits from the same capabilities.
Implement the Hub Protocol to provide chat, tool calls, and streaming. Register your service in Synap and link your workspace. Your custom logic runs where you choose — on your infra, with your models — while Synap handles the UI and orchestration.