[go: up one dir, main page]

LLM Gateways for Linux

View 5 business solutions

Browse free open source LLM Gateways and projects for Linux below. Use the toggles on the left to filter open source LLM Gateways by OS, license, language, programming language, and project status.

  • Gen AI apps are built with MongoDB Atlas Icon
    Gen AI apps are built with MongoDB Atlas

    The database for AI-powered applications.

    MongoDB Atlas is the developer-friendly database used to build, scale, and run gen AI and LLM-powered apps—without needing a separate vector database. Atlas offers built-in vector search, global availability across 115+ regions, and flexible document modeling. Start building AI apps faster, all in one place.
    Start Free
  • Monitoring, Securing, Optimizing 3rd party scripts Icon
    Monitoring, Securing, Optimizing 3rd party scripts

    For developers looking for a solution to monitor, script, and optimize 3rd party scripts

    c/side is crawling many sites to get ahead of new attacks. c/side is the only fully autonomous detection tool for assessing 3rd party scripts. We do not rely purely on threat feed intel or easy to circumvent detections. We also use historical context and AI to review the payload and behavior of scripts.
    Learn More
  • 1
    Kong

    Kong

    The Cloud-Native API Gateway

    Kong is a next generation cloud-native API platform for multi-cloud and hybrid organizations. When building for the web, mobile, or Internet of Things, you’ll need a common functionality to run your software, and Kong is that solution. Kong acts as a gateway, connecting microservices requests and APIs natively while also providing load balancing, logging, monitoring, authentication, rate-limiting, and so much more through plugins. Kong is highly extensible as well as platform agnostic, connecting APIs across different environments, platforms and patterns. Achieve architectural freedom with Kong today.
    Downloads: 7 This Week
    Last Update:
    See Project
  • 2
    APIPark

    APIPark

    APIPark is the #1 open-source AI Gateway and Developer Portal

    APIPark is an open-source, all-in-one AI gateway and API developer portal, that helps developers and enterprises easily manage, integrate, and deploy AI services. No matter which AI model you use, APIPark provides a one-stop integration solution. It unifies the management of all authentication information and tracks the costs of API calls. Standardize the request data format for all AI models. When switching AI models or modifying prompts, it won’t affect your app or microservices, simplifying your AI usage and reducing maintenance costs. You can quickly combine AI models and prompts into new APIs. For example, using OpenAI GPT-4 and custom prompts, you can create sentiment analysis APIs, translation APIs, or data analysis APIs. API lifecycle management helps standardize the process of managing APIs, including traffic forwarding, load balancing, and managing different versions of publicly accessible APIs. This improves API quality and maintainability.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 3
    Bifrost

    Bifrost

    The Fastest LLM Gateway with built in OTel observability

    Bifrost is an LLM gateway designed to provide a unified OpenAI-compatible API front for many different model providers. It abstracts away the complexity of working directly with multiple backend providers (OpenAI, Anthropic, AWS Bedrock, Google Vertex, etc.), enabling you to plug in providers and switch between them without touching your client code. It is built to be high performance: in benchmark tests at 5,000 requests per second, it reportedly adds only microseconds of overhead and achieves perfect success rates with no failed requests. Bifrost supports features such as automatic fallback (failover between providers), load balancing across API keys/providers, and semantic caching to reduce latency and cost. It also includes observability with built-in metrics, tracing, logging, and supports governance features like rate limiting, access control, and cost budgeting. The architecture is modular: there is a core engine, plugin layers, and transport layers (HTTP APIs).
    Downloads: 3 This Week
    Last Update:
    See Project
  • 4
    MagicAPI AI Gateway

    MagicAPI AI Gateway

    Built for demanding AI workflows

    The world's fastest AI Gateway proxy, written in Rust and optimized for maximum performance. This high-performance API gateway routes requests to various AI providers (OpenAI, GROQ) with streaming support, making it perfect for developers who need reliable and blazing-fast AI API access.
    Downloads: 3 This Week
    Last Update:
    See Project
  • Haystack is a modern, engaging, and intuitive intranet platform that employees actually use. Icon
    Haystack is a modern, engaging, and intuitive intranet platform that employees actually use.

    You Deserve the Best Intranet Experience

    With customizable iOS and Android mobile apps, Slack and Microsoft Teams integrations, and an intuitive design employees love, Haystack brings an outstanding digital employee experience to your entire workforce, no matter where their work takes them.
    Learn More
  • 5
    LLM Gateway

    LLM Gateway

    Route, manage, and analyze your LLM requests across multiple providers

    LLM Gateway is an open-source middleware that consolidates interactions with multiple LLM providers—such as OpenAI, Anthropic, Google Vertex AI—behind a single, unified API compatible with OpenAI's spec. Designed for both self-hosted and cloud use, it enables developers to route requests dynamically, secure and manage API keys, monitor token usage and costs, and analyze performance metrics. With optional UI, telemetry, and Docker deployment, it's ideal for teams aiming to centralize LLM orchestration and gain visibility into AI usage.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 6
    LangDB AI Gateway

    LangDB AI Gateway

    Govern, secure, and optimize your AI traffic

    AI Gateway is a high-performance, open-source API gateway optimized for managing and monitoring LLM traffic at scale. Developed by the LangDB team, AI Gateway acts as an intermediary between clients and backend LLMs, providing advanced features like caching, rate limiting, prompt management, and observability. It helps teams secure and optimize their LLM deployments, whether using local models or external APIs like OpenAI or Anthropic. With native support for multi-tenant environments and low-latency inference routing, AI Gateway is an essential tool for companies building production-grade generative AI services.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 7
    Portkey AI Gateway

    Portkey AI Gateway

    A blazing fast AI Gateway with integrated guardrails

    Portkey AI Gateway aims to offer a blazing fast, secure, and flexible gateway for interacting with a wide variety of models and enforcing guardrails. It presents a single, friendly API through which you can route to 200+ LLMs, while applying configurable input/output guardrails to enforce policies or restrict certain content. It supports automatic retries, fallbacks, load balancing across providers or keys, and request timeouts to avoid latency spikes. The gateway is multimodal: it can handle text, vision, audio, and image models under a common interface. It also offers features for governance: role-based access, compliance with standards (SOC2, HIPAA, GDPR), secure key management, and logging/analytics of usage, latency, errors, and cost. The system integrates with agent frameworks like LangChain, Autogen, and others, enabling the building of more complex AI applications. It’s lightweight and optimized for low latency with a small footprint.
    Downloads: 1 This Week
    Last Update:
    See Project
  • Previous
  • You're on page 1
  • Next