Kong Gateway 3.6 Released
February 15, 2024

Kong announced a suite of open-source AI plugins for Kong Gateway 3.6 that can turn any Kong Gateway deployment into an AI Gateway, offering unprecedented support for multi-Language Learning Models (LLMs) integration.

By upgrading to Kong Gateway 3.6, available today, users can access a suite of six new plugins that are entirely focused on AI and LLM usage. This will enable developers who want to integrate one or more LLMs into their products to be more productive and ship AI capabilities faster, while at the same time offering architects and platform teams a secure solution that ensures visibility, control and compliance on every AI request sent by the teams. Due to the tight integration with Kong Gateway, it will now be possible to easily orchestrate AI flows in the cloud or on self-hosted LLMs with industry leading performance and low latency, which are critical to the performance of AI-based applications.

“Today marks a significant milestone in our journey towards democratizing AI for developers and enterprises worldwide. By open-sourcing this suite of innovative AI capabilities, including no-code AI plugins, we’re removing the barriers to AI adoption and making it possible for developers to leverage multiple LLMs effortlessly and ship AI powered applications faster. At the same time, we’re providing governance and visibility to all the AI traffic that is being generated by an organization,“ said Marco Palladino, Chief Technology Officer and Co-Founder, Kong Inc.

By upgrading to Kong Gateway 3.6, AI builders can access this new suite of plugins entirely focused on AI and LLM usage. The suite of open source plugins delivers a range of new capabilities, including:

- Multi-LLM Integration: Kong Inc.'s "ai-proxy" plugin enables seamless integration of multiple Large Language Model (LLM) implementations, offering native support for industry leaders including OpenAI, Azure AI, Cohere, Anthropic, Mistral, and LLAMA. The standardized interface allows for simple switching between LLMs without modifying application code, facilitating the use of diverse models and rapid prototyping.

- Central AI Credential Management: The "ai-proxy" helps ensure secure and centralized storage of AI credentials within Kong Gateway. This design negates the need for credentials within applications, streamlining credential rotation and updates directly from the gateway

- Layer 7 AI Metrics Collection: Leveraging the "ai-proxy" plugin, users can now capture detailed Layer 7 AI analytics. This includes metrics such as request and response token counts, along with usage data for LLM providers and models. Integration with third-party platforms like Datadog, New Relic, and existing logging plugins in Kong Gateway, like TCP, Syslog, Prometheus, is supported, enriching observability and offering insights into developer preferences.

- No-Code AI Integrations: With the "ai-request-transformer" and "ai-response-transformer" plugins, AI capabilities are injected into API requests and responses without a single line of code. This allows for on-the-fly transformations like real-time API response translations for internationalization, enriching and converting API traffic effortlessly.

- AI Prompt Decoration: The "ai-prompt-decorator" plugin allows for the consistent configuration of AI prompt contexts, automating the inclusion of rules and instructions with each AI request to enforce organizational compliance and restrict discussions on sensitive topics. 


- AI Prompt Firewall: The "ai-prompt-guard" offers a governance layer, establishing rules to authorize or block free-form prompts created by applications. This helps ensure that prompts adhere to approved standards before being transmitted to LLM providers. 


- Comprehensive AI Egress with Extensive Features: The integration of these AI capabilities within Kong Gateway centralizes the management, security, and monitoring of AI traffic. It leverages over 1,000+ existing official and community plugins for robust access control, rate limiting, and the creation of advanced traffic control rules. The AI Gateway is equipped from day one with all Kong Gateway features, making it, we believe, the most capable in the AI ecosystem.

With this release, Kong's expertise in modern API infrastructure now extends to AI-driven use cases.

Share this

Industry News

November 21, 2024

Red Hat announced the general availability of Red Hat Enterprise Linux 9.5, the latest version of the enterprise Linux platform.

November 21, 2024

Securiti announced a new solution - Security for AI Copilots in SaaS apps.

November 20, 2024

Spectro Cloud completed a $75 million Series C funding round led by Growth Equity at Goldman Sachs Alternatives with participation from existing Spectro Cloud investors.

November 20, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, has announced significant momentum around cloud native training and certifications with the addition of three new project-centric certifications and a series of new Platform Engineering-specific certifications:

November 20, 2024

Red Hat announced the latest version of Red Hat OpenShift AI, its artificial intelligence (AI) and machine learning (ML) platform built on Red Hat OpenShift that enables enterprises to create and deliver AI-enabled applications at scale across the hybrid cloud.

November 20, 2024

Salesforce announced agentic lifecycle management tools to automate Agentforce testing, prototype agents in secure Sandbox environments, and transparently manage usage at scale.

November 19, 2024

OpenText™ unveiled Cloud Editions (CE) 24.4, presenting a suite of transformative advancements in Business Cloud, AI, and Technology to empower the future of AI-driven knowledge work.

November 19, 2024

Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade developer portal based on the Backstage project.

November 19, 2024

Pegasystems announced the availability of new AI-driven legacy discovery capabilities in Pega GenAI Blueprint™ to accelerate the daunting task of modernizing legacy systems that hold organizations back.

November 19, 2024

Tricentis launched enhanced cloud capabilities for its flagship solution, Tricentis Tosca, bringing enterprise-ready end-to-end test automation to the cloud.

November 19, 2024

Rafay Systems announced new platform advancements that help enterprises and GPU cloud providers deliver developer-friendly consumption workflows for GPU infrastructure.

November 19, 2024

Apiiro introduced Code-to-Runtime, a new capability using Apiiro’s deep code analysis (DCA) technology to map software architecture and trace all types of software components including APIs, open source software (OSS), and containers to code owners while enriching it with business impact.

November 19, 2024

Zesty announced the launch of Kompass, its automated Kubernetes optimization platform.

November 18, 2024

MacStadium announced the launch of Orka Engine, the latest addition to its Orka product line.