Kong announced the launch of the latest version of Kong AI Gateway, which introduces new features to provide the AI security and governance guardrails needed to make GenAI and Agentic AI production-ready.
Tabnine announced the ability for users to select the underlying large language model (LLM) that powers its software development chat tool, Tabnine Chat.
Engineering teams can now select from a catalog of models to use the one best suited to their situation, and can switch between them at-will.
“Engineering teams should have the freedom to choose the most suitable model for their use case without the need for multiple AI assistants or vendor switches. They should be able to choose based on each model’s specific performance, privacy policies, and the code it was trained on,” said Eran Yahav, co-founder and CTO of Tabnine. “To date, this has not been possible with the best-of-breed AI coding assistants — but we are changing that. Tabnine eliminates any worry about missing out on AI innovations, and makes it simple to use new models as they become available.”
Announced in June 2023 and now generally available, Tabnine Chat is the enterprise-grade, code-centric chat application that allows developers to interact with Tabnine AI models using natural language. Today’s update enables users to further adapt Tabnine Chat to the way they work and make it an integral part of the software development lifecycle (SDLC). It provides users control and flexibility to select LLMs that are the best fit for their needs and allows them to take advantage of the capabilities in newer LLMs and without getting locked to a specific model.
Tabnine also provides transparency into the behaviors and characteristics around the performance, privacy, and protection for each of the available models — making it easier to decide which model is right for each specific use case, project, or team. Developers now have the ability to choose any of the following models with Tabnine Chat and switch at any time:
- Tabnine Protected: Tabnine’s original model, designed to deliver high performance without the risks of intellectual property violations or exposing your code and data to others.
- Tabnine + Mistral: Tabnine’s newest offering, built to deliver the highest class of performance while still maintaining complete privacy.
- GPT-3.5 Turbo and GPT-4.0 Turbo: The industry’s most popular LLMs, proven to deliver the highest levels of performance for teams willing to share their data externally.
Switchable models offer engineering teams the combined advantages of LLMs' comprehensive understanding of programming languages and software development methods, coupled with Tabnine’s extensive efforts in crafting a developer experience and AI agents tailored to fit seamlessly into the SDLC. Tabnine employs advanced proprietary techniques — including prompt engineering, contextual awareness of local and global code bases, and task-specific fine-tuning within the SDLC — to maximize the performance, relevance, and quality of LLM output. By uniquely applying these methods to each LLM, Tabnine delivers an AI coding assistant optimized for each model. Regardless of the model selected, engineering teams get the full capability of Tabnine — including code generation, code explanations, generation of documentation, AI-created tests — and Tabnine is always highly personalized to each engineering team through both local and full codebase awareness.
Furthermore, with ongoing support for popular integrated development environments (IDEs) and integrations with common development tools, Tabnine ensures compatibility within existing engineering ecosystems.
Industry News
Traefik Labs announced significant enhancements to its AI Gateway platform along with new developer tools designed to streamline enterprise AI adoption and API development.
Zencoder released its next-generation AI coding and unit testing agents, designed to accelerate software development for professional engineers.
Windsurf (formerly Codeium) and Netlify announced a new technology partnership that brings seamless, one-click deployment directly into the developer's integrated development environment (IDE.)
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, is making significant updates to its certification offerings.
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the Golden Kubestronaut program, a distinguished recognition for professionals who have demonstrated the highest level of expertise in Kubernetes, cloud native technologies, and Linux administration.
Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade internal developer portal based on the Backstage project.
Platform9 announced that Private Cloud Director Community Edition is generally available.
Sonatype expanded support for software development in Rust via the Cargo registry to the entire Sonatype product suite.
CloudBolt Software announced its acquisition of StormForge, a provider of machine learning-powered Kubernetes resource optimization.
Mirantis announced the k0rdent Application Catalog – with 19 validated infrastructure and software integrations that empower platform engineers to accelerate the delivery of cloud-native and AI workloads wherever the\y need to be deployed.
Traefik Labs announced its Kubernetes-native API Management product suite is now available on the Oracle Cloud Marketplace.
webAI and MacStadium(link is external) announced a strategic partnership that will revolutionize the deployment of large-scale artificial intelligence models using Apple's cutting-edge silicon technology.
Development work on the Linux kernel — the core software that underpins the open source Linux operating system — has a new infrastructure partner in Akamai. The company's cloud computing service and content delivery network (CDN) will support kernel.org, the main distribution system for Linux kernel source code and the primary coordination vehicle for its global developer network.