Red Hat Enterprise Linux AI 1.3 Released
December 16, 2024

Red Hat announced the latest release of Red Hat Enterprise Linux AI (RHEL AI), Red Hat’s foundation model platform for more seamlessly developing, testing and running generative artificial intelligence (gen AI) models for enterprise applications.

RHEL AI 1.3 brings support for the latest advancements in the Granite large language model (LLM) family and incorporates open source advancements for data preparation while still maintaining expanded choice for hybrid cloud deployments, including the underlying accelerated compute architecture.

RHEL AI forms a key pillar for Red Hat’s AI vision, bringing together the open source-licensed Granite model family and InstructLab model alignment tools, based on the Large-scale Alignment for chatBots (LAB) methodology. These components are then packaged as an optimized, bootable Red Hat Enterprise Linux image for individual server deployments anywhere across the hybrid cloud.

- Support for Granite 3.0 LLMs: RHEL AI 1.3 extends Red Hat’s commitment to Granite LLMs with support for Granite 3.0 8b English language use cases. Granite 3.0 8b is a converged model, supporting not only English but a dozen other natural languages, code generation and function calling. Non-English language use cases, as well as code and functions, are available as a developer preview within RHEL AI 1.3, with the expectation that these capabilities will be supported in future RHEL AI releases.

- Simplifying data preparation with Docling: Recently open sourced by IBM Research, Docling is an upstream community project that helps parse common document formats and convert them into formats like Markdown and JSON, preparing this content for gen AI applications and training. RHEL AI 1.3 now incorporates this innovation as a supported feature, enabling users to convert PDFs into Markdown for simplified data ingestion for model tuning with InstructLab. Through Docling, RHEL AI 1.3 now also includes context-aware chunking, which takes into account the structure and semantic elements of the documents used for gen AI training. This helps resulting gen AI applications maintain better levels of coherency and contextually-appropriate responses to questions and tasks, which otherwise would require further tuning and alignment. Future RHEL AI releases will continue to support and refine Docling components, including additional document formats as well as integration for retrieval-augmented generation (RAG) pipelines in addition to InstructLab knowledge tuning.

- Broadening the gen AI ecosystem: Choice is a fundamental component of the hybrid cloud and with gen AI serving as a signature workload for hybrid environments, this optionality needs to start with the underlying chip architectures. RHEL AI already supports leading accelerators from NVIDIA and AMD, and the 1.3 release now includes Intel Gaudi 3 as a technology preview. Beyond chip architecture, RHEL AI is supported across major cloud providers, including AWS, Google Cloud and Microsoft Azure consoles as a “bring your own subscription” (BYOS) offering. The platform is also available soon as an optimized and validated solution option on Azure Marketplace and AWS Marketplace. RHEL AI is available as a preferred foundation model platform on accelerated hardware offerings from Red Hat partners, including Dell PowerEdge R760xa servers and Lenovo ThinkSystem SR675 V3 servers.

- Model serving improvements with Red Hat OpenShift AI: As users look to scale out the serving of LLMs, Red Hat OpenShift AI now supports parallelized serving across multiple nodes with vLLM runtimes, providing the ability to handle multiple requests in real-time. Red Hat OpenShift AI also allows users to dynamically alter an LLM’s parameters when being served, such as sharding the model across multiple GPUs or quantizing the model to a smaller footprint. These improvements are aimed at speeding up response time for users, increasing customer satisfaction and lowering churn.

- Supporting Red Hat AI: RHEL AI, along with Red Hat OpenShift AI, underpins Red Hat AI, Red Hat’s portfolio of solutions that accelerate time to market and reduce the operational cost of delivering AI solutions across the hybrid cloud. RHEL AI supports individual Linux server environments, while Red Hat OpenShift AI powers distributed Kubernetes platform environments and provides integrated machine-learning operations (MLOps) capabilities. Both solutions are compatible with each other, with Red Hat OpenShift AI will incorporate all of RHEL AI’s capabilities to be delivered at scale.

RHEL AI 1.3 is now generally available.

Share this

Industry News

April 03, 2025

StackGen has partnered with Google Cloud Platform (GCP) to bring its platform to the Google Cloud Marketplace.

April 03, 2025

Tricentis announced its spring release of new cloud capabilities for the company’s AI-powered, model-based test automation solution, Tricentis Tosca.

April 03, 2025

Lucid Software has acquired airfocus, an AI-powered product management and roadmapping platform designed to help teams prioritize and build the right products faster.

April 03, 2025

AutonomyAI announced its launch from stealth with $4 million in pre-seed funding.

April 02, 2025

Kong announced the launch of the latest version of Kong AI Gateway, which introduces new features to provide the AI security and governance guardrails needed to make GenAI and Agentic AI production-ready.

April 02, 2025

Traefik Labs announced significant enhancements to its AI Gateway platform along with new developer tools designed to streamline enterprise AI adoption and API development.

April 02, 2025

Zencoder released its next-generation AI coding and unit testing agents, designed to accelerate software development for professional engineers.

April 02, 2025

Windsurf (formerly Codeium) and Netlify announced a new technology partnership that brings seamless, one-click deployment directly into the developer's integrated development environment (IDE.)

April 02, 2025

Opsera raised $20M in Series B funding.

April 02, 2025

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, is making significant updates to its certification offerings.

April 01, 2025

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the Golden Kubestronaut program, a distinguished recognition for professionals who have demonstrated the highest level of expertise in Kubernetes, cloud native technologies, and Linux administration.

April 01, 2025

Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade internal developer portal based on the Backstage project.

April 01, 2025

Platform9 announced that Private Cloud Director Community Edition is generally available.

March 31, 2025

Sonatype expanded support for software development in Rust via the Cargo registry to the entire Sonatype product suite.

March 31, 2025

CloudBolt Software announced its acquisition of StormForge, a provider of machine learning-powered Kubernetes resource optimization.