JFrog Integrates with NVIDIA NIM
September 17, 2024

JFrog announced a new product integration with NVIDIA NIM microservices, part of the NVIDIA AI Enterprise software platform.

The integration of the JFrog Platform with the JFrog Artifactory model registry and NVIDIA NIM is expected to combine GPU-optimized, pre-approved AI models with centralized DevSecOps processes in an end-to-end software supply chain workflow. This allows organizations to bring secure machine learning (ML) models and large language models (LLMs) to production at lightning speed, with increased transparency, traceability, and trust.

“As organizations rapidly adopt AI technology, it’s essential to implement practices that ensure their efficiency and safety, and that incorporate AI responsibly,” said Gal Marder, EVP Strategy, JFrog. “By integrating DevOps, security, and MLOps processes into an end-to-end software supply chain workflow with NVIDIA NIM microservices, customers will be able to efficiently bring secure models to production while maintaining high levels of visibility, traceability, and control throughout the pipeline.”

“As enterprises scale their generative AI deployments, a central repository can help them rapidly select and deploy models that are approved for development,” said Pat Lee, Vice President, Enterprise Strategic Partnerships, NVIDIA. “The integration of NVIDIA NIM microservices into the JFrog Platform can help developers quickly get fully compliant, performance-optimized models quickly running in production.”

JFrog Artifactory provides a single solution for housing and managing all the artifacts, binaries, packages, files, containers, and components for use throughout software supply chains. The JFrog Platform’s integration with NVIDIA NIM is expected to incorporate containerized AI models as software packages into existing software development workflows. By coupling NVIDIA NGC – a hub for GPU-optimized deep learning, ML and HPC models – with the JFrog platform and JFrog Artifactory model registry, organizations will be able to maintain a single source of truth for all software packages and AI models, while leveraging enterprise DevSecOps best practices to gain visibility, governance, and control across their software supply chain.

The integration between the JFrog Platform and NVIDIA NIM is anticipated to deliver multiple benefits, including:

- Unified Management: Centralized access control and management of NIM microservice containers alongside all other assets, including proprietary artifacts and open-source software dependencies, in JFrog Artifactory as the model registry to enable seamless integration with existing DevSecOps workflows.

- Comprehensive Security and Integrity: Continuous scanning at every stage of development – including containers and dependencies – delivering contextual insights across NIM microservices with JFrog auditing and usage statistics that drive compliance.

- Exceptional Model Performance and Scalability: Optimized AI application performance using NVIDIA accelerated computing infrastructure, offering low latency and high throughput for scalable deployment of LLMs to large-scale production environments.

- Flexible Deployment: Flexible deployment options via JFrog Artifactory, including self-hosted, multi-cloud, and air-gap deployment options.

Share this

Industry News

November 21, 2024

Red Hat announced the general availability of Red Hat Enterprise Linux 9.5, the latest version of the enterprise Linux platform.

November 21, 2024

Securiti announced a new solution - Security for AI Copilots in SaaS apps.

November 20, 2024

Spectro Cloud completed a $75 million Series C funding round led by Growth Equity at Goldman Sachs Alternatives with participation from existing Spectro Cloud investors.

November 20, 2024

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, has announced significant momentum around cloud native training and certifications with the addition of three new project-centric certifications and a series of new Platform Engineering-specific certifications:

November 20, 2024

Red Hat announced the latest version of Red Hat OpenShift AI, its artificial intelligence (AI) and machine learning (ML) platform built on Red Hat OpenShift that enables enterprises to create and deliver AI-enabled applications at scale across the hybrid cloud.

November 20, 2024

Salesforce announced agentic lifecycle management tools to automate Agentforce testing, prototype agents in secure Sandbox environments, and transparently manage usage at scale.

November 19, 2024

OpenText™ unveiled Cloud Editions (CE) 24.4, presenting a suite of transformative advancements in Business Cloud, AI, and Technology to empower the future of AI-driven knowledge work.

November 19, 2024

Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade developer portal based on the Backstage project.

November 19, 2024

Pegasystems announced the availability of new AI-driven legacy discovery capabilities in Pega GenAI Blueprint™ to accelerate the daunting task of modernizing legacy systems that hold organizations back.

November 19, 2024

Tricentis launched enhanced cloud capabilities for its flagship solution, Tricentis Tosca, bringing enterprise-ready end-to-end test automation to the cloud.

November 19, 2024

Rafay Systems announced new platform advancements that help enterprises and GPU cloud providers deliver developer-friendly consumption workflows for GPU infrastructure.

November 19, 2024

Apiiro introduced Code-to-Runtime, a new capability using Apiiro’s deep code analysis (DCA) technology to map software architecture and trace all types of software components including APIs, open source software (OSS), and containers to code owners while enriching it with business impact.

November 19, 2024

Zesty announced the launch of Kompass, its automated Kubernetes optimization platform.

November 18, 2024

MacStadium announced the launch of Orka Engine, the latest addition to its Orka product line.