Spectro Cloud completed a $75 million Series C funding round led by Growth Equity at Goldman Sachs Alternatives with participation from existing Spectro Cloud investors.
A recent MIT/BCG study revealed that 84% surveyed feel AI is critical to obtain or sustain competitive advantage, and three out of four surveyed believe that Machine Learning provides an opportunity to enter new businesses and that AI will be the basis for new entrants into their industry. Which shouldn't come as a surprise to anyone, seeing as how advances in GPU/TPU technology, and the development of new platforms and frameworks have enabled an explosion in AI and Machine Learning, while new platforms from Amazon, Microsoft and others have put pre-built frameworks firmly in the grasp of developers. Despite all this movement, however, we are still definitely very early in the transition to using AI to transform software development — commonly referred to as Software 2.0, or AIOps.
Tesla is one shining example that emphasizes how early we are, and just how much expertise is required in an organization in order for the enterprise to gain the level of maturity necessary to take on this advanced, yet still esoteric, technology. Tesla uses computer vision, and other Machine Learning algorithms, to enable their vehicles to make literally thousands of decisions a millisecond. Most companies don't have anywhere near the comparable expertise in Artificial Intelligence and/or Machine Learning to take on this level of complexity on their own. But we remain optimistic, since Tesla's success thus far does inform what's possible in the near future.
The difficulty inherent in the transformation of DevOps to AIOps is that the two methodologies are not even close to being the same thing. Algorithmia, a company intent on "building the future of Machine Learning infrastructure," is one other organization that has already developed a flagship DevOps platform for AI. This tweet from Diego Oppenheimer, CEO/founder of Algorithmia, (quoting Mike Anderson, also of Algorithmia) illustrates what I mean when I say DevOps and AIOps are not one and the same: "Expecting your engineering and DevOps teams to deploy ML models well is like showing up to Seaworld with a giraffe, since they are already handling large mammals."
The low-code Lego models may be faster, but that doesn't mean they are optimized or efficient when you piece all the Legos together into a full-blown application. Though over time it's possible these components will improve. Some of the advantages of this approach can also be achieved (but perhaps without the continuous improvement of evaluating the quality of the code) through Reusable Component Libraries.
Many companies that may be eager to start down on the AI path will necessarily be relying on those familiar platform providers that are immediately available to them to improve/optimize code — such as the Microsoft Intellicode. We've also seen Apple launch SwiftUI, CreateML, and Reality Composer — all products aimed at reducing the coding effort as well as a significant investment in Swift (a far more efficient and declarative syntax that intrinsically requires less code) and the underlying ML and AR frameworks to pull it off. But like the Microsoft example, this is being led by the platform providers.
Industry News
The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, has announced significant momentum around cloud native training and certifications with the addition of three new project-centric certifications and a series of new Platform Engineering-specific certifications:
Red Hat announced the latest version of Red Hat OpenShift AI, its artificial intelligence (AI) and machine learning (ML) platform built on Red Hat OpenShift that enables enterprises to create and deliver AI-enabled applications at scale across the hybrid cloud.
Salesforce announced agentic lifecycle management tools to automate Agentforce testing, prototype agents in secure Sandbox environments, and transparently manage usage at scale.
OpenText™ unveiled Cloud Editions (CE) 24.4, presenting a suite of transformative advancements in Business Cloud, AI, and Technology to empower the future of AI-driven knowledge work.
Red Hat announced new capabilities and enhancements for Red Hat Developer Hub, Red Hat’s enterprise-grade developer portal based on the Backstage project.
Pegasystems announced the availability of new AI-driven legacy discovery capabilities in Pega GenAI Blueprint™ to accelerate the daunting task of modernizing legacy systems that hold organizations back.
Tricentis launched enhanced cloud capabilities for its flagship solution, Tricentis Tosca, bringing enterprise-ready end-to-end test automation to the cloud.
Rafay Systems announced new platform advancements that help enterprises and GPU cloud providers deliver developer-friendly consumption workflows for GPU infrastructure.
Apiiro introduced Code-to-Runtime, a new capability using Apiiro’s deep code analysis (DCA) technology to map software architecture and trace all types of software components including APIs, open source software (OSS), and containers to code owners while enriching it with business impact.
Zesty announced the launch of Kompass, its automated Kubernetes optimization platform.
MacStadium announced the launch of Orka Engine, the latest addition to its Orka product line.
Elastic announced its AI ecosystem to help enterprise developers accelerate building and deploying their Retrieval Augmented Generation (RAG) applications.
Red Hat introduced new capabilities and enhancements for Red Hat OpenShift, a hybrid cloud application platform powered by Kubernetes, as well as the technology preview of Red Hat OpenShift Lightspeed.
Traefik Labs announced API Sandbox as a Service to streamline and accelerate mock API development, and Traefik Proxy v3.2.