Backslash Security(link is external) announced significant adoption of the Backslash App Graph, the industry’s first dynamic digital twin for application code.
Over the past couple years, enterprise developers have experienced an unprecedented pace of learning and new demands as generative AI has emerged and related projects pile up on their desks. Developers are using the new technology, but are also being asked to build it — to create custom applications that harness generative AI.
In 2025, this trend will only accelerate: Nearly two-thirds of organizations from a recent IBM survey plan to increase their AI investments this year. While many developers spent 2023 and 2024 experimenting with large language models and other generative AI, enterprises now want to implement the technology at scale. They are particularly keen to introduce AI agents to their organizations.
Enterprises have good reason to invest in generative AI. But the developers leading the charge are often apprehensive. That's because while enterprise AI applications eventually make workflows simpler, building those applications can be complex and challenging. Indeed, a new survey published in January by IBM and Morning Consult reveals several hurdles AI developers are facing, from an overwhelming number of dev tools to a dearth of skills.
Many AI developers are overwhelmed by "tool sprawl," the survey found — the need to master and harness an untenable number of apps. These same developers are also facing skills gaps: Less than one quarter (24%) of traditional application developers surveyed ranked themselves as "experts" in generative AI. Despite this, many application developers are still expected to lead the charge when it comes to generative AI adoption. Meanwhile, the community lacks alignment around frameworks and governance: Survey respondents cited the need for a standardized AI development process and an AI lifecycle that ensures transparency and traceability of data.
What can enterprises and their developers do in 2025 to clear these hurdles?
Below are four tips, informed by IBM's recent survey and our own experience in the field, to simplify the AI stack and unlock generative AI's potential.
Choose versatile tools
Almost three quarters of developers surveyed use between five and 15 tools when creating an AI enterprise application. And 13% use more than 15 tools. That's an unmanageable number, especially since these tools can change their features, availability, or APIs at a moment's notice. In 2025, point solutions will still remain relevant — but AI developers should also seek integrated solutions that can cover everything from SDKs and RAG frameworks to agentic workflows and advanced tuning.
Less time learning multiple tools, and less development efforts for integrating and maintaining those point solutions, mean more time spent building meaningful AI applications. IBM's survey also reveals that developers prize four qualities when selecting an AI development tool: performance, flexibility, ease of use, and integration.
Leverage coding assistants
Developers should use existing generative AI applications when building new ones — specifically coding assistants, which provide a major productivity boost. This potential isn't lost on developers: IBM's survey found that 99% of AI developers are already using coding assistants in some capacity, and about one quarter say the tools are saving them three hours or more of time each day. In 2025, developers should adopt coding assistants to save time and increase their productivity.
Opt for openness
One reliable way to simplify the AI stack is to make it more open, introducing the transparency and traceability that developers crave. Openness also helps eliminate the anxiety that building atop "black box" proprietary models can generate: When a model is open, everything from data provenance to security becomes easier to document.
Skill up, but manage expectations
Traditional application developers shouldn't bristle at the idea of developing enterprise AI applications. These applications are the future of enterprise software, so it's crucial for developers to expand their skill sets when it comes to LLM lifecycle management, fine-tuning, data stewardship, and coding productivity generally. However, it is also important for developers to manage their enterprises' expectations. Building effective enterprise AI applications requires talent accrued over time, not a day's work.
The rise of generative AI for the enterprise and the desire for AI agents will keep developers busy in 2025. But busy doesn't have to mean frustrated and overwhelmed. In the coming months, the most prudent developers will prepare themselves for this trend by choosing the right tools, tactics, and training — setting themselves and their enterprises up for success.
Industry News
SmartBear launched API Hub for Test, a new capability within the company’s API Hub, powered by Swagger.
Akamai Technologies introduced App & API Protector Hybrid.
Veracode has been granted a United States patent for its generative artificial intelligence security tool, Veracode Fix.
Zesty announced that its automated Kubernetes optimization platform, Kompass, now includes full pod scaling capabilities, with the addition of Vertical Pod Autoscaler (VPA) alongside the existing Horizontal Pod Autoscaler (HPA).
Check Point® Software Technologies Ltd.(link is external) has emerged as a leading player in Attack Surface Management (ASM) with its acquisition of Cyberint, as highlighted in the recent GigaOm Radar report.
GitHub announced the general availability of security campaigns with Copilot Autofix to help security and developer teams rapidly reduce security debt across their entire codebase.
DX and Spotify announced a partnership to help engineering organizations achieve higher returns on investment and business impact from their Spotify Portal for Backstage implementation.
Appfire announced its launch of the Appfire Cloud Advantage Alliance.
Salt Security announced API integrations with the CrowdStrike Falcon® platform to enhance and accelerate API discovery, posture governance and threat protection.
Lucid Software has acquired airfocus, an AI-powered product management and roadmapping platform designed to help teams prioritize and build the right products faster.
StackGen has partnered with Google Cloud Platform (GCP) to bring its platform to the Google Cloud Marketplace.
Tricentis announced its spring release of new cloud capabilities for the company’s AI-powered, model-based test automation solution, Tricentis Tosca.
Lucid Software has acquired airfocus, an AI-powered product management and roadmapping platform designed to help teams prioritize and build the right products faster.
AutonomyAI announced its launch from stealth with $4 million in pre-seed funding.