Check Point® Software Technologies Ltd.(link is external) announced that its Infinity Platform has been named the top-ranked AI-powered cyber security platform in the 2025 Miercom Assessment.
For at least the last ten years, every year was the year of Java's demise. Still, Java is here and it is as strong as ever. Not only that, but it is also very present together with state-of-the-art technologies, such as clouds, containers and resource managers.
However, running Java, or any other JVM language, in the cloud comes with hurdles. Microservice architectures, quick scaling operations (scale up or down) to balance load requirements and cost effectiveness, as well as environments such as Kubernetes (k8s) come with their own complexity in terms of Java.
Java and the Cloud
That said, one of the biggest obstacles with Java-based microservices is the slow startup time of Java applications. A lot of this can be mitigated by using microservice frameworks like Quarkus, but the JVM still has heavy lifting with class loading and initial Just-in-Time (JIT) compilation.
When talking about startup time and Java, it would be more accurate to say warmup time. Warming up a JVM means to feed real-world workloads to the application to give the JVM enough profiling data to optimize the application for highest speed with that specific performance profile. This adds to the loading time of the JVM and application itself, and it happens every single time you start the application. Even if you already have an instance of the same application running.
It also works against the newly acquired "Let it crash" mentality. For decades we, as engineers, learned to handle exception and failure scenarios. Everyone knows how hard it can be to recover from these issues and get the application back into a meaningful and consistent state. The opposite of that sentiment is to just let the application crash and restart it. With the JVM, however, this brings us right back to the warmup time issue.
Likewise, we see a similar problematic situation with quickly and / or frequently scaling up and down. When running in the cloud, one of the major cost efficiency factors is to keep the provided service resources consistent with the current load profile. That means scaling up if we see higher loads and scaling down if the load profile slows down. Introducing an additional instance of our Java application will cause a delay. The warmup delay can be just a minute, but some applications can take much longer to warm up and be ready to serve requests at full speed.
The Cloud Native Compiler (CNC) Makes Compilation More Efficient
This issue can be mitigated through sharing the code and JIT compilation results by moving the compiler outside of the JVM. Such compilers exist, including Azul Cloud Native Compiler (CNC).
The JIT compiler is its own process, deployed as a separate pod in Kubernetes, available to all running JVM instances. Instead of having the same profiling operation and compilation executed in each JVM separately, a compiler in the cloud executes it once, caches the result, and makes it available to multiple JVMs at once to prevent recompilation.
Most importantly, a cloud compiler solves the issue of slow JVM warmups. The profile of the application is known from previous instances. A newly started JVM has instant access to bytecode already precompiled according to the known profile. No JIT compiling. No warming up.
That doesn't just save precious startup time, but it actively prevents wasting CPU and RAM resources in the same operation every single time.
Less Anxiety with Java and the Cloud
What this all comes down to is less anxiety when running Java microservices in the cloud. There is no issue with fast scaling of services, and thanks to the pre-cached and pre-compiled Java classes, we save valuable resources for what our containers are meant to be used for — the service operation.
That in turn means easier maintainability (no warmup processes), easier deployment processes, and cost cutting on many levels (people hours, resources, time). The speed and volume of transactions demand the cloud's limitless resources, and cloud native JVM harness the cloud to great effect.
Industry News
Orca Security announced the Orca Bitbucket App, a cloud-native seamless integration for scanning Bitbucket Repositories.
The Live API for Gemini models is now in Preview, enabling developers to start building and testing more robust, scalable applications with significantly higher rate limits.
Backslash Security(link is external) announced significant adoption of the Backslash App Graph, the industry’s first dynamic digital twin for application code.
SmartBear launched API Hub for Test, a new capability within the company’s API Hub, powered by Swagger.
Akamai Technologies introduced App & API Protector Hybrid.
Veracode has been granted a United States patent for its generative artificial intelligence security tool, Veracode Fix.
Zesty announced that its automated Kubernetes optimization platform, Kompass, now includes full pod scaling capabilities, with the addition of Vertical Pod Autoscaler (VPA) alongside the existing Horizontal Pod Autoscaler (HPA).
Check Point® Software Technologies Ltd.(link is external) has emerged as a leading player in Attack Surface Management (ASM) with its acquisition of Cyberint, as highlighted in the recent GigaOm Radar report.
GitHub announced the general availability of security campaigns with Copilot Autofix to help security and developer teams rapidly reduce security debt across their entire codebase.
DX and Spotify announced a partnership to help engineering organizations achieve higher returns on investment and business impact from their Spotify Portal for Backstage implementation.
Appfire announced its launch of the Appfire Cloud Advantage Alliance.
Salt Security announced API integrations with the CrowdStrike Falcon® platform to enhance and accelerate API discovery, posture governance and threat protection.
Lucid Software has acquired airfocus, an AI-powered product management and roadmapping platform designed to help teams prioritize and build the right products faster.
StackGen has partnered with Google Cloud Platform (GCP) to bring its platform to the Google Cloud Marketplace.