Speaker Details

Simon Ritter

Azul

Simon Ritter is the Deputy CTO of Azul Systems. Simon has been in the IT business since 1984 and holds a Bachelor of Science degree in Physics from Brunel University in the U.K. 


Simon joined Sun Microsystems in 1996 and started working with Java technology from JDK 1.0; he has spent time working in both Java development and consultancy. Having moved to Oracle as part of the Sun acquisition, he managed the Java Evangelism team for the core Java platform. Now at Azul, he continues to help people understand Java as well as Azul’s JVM technologies and products. Simon has twice been awarded Java Rockstar status at JavaOne and is a Java Champion. He represents Azul on the Java SE Expert Group, JCP Executive Committee, OpenJDK Vulnerability Group and Adoptium Steering Committee.

Java on CRaC: Superfast JVM Application Startup

Conference
Java

One of the key benefits of a microservice architecture is the ability to dynamically respond to changes in load by spinning up new instances as required. However, when deploying JVM-based services, the warmup time of JIT compilation can be a limitation on the effectiveness of this technique.

 

One approach to solving this problem is using native images, where the service is statically compiled using an ahead-of-time (AOT) compiler. Although this does reduce startup time and footprint, it does so at the cost of overall performance.

 

A new project has been proposed and accepted into OpenJDK called CRaC (Coordinated Restore at Checkpoint). The goal is to research the co-ordination of Java programs with mechanisms to checkpoint (snapshot) a running application. Restoring from the snapshot could solve some of the problems with the startup and warmup times, especially in microservices.


In this session, we’ll look at the main challenges for such a project, potential solutions and the results from a proof-of-concept implementation.


Cloud Native Compiler: JIT-as-a-Service

Byte Size
Java

Adaptive, just in time (JIT) compilation provides a massive performance improvement to JVM-based applications compared to only using an interpreter.  The downside of this is that applications have to compile frequently used methods as the application is running.  This can lead to reduced throughput and slower response times.  Another drawback is that each time an application is started, it must perform the same analysis to identify hot spot methods and compile them.


In this session, we'll look at Azul's work to move the JIT compiler into a centralized service that can be shared by many JVMs.  This provides many advantages, such as caching compiled code for instant delivery when restarting the same application or spinning up new instances of the same service.  In addition, it removes the workload from individual JVMs, allowing them to deliver more transactions per second of application work.  Finally, there is the opportunity to apply considerably more compute resources to enable complex optimizations to be used that wouldn't be practical in a single JVM.