Project Loom: Understand the new Java concurrency model

The Loom docs present the example seen in Listing 3, which provides a good mental picture of how this works. However, forget about automagically scaling up to a million of private threads in real-life scenarios without knowing what you are doing. The problem with real applications is them doing silly things, like calling databases, working with the file system, executing REST calls or talking to some sort of queue/stream.

  • Without it, multi-threaded applications are more error-prone when subtasks are shut down or canceled in the wrong order, and harder to understand, he said.
  • For example, if a request takes two seconds and we limit the thread pool to 1,000 threads, then a maximum of 500 requests per second could be answered.
  • Project Loom features a lightweight concurrency construct for Java.
  • One feature introduced by Project Loom is virtual threads.
  • Whereas parallelism is the process of performing a task faster by using more resources such as multiple processing units.
  • This is a software construct that’s built into the JVM, or that will be built into the JVM.

To demo it, we have a very simple task that waits for 1 second before printing a message in the console. We are creating this task to keep the example simple so we can focus on the concept. Let us understand the difference between both kinds of threads when they are submitted with the same executable code. In this case, the exception is also not propagated to the parent thread.

2. Avoid using Thread-local Variables

This article discusses the problems in Java’s current concurrency model and how the Java project Loom aims to change them. We also explored the tasks and schedulers in threads and how Fibers Class and pluggable user-mode schedulers can be an excellent alternative for traditional threads in Java. More importantly, every thread you create in your Java Virtual Machine consumes more or less around 1 megabyte of memory, and it’s outside of heap. No matter how much heap you allocate, you have to factor out the extra memory consumed by your threads. This is actually a significant cost, every time you create a thread, that’s why we have thread pools.

java loom

Already, Java and its primary server-side competitor Node.js are neck and neck in performance. An order of magnitude boost to Java performance in typical web app use cases could alter the landscape for years to come. It will be fascinating to watch as Project Loom moves into the main branch and evolves in response to real-world use. As this plays out, and the advantages inherent in the new system are adopted into the infrastructure that developers rely on (think Java app servers like Jetty and Tomcat), we could see a sea change in the Java ecosystem. At a high level, a continuation is a representation in code of the execution flow. In other words, a continuation allows the developer to manipulate the execution flow by calling functions.

Project Loom: Understand The New Java Concurrency Model

In response to these drawbacks, many asynchronous libraries have emerged in recent years, for example using CompletableFuture. As have entire reactive frameworks, such as RxJava, Reactor, or Akka Streams. While they all make far more effective use of resources, developers need to adapt to a somewhat different programming model. Many developers perceive the different style as “cognitive ballast”. Instead of dealing with callbacks, observables, or flows, they would rather stick to a sequential list of instructions.

java loom

It can hardly be faster because every task waits one second. However, anyone who has had to maintain code like the following knows that reactive code is many times more complex than sequential code – and absolutely no fun. Virtual threads can utilize the CPU more efficiently, also resource utilization is much better.

Check nulls in Java Stream

By the way, this effect has become relatively worse with modern, complex CPU architectures with multiple cache layers (“non-uniform memory access”, NUMA for short). With virtual threads on the other hand it’s no problem to start a whole million threads. Listing 2 will run on the Project Loom JVM without any problems. I am a Java and Python developer with over 1 year of experience in software development.

Because RestTemplate underneath uses HTTP client from Apache, which uses sockets, and sockets are rewritten so that every time you block, or wait for reading or writing data, you are actually suspending your virtual thread. It seems like RestTemplate or any other blocking API is exciting again. At least that’s what we might think, you no longer need reactive java loom programming and all these like WebFluxes, RxJavas, Reactors, and so on. User threads and kernel threads aren’t actually the same thing. User threads are created by the JVM every time you say newthread.start. In the very prehistoric days, in the very beginning of the Java platform, there used to be this mechanism called the many-to-one model.

devmio – Software Know-How

Apart from the number of threads, latency is also a big concern. If you watch closely, in today’s world of microservices, a request is served by fetching/updating data on multiple systems and servers. While the application waits for the information from other servers, the current https://www.globalcloudteam.com/ platform thread remains in an idle state. This is a waste of computing resources and a major hurdle in achieving a high throughput application. Continuations have a justification beyond virtual threads and are a powerful construct to influence the flow of a program.

java loom

Hence, the biggest gains should be seen in I/O-heavy systems, while CPU-heavy applications won’t see much improvement from using Loom. Prevalent issue with the current thread implementation is that it can limit the applications bandwidth to well below what the modern hardware can handle. This is a main function that calls foo, then foo calls bar.

Exceptions and structured concurrency

You have coroutines or goroutines, in languages like Kotlin and Go. All of these are actually very similar concepts, which are finally brought into the JVM. It used to be simply a function that just blocks your current thread so that it still exists on your operating system. However, it no longer runs, so it will be woken up by your operating system. A new version that takes advantage of virtual threads, notice that if you’re currently running a virtual thread, a different piece of code is run.

Using a virtual thread based executor is a viable alternative to Tomcat’s standard thread pool. The benefits of switching to a virtual thread executor are marginal in terms of container overhead. At high levels of concurrency when there were more concurrent tasks than processor cores available, the virtual thread executor again showed increased performance. This was more noticeable in the tests using smaller response bodies. An unexpected result seen in the thread pool tests was that, more noticeably for the smaller response bodies, 2 concurrent users resulted in fewer average requests per second than a single user.

Why Do We Need Virtual Threads?

However, there’s a whole bunch of APIs, most importantly, the file API. There’s a list of APIs that do not play well with Project Loom, so it’s easy to shoot yourself in the foot. Virtual threads under Project Loom also require minimal changes to code, which will encourage its adoption in existing Java libraries, Hellberg said.

Leave a comment

Your email address will not be published. Required fields are marked *