38 How to Choose the Right Blocking Queue for Yourself

38 How to Choose the Right Blocking Queue for Yourself #

In this lesson, we will mainly discuss how to choose the right blocking queue for yourself.

When it comes to choosing the most suitable blocking queue, thread pools have set a good example for us. There are many types of thread pools, and each type chooses a blocking queue that is suitable for its own characteristics.

So let’s first review how these classic thread pools select their blocking queues. After learning from their experiences, we can summarize a set of rules to consider when choosing a blocking queue for ourselves.

Thread Pool’s Selection of Blocking Queues #

img

Let’s take a look at the key points for selecting a thread pool. The table above shows the thread pools on the left and their corresponding blocking queues on the right. You can see that there are only 3 types of blocking queues corresponding to the 5 types of thread pools. Let’s introduce them one by one.

  • FixedThreadPool (the same applies to SingleThreadExecutor) chooses LinkedBlockingQueue

Unlike ArrayBlockingQueue, which has a limited capacity, LinkedBlockingQueue has an infinitely extensible length due to its nature as a linked list.

Since the number of threads in FixedThreadPool is fixed, it cannot increase more threads to help process tasks when the workload increases. Therefore, it needs a queue like LinkedBlockingQueue with no capacity limit to store the tasks that have not been processed yet.

If all the corePoolSize threads are busy, new tasks will enter the blocking queue and wait. Since the queue has no capacity limit, it will never be filled up. This ensures that for FixedThreadPool and SingleThreadExecutor, new tasks will not be rejected and data will not be lost.

  • CachedThreadPool chooses SynchronousQueue

For CachedThreadPool, in order to avoid rejecting newly submitted tasks, it chooses an unlimited maximumPoolSize. Since the maximum number of its threads is unlimited, it means that the number of its threads is not limited. Therefore, it does not need extra space to store the tasks because each task can be processed by creating a new thread.

SynchronousQueue directly hands over the tasks to the threads without saving them separately, which is more efficient. Therefore, CachedThreadPool uses SynchronousQueue as its queue.

  • ScheduledThreadPool (the same applies to SingleThreadScheduledExecutor) chooses DelayQueue

For ScheduledThreadPool, it uses DelayedWorkQueue. The characteristic of a delay queue is that it does not follow the first-in-first-out order, but sorts the tasks based on their delay time. The next task to be executed will be placed at the front of the queue.

Let’s take an example: Suppose we put a task with a delay of 10 minutes into this queue, and then put another task with a delay of 10 seconds. Normally, if it’s not a delay queue and follows the first-in-first-out order, meaning the task with a delay of 10 minutes is the first one placed and will be at the front. But since we are using a blocking queue, it arranges the tasks based on the length of their delay time. Therefore, the task with a delay of 10 seconds, which was placed later, will actually be placed in front of the task with a delay of 10 minutes because its execution time is earlier.

We choose a delay queue because ScheduledThreadPool deals with tasks that are executed based on time. The delay queue has the ability to sort tasks according to their execution time, which is exactly what we need.

ArrayBlockingQueue #

Besides the 3 types of blocking queues selected by thread pools, there is another commonly used blocking queue called ArrayBlockingQueue, which is often used in manually created thread pools.

This blocking queue is implemented using an array and requires the capacity value to be passed in when creating the object, and it cannot be resized later. Therefore, the most notable feature of ArrayBlockingQueue is that its capacity is limited and fixed. In this case, when using ArrayBlockingQueue and setting a reasonable maximum number of threads for the thread pool, if the task queue is full and the number of threads has reached the maximum value, the thread pool will reject newly submitted tasks according to the rules, instead of continuously increasing tasks or threads, which can effectively prevent resource exhaustion.

Summary #

Let’s summarize the experience below. Generally, we can consider the following 5 aspects to choose a suitable blocking queue:

  • Functionality

The first aspect to consider is functionality. For example, do we need a blocking queue to help us with sorting, such as priority sorting or delayed execution? If so, we must choose a blocking queue with sorting capabilities, such as PriorityBlockingQueue.

  • Capacity

The second aspect to consider is capacity, or whether there is a storage requirement or just “direct pass-through”. When considering this, we know that the various blocking queues mentioned earlier have different capacities. Some have a fixed capacity, such as ArrayBlockingQueue; some have a default infinite capacity, such as LinkedBlockingQueue; and some have no capacity at all, such as SynchronousQueue. As for DelayQueue, its fixed capacity is Integer.MAX_VALUE.

Therefore, the capacity of different blocking queues varies widely. We need to calculate the appropriate capacity based on the number of tasks to select the appropriate BlockingQueue.

  • Expandability

The third aspect to consider is expandability. Sometimes, we may not be able to accurately estimate the size of the queue at the beginning because the business may have peak and off-peak periods.

If we fix a capacity at the beginning, it may not be suitable or able to handle all situations, and dynamic expansion may be needed. If we need dynamic expansion, then we cannot choose ArrayBlockingQueue because its capacity is determined at creation and cannot be expanded. On the contrary, even if PriorityBlockingQueue specifies an initial capacity, it can automatically expand if needed.

Therefore, we can select the appropriate queue based on whether expansion is needed.

  • Memory structure

The fourth aspect to consider is memory structure. In the previous lesson, we analyzed the source code of ArrayBlockingQueue and saw that its internal structure is in the form of an “array”.

Different from it, LinkedBlockingQueue is implemented using linked lists. So, we need to consider that ArrayBlockingQueue has higher space utilization because it does not require “nodes” like linked lists. Therefore, if we have performance requirements, we can consider this issue from the perspective of memory structure.

  • Performance

The fifth aspect is to consider from the perspective of performance. For example, due to the existence of two locks, LinkedBlockingQueue has finer operation granularity, and its performance is better than that of ArrayBlockingQueue, which only has one lock, when the concurrency is high.

In addition, SynchronousQueue often has better performance than other implementations because it only needs “direct pass-through” without the need for storage. If our scenario requires direct pass-through, we can prioritize considering SynchronousQueue.

In this lesson, we first reviewed the rules for selecting a blocking queue for thread pools. Then, we learned about the characteristics of ArrayBlockingQueue. Next, we summarized the usual considerations from the perspectives of functionality, capacity, expandability, memory structure, and performance. We should choose the most suitable blocking queue for our business based on these considerations.