13 What Blocking Queues Are Commonly Used in Thread Pools

13 What Blocking Queues Are Commonly Used in Thread Pools #

In this lesson, we mainly learn about the internal structure of thread pools and the most common types of blocking queues in thread pools.

Internal Structure of Thread Pools #

img

The internal structure of a thread pool mainly consists of four parts, as shown in the diagram.

  • The first part is the thread pool manager, which is responsible for managing the creation, destruction, and addition of tasks to the thread pool. It is the manager of the entire thread pool.
  • The second part is the worker threads, represented by threads t0~t9 in the diagram. These threads diligently retrieve tasks from the task queue and execute them.
  • The third part is the task queue. As a buffering mechanism, the thread pool puts unprocessed tasks into the task queue. Since multiple threads retrieve tasks from the task queue concurrently, the task queue needs to satisfy the requirements of thread safety. Therefore, the task queue in the thread pool uses a BlockingQueue to ensure thread safety.
  • The fourth part is the tasks, which need to implement a unified interface so that the worker threads can process and execute them.

Blocking Queue #

img #

The most important part of the thread pool is the blocking queue, as shown in the table. Different thread pools use different blocking queues.

On the left side of the table are the thread pools, and on the right side are their corresponding blocking queues. You can see that the 5 types of thread pools correspond to 3 types of blocking queues. Let’s introduce them one by one.

LinkedBlockingQueue #

For FixedThreadPool and SingleThreadExecutor, they use a LinkedBlockingQueue with a capacity of Integer.MAX_VALUE, which can be considered an unbounded queue. Since the number of threads in the FixedThreadPool is fixed, it is not possible to add a large number of threads to handle tasks. In this case, a blocking queue like LinkedBlockingQueue without capacity limitation is needed to store tasks. It should be noted that since the task queue of the thread pool will never be full, the thread pool will only create the same number of threads as the core thread count. Therefore, the maximum thread count is meaningless for the thread pool because it will not generate more threads than the core thread count.

SynchronousQueue #

The second type of blocking queue is SynchronousQueue, which corresponds to the CachedThreadPool. The maximum number of threads in the CachedThreadPool is Integer.MAX_VALUE, which means that the number of threads can be expanded infinitely. The situation of CachedThreadPool is exactly the opposite of FixedThreadPool. In FixedThreadPool, the capacity of the blocking queue is unlimited, but in CachedThreadPool, the number of threads can be expanded infinitely. Therefore, the CachedThreadPool does not need a task queue to store tasks because once a task is submitted, it is either directly forwarded to a thread or a new thread is created to execute it, without the need to store them separately.

When we create our own thread pool using SynchronousQueue, if we don’t want tasks to be rejected, we need to pay attention to setting the maximum number of threads as large as possible. This is to ensure that when the number of tasks exceeds the maximum number of threads, there are enough threads to execute the tasks even if they cannot be placed in the queue.

DelayedWorkQueue #

The third type of blocking queue is DelayedWorkQueue, which corresponds to ScheduledThreadPool and SingleThreadScheduledExecutor. The most distinctive feature of these two thread pools is that they can delay the execution of tasks, such as executing tasks after a certain time or executing tasks at regular intervals. The feature of DelayedWorkQueue is that the internal elements are not sorted by the time they were inserted, but rather they are sorted by the delay time, using a data structure called “heap”. The reason why ScheduledThreadPool and SingleThreadScheduledExecutor choose DelayedWorkQueue is because they themselves are based on executing tasks based on time, and the delayed queue can sort tasks by time, making it convenient to execute tasks.