23 Single Thread Model How to Ensure Ui Runs Smoothly

23 Single-thread Model How to Ensure UI Runs Smoothly #

Hello, I’m Chen Hang.

In the previous article, I taught you how to implement animations in Flutter. For component animations, Flutter separates the animation’s state from rendering, so we need to use animation curve generators like Animation, animation state controllers like AnimationController, and animation progress listeners to update the animation. As for cross-page animations, Flutter provides the Hero component, which allows for shared element transitions between pages.

In previous chapters, we introduced the excellent rendering and interaction capabilities of the Flutter framework. Behind these complex capabilities is Dart, which is based on a single-thread model. So, how does Dart ensure the smoothness of Flutter UI from both language design and code running mechanisms compared to the multi-thread mechanism of native Android and iOS?

Therefore, today, I will use several small examples to progressively introduce the event loop processing mechanism in the Dart language, the principles and usage of asynchronous processing and concurrent programming, and understand the essence of code running in Dart’s single-thread model from both language design and practical perspectives. This will help us understand how to use Future and Isolate to optimize our projects in the future.

Event Loop Mechanism #

First of all, we need to establish the concept that Dart is single-threaded. What does it mean to be single-threaded? It means that Dart code is executed in order, one after another, according to the occurrence in the main function, without being interrupted by other code. In addition, as the key technology supporting the Flutter UI framework, Dart naturally supports asynchronous operations. It is important to note that single-threaded and asynchronous are not contradictory.

Why can asynchronous operations still work in a single-threaded environment?

There is a big premise here, which is that our app spends most of the time waiting. For example, waiting for user clicks, waiting for network responses, waiting for file I/O results, and so on. And these waiting operations are not blocking. For example, for network requests, the Socket itself provides a select model for asynchronous querying; for file I/O, the operating system also provides event-based callback mechanisms.

Therefore, based on these characteristics, a single-threaded model can do other things while waiting, and only perform the corresponding processing when the response is truly needed. Because the waiting process is not blocking, it gives us the feeling of doing multiple things simultaneously. However, there is always only one thread processing your tasks.

This waiting behavior is driven by the Event Loop. The Event Queue will store events that have been completed in other parallel worlds (such as Sockets) and need to be responded to by the main thread. Like other languages, Dart also has a huge event loop, which constantly polls the event queue, retrieves events (such as keyboard events, I/O events, network events, etc.), and synchronously executes their callback functions in the main thread, as shown in the following diagram:

Figure 1: Simplified Event Loop

Asynchronous Tasks #

In fact, the Event Loop diagram in Figure 1 is just a simplified version. In Dart, there are actually two queues, an Event Queue and a Microtask Queue. In each event loop iteration, Dart always checks the first microtask queue for executable tasks. If there are none, it proceeds to process the event queue.

Therefore, the complete flowchart of the Event Loop should be as follows:

Figure 2: Microtask Queue and Event Queue

Next, let’s take a look at the characteristics and use cases of these two queues separately.

First, let’s examine the microtask queue. As the name suggests, a microtask represents an asynchronous task that will be completed in a short period of time. From the flowchart above, it can be seen that the microtask queue has the highest priority in the event loop, as long as there are tasks in the queue, it will continue to dominate the event loop.

Microtasks are created using scheduleMicroTask. As shown below, this code will output a string in the next event loop:

scheduleMicrotask(() => print('This is a microtask'));

However, generally, asynchronous tasks usually do not need to be completed before the event queue, so they do not require a high priority. Therefore, we rarely directly use the microtask queue, even within Flutter, there are only 7 places where it is used (such as gesture recognition, text input, scroll views, and saving page effects where high priority execution tasks are needed).

The most commonly used asynchronous tasks are those with lower priority in the event queue. For example, I/O, rendering, and timers are all asynchronous events driven by the event queue in the main thread.

Dart provides a layer of encapsulation for establishing tasks in the event queue, called Future. As the name implies, it represents a task that will be completed in the future.

Wrapping a function body in a Future completes the transformation from a synchronous task to an asynchronous task. Future also provides the ability for chaining calls, allowing other function bodies in the chain to be executed after the asynchronous task is completed.

Next, let’s take a look at a specific code example: declaring two asynchronous tasks that will output a string in the next event loop. After the second task is completed, two additional strings will be outputted:

Future(() => print('Running in Future 1')); // Output a string in the next event loop

Future(() => print('Running in Future 2'))
  .then((_) => print('and then 1'))
  .then((_) => print('and then 2')); // Output three strings in succession after the previous event loop ends

Of course, the execution priority of these two Future asynchronous tasks is lower than that of microtasks.

Under normal circumstances, the execution of a Future asynchronous task is relatively simple: when we declare a Future, Dart puts the function body of the asynchronous task into the event queue and returns immediately, allowing the subsequent code to continue executing synchronously. When the synchronous code is completed, the event queue retrieves events in the order they were added (i.e., the order of declaration) and executes the function body of the Future as well as the subsequent then callbacks synchronously.

This means that then shares the same event loop as the Future function body. If a Future has multiple then callbacks, they will be executed synchronously in the order of the chain, sharing the same event loop as well.

If the Future function body has already completed execution, but you have a reference to this Future and add a then callback to it, how will Dart handle this situation? In such a case, Dart will put the subsequent then callbacks into the microtask queue for immediate execution.

The following code demonstrates the execution rules of Future, that is, adding to the event queue or declaring tasks in advance determines the execution order; then is executed immediately after the Future is completed.

  • In the first example, since f1 is declared before f2, it is added to the event queue first, so f1 is executed before f2.
  • In the second example, since the Future function body shares the same event loop as then, when f3 is executed, then 3 is immediately executed synchronously.
  • In the last example, the Future function body is null, which means it does not require an event loop. Therefore, the subsequent then callbacks cannot share the event loop with it. In this scenario, Dart puts the subsequent then callbacks into the microtask queue for execution in the next event loop.
// `f1` is executed before `f2` since it is added to the event queue first
Future(() => print('f1'));
Future(() => print('f2'));

// `then 4` is added to the microtask queue for immediate execution
Future(() => null).then((_) => print('then 4'));

After talking about so many rules, maybe everyone didn’t remember them all. So let’s use a comprehensive case to string together the execution rules mentioned earlier and focus on learning them.

In the example below, we declare several async tasks Future and microtasks. Inside some of these Future, we embed the declaration of Future and microtask again:

Future(() => print('f1')); // declare an anonymous Future
Future fx = Future(() => null); // declare Future fx with null execution body

// declare an anonymous Future and register two `then` methods, in the first `then` callback, start a microtask
Future(() => print('f2')).then((_) {
  print('f3');
  scheduleMicrotask(() => print('f4'));
}).then((_) => print('f5'));

// declare an anonymous Future and register two `then` methods, the first `then` is a Future
Future(() => print('f6'))
  .then((_) => Future(() => print('f7')))
  .then((_) => print('f8'));

// declare an anonymous Future
Future(() => print('f9'));

// register a `then` method on fx with a null execution body
fx.then((_) => print('f10'));

// start a microtask
scheduleMicrotask(() => print('f11'));
print('f12');

After running the code, the internal execution results of the above async tasks will be printed in sequence:

f12
f11
f1
f10
f2
f3
f5
f4
f6
f9
f7
f8

At this point, you may already be confused. Don’t worry, let’s first take a look at the changes in the Event Queue and Microtask Queue during the execution of this code, and analyze why the execution order of them is like this:

Figure 3 Example of changing Event Queue and Microtask Queue

  • Because the other statements are async tasks, f12 is printed first.
  • Among the remaining async tasks, the microtask queue has the highest priority, so f11 is printed next; then according to the order declared by Future, f1 is printed.
  • Then it comes to fx, since the execution body of fx is null, it is considered finished. Dart puts fx.then into the microtask queue, and since the microtask queue has the highest priority, fx.then is still executed first, printing f10.
  • Then it comes to f2 after fx, f2 is printed, and then then is executed, printing f3. f4 is a microtask and will be executed in the next event loop, so the remaining then is executed synchronously, printing f5. At the end of this event loop, in the next event loop, the microtask f4 is called, and f4 is printed.
  • Then it comes to f6 below f2, f6 is printed, and then then is executed. Here, it is important to note that this then is a Future async task, so this then, and the subsequent then are put into the event queue.
  • There is f9 below f6, which is printed.
  • In the last event loop, f7 is printed, followed by f8.

The above code is confusing, luckily we usually don’t encounter such strange writing styles when developing Flutter, so you can rest assured. You just need to remember one thing: then is executed immediately after the execution body of the Future is finished, whether it shares the same event loop or enters the next microtask.

After understanding the execution rules of Future async tasks in depth, let’s take a look at how to encapsulate an async function.

Asynchronous Functions #

For an asynchronous function, its internal actions are not completed when it returns, so it needs to return a Future object for the caller to use. The caller can decide what to do based on the Future object: register a then on the Future object to do asynchronous processing after the execution is completed, or wait synchronously for the execution of the Future object to finish.

For the Future object returned by an asynchronous function, if the caller decides to wait synchronously, the await keyword needs to be used at the calling site, and the function body at the calling site needs to be marked with the async keyword.

In the following example, the asynchronous method fetchContent returns “Hello 2019” after a delay of 3 seconds. At the calling site, we use await to wait continuously for its return before printing it:

// Declare a Future that returns "Hello" after a delay of 3 seconds, and register a then to return "Hello 2019" after the execution is finished
Future<String> fetchContent() =>
  Future<String>.delayed(Duration(seconds:3), () => "Hello")
    .then((x) => "$x 2019");

main() async {
  print(await fetchContent()); // Wait for the return of "Hello 2019"
}

Perhaps you have noticed that when we use await to wait, we mark the async keyword on the main function at the calling site. Why do we need to add this keyword?

That’s because await in Dart does not block the execution, it is an asynchronous wait. Dart treats the context of the await statement as an asynchronous function, and puts it into the Event Queue. Once the result is available, the Event Loop takes it out of the Event Queue and continues the code execution.

Next, to help reinforce your understanding, I have prepared two specific examples.

Let’s first take a look at this code. The execution order of this code suggests that the print statements should be executed in the order of f1, f2, f3, f4.

Future(() => print('f1'))
  .then((_) async => await Future(() => print('f2')))
  .then((_) => print('f3'));
Future(() => print('f4'));

But when you run this code, you will find that the actual output is f1, f4, f2, f3!

Let me analyze the execution order of this code for you:

  • According to the order of declaration, f1 and f4 are added to the event queue one after the other.
  • f1 is taken out and printed; then it goes to the then. The execution body of then, f2, is a future, so it is put into the Event Queue. Then the await is also put into the Event Queue.
  • Now pay attention, there is still f4 in the Event Queue. Our await does not block the execution of f4. Therefore, Event Loop first takes out f4, prints f4; then it can take out and print f2, and finally the waiting await is taken out to start the execution of the following f3.

Because await waits using the mechanism of the Event Queue, f4, which is ahead of it in the Event Queue, will not be blocked by it.

Next, let’s look at another example: in the main function, calling an asynchronous function to print a sentence, and in this asynchronous function, we use await and async to synchronously wait for the return of another asynchronous function that returns a string:

// Declare a Future that returns "Hello" after a delay of 2 seconds, and register a then to return "Hello 2019" after the execution is finished
Future<String> fetchContent() =>
  Future<String>.delayed(Duration(seconds:2), () => "Hello")
    .then((x) => "$x 2019");
// The async function waits synchronously for the return of "Hello 2019" and prints it
func() async => print(await fetchContent());

main() {
  print("func before");
  func();
  print("func after");
}

When running this code, we find that the final output order is “func before”, “func after”, “Hello 2019”. The await statement in the func function seems to have no effect. Why is that?

Similarly, let me analyze the execution order of this code for you:

  • First, the first line of code is synchronous, so “func before” is printed first.
  • Then, enter the func function. The func function calls the asynchronous function fetchContent and waits for it using await, so we put fetchContent and the await statement’s context function func into the event queue.
  • The context function of await does not include a call stack, so the subsequent code in func continues to execute and prints “func after”.
  • After 2 seconds, the asynchronous task of fetchContent returns “Hello 2019”, so the await in func is also taken out and “Hello 2019” is printed.

From the analysis above, what did you notice? That is, await and async are only effective for the calling context function and are not passed on. Therefore, in this case, the await in func is synchronous waiting. If we want to wait synchronously in the main function as well, we need to add await when calling the asynchronous function and mark the main function with async.

After introducing asynchrony, let’s take a look at how concurrency can be achieved through multithreading in Dart.

Isolate #

Although Dart is based on a single-threaded model, in order to further utilize multi-core CPUs and isolate CPU-intensive computations, Dart also provides a multi-threaded mechanism called Isolate. In an Isolate, resource isolation is done very well. Each Isolate has its own event loop and queue. Isolates do not share any resources with each other and can only communicate through message passing, so there is no resource contention.

Just like in other languages, creating an Isolate is very simple. We just need to provide a function entry point and pass a parameter when creating it. As shown below, we declare an entry function for an Isolate and start it in the main function, passing a string parameter:

doSth(msg) => print(msg);

main() {
  Isolate.spawn(doSth, "Hi");
  ...
}

However, in most cases, our requirements are not that simple. We not only want concurrency, but also want the Isolate to inform the main Isolate of its current execution result during concurrent execution.

To notify the execution result, Isolate uses message communication mechanism through sending ports (SendPort). When starting a concurrent Isolate, we can pass the sending port of the main Isolate as a parameter to it, so that the concurrent Isolate can send us a message using this sending port after the task is completed.

Next, let’s illustrate this with an example: in the main Isolate, we create a concurrent Isolate and pass the sending port of the main Isolate as a parameter to it. Then we wait for the concurrent Isolate to return a message. In the concurrent Isolate, we use this port to send a Hello string to the main Isolate:

Isolate isolate;

start() async {
  ReceivePort receivePort = ReceivePort(); // Create a port
  // Create a concurrent Isolate and pass the sending port
  isolate = await Isolate.spawn(getMsg, receivePort.sendPort);
  // Listen for port messages
  receivePort.listen((data) {
    print('Data: $data');
    receivePort.close(); // Close the port
    isolate?.kill(priority: Isolate.immediate); // Kill the concurrent Isolate
    isolate = null;
  });
}
// Concurrent Isolate sends a string to the port
getMsg(sendPort) => sendPort.send("Hello");

Please note that in an Isolate, the sending port is unidirectional: we start an Isolate to perform a certain task, and the Isolate informs us of the task completion by sending a message. If an Isolate needs to rely on the main Isolate to send it parameters when performing a task, and then sends the execution result back to the main Isolate after the task is completed, how can we achieve this bidirectional communication scenario? The answer is simple, just let the concurrent Isolate also return a sending port.

Next, let’s explain how to achieve bidirectional communication with an example of concurrently calculating factorial.

In the example below, we create an asynchronous function to calculate the factorial. In this asynchronous function, we create a concurrent Isolate and pass the main Isolate’s sending port to it. The concurrent Isolate also returns a sending port. After receiving the returned port in the main Isolate, we send the parameter N to the concurrent Isolate and immediately return a Future. The concurrent Isolate uses the parameter N to call a synchronous factorial calculation function and returns the execution result. Finally, the main Isolate prints the returned result:

// Concurrently calculate factorial
Future<dynamic> asyncFactoriali(n) async {
  final response = ReceivePort(); // Create a port
  // Create a concurrent Isolate and pass the port
  await Isolate.spawn(_isolate, response.sendPort);
  // Wait for the Isolate to return the port
  final sendPort = await response.first as SendPort;
  // Create another port for answer
  final answer = ReceivePort();
  // Send the parameter to the Isolate through the returned port and pass the answer port as well
  sendPort.send([n, answer.sendPort]);
  return answer.first; // Wait for the Isolate to return the execution result through the answer port
}

// Isolate function, with the main Isolate's port as a parameter
_isolate(initialReplyTo) async {
  final port = ReceivePort(); // Create a port
  initialReplyTo.send(port.sendPort); // Send the port back to the main Isolate
  final message = await port.first as List; // Wait for the main Isolate to send a message (parameters and the port for returning the result)
  final data = message[0] as int; // Parameter
  final send = message[1] as SendPort; // Port for returning the result
  send.send(syncFactorial(data)); // Call the synchronous factorial calculation function and return the result through the port
}

// Synchronous factorial calculation function
int syncFactorial(n) => n < 2 ? n : n * syncFactorial(n - 1);
main() async => print(await asyncFactoriali(4)); // Wait for the concurrent factorial calculation result

What do you feel after reading this code? We just want to calculate a factorial concurrently, isn’t it too cumbersome?

Yes, it is indeed too cumbersome. In Flutter, we can use a simpler approach to perform concurrent calculation tasks like this. Flutter provides a compute function that supports concurrent computation. It encapsulates and abstracts the creation of Isolate and bidirectional communication, hiding many low-level details. When calling it, we only need to pass the function entry point and function parameters to achieve concurrent computation and message notification.

Let’s try to refactor the code for concurrent factorial calculation using the compute function:

// Synchronous factorial calculation function
int syncFactorial(n) => n < 2 ? n : n * syncFactorial(n - 1);
// Use the compute function to encapsulate the creation of Isolate and the result return
main() async => print(await compute(syncFactorial, 4));

As you can see, after refactoring with the compute function, the entire code becomes only two lines, making the code for concurrent factorial calculation much cleaner.

Summary #

Alright, that’s it for today’s sharing on Dart’s asynchronous and concurrent mechanisms and implementation principles. Let’s briefly review the main points.

Dart is single-threaded, but asynchronous programming can be achieved through event loops. Future encapsulates asynchronous tasks, and with the help of await and async, we can achieve non-blocking synchronous waiting through event loops. Isolate is the multi-threading mechanism in Dart, capable of concurrency and having its own event loop and queue, occupying resources exclusively. Isolates can communicate with each other through message passing, and these passed messages drive the other party to perform asynchronous processing with their event loops.

In UI programming, asynchronous and multithreading are two closely related terms, and they can be easily confused. For asynchronous method calls, the code does not need to wait for the result to return, but instead actively (or passively) receives the execution result at some point in the future through other means such as notification, callback, event loop, or multithreading.

Therefore, from a dialectical perspective, asynchronous and multithreading are not equal: asynchronous is the goal, and multithreading is just one of the means to achieve it. In Flutter, with the help of the event loop provided by the UI framework, we can wait for multiple asynchronous tasks without blocking, so there is no need to create multiple threads. We must remember this point.

I have packaged the knowledge points covered in today’s sharing on GitHub. You can download it and run it multiple times to deepen your understanding.

Discussion Questions #

Finally, I leave you with two discussion questions.

  1. In the example of calculating factorial using concurrent Isolates, I sent two SendPorts to the concurrent Isolate in the asyncFactorial method. Can you explain the reason for doing this? Can we send only one SendPort?

  2. Please modify the following code to output the result as f1, f2, f3, f4, without changing the overall asynchronous structure.

    Future(() => print(‘f1’)) .then(() async => await Future(() => print(‘f2’))) .then(() => print(‘f3’)); Future(() => print(‘f4’));

Feel free to share your thoughts in the comments section. I’ll be waiting for you in the next article! Thank you for listening and feel free to share this article with more friends to read together.