1 00:00:00,000 --> 00:00:07,433 [No Audio] 2 00:00:07,434 --> 00:00:09,659 In this section we will explore in detail the 3 00:00:09,660 --> 00:00:12,029 Rust concurrency feature. First of all, let 4 00:00:12,030 --> 00:00:14,069 us explain what exactly do we mean by the 5 00:00:14,070 --> 00:00:16,889 term concurrency. Concurrency refers to when 6 00:00:16,890 --> 00:00:19,679 section of code, when sections of code in our 7 00:00:19,680 --> 00:00:21,809 application runs parallel to other sections 8 00:00:21,810 --> 00:00:24,449 of code. This means if we have some code 9 00:00:24,450 --> 00:00:26,699 block or segment residing inside the same 10 00:00:26,700 --> 00:00:28,949 file, the concurrency will allow these 11 00:00:28,950 --> 00:00:30,989 segments to run concurrently or in other 12 00:00:30,990 --> 00:00:34,169 words in parallel. The term concurrency is 13 00:00:34,170 --> 00:00:38,069 also often referred to as multi threading. In 14 00:00:38,070 --> 00:00:40,979 most current operating systems, an executed 15 00:00:40,980 --> 00:00:43,559 programs code is run in a process, and the 16 00:00:43,589 --> 00:00:46,079 operating system manages multiple processes 17 00:00:46,080 --> 00:00:49,289 at once. within your program, you can also 18 00:00:49,290 --> 00:00:51,179 have independent parts that run 19 00:00:51,180 --> 00:00:54,329 simultaneously. This feature that runs these 20 00:00:54,359 --> 00:00:57,089 independent parts is called threads. In 21 00:00:57,090 --> 00:01:00,029 summary, concurrency means executions of 22 00:01:00,030 --> 00:01:02,459 different parts of our code simultaneously, 23 00:01:02,849 --> 00:01:04,739 and the mechanism through which it is made 24 00:01:04,740 --> 00:01:07,289 possible is known as threads. All the 25 00:01:07,290 --> 00:01:09,599 programs we have seen so far in this course, 26 00:01:09,869 --> 00:01:12,329 run the respective code sequentially section 27 00:01:12,359 --> 00:01:15,599 after section. By using threads, we can run 28 00:01:15,600 --> 00:01:18,479 sections of our code in parallel. This means 29 00:01:18,480 --> 00:01:20,819 that each subsequent section do not need to 30 00:01:20,820 --> 00:01:23,279 wait for the section above it to complete 31 00:01:23,280 --> 00:01:26,339 fully before it can be executed. There are 32 00:01:26,340 --> 00:01:29,189 many essential advantages of this, consider a 33 00:01:29,190 --> 00:01:31,559 case where one part of our program may wait 34 00:01:31,560 --> 00:01:34,199 for some input, thereby unnecessarily 35 00:01:34,200 --> 00:01:36,689 blocking the remaining of the code. We can 36 00:01:36,690 --> 00:01:39,269 create threads where the input output may be 37 00:01:39,270 --> 00:01:41,459 assigned to one thread, and the remaining of 38 00:01:41,460 --> 00:01:44,189 the computation to the other. This way, the 39 00:01:44,190 --> 00:01:46,439 program will not unnecessarily block the 40 00:01:46,440 --> 00:01:48,749 remaining of the code from execution while 41 00:01:48,750 --> 00:01:51,239 waiting for the input to be received. With 42 00:01:51,240 --> 00:01:53,339 this background, let us start coding the 43 00:01:53,340 --> 00:01:55,529 threads, and things will become more easy to 44 00:01:55,530 --> 00:01:57,166 comprehend and understand. 45 00:01:58,032 --> 00:01:59,219 To use the threads 46 00:01:59,220 --> 00:02:01,019 we need to include the thread from the 47 00:02:01,020 --> 00:02:02,069 standard library. 48 00:02:02,070 --> 00:02:07,560 [No Audio] 49 00:02:07,561 --> 00:02:09,690 Let us add a few print statements before 50 00:02:09,691 --> 00:02:11,820 including a thread in our main program. 51 00:02:11,821 --> 00:02:21,690 [No Audio] 52 00:02:21,691 --> 00:02:23,460 To create a thread, we need to use the 53 00:02:23,461 --> 00:02:25,618 function of spawn which takes a 54 00:02:25,619 --> 00:02:26,910 closure as an input. 55 00:02:26,911 --> 00:02:32,520 [No Audio] 56 00:02:32,521 --> 00:02:34,410 Let us add some code to the body of the 57 00:02:34,411 --> 00:02:36,066 thread inside the closure. 58 00:02:36,067 --> 00:02:37,980 I will add a few print statements. 59 00:02:37,981 --> 00:02:45,690 [No Audio] 60 00:02:45,691 --> 00:02:48,000 Next I will add a couple of print statements 61 00:02:48,001 --> 00:02:50,733 to the body of the main function outside the thread. 62 00:02:50,734 --> 00:02:56,970 [No Audio] 63 00:02:56,971 --> 00:02:58,980 Okay, now let us explain some important 64 00:02:58,981 --> 00:03:02,010 points. The print statements at the start 65 00:03:02,011 --> 00:03:04,050 of the program will always be printed to the 66 00:03:04,051 --> 00:03:06,450 terminal. This is because we are creating 67 00:03:06,480 --> 00:03:08,850 threads after this line, and therefore the 68 00:03:08,851 --> 00:03:11,100 concurrency in the program has not yet 69 00:03:11,101 --> 00:03:13,740 started. Therefore, these print statements 70 00:03:13,741 --> 00:03:16,800 will be printed in sequential order. Next we 71 00:03:16,801 --> 00:03:19,800 have spawned a thread. Each program by default 72 00:03:19,801 --> 00:03:22,800 has one thread which is the main thread. At 73 00:03:22,801 --> 00:03:24,990 this point now, we have created another 74 00:03:24,991 --> 00:03:27,450 thread in addition to the main thread after 75 00:03:27,451 --> 00:03:29,220 this point, the remaining of the code either 76 00:03:29,221 --> 00:03:32,130 belongs to the main thread of or this newly 77 00:03:32,131 --> 00:03:34,620 created thread, threads can execute in 78 00:03:34,621 --> 00:03:36,270 parallel therefore, the print statement 79 00:03:36,271 --> 00:03:38,340 inside the thread and the two print 80 00:03:38,341 --> 00:03:40,080 statements in the remaining of the main 81 00:03:40,081 --> 00:03:42,333 function will execute in parallel. 82 00:03:43,266 --> 00:03:46,260 The specific scheduling order in which they will 83 00:03:46,261 --> 00:03:49,380 be executed is not deterministic, and is being 84 00:03:49,381 --> 00:03:51,600 handled by the underlying operating system. 85 00:03:52,050 --> 00:03:54,360 Therefore, we cannot guarantee any order in 86 00:03:54,361 --> 00:03:57,420 which they will be executed. Let us execute 87 00:03:57,421 --> 00:03:59,766 now, and then we will explain some important points. 88 00:03:59,767 --> 00:04:06,000 [No Audio] 89 00:04:06,001 --> 00:04:07,980 Let me execute a few more times. 90 00:04:07,981 --> 00:04:13,470 [No Audio] 91 00:04:13,471 --> 00:04:15,510 You may note that the first three lines are 92 00:04:15,511 --> 00:04:18,180 always being printed, and it is because the 93 00:04:18,181 --> 00:04:20,370 concurrency starts after the first three 94 00:04:20,371 --> 00:04:23,970 lines in the main function. Next, there is no 95 00:04:23,971 --> 00:04:26,100 unique order in which the print statements 96 00:04:26,101 --> 00:04:28,260 inside the thread and the main are being 97 00:04:28,261 --> 00:04:31,050 executed. Each time we execute, we obtained 98 00:04:31,051 --> 00:04:33,690 different result. Sometimes the thread is 99 00:04:33,720 --> 00:04:36,330 executed partially, and then switched to the 100 00:04:36,331 --> 00:04:39,000 main, and at other times the main is is 101 00:04:39,001 --> 00:04:41,700 executed first, and then a chance is given to 102 00:04:41,701 --> 00:04:44,610 the thread. And at some other times multiple 103 00:04:44,611 --> 00:04:46,890 switches may take place between the main, and 104 00:04:46,891 --> 00:04:50,280 the created thread. As explained earlier, 105 00:04:50,281 --> 00:04:52,050 this is because the thread scheduling are 106 00:04:52,051 --> 00:04:54,270 being handled by the operating system and 107 00:04:54,271 --> 00:04:56,310 there is no deterministic order in which they 108 00:04:56,311 --> 00:04:59,900 will be executed. We may also note that 109 00:04:59,901 --> 00:05:02,280 in all the executions, the print statements 110 00:05:02,281 --> 00:05:03,840 at the end of the main are always been 111 00:05:03,841 --> 00:05:06,660 executed, while some of the print statements 112 00:05:06,661 --> 00:05:09,630 inside the thread may not execute. This is 113 00:05:09,631 --> 00:05:11,970 because the main thread will always complete 114 00:05:12,000 --> 00:05:13,980 execution before the termination of the 115 00:05:13,981 --> 00:05:16,290 program, while the remaining of the threads 116 00:05:16,291 --> 00:05:19,200 may or may not. Once the main thread of the 117 00:05:19,201 --> 00:05:21,900 Rust program completes, all spawned threads 118 00:05:21,930 --> 00:05:24,210 are shut down whether or not they have 119 00:05:24,211 --> 00:05:26,550 finished their respective execution. For this 120 00:05:26,551 --> 00:05:28,770 reason, some of the print statements are not 121 00:05:28,771 --> 00:05:31,680 executed during our runs of the program. To 122 00:05:31,681 --> 00:05:33,990 give more chance to your thread to go to 123 00:05:33,991 --> 00:05:36,060 completion before the end of the main thread, 124 00:05:36,600 --> 00:05:39,240 you may use a function called thread sleep. 125 00:05:39,660 --> 00:05:42,360 The call to thread::sleep 126 00:05:42,361 --> 00:05:45,030 forces our thread to stop its execution for a 127 00:05:45,031 --> 00:05:47,700 short duration, allowing a different thread 128 00:05:47,701 --> 00:05:51,266 to run. To use this function I will first include 129 00:05:51,267 --> 00:05:53,533 the relevant module from the standard library. 130 00:05:53,534 --> 00:05:59,700 [No Audio] 131 00:05:59,701 --> 00:06:01,770 Next, I will use it in the main after 132 00:06:01,791 --> 00:06:03,300 creating the thread. 133 00:06:03,301 --> 00:06:06,888 [No Audio] 134 00:06:06,889 --> 00:06:08,061 This will now block the 135 00:06:08,062 --> 00:06:10,770 main thread for a for one millisecond thereby 136 00:06:10,950 --> 00:06:13,410 giving a slightly more chance to the thread 137 00:06:13,411 --> 00:06:16,300 to execute. Let us cargo run to see the result. 138 00:06:16,301 --> 00:06:20,700 [No Audio] 139 00:06:20,701 --> 00:06:22,920 We can, we can increase the sleep 140 00:06:22,921 --> 00:06:25,530 time to give more chance to the thread, the 141 00:06:25,531 --> 00:06:28,110 threads will probably take turns but that 142 00:06:28,111 --> 00:06:30,630 isn't guaranteed. It depends on how your 143 00:06:30,631 --> 00:06:33,480 operating system schedules the threads. If 144 00:06:33,481 --> 00:06:35,790 you run this code and only see output from 145 00:06:35,791 --> 00:06:38,340 the main thread, or don't see any overlap, 146 00:06:38,370 --> 00:06:41,820 try increasing the sleep time. Now let us 147 00:06:41,821 --> 00:06:44,010 learn how to ensure that the spawn thread 148 00:06:44,011 --> 00:06:46,800 runs to the completion before the end of the 149 00:06:46,801 --> 00:06:50,550 main program, and do not end prematurely. This 150 00:06:50,551 --> 00:06:52,500 can be done by making use of the joint 151 00:06:52,501 --> 00:06:55,860 function or the join handle. The join handle 152 00:06:55,861 --> 00:06:57,930 is the returning value of a thread spawn 153 00:06:57,931 --> 00:07:00,690 function, which we can store in a variable 154 00:07:00,810 --> 00:07:03,210 let me define a variable and assign to it the 155 00:07:03,211 --> 00:07:06,750 result of the thread spawn. You may note 156 00:07:06,751 --> 00:07:09,500 that the variable type is date of join handle. 157 00:07:09,733 --> 00:07:11,790 Now we can call the join on the 158 00:07:11,791 --> 00:07:13,800 variable of handle, so let me call it. 159 00:07:13,801 --> 00:07:17,466 [No Audio] 160 00:07:17,467 --> 00:07:20,250 Calling this function for a specific thread will halt 161 00:07:20,280 --> 00:07:22,890 or block the execution until the thread is 162 00:07:22,891 --> 00:07:25,890 complete. This means now that if I execute 163 00:07:25,920 --> 00:07:28,440 this program then the thread will complete 164 00:07:28,441 --> 00:07:30,900 first and after that the remaining statements 165 00:07:30,901 --> 00:07:33,780 in the main will get a chance to execute. So 166 00:07:33,781 --> 00:07:35,266 let us cargo run this to confirm. 167 00:07:35,267 --> 00:07:39,766 [No Audio] 168 00:07:39,767 --> 00:07:42,641 Now what will happen if I move the call to join at 169 00:07:42,642 --> 00:07:44,866 the end of the main function, let me move it. 170 00:07:44,867 --> 00:07:51,149 [No Audio] 171 00:07:51,150 --> 00:07:53,549 In this case now the thread and the main will 172 00:07:53,550 --> 00:07:55,889 keep on getting chances but the main thread 173 00:07:55,890 --> 00:07:59,339 waits because of the call t.join, and 174 00:07:59,340 --> 00:08:01,229 does not end until the spawn thread is 175 00:08:01,230 --> 00:08:03,929 finished. Let us execute to confirm this. 176 00:08:03,930 --> 00:08:08,639 [No Audio] 177 00:08:08,640 --> 00:08:10,289 Please note that the join function will 178 00:08:10,290 --> 00:08:12,779 return a result, it will result an Ok 179 00:08:12,780 --> 00:08:15,389 variant when the thread does not panic and it 180 00:08:15,390 --> 00:08:17,729 will return an error if the thread panic due 181 00:08:17,730 --> 00:08:19,033 to some reason. 182 00:08:21,175 --> 00:08:22,619 Okay before we end there are 183 00:08:22,620 --> 00:08:24,359 a couple of important points to note. 184 00:08:24,539 --> 00:08:27,269 Firstly, there are two terminologies of 185 00:08:27,270 --> 00:08:29,969 concurrency and parallelism, which are very 186 00:08:29,970 --> 00:08:32,039 related but different within the context of 187 00:08:32,040 --> 00:08:35,068 computer science. The concurrency is about 188 00:08:35,069 --> 00:08:38,159 multiple tasks which starts run and complete 189 00:08:38,160 --> 00:08:41,009 in overlapping time periods in no specific 190 00:08:41,010 --> 00:08:44,489 order. parallelism is about multiple tasks or 191 00:08:44,490 --> 00:08:47,549 sub tests of the same task that literally run 192 00:08:47,550 --> 00:08:50,249 at the same time on a hardware with multiple 193 00:08:50,250 --> 00:08:52,469 computing resources like multicores 194 00:08:52,470 --> 00:08:55,319 processors. From programming perspective, we 195 00:08:55,320 --> 00:08:57,359 are interested in concurrency and from the 196 00:08:57,360 --> 00:08:59,789 hardware perspective, that is the underlying 197 00:08:59,790 --> 00:09:01,889 architecture involving physical design and 198 00:09:01,890 --> 00:09:04,589 components. We are interested in parallelism. 199 00:09:05,369 --> 00:09:07,439 Programming community, however, uses the 200 00:09:07,440 --> 00:09:10,169 terms interchangeably as it is understood 201 00:09:10,170 --> 00:09:11,909 that they are referring to the concurrency 202 00:09:11,910 --> 00:09:14,249 and not to parallelism, since 203 00:09:14,250 --> 00:09:16,799 parallelism is beyond their scope. Secondly, 204 00:09:16,800 --> 00:09:19,169 threads are often associated with problems 205 00:09:19,170 --> 00:09:21,539 such as race conditions, where threads are 206 00:09:21,540 --> 00:09:24,149 accessing data or resource in an inconsistent 207 00:09:24,179 --> 00:09:26,429 order and deadlocks where two threads are 208 00:09:26,430 --> 00:09:28,259 waiting for each other to finish using the 209 00:09:28,260 --> 00:09:30,839 resource, preventing both the threads from 210 00:09:30,840 --> 00:09:33,809 continuing and finally bugs, which can only 211 00:09:33,810 --> 00:09:36,089 happen in certain situations and are hard to 212 00:09:36,090 --> 00:09:39,599 reproduce and fix reliably. Rust attempts to 213 00:09:39,600 --> 00:09:41,579 mitigate the negative effects of using 214 00:09:41,580 --> 00:09:43,769 threads as it will become evident to us in 215 00:09:43,770 --> 00:09:46,319 the upcoming tutorials. We end this tutorial 216 00:09:46,320 --> 00:09:48,899 here see you again with more concepts and 217 00:09:48,900 --> 00:09:51,689 until next tutorial enjoy Rust programming. 218 00:09:51,690 --> 00:09:56,566 [No Audio]