Skip to main content

Threads

A thread is defined as an independent stream of instructions that can be scheduled by the operating system. From a programmer's perspective, a thread of execution can best be described as a function that runs independently of the main program, and a parallel program (with multiple threads) can be seen as a collection of such functions that can be scheduled to run simultaneously and independently by the operating system.

caution

There is a very important distinction between the concept of a process and that of a thread! You will delve into further details in other courses, but at this moment, it's crucial to remember that a process is an instance of a running program (and, therefore, two distinct processes do not share the address space, which includes program stack, variables, data, etc.), whereas a thread is a unit of work within a process (so multiple threads can have shared access to variables and other data).

Because threads of the same process share resources, changes made by one thread to those resources (such as closing a file) will be observed by all threads of that process. Furthermore, two pointers with the same value refer to the same data, and reading and writing to/from the same memory area are possible but require explicit synchronization by the programmer (you will learn more about what synchronization means and how it is achieved in lab 2).

In general, programs that can benefit from a multi-threaded implementation have several common traits:

  • they contain computational components that can run in parallel
  • they have data that can be operated on in parallel
  • they occasionally block while waiting for I/O
  • they need to respond to asynchronous events
  • certain execution components have higher priority than others.