Task Parallel Library (TPL) and Parallel Programming in C#
Learn Task Parallel Library and parallel programming in C# using Task, Parallel, and concurrency patterns with practical examples.
In .NET, the Task Parallel Library (TPL) simplifies parallel execution by taking advantage of multi-core processors.
With TPL, you can use tools such as Task, Parallel, Concurrent collections, and PLINQ
to speed up CPU-bound operations.
This article covers the fundamental concepts, appropriate use cases, cancellation/exception handling, and practical examples.
What is TPL? Getting Started with Task
A Task represents a unit of work. Task.Run dispatches CPU-bound operations to the thread pool.
You can start multiple tasks simultaneously and wait for all of them using Task.WhenAll.
using System;
using System.Threading.Tasks;
class Program
{
static async Task Main()
{
Task t1 = Task.Run(() => Work("A", 1000));
Task t2 = Task.Run(() => Work("B", 800));
Task t3 = Task.Run(() => Work("C", 1200));
await Task.WhenAll(t1, t2, t3);
Console.WriteLine("All tasks completed.");
}
static void Work(string name, int ms)
{
Console.WriteLine($"{name} started");
Task.Delay(ms).Wait(); // simulated wait (not CPU-bound)
Console.WriteLine($"{name} finished");
}
}
Note: For real I/O waits, prefer async/await; Task.Run is intended for CPU-bound operations.
Parallel.For / Parallel.ForEach
The Parallel class automatically partitions loops and distributes work across threads.
When accessing shared variables, use thread-safe techniques such as Interlocked.
using System;
using System.Threading;
using System.Threading.Tasks;
class Program
{
static void Main()
{
int total = 0;
Parallel.For(0, 1_000_000, i =>
{
// Use Interlocked to prevent race conditions
Interlocked.Add(ref total, 1);
});
Console.WriteLine($"Total: {total}");
}
}
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
class Program
{
static void Main()
{
var numbers = new List<int> { 1, 2, 3, 4, 5 };
Parallel.ForEach(numbers, n =>
{
Console.WriteLine($"{n} processed (Thread: {Environment.CurrentManagedThreadId})");
});
}
}
ParallelOptions: Degree of Parallelism and Cancellation
With MaxDegreeOfParallelism you can control concurrency, and with CancellationToken you can support cancellation.
using System;
using System.Threading;
using System.Threading.Tasks;
class Program
{
static void Main()
{
var cts = new CancellationTokenSource();
var po = new ParallelOptions
{
MaxDegreeOfParallelism = Environment.ProcessorCount - 1, // based on core count
CancellationToken = cts.Token
};
// cancel after 2 seconds
Task.Run(async () => { await Task.Delay(2000); cts.Cancel(); });
try
{
Parallel.For(0, 1000, po, i =>
{
po.CancellationToken.ThrowIfCancellationRequested();
DoCpuWork(i); // CPU-bound work
});
}
catch (OperationCanceledException)
{
Console.WriteLine("Operations were canceled.");
}
}
static void DoCpuWork(int i)
{
// example workload
double x = 0;
for (int k = 0; k < 50_000; k++) x += Math.Sqrt(k + i);
}
}
Aggregating with Local Init/Finally
Instead of locking a shared variable, use local accumulators per thread and combine them at the end.
using System;
using System.Threading.Tasks;
class Program
{
static void Main()
{
long globalTotal = 0;
Parallel.For<long>(0, 10_000_000,
() => 0, // local total for each thread
(i, loop, localTotal) =>
{
localTotal += i % 10;
return localTotal;
},
localTotal => { System.Threading.Interlocked.Add(ref globalTotal, localTotal); });
Console.WriteLine($"Total: {globalTotal}");
}
}
Concurrent Collections
In producer/consumer scenarios, ConcurrentBag<T>, ConcurrentQueue<T>, and
ConcurrentDictionary<TKey,TValue> provide lock-free or low-lock access for thread safety.
using System;
using System.Collections.Concurrent;
using System.Threading.Tasks;
class Program
{
static void Main()
{
var bag = new ConcurrentBag<int>();
Parallel.For(0, 1000, i => bag.Add(i));
int count = 0;
while (bag.TryTake(out _)) count++;
Console.WriteLine($"Items taken: {count}");
}
}
PLINQ (Parallel LINQ)
Use AsParallel() to execute queries over large collections in parallel.
If ordering matters, use AsOrdered(); otherwise, unordered execution improves performance.
using System;
using System.Linq;
class Program
{
static void Main()
{
var array = Enumerable.Range(1, 1_000_000).ToArray();
var result = array
.AsParallel()
.Where(x => x % 3 == 0)
.Select(x => x * x)
.Take(10)
.ToArray();
Console.WriteLine(string.Join(", ", result));
}
}
Exception Handling
In Parallel calls, multiple exceptions can be thrown; handle them with AggregateException.
In Task-based workflows, exceptions surface during await.
using System;
using System.Threading.Tasks;
class Program
{
static void Main()
{
try
{
Parallel.Invoke(
() => throw new InvalidOperationException("A"),
() => throw new ApplicationException("B")
);
}
catch (AggregateException ex)
{
foreach (var e in ex.InnerExceptions)
Console.WriteLine($"Error: {e.GetType().Name} - {e.Message}");
}
}
}
async/await vs Parallel
- async/await: Used for I/O-bound operations (file, network, database) to keep the UI responsive.
- Parallel/TPL: Used for CPU-bound work that benefits from multi-core parallelism.
- Avoid using
Task.Runfor I/O tasks unnecessarily; real performance gains occur with CPU-bound workloads.
Performance Tips and Best Practices
- Minimize shared state; combine results using local accumulators when possible.
- Avoid unnecessary locks; use
InterlockedorConcurrentcollections when needed. - Use
MaxDegreeOfParallelismto prevent excessive context switching. - In UI applications, avoid blocking calls like
.Wait()and.Result.
Example: Parallelizing Image Processing
The following example simulates processing image files in a folder in parallel. In a real scenario, CPU-intensive operations like filtering or thumbnail generation could be performed here.
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
class Program
{
static void Main()
{
string folder = "C:\\\\Images";
var files = Directory.Exists(folder)
? Directory.GetFiles(folder, "*.jpg")
: Array.Empty<string>();
var po = new ParallelOptions { MaxDegreeOfParallelism = Environment.ProcessorCount };
Parallel.ForEach(files, po, file =>
{
// Simulated CPU-intensive image filtering
var name = Path.GetFileName(file);
Console.WriteLine($"Processing: {name}");
Thread.SpinWait(3000000); // Simulation
Console.WriteLine($"Done: {name}");
});
Console.WriteLine("All files processed.");
}
}
TL;DR
- TPL simplifies parallel programming:
Task,Parallel,PLINQ. - Parallel.For/ForEach spreads loops across processor cores.
- MaxDegreeOfParallelism and CancellationToken provide control and cancellation.
- Thread safety is essential: use
Interlocked, local accumulators, andConcurrentcollections. - async/await ↔ Parallel: Choose based on I/O-bound vs CPU-bound workloads.
Related Articles
Asynchronous Programming Basics in C# (async/await)
Learn async and await in C# to build responsive applications with asynchronous tasks, non-blocking code, and practical examples.
Asynchronous Streams in C# (IAsyncEnumerable)
Learn asynchronous streams in C# with IAsyncEnumerable to process data step by step using modern async iteration patterns.
Process and Thread Management in C#
Learn process and thread management in C# to control execution flow, system resources, and multithreaded applications.