Transcript
Page 1: Patterns of parallel programming

Patterns of Parallel Programming

Prepared by Yan [email protected]

@ydrugalya

Page 2: Patterns of parallel programming

Agenda

• Why parallel?• Terms and measures• Building Blocks• Patterns overview

– Pipeline and data flow– Producer-Consumer– Map-Reduce– Other

Page 3: Patterns of parallel programming

Why Moore's law is not working anymore

• Power consumption• Wire delays• DRAM access latency• Diminishing returns of more instruction-level

parallelism

Page 4: Patterns of parallel programming

Power consumption

10,000

1,000

100

10

1

‘70 ‘80 ’90 ’00 ‘10

Pow

er D

ensi

ty (

W/c

m2 )

8080

Pentium® processors

Hot Plate

Nuclear Reactor

Rocket Nozzle

Sun’s Surface

Page 5: Patterns of parallel programming

Wire delays

Page 6: Patterns of parallel programming

Diminishing returns

• 80’s– 10 CPI 1 CPI

• 90– 1 CPI 0.5CPI

• 00’s: multicore

Page 7: Patterns of parallel programming

No matter how fast processors get, software consistently finds new ways to eat up the extra speed.

Herb Sutter

Page 8: Patterns of parallel programming

To scale performance, put many processing cores on the microprocessor chip

New Moore’s law edition is about doubling of cores.

Survival

Page 9: Patterns of parallel programming

Terms & Measures• Work = T1• Span = T∞• Work Law: Tp>=T1/P• Span Law: Tp>=T∞• Speedup: Tp/T1

– Linear: θ(P)– Perfect: P

• Parallelism: T1/T∞• Tp<=(T1-T∞)/P + T∞

Page 10: Patterns of parallel programming

Definitions

• Concurrent- Several things happenings at the same time

• Multithreaded– Multiple execution contexts

• Parallel– Multiple simultaneous computations

• Asynchronous– Not having to wait

Page 11: Patterns of parallel programming

Dangers

• Race Conditions• Starvations• Deadlocks• Livelock• Optimizing compilers • …

Page 12: Patterns of parallel programming

Data parallelism

Parallel.ForEach(letters, ch => Capitalize(ch));

Page 13: Patterns of parallel programming

Task parallelism

Parallel.Invoke(() => Average(), () => Minimum() …);

Page 14: Patterns of parallel programming

Fork-Join• Additional work may be started only when specific subsets of the

original elements have completed processing• All elements should be given the chance to run even if one

invocation fails (Ping)

Fork

Compute Mean

Compute Median

Join

Compute Mode

Parallel.Invoke(() => ComputeMean(),() => ComputeMedian(),() => ComputeMode());

static void MyParallelInvoke(params Action[] actions){

var tasks = new Task[actions.Length];for (int i = 0; i < actions.Length; i++)

tasks[i] = Task.Factory.StartNew(actions[i]);Task.WaitAll(tasks);

}

Page 15: Patterns of parallel programming

Pipeline pattern

Task 1

Task 2

Task 3

Task<int> T1 = Task.Factory.StartNew(() => { return result1(); });

Task<double> T2 = T1.ContinueWith((antecedent) => { return result2(antecedent.Result); });

Task<double> T3 = T2.ContinueWith((antecedent) => { return result3(antecedent.Result); });

Page 16: Patterns of parallel programming

Producer/Consumer

BlockingCollection<T>

Read 1 Read 2 Read 3Disk/Net

Process Process Process

Page 17: Patterns of parallel programming
Page 18: Patterns of parallel programming

Other patterns

• Speculative Execution• APM (IAsyncResult, Begin/end pairs)• EAP(Operation/Callback pairs)

Page 19: Patterns of parallel programming

References• Patterns for Parallel Programming: Understanding and Applying Par

allel Patterns with the .NET Framework 4• Pluralsight:

– Introduction to Async and Parallel Programming in .NET 4 – Async and Parallel Programming: Application Design

• The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software

• Chapter 27 Multithreaded Algorithms from Introduction to algorithms 3rd edition


Top Related