29.04.2014 Views

Presburger Arithmetic and Its Use in Verification

Presburger Arithmetic and Its Use in Verification

Presburger Arithmetic and Its Use in Verification

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.2. MULTICORE PARALLELISM ON .NET FRAMEWORK<br />

Synchronization<br />

Parallel process<strong>in</strong>g allows two or more processes runn<strong>in</strong>g simultaneously. In various<br />

cases, the order of events does not matter. In other cases, to ensure correctness of<br />

aprogram,wehavetoensurethateventsoccur<strong>in</strong>aspecificorder. Synchronization<br />

constructs are <strong>in</strong>troduced to enforce constra<strong>in</strong>ts <strong>in</strong> these cases.<br />

Race condition<br />

This is a typical k<strong>in</strong>d of error happen<strong>in</strong>g to parallel programs. It occurs when a<br />

program runs <strong>in</strong> the same system with the same data but produces totally different<br />

results. One cause of race condition is synchronization. Read <strong>and</strong> write comm<strong>and</strong>s<br />

to shared variables have to be done <strong>in</strong> a correct order; otherwise, results are unpredictably<br />

wrong.<br />

Deadlocks<br />

Deadlocks usually occur with parallel programs when complex coord<strong>in</strong>ation between<br />

tasks is employed. When a cycle of tasks are blocked <strong>and</strong> wait<strong>in</strong>g for each other,<br />

adeadlockoccurs. Ingeneral,deadlocksarenotdifficulttodetectstatically<strong>and</strong><br />

some constructs are made for resolv<strong>in</strong>g deadlocks when they happen.<br />

Parallel slowdown<br />

When a task is divided <strong>in</strong>to more <strong>and</strong> more subtasks, these subtasks spend more <strong>and</strong><br />

more time communicat<strong>in</strong>g with each other; eventually, overheads of communication<br />

dom<strong>in</strong>ate runn<strong>in</strong>g time, <strong>and</strong> further parallelization <strong>in</strong>creases rather than decreases<br />

total runn<strong>in</strong>g time. Therefore, good parallel efficiency requires careful management<br />

of task creation <strong>and</strong> task partition<strong>in</strong>g.<br />

2.2 Multicore parallelism on .NET framework<br />

In this section, we shall <strong>in</strong>troduce parallelism constructs of .NET platform. To<br />

make the <strong>in</strong>troduction more <strong>in</strong>terest<strong>in</strong>g, we demonstrate some specific examples<br />

which directly use those parallelism constructs. All benchmarks here are done on<br />

an 8-core 2.40GHz Intel Xeon workstation with 8GB shared physical memory.<br />

2.2.1 Overview<br />

Parallel programm<strong>in</strong>g has been supported by .NET framework for quite some time,<br />

<strong>and</strong> it becomes really mature with Parallel Extension (PFX) <strong>in</strong> .NET 4.0. An<br />

overall picture of parallel constructs is illustrated <strong>in</strong> Figure 2.2. In general, a parallel<br />

program is written <strong>in</strong> an imperative way or <strong>in</strong> a functional way <strong>and</strong> compiled <strong>in</strong>to<br />

any language <strong>in</strong> .NET platform. Here we are particularly <strong>in</strong>terested <strong>in</strong> F# <strong>and</strong> how<br />

to use parallel constructs <strong>in</strong> F#. Some programs may employ PLINQ execution<br />

9

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!