05.08.2014 Views

here - Stefan-Marr.de

here - Stefan-Marr.de

here - Stefan-Marr.de

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

2. Context and Motivation<br />

tasks with automatic and efficient load-balancing. When the execution mo<strong>de</strong>l<br />

provi<strong>de</strong>s only time shared execution, the overhead of fork/join is in most<br />

implementations prohibitive [Kumar et al., 2012], especially when compared<br />

to a sequential recursive implementation. Furthermore, the problem of loadbalancing<br />

that is solved by work-stealing does not exist in the first place. To<br />

conclu<strong>de</strong>, parallel programming needs to solve problems that do not exist for<br />

concurrent programs, and parallel programs require solutions that are not<br />

necessarily applicable to concurrent programs.<br />

Conversely, low-level atomic operations such as compare-and-swap have been<br />

conceived in the context of few-core systems. One important use case was to<br />

ensure that an operation is atomic with respect to interrupts on the same core.<br />

One artifact of the <strong>de</strong>sign for time shared systems is that Intel’s compare-andswap<br />

operation in the IA-32 instruction set (CMPXCHG) requires an additional<br />

LOCK prefix to be atomic in multicore environments [Intel Corporation, 2012].<br />

Similarly to the argumentation that parallel programming concepts do not<br />

necessarily apply to time shared systems, the usefulness of low-level atomic<br />

operations originating in concurrent systems is limited. Using them naively<br />

restricts their scalability and their usefulness diminishes with rising <strong>de</strong>gree of<br />

parallelism [Shavit, 2011; Ungar, 2011]. They are <strong>de</strong>signed to solve a particular<br />

set of problems in concurrent programs but are not necessarily applicable to<br />

the problems in parallel programs.<br />

Concluding from these examples, it would be beneficial to treat concurrent<br />

programming and parallel programming separately to properly reflect the characteristics<br />

and applicability of the corresponding programming concepts.<br />

2.3.2. Concurrent Programming and Parallel Programming<br />

This section <strong>de</strong>fines the notions of concurrent programming and parallel programming<br />

to create two disjoint sets of programming concepts. Instead of focusing<br />

on the execution mo<strong>de</strong>l as earlier <strong>de</strong>finitions do, the proposed <strong>de</strong>finitions<br />

concentrate on the aspect of programming, i. e., the process of formalizing<br />

an algorithm using a number of programming concepts with a specific intent<br />

and goal. The distinction between the execution mo<strong>de</strong>l and the act of programming<br />

is ma<strong>de</strong> explicitly to avoid confusion with the common usage of<br />

the terms concurrency and parallelism.<br />

One in<strong>here</strong>nt assumption for these <strong>de</strong>finitions is that they relate the notions<br />

of concurrent and parallel programming with each other on a fixed level of<br />

abstraction. Without assuming a fixed level of abstraction, it becomes easily<br />

confusing because higher-level programming abstractions are typically built<br />

20

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!