05.08.2014 Views

here - Stefan-Marr.de

here - Stefan-Marr.de

here - Stefan-Marr.de

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

2.3. Concurrent vs. Parallel Programming: Definitions<br />

on top of lower-level abstractions, which can fall into a different category. For<br />

instance, as discussed after these <strong>de</strong>finitions, parallel programming abstractions<br />

are often implemented in terms of concurrent programming abstractions.<br />

Thus, these <strong>de</strong>finitions have to be interpreted on a fixed and common<br />

abstraction level.<br />

Definition 1. Parallel programming is the art of <strong>de</strong>vising a strategy to coordinate<br />

collaborating activities to contribute to the computation of an overall result by<br />

employing multiple computational resources.<br />

Definition 2. Concurrent programming is the art of <strong>de</strong>vising a strategy to coordinate<br />

in<strong>de</strong>pen<strong>de</strong>nt activities at runtime to access shared resources while preserving<br />

the resources’ invariants.<br />

This means that parallel programming is distinct from concurrent programming<br />

because it provi<strong>de</strong>s techniques to employ multiple computational resources,<br />

while concurrent programming provi<strong>de</strong>s techniques to preserve semantics,<br />

i. e., the correctness of computations done by in<strong>de</strong>pen<strong>de</strong>nt interacting<br />

activities that use shared resources.<br />

Furthermore, an important aspect of parallel programming is the <strong>de</strong>composition<br />

of a problem into cooperating activities that can execute in parallel<br />

to produce an overall result. T<strong>here</strong>fore, the related concepts inclu<strong>de</strong> mechanisms<br />

to coordinate activities and communicate between them. This coordination<br />

can be done by statically planing out interactions for instance to reduce<br />

communication, however, it usually also needs to involve a strategy for the<br />

communication at runtime, i. e., the dynamic coordination.<br />

In contrast, concurrent programming concepts inclu<strong>de</strong> techniques to protect<br />

resources, for instance by requiring the use of locks and monitors, or by<br />

enforcing properties such as isolation at runtime, preventing un<strong>de</strong>sirable access<br />

to shared resources. The notion of protecting invariants, i. e., resources is<br />

important because the interacting activities are in<strong>de</strong>pen<strong>de</strong>nt. They only interact<br />

based on conventions such as locking protocols or via constraint interfaces<br />

such as messaging protocols to preserve the invariants of the share resources.<br />

The nature of activities remains explicitly un<strong>de</strong>fined. An activity can t<strong>here</strong>fore<br />

be represented for instance by a light-weight task, a thread, or an operating<br />

system process, but it could as well be represented by the abstract notion<br />

of an actor.<br />

Note that these <strong>de</strong>finitions do not preclu<strong>de</strong> the combination of concurrent<br />

and parallel programming. Neither do they preclu<strong>de</strong> the fact that programming<br />

concepts can build on each other, as discussed in the beginning of this<br />

21

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!