15.01.2013 Views

Free-ebooks-library - Bahar Ali Khan

Free-ebooks-library - Bahar Ali Khan

Free-ebooks-library - Bahar Ali Khan

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

we’re using it to pass the method delegate to the completion callback, so we can call<br />

EndInvoke on it.<br />

Optimizing the Thread Pool<br />

The thread pool starts out with one thread in its pool. As tasks are assigned, the pool<br />

manager “injects” new threads to cope with the extra concurrent workload, up to a<br />

maximum limit. After a sufficient period of inactivity, the pool manager may “retire”<br />

threads if it suspects that doing so will lead to better throughput.<br />

You can set the upper limit of threads that the pool will create by calling Thread<br />

Pool.SetMaxThreads; the defaults are:<br />

• 1023 in Framework 4.0 in a 32-bit environment<br />

• 32768 in Framework 4.0 in a 64-bit environment<br />

• 250 per core in Framework 3.5<br />

• 25 per core in Framework 2.0<br />

(These figures may vary according to the hardware and operating system.) The reason<br />

for there being that many is to ensure progress should some threads be blocked<br />

(idling while awaiting some condition, such as a response from a remote computer).<br />

You can also set a lower limit by calling ThreadPool.SetMinThreads. The role of the<br />

lower limit is subtler: it’s an advanced optimization technique that instructs the pool<br />

manager not to delay in the allocation of threads until reaching the lower limit.<br />

Raising the minimum thread count improves concurrency when there are blocked<br />

threads (see sidebar).<br />

Synchronization<br />

The default lower limit is one thread per processor core—the<br />

minimum that allows full CPU utilization. On server environments,<br />

though (such ASP.NET under IIS), the lower limit is<br />

typically much higher—as much as 50 or more.<br />

So far, we’ve described how to start a task on a thread, configure a thread, and pass<br />

data in both directions. We’ve also described how local variables are private to a<br />

thread and how references can be shared among threads, allowing them to communicate<br />

via common fields.<br />

The next step is synchronization: coordinating the actions of threads for a predictable<br />

outcome. Synchronization is particularly important when threads access the same<br />

data; it’s surprisingly easy to run aground in this area.<br />

Synchronization constructs can be divided into four categories:<br />

Simple blocking methods<br />

These wait for another thread to finish or for a period of time to elapse. Sleep,<br />

Join, and Task.Wait are simple blocking methods.<br />

Synchronization | 805<br />

Threading

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!