28.12.2014 Views

TGQR 2010Q2 Report.pdf - Teragridforum.org

TGQR 2010Q2 Report.pdf - Teragridforum.org

TGQR 2010Q2 Report.pdf - Teragridforum.org

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Further details can be found §6.2.1.<br />

To support the great diversity of research activities and their wide range in resources needs, our<br />

user support and operations teams leverage the expertise across the eleven TeraGrid Resource<br />

Providers (§5, §8). In addition, users benefit from our coordinated education, outreach, and<br />

training activities (§9).<br />

1.3 TeraGrid’s Integrated, Distributed Environment<br />

TeraGrid’s diverse set of HPC resources provides a rich computational science environment.<br />

TeraGrid RPs operate more than 25 highly reliable HPC resources and 10 storage resources that<br />

are available via a central allocations and accounting process for the national academic<br />

community. More detailed information is available on compute resources at<br />

www.teragrid.<strong>org</strong>/userinfo/hardware/resources.php and on storage resources at<br />

www.teragrid.<strong>org</strong>/userinfo/hardware/dataresources.php.<br />

In Q2 2010, TeraGrid again saw further increases in delivered NUs. TeraGrid compute resources<br />

delivered 12.1 billion NUs to users, 0.3% as TeraGrid Roaming NUs (Figure 8-11). Roaming<br />

NUs continued to decline as that allocation feature is phased out. The 12.1 billion NUs represent<br />

a 20% increase over the NUs delivered in Q1 2010. Nearly 100% of those NUs were delivered to<br />

allocated users. The Q2 2010 NUs delivered are 1.7x the NUs delivered in Q2 2009. Further<br />

details can be found §8.11.<br />

1.4 Organizational Architecture<br />

The coordination and management of the TeraGrid partners and resources requires <strong>org</strong>anizational<br />

and collaboration mechanisms that<br />

are different from a classic<br />

<strong>org</strong>anizational structure for single<br />

<strong>org</strong>anizations. The existing<br />

structure and practice has evolved<br />

from many years of collaborative<br />

arrangements between the centers,<br />

some predating the TeraGrid. As<br />

the TeraGrid moves forward, the<br />

inter-relationships continue to<br />

evolve in the context of a<br />

persistent collaborative<br />

environment.<br />

Figure 1-1: TeraGrid Facility Partner Institutions<br />

The TeraGrid team (Figure 1-1) is<br />

composed of eleven RPs and the<br />

GIG, which in turn has subawards to the RPs plus six additional Software Integration partners.<br />

The GIG provides coordination, operations, software integration, management and planning. GIG<br />

area directors (ADs) direct project activities involving staff from multiple partner sites,<br />

coordinating and maintaining TeraGrid central services.<br />

TeraGrid policy and governance rests with the TeraGrid Forum (TG Forum), comprising the<br />

eleven RP principal investigators and the GIG principal investigator. The TG Forum is led by an<br />

9

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!