TGQR 2010Q4 Report.pdf - Teragridforum.org
TGQR 2010Q4 Report.pdf - Teragridforum.org
TGQR 2010Q4 Report.pdf - Teragridforum.org
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
NSF Extensible Terascale Facility<br />
TeraGrid<br />
<strong>Report</strong> for<br />
Q4: October 1 st , 2010 through December 31 th , 2010<br />
Ian Foster (GIG)<br />
Phil Andrews<br />
Jay Boisseau<br />
John Cobb<br />
Michael Levine<br />
Rich Loft<br />
Honggao Liu<br />
Richard Moore<br />
Carol Song<br />
Rick Stevens<br />
Craig Stewart<br />
John Towns<br />
Principal Investigators (GIG and RPs)<br />
University of Chicago/Argonne National Laboratory (UC/ANL)<br />
University of Tennessee (UT-NICS)<br />
Texas Advanced Computing Center (TACC)<br />
Oak Ridge National Laboratory (ORNL)<br />
Pittsburgh Supercomputing Center (PSC)<br />
National Center for Atmospheric Research (NCAR)<br />
Louisiana Optical Network Initiative/Louisiana State University<br />
(LONI/LSU)<br />
San Diego Supercomputer Center (SDSC)<br />
Purdue University (PU)<br />
University of Chicago/Argonne National Laboratory (UC/ANL)<br />
Indiana University (IU)<br />
National Center for Supercomputing Applications (NCSA)<br />
i
TeraGrid Forum (TGF)<br />
John Towns (NCSA)<br />
Matt Heinzel (UC/ANL)<br />
Phil Andrews (UT-NICS)<br />
Jay Boisseau (TACC)<br />
John Cobb (ORNL)<br />
Honggao Liu (LONI/LSU)<br />
Michael Levine (PSC)<br />
Rich Loft (NCAR)<br />
Richard Moore (SDSC)<br />
Mike Papka (UC-ANL)<br />
Carol Song (PU)<br />
Craig Stewart (IU)<br />
Grid Infrastructure Group<br />
Ian Foster (GIG) PI of TeraGrid GIG<br />
Matt Heinzel (UC) Director of the TeraGrid GIG<br />
Tim Cockerill (NCSA) Project Management Working Group<br />
Kelly Gaither (TACC) Visualization<br />
Christopher Jordan (TACC) Data Analysis<br />
Daniel S. Katz (UC/ANL) GIG Director of Science<br />
Scott Lathrop (UC/ANL) Education, Outreach and Training; External Relations<br />
Elizabeth Leake (UC) External Relations<br />
Lee Liming (UC/ANL) Software Integration and Scheduling<br />
Amit Majumdar (SDSC) Advanced User Support<br />
J.P. Navarro (UC/ANL) Software Integration and Scheduling<br />
Mike Northrop (UC) GIG Project Manager<br />
Jeff Koerner (UC/ANL) Networking, Operations and Security<br />
Jeff Koerner (UC/ANL) User Facing Projects<br />
Sergiu Sanielevici (PSC) User Services and Support<br />
Nancy Wilkins-Diehr (SDSC) Science Gateways<br />
Carolyn Peters (ANL) Event Coordinator/Administration<br />
ii
Working Group Leaders<br />
Accounting<br />
Advanced User Support<br />
Allocations<br />
Common User Environment<br />
Core Services<br />
Data<br />
Education, Outreach and Training<br />
External Relations<br />
Extreme Scalability<br />
Networking<br />
Operations<br />
Project Management<br />
Quality Assurance<br />
Scheduling<br />
Science Gateways<br />
Security<br />
Software<br />
User Facing Projects<br />
User Services<br />
Visualization<br />
David Hart (SDSC)<br />
Amit Majumdar (SDSC)<br />
Kent Milfield (TACC)<br />
Shawn Brown (PSC)<br />
Jeff Koerner (UC/ANL)<br />
Christopher Jordan (TACC)<br />
Scott Lathrop (UC/ANL)<br />
Elizabeth Leake (UC)<br />
Nick Nystrom (PSC)<br />
Linda Winkler (ANL)<br />
Jeff Koerner (UC/ANL)<br />
Tim Cockerill (NCSA)<br />
Kate Ericson (SDSC), Shava Smallen (SDSC)<br />
Warren Smith (TACC)<br />
Nancy Wilkins-Diehr (SDSC)<br />
Jim Marsteller (PSC)<br />
Lee Liming (UC/ANL), JP Navarro (UC/ANL)<br />
Jeff Koerner (UC/ANL)<br />
Sergiu Sanielevici (PSC)<br />
Kelly Gaither (TACC), Mike Papka (UC/ANL)<br />
iii
TERAGRID QUARTERLY REPORT<br />
Contents<br />
1 Overview ........................................................................................................................... 7<br />
1.1 Scope .............................................................................................................................. 7<br />
1.2 Communities Served ...................................................................................................... 8<br />
1.3 TeraGrid’s Integrated, Distributed Environment ........................................................... 9<br />
1.4 Organizational Architecture ........................................................................................... 9<br />
2 Science and Engineering Highlights ............................................................................. 11<br />
2.1 List of Science and Engineering Highlights ................................................................. 11<br />
2.2 Science and Engineering Highlights ............................................................................ 12<br />
3 Software Integration and Scheduling ........................................................................... 26<br />
3.1 Highlights ..................................................................................................................... 26<br />
3.2 Coordination and outreach ........................................................................................... 26<br />
3.3 Capability Expansion and Maintenance ....................................................................... 26<br />
3.4 Advanced Scheduling Capabilities .............................................................................. 27<br />
3.5 Information Services Enhancements ............................................................................ 27<br />
3.6 Operational Issues ........................................................................................................ 27<br />
4 Science Gateways ............................................................................................................ 28<br />
4.1 Highlights ..................................................................................................................... 28<br />
4.2 Targeted Support .......................................................................................................... 29<br />
4.3 Gateway Infrastructure and Services ........................................................................... 34<br />
4.4 RP Operations: SGW Operations ................................................................................. 35<br />
5 Users and User Support ................................................................................................. 35<br />
5.1 Highlights ..................................................................................................................... 35<br />
5.2 User Engagement ......................................................................................................... 36<br />
5.3 Frontline User Support ................................................................................................. 37<br />
5.4 RP Operations: User Services ...................................................................................... 40<br />
5.5 Advanced User Support ............................................................................................... 41<br />
6 User Facing Projects and Core Services ....................................................................... 64<br />
6.1 Highlights ..................................................................................................................... 64<br />
6.2 Enhanced TeraGrid User Access ................................................................................. 64<br />
6.3 Allocations Process and Allocation Management ........................................................ 66<br />
6.4 RP Integration and Information Sharing ...................................................................... 66<br />
iv
6.5 RP Operations: Accounting/Core Services .................................................................. 67<br />
6.6 User Information Presentation ..................................................................................... 67<br />
6.7 RP Operations: User Facing Projects ........................................................................... 68<br />
6.8 Information Production and Quality Assurance ........................................................... 68<br />
7 Data and Visualization ................................................................................................... 72<br />
7.1 Highlights ..................................................................................................................... 72<br />
7.2 Data .............................................................................................................................. 73<br />
7.3 RP Operations: Data..................................................................................................... 74<br />
7.4 Visualization ................................................................................................................ 75<br />
7.5 RP Operations: Visualization ....................................................................................... 76<br />
8 Network Operations and Security ................................................................................ 78<br />
8.1 Highlights ..................................................................................................................... 78<br />
8.2 Networking .................................................................................................................. 79<br />
8.3 RP Operations: Networking ......................................................................................... 79<br />
8.4 INCA ............................................................................................................................ 80<br />
8.5 Grid Services Usage ..................................................................................................... 80<br />
8.6 RP Operations: HPC Operations .................................................................................. 81<br />
8.7 Security ........................................................................................................................ 83<br />
8.8 RP Operations: Security ............................................................................................... 84<br />
8.9 Common User Environment ........................................................................................ 86<br />
8.10 System Performance Metrics ....................................................................................... 86<br />
9 Evaluation, Monitoring, and Auditing ......................................................................... 92<br />
9.1 XD – TIS ...................................................................................................................... 92<br />
9.2 Quality Assurance ........................................................................................................ 96<br />
9.3 XD – TAS .................................................................................................................... 97<br />
10 Education, Outreach, and Training; and External Relations .................................. 114<br />
10.1 Joint EOT-ER Outreach Highlights ........................................................................... 114<br />
10.2 EOT Impact Stories .................................................................................................... 121<br />
10.3 EOT Highlights .......................................................................................................... 123<br />
10.4 Training, Education and Outreach ............................................................................. 124<br />
10.5 RP Operations: Training ............................................................................................ 125<br />
10.6 RP Operations: Education .......................................................................................... 126<br />
10.7 RP Operations: Outreach ........................................................................................... 128<br />
10.8 External Relations ...................................................................................................... 130<br />
10.9 Enhancement of Diversity .......................................................................................... 133<br />
v
10.10 International Collaborations ....................................................................................... 133<br />
10.11 Broader Impacts ......................................................................................................... 135<br />
11 Project Management .................................................................................................... 136<br />
11.1 Highlights ................................................................................................................... 136<br />
11.2 Project Management Working Group ........................................................................ 136<br />
12 Science Advisory Board Interactions.......................................................................... 136<br />
13 Collaborations with Non-TeraGrid Institutions ........................................................ 136<br />
13.1 Data Systems .............................................................................................................. 136<br />
13.2 Science Projects ......................................................................................................... 136<br />
14 TeraGrid/Open Science Grid (OSG) Joint Activity .................................................. 137<br />
14.1 ExTENCI ................................................................................................................... 137<br />
vi
1 Overview<br />
The TeraGrid is an open cyberinfrastructure that enables and supports leading-edge scientific<br />
discovery and promotes science and technology education. The TeraGrid comprises<br />
supercomputing and massive storage systems, visualization resources, data collections, and<br />
science gateways, connected by high-bandwidth networks integrated by coordinated policies and<br />
operations, and supported by computational science and technology experts,.<br />
TeraGrid’s objectives are accomplished via a three-pronged strategy: to support the most<br />
advanced computational science in multiple domains (deep impact), to empower new<br />
communities of users (wide impact), and to provide resources and services that can be extended<br />
to a broader cyberinfrastructure (open infrastructure). This “deep, wide, and open” strategy<br />
guides the development, deployment, operations, and support activities to ensure maximum<br />
impact on science research and education across communities.<br />
1.1 Scope<br />
TeraGrid is an integrated, national-scale computational science infrastructure operated in a<br />
partnership comprising the Grid Infrastructure Group (GIG), eleven Resource Provider (RP)<br />
institutions, and six Software Integration partners, with funding from the National Science<br />
Foundation’s (NSF) Office of Cyberinfrastructure (OCI). Initially created as the Distributed<br />
Terascale Facility through a Major Research Equipment (MRE) award in 2001, the TeraGrid<br />
began providing production computing, storage, and visualization services to the national<br />
community in October 2004. In August 2005, NSF funded a five-year program to operate,<br />
enhance, and expand the capacity and capabilities of the TeraGrid to meet the growing needs of<br />
the science and engineering community through 2010, and then extended the TeraGrid an<br />
additional year into 2011 to provide an extended planning phase in preparation for TeraGrid<br />
Phase III eXtreme Digital (XD).<br />
Accomplishing this vision is crucial for the advancement of many areas of scientific discovery,<br />
ensuring US scientific leadership, and increasingly, for addressing important societal issues.<br />
TeraGrid achieves its purpose and fulfills its mission through a three-pronged strategy:<br />
Deep: ensure profound impact for the most experienced users, through provision of the<br />
most powerful computational resources and advanced computational expertise;<br />
Wide: enable scientific discovery by broader and more diverse communities of<br />
researchers and educators who can leverage TeraGrid’s high-end resources, portals and<br />
science gateways; and<br />
Open: facilitate simple integration with the broader cyberinfrastructure through the use<br />
of open interfaces, partnerships with other grids, and collaborations with other science<br />
research groups delivering and supporting open cyberinfrastructure facilities.<br />
The TeraGrid’s deep goal is to enable transformational scientific discovery through leadership<br />
in HPC for high-end computational research. The TeraGrid is designed to enable high-end<br />
science utilizing powerful supercomputing systems and high-end resources for the data analysis,<br />
visualization, management, storage, and transfer capabilities required by large-scale simulation<br />
and analysis. All of this requires an increasingly diverse set of leadership-class resources and<br />
services, and deep intellectual expertise in the application of advanced computing technologies.<br />
The TeraGrid’s wide goal is to increase the overall impact of TeraGrid’s advanced<br />
computational resources to larger and more diverse research and education communities<br />
through user interfaces and portals, domain specific gateways, and enhanced support that<br />
7
facilitate scientific discovery by people without requiring them to become high performance<br />
computing experts. The complexity of using TeraGrid’s high-end resources will continue to grow<br />
as systems increase in scale and evolve with new technologies. TeraGrid broadens the scientific<br />
user base of its resources via the development and support of simpler but powerful interfaces to<br />
resources, ranging from establishing common user environments to developing and hosting<br />
Science Gateways and portals. TeraGrid also provides focused outreach and collaboration with<br />
science domain research groups, and conducts educational and outreach activities that help<br />
inspire and educate the next generation of America’s leading-edge scientists.<br />
TeraGrid’s open goal is twofold: to enable the extensibility and evolution of the TeraGrid by<br />
using open standards and interfaces; and to ensure that the TeraGrid is interoperable with<br />
other, open standards-based cyberinfrastructure facilities. While TeraGrid only provides<br />
(integrated) high-end resources, it must enable its high-end cyberinfrastructure to be more<br />
accessible from, and even federated or integrated with, cyberinfrastructure of all scales. That<br />
includes not just other grids, but also campus cyberinfrastructures and even individual researcher<br />
labs/systems. The TeraGrid leads the community forward by providing an open infrastructure that<br />
enables, simplifies, and even encourages scaling out to its leadership-class resources by<br />
establishing models in which computational resources can be integrated both for current and new<br />
modalities of science. This openness includes open standards and interfaces, but goes further to<br />
include appropriate policies, support, training, and community building.<br />
The TeraGrid’s integrated resource portfolio includes 19 high-performance computational (HPC)<br />
systems, several massive storage systems, remote visualization resources, and a dedicated<br />
interconnection network. This infrastructure is integrated at several levels: policy and planning,<br />
operational and user support, and software and services. Policy and planning integration<br />
facilitates coordinated management and evolution of the TeraGrid environment and allows us to<br />
present—as much as possible—a single cyberinfrastructure to the user community. Operational<br />
and user support integration allows the user community to interact with one or more of the many<br />
distinct resources and HPC centers that comprise TeraGrid, through a common service, training,<br />
and support <strong>org</strong>anization—masking the complexity of a distributed <strong>org</strong>anization. In addition, this<br />
allows the national user community to request allocations through a single national review<br />
process and use the resources of the distributed facility with a single allocation. Software and<br />
services integration creates a user environment with standard service interfaces, lowering barriers<br />
to porting applications, enabling users to readily exploit the many TeraGrid resources to optimize<br />
their workload, and catalyzing a new generation of scientific discovery through distributed<br />
computing modalities. Science Gateways, web-based portals that present TeraGrid to specific<br />
communities, represent a new paradigm in high-performance computing with the potential to<br />
impact a wide audience. Gateways benefit significantly from the integration of software and<br />
services across the TeraGrid.<br />
1.2 Communities Served<br />
The national, and global, user community that relies on TeraGrid has grown tremendously during<br />
the past four years, from fewer than 1,000 users in October 2005 to nearly 5,000 active users and<br />
more than 10,000 total lifetime users at the end of 2009.<br />
The fourth quarter has typically seen a slight dip in TeraGrid user metrics, and Q4 2010 was no<br />
exception. 515 new individuals joined the ranks of TeraGrid users, and in October, 1,373<br />
individuals charged jobs on TeraGrid resources, a new monthly high. TeraGrid users numbered<br />
6,505 at the end of the quarter, including 5,563 individuals with current TeraGrid accounts. Of<br />
the 5,563 individuals, 1,904 (34%) were associated with jobs recorded in the TeraGrid central<br />
8
accounting system; an additional 942 users submitted jobs via science gateways—that is, nearly<br />
1/3 of TeraGrid users submitting jobs worked through gateways.<br />
Further details can be found §6.2.1.<br />
To support the great diversity of research activities and their wide range in resources needs, our<br />
user support and operations teams leverage the expertise across the eleven TeraGrid Resource<br />
Providers (§5, §8). In addition, users benefit from our coordinated education, outreach, and<br />
training activities (§9).<br />
1.3 TeraGrid’s Integrated, Distributed Environment<br />
TeraGrid’s diverse set of HPC resources provides a rich computational science environment.<br />
TeraGrid RPs operate more than 25 highly reliable HPC resources and 10 storage resources that<br />
are available via a central allocations and accounting process for the national academic<br />
community. More detailed information is available on compute resources at<br />
https://www.teragrid.<strong>org</strong>/web/user-support/resources and on storage resources at<br />
https://www.teragrid.<strong>org</strong>/web/user-support/data_and_viz.<br />
TeraGrid saw a slight dip in delivered NUs for Q4 2010. TeraGrid compute resources delivered<br />
13.2 billion NUs to users, down 0.2 billion, or 1%, from Q3 (Figure 8-11). However, the<br />
13.2 billion NUs represent a 1.7x increase in the NUs delivered in Q4 2009. Nearly 100% of<br />
those NUs were delivered to allocated users. Further details can be found §8.11.<br />
1.4 Organizational Architecture<br />
The coordination and management<br />
of the TeraGrid partners and<br />
resources requires <strong>org</strong>anizational<br />
and collaboration mechanisms that<br />
are different from a classic<br />
<strong>org</strong>anizational structure for single<br />
<strong>org</strong>anizations. The existing<br />
structure and practice has evolved<br />
from many years of collaborative<br />
arrangements between the centers,<br />
some predating the TeraGrid. As<br />
the TeraGrid moves forward, the<br />
inter-relationships continue to<br />
evolve in the context of a<br />
persistent collaborative<br />
environment.<br />
Figure 1-1: TeraGrid Facility Partner Institutions<br />
The TeraGrid team (Figure 1-1) is composed of eleven RPs and the GIG, which in turn has<br />
subawards to the RPs plus six additional Software Integration partners. The GIG provides<br />
coordination, operations, software integration, management and planning. GIG area directors<br />
(ADs) direct project activities involving staff from multiple partner sites, coordinating and<br />
maintaining TeraGrid central services.<br />
TeraGrid policy and governance rests with the TeraGrid Forum (TG Forum), comprising the<br />
eleven RP principal investigators and the GIG principal investigator. The TG Forum is led by an<br />
9
elected Chairperson, currently John Towns. This position facilitates the functioning of the TG<br />
Forum on behalf of the overall collaboration.<br />
TeraGrid management and planning is coordinated via a series of regular meetings, including<br />
weekly TeraGrid AD meetings, biweekly project-wide Round Table meetings, biweekly TG<br />
Forum meetings, and quarterly face-to-face internal project meetings. Coordination of project<br />
staff in terms of detailed technical analysis and planning is done through two types of technical<br />
groups: working groups and Requirement Analysis Teams (RATs). Working groups are persistent<br />
coordination teams and in general have participants from all RP sites; RATs are short-term (6-10<br />
weeks) focused planning teams that are typically small, with experts from a subset of both RP<br />
sites and GIG partner sites. Both types of groups make recommendations to the TeraGrid Forum<br />
or, as appropriate, to the GIG management team.<br />
10
2 Science and Engineering Highlights<br />
The TeraGrid publishes science impact stories to inform and educate the public about the<br />
importance and impact of high-end computing technologies in advancing scientific research.<br />
Below we provide 19 examples of TeraGrid-enabled science: first in a summary list, then as short<br />
summaries of each project. These highlights have been selected from the much larger collection<br />
of important scientific research activity supported on the TeraGrid to demonstrate the breadth of<br />
scope across scientific disciplines, research institutions, and sizes of research groups.<br />
2.1 List of Science and Engineering Highlights<br />
2.1.1<br />
2.1.2<br />
2.1.3<br />
2.1.4<br />
2.1.5<br />
2.1.6<br />
2.1.7<br />
2.1.8<br />
2.1.9<br />
2.1.10<br />
2.1.11<br />
2.1.12<br />
2.1.13<br />
2.1.14<br />
2.1.15<br />
2.1.16<br />
2.1.17<br />
Astronomical Sciences: Detecting HI In Emission At z = 1: The Effect of Substructure<br />
(PI: Tiziana Di Matteo, Carnegie Mellon)<br />
Astrophysics: The Formation of Active Regions on the Sun (PI: Matthias Rempel, NCAR)<br />
Atmospheric Sciences: Rapid Intensification of Hurricanes Related to Vertical Velocity<br />
Distributions and Microphysical Processes using WRF simulations in the context of<br />
observations (PI: Greg McFarquhar, U. Illinois)<br />
Atomic, Molecular, and Optical Physics: Optical Absorption Enhancement in Silicon<br />
Solar Cells Using Photonic Crystals (PI: Sang Eon Han, MIT)<br />
Biology: Water under the BAR (Gregory A. Voth, U. Chicago)<br />
Biology & Environment Science: North American bird distribution map input for 2011<br />
State of the Birds (PIs: Steve Kelling, John Cobb, Dan Fink, Cornell and ORNL)<br />
Cell Biology: Complex Formation and Binding Affinity of NHERF1 to C-terminal<br />
Peptides (PI: Peter A. Friedman, U. Pittsburgh School of Medicine)<br />
Chemistry: Molecular Simulation of Adsorption Sites in Metal-Organic Frameworks (PI:<br />
Randall Snurr, Northwestern)<br />
Computer and Computation Research: eaviv—Distributed visualization (PI: Andrei<br />
Hutanu, Louisiana State U.)<br />
Computational Mathematics: Residual Based Turbulence Models for Large Eddy<br />
Simulation on Unstructured Tetrahedral Meshes (PI: Arif Masud, U. Illinois)<br />
Earth Sciences: A TeraGrid Community Climate System Modeling (CCSM) Portal (PI:<br />
Matthew Huber, Purdue)<br />
Fluid, Particulate, and Hydraulic Systems: Fluid and Solid Interaction (FSI) in<br />
Engineering, Biomedical and Other Applications (PI: Cyrus Aidun, Ge<strong>org</strong>iaTech)<br />
Fluid, Particulate, and Hydraulic Systems: Large Eddy Simulation Investigations of<br />
Aviation Environmental Impacts (PI: Sanjiva Lele, Stanford)<br />
Human Computer Interactions: A Window on the Archives of the Future (PIs: Robert<br />
Chadduck, National Archives and Record Administration; Maria Esteva, and Weijia Xu,<br />
TACC)<br />
Molecular and Cellular Biosciences: Computational Approaches to Understanding Ion<br />
Channel and Transporter Function (PI: Michael Grabe, U. Pittsburgh)<br />
Phylogenetic Research: The CIPRES Science Gateway (PI: Mark Miller, UCSD)<br />
Physical Chemistry: Gas Phase Chemistry of Molecules and Ions of Astrophysical<br />
Interest (PI: Zhibo Yang, U. Colorado)<br />
11
2.1.18<br />
2.1.19<br />
Physics: The Geophysical High Order Suite for Turbulence (GHOST) (PI: Annick<br />
Pouquet, NCAR)<br />
Student Training: Use of TeraGrid Systems in the Classroom and Research Lab (PI:<br />
Andrew Ruether, Swarthmore College)<br />
2.2 Science and Engineering Highlights<br />
2.2.1<br />
Astronomical Sciences: Detecting HI In Emission At z = 1: The Effect of Substructure<br />
(PI: Tiziana Di Matteo, Carnegie Mellon)<br />
Observations show that the cosmic star formation rate has declined by more than one order of<br />
magnitude since redshift z = 1 (about 7.7 billion years ago). However a combined census of the<br />
cold gas (the fuel for star formation), and stellar components is largely missing in observations.<br />
This cold gas fraction is a crucial ingredient in models of galaxy formation and will suggest how<br />
galaxies obtain gas and subsequently convert it to stars. A census of neutral hydrogen (HI) will<br />
put tight constraints on different models of galaxy formation.<br />
Based on current observations, Di Matteo’s group modeled the distribution of HI using dark<br />
matter simulations with 14.6 billion particles at z = 1, using the hybrid MPI/Pthreads and<br />
massively parallel code GADGET on ~22k cores of TeraGrid’s Kraken system at NICS. This<br />
simulation is around 1.5 times larger than the Millennium Simulation (Springel et al. 2005) with 3<br />
times finer mass resolution. This was made possible by their NSF PetaApps project with PSC and<br />
the University of Washington, which has scaled GADGET to ~100k cores on Kraken.<br />
A slice from Di<br />
Matteo’s group’s<br />
simulation<br />
illustrating the<br />
rich network of<br />
nodes, filaments<br />
and voids in the<br />
distribution of<br />
dark matter at z<br />
= 1 is shown in<br />
the left side of<br />
the figure. The<br />
high resolution<br />
and large<br />
volume allow<br />
the group to<br />
resolve the<br />
smallest<br />
subhalos that host HI, residing in the largest dark matter halo. The group makes predictions for<br />
existing surveys and radio interferometers to determine the HI mass function at this redshift. They<br />
predict that small galaxies (undetected in an optical survey) embedded in their parent halo can be<br />
detected through their contribution to the total HI signal when seen in a radio telescope. This is<br />
shown for one of the many large halos in their simulation in the right side of the figure.<br />
Figure 2.1. (Left) A slice from the dark matter simulation at z = 1 enclosing the largest halo. (Right)<br />
Contribution of subhalos to the HI signal. The measured signal (red line) of a single large halo and the<br />
modeled signal (black line). Data points are the contribution of subhalos within pixel, the large halo<br />
giving rise to a discrepancy between the theoretical and the observed signal.<br />
Even though individual halos cannot be detected in HI at z = 1, the group can detect the effect of<br />
substructures by stacking thousands of halos whose redshifts have been determined in an optical<br />
survey. They find that the contribution of substructure can be seen in the cumulative HI signal<br />
12
and that they will be able to determine the HI mass function with existing surveys and radio<br />
telescopes with a detection significance of 5 -10σ across models.<br />
2.2.2<br />
Astrophysics: The Formation of Active Regions on the Sun (PI: Matthias Rempel, NCAR)<br />
Rempel and his team research the formation of active regions of<br />
the sun, which are areas with an especially strong magnetic field.<br />
Their recent work has concentrated on two closely related aspects<br />
of solar active region formation: 1) the sensitivity of flux<br />
emergence processes on initial conditions and 2) the dependence<br />
of penumbral fine structure on the imposed top boundary<br />
condition.<br />
Solar flux emergence processes include the transport of a<br />
magnetic field toward the stellar surface, the formation of<br />
starspots (i.e., sunspots), and the expansion of the magnetic field<br />
into the stellar corona. Using a new boundary condition<br />
artificially enhancing the horizontal magnetic field at the top<br />
boundary of the computational domain, the team was able to<br />
simulate sunspots with extended penumbrae and strong Evershed<br />
effect (outflow of gas from sunspots).<br />
Using Kraken at NICS, the team investigated how active region<br />
formation and evolution depend on the degree of magnetic twist<br />
present. In order to investigate the flux emergence, the team ran 150,000 iterations using 1,536<br />
CPUs, leading to 24 hours of simulated solar time. In order to investigate penumbral fine<br />
structure, the team ran 60,000 iterations using 1,536 CPUs to generate 2.5 hours of solar time.<br />
The highest resolution run was restarted from a previous run and evolved an additional 40<br />
minutes of solar time requiring 4,608 CPUs for 20,000 iterations. The increase of CPUs by a<br />
factor of four leads to a threefold decrease in time-to-solution and helped identify parts of the<br />
code requiring further improvement. The team used information from these runs on Kraken for a<br />
long-duration simulation in which the focus was wave propagation through sunspots to test and<br />
improve helioseismic inversion methods. This simulation was started on Ranger (at TACC),<br />
while the last 25% was finished on Kraken.<br />
The new boundary condition allowed the team to investigate the robustness of penumbral fine<br />
structure and of the underlying magneto-convection mechanism with respect to the numerical<br />
resolution. Identifying the minimum resolution required for properly resolving the physics<br />
underlying large scale outflows around sunspots is essential for future simulation runs since it<br />
allows to optimize the use of the required computing resources.<br />
2.2.3<br />
Figure 2.2. Numerical simulation explains<br />
sunspot fine structure as a consequence of<br />
magneto-convective energy transport in<br />
umbra and penumbra.<br />
Atmospheric Sciences: Rapid Intensification of Hurricanes Related to Vertical Velocity<br />
Distributions and Microphysical Processes using WRF simulations in the context of<br />
observations (PI: Greg McFarquhar, U. Illinois)<br />
It is well known that transformations between phases of water—vapor, liquid, ice—result in<br />
latent heating that ultimately affects tropical cyclone intensity. A quantitative understanding of<br />
the distribution of cloud microphysical processes releasing this energy and relating to the<br />
development of updrafts and downdrafts, however, has not been developed, which challenges<br />
forecasters.<br />
Greg McFarquhar and his colleagues at the University of Illinois at Urbana-Champaign<br />
configured the Weather Research and Forecasting model with one kilometer grid spacing and<br />
two-minute output frequency to simulate Hurricane Dennis (2005), which was observed during<br />
the NASA Tropical Cloud Systems and Processes campaign. NCSA’s Abe provided the<br />
13
computational resources for this simulation, which spanned from<br />
Dennis’s tropical storm through mature hurricane stage and<br />
second landfall over Cuba. In particular, 256 processors were<br />
used for integration of the model and an additional 16 processors<br />
for writing the standard and added latent heat output. It took about<br />
100 hours of real time to complete the 66-hour simulation. NCSA<br />
project space has since proven particularly useful for analyzing<br />
the approximately 12 terabytes of data post-processed from this<br />
simulation.<br />
To identify potential precursors to the rapid intensification of<br />
Dennis, statistical distributions of simulated vertical velocity and<br />
latent heat within various horizontal, vertical, and temporal<br />
frameworks were constructed. The team found that during the<br />
approximately 24-hour period preceding the onset of the most<br />
rapid intensification, upper-tropospheric vertical velocity<br />
frequency distributions continually broadened while the outliers<br />
of these distributions converged toward the simulated tropical<br />
cyclone center. Although a clear increase and concentration of<br />
latent heating toward the tropical cyclone center occurred, it was<br />
only after the onset of and during this most rapid intensification,<br />
when lower-to-mid-level vertical velocity frequency distributions<br />
broadened and more effectively forced the phase changes of water.<br />
The team presented their work at many conferences, including the 29th Conference on Hurricanes<br />
and Tropical Meteorology in 2010.<br />
2.2.4 Atomic, Molecular, and<br />
Optical Physics: Optical<br />
Absorption Enhancement in<br />
Silicon Solar Cells Using<br />
Photonic Crystals (PI: Sang<br />
Eon Han, MIT)<br />
The cost of crystalline silicon solar<br />
cells could be reduced significantly if<br />
the thickness of the silicon layer<br />
could be decreased from a few<br />
hundred micrometers to just a few<br />
micrometers. The material cost of<br />
crystalline silicon takes up about half<br />
the module cost of solar cells; hence<br />
thinner silicon would reduce costs.<br />
However, the poor light absorption of<br />
native crystalline silicon is a major<br />
challenge. Thus, an effective<br />
technique for light trapping in thin<br />
active layers needs to be developed.<br />
The scattering from metal<br />
nanoparticles near localized plasmon<br />
Figure 2.3. Hurricane Dennis bears down<br />
on Cuba in 2005 in this image from<br />
NASA. Researchers simulated this phase<br />
of the hurricane as part of their study of the<br />
cloud microphysical processes that caused<br />
the storm to later intensify as part of a<br />
project on forecasting hurricane intensity.<br />
Figure 2.4. The image shows that the amount of silicon can be reduced by<br />
two orders of magnitude by using photonic crystals to achieve the same<br />
absorption as a flat silicon substrate does. Optical interference concentrates<br />
light in silicon and this achieves absorption that exceeds the conventional<br />
light trapping limit at long wavelengths.<br />
resonance—oscillations that exist at the interface between any two materials—is a promising way<br />
of increasing the light absorption in thin-film solar cells.<br />
14
Enhancements in photocurrent have been observed for a wide range of semiconductors and solar<br />
cell configurations. Photonic crystals, optical nanostructures designed to act on photons in the<br />
same way semiconductors act on electrons, can be used to enhance optical absorption in this way.<br />
Photonic crystals are structures periodic on an optical length scale, meaning they contain<br />
regularly repeating internal regions that allow light waves to propagate or not depending on<br />
wavelengths. This periodicity can modify the density of photon states and absorption can be<br />
altered. Han is investigating silicon photonic crystals as light-absorbing structures for solar<br />
photovoltaics via simulations using the high-throughput Condor pool TeraGrid resource at<br />
Purdue.<br />
Charge carriers (electrons and holes) recombine at the crystal defects. If a thinner layer is used, it<br />
results in better charge carrier collection because the area of defects is less. In photonic crystals,<br />
optical diffraction can happen, which can be understood as an optical interference phenomenon.<br />
Constructive interference increases the electromagnetic field intensity and this can lead to<br />
stronger light absorption. If constructive interference in the photonic crystal regions which silicon<br />
occupies can be induced, enhanced absorption is expected. Accordingly, thin films might be used<br />
both to reduce the cost of solar cells and increase their efficiency. Results from this research<br />
using TeraGrid resources have recently been published in Nano Letters, “Toward the Lambertian<br />
Limit of Light Trapping in Thin Nanostructured Silicon Solar Cells,” Nano Lett., 2010, 10 (11),<br />
pp 4692–4696.<br />
2.2.5<br />
Biology: Water under the BAR (Gregory A. Voth, U. Chicago)<br />
Many cellular processes require the<br />
generation of highly curved regions of<br />
cell membranes by interfacial<br />
membrane proteins. Several<br />
mechanisms of curvature generation<br />
have been suggested, but a quantitative<br />
understanding of the importance of<br />
various potential mechanisms remains<br />
elusive.<br />
Voth’s research in this area is<br />
concentrated on the electrostatic<br />
attraction underlying the scaffold<br />
mechanism of membrane bending in<br />
the context of the N-BAR domain of<br />
the protein amphiphysin. Analysis of<br />
atomistic molecular dynamics<br />
simulations reveals considerable water between the membrane and the positively charged<br />
concave face of the BAR, even when it is as tightly bound as physically possible.<br />
Figure 2.5. (Upper-left) Top view of the starting configuration ofthe sixamphiphysin<br />
N-BAR SBCG simulation (counterions omitted for clarity).<br />
(Upper-right) Side view of the starting configuration of the SBCG<br />
simulations, showing counterions. (Lower-left) Final configu- ration of the<br />
SBCG simulation with the relative dielectric set to 1. (Lower- right) Final<br />
configuration of the SBCG simulation with the relative dielectric set to 14.<br />
This results in significant screening (reduction in strength) of electrostatic interactions, suggesting<br />
that electrostatic attraction is not the main driving force behind curvature sensing. These results<br />
also emphasize the need for care when building coarse-grained models of protein-membrane<br />
interactions. In the coarse-grained simulations, Voth finds that the induced curvature depends<br />
strongly on the dielectric screening.<br />
Calculations were performed on BigBen at PSC, Ranger at TACC, and Kraken at NICS. Phil<br />
Blood and Richard Swenson provided assistance with the simulations. This work has been<br />
published in Biophysical Journal v.99, 1783-1790, Sept. 2010.<br />
15
2.2.6<br />
Biology & Environment Science: North American bird distribution map input for 2011<br />
State of the Birds (PIs: Steve Kelling, John Cobb, Dan Fink, Cornell and ORNL)<br />
The Cornell lab of ornithology’s eBird<br />
(http://www.ebird.<strong>org</strong>) project has collected<br />
620,000 bird observations reported from<br />
107,000 unique locations. Collaborating with<br />
the scientific exploration, visualization, and<br />
analysis working group of the DataONE<br />
DataNet project, the team has developed a new,<br />
improved, statistical analysis method to predict<br />
the presence of absence of bird species<br />
according to environmental covariates. During<br />
2010, the team collected the covariants into a<br />
data warehouse and with a TeraGrid startup allocation (used mostly on Lonestar at TACC) were<br />
able to calculate bird migration maps with unprecedented detail and accuracy. An example<br />
showing the presence of the Wood Thrush during the Spring is given in the figure. Additional<br />
occurrence maps from these calculations are available from eBird<br />
(http://ebird.<strong>org</strong>/content/ebird/about/occurrence-maps/occurrence-maps). Maps of this detail are<br />
fine enough to be used in making land use and other policy decisions that affect habitat for<br />
important bird species, including decisions about crucial time periods at specific locations<br />
important for migratory species. Data and occurrence maps from these computations will be used<br />
in the annual state of the birds report, a high level policy document used to set multi-agency and<br />
NGO research and conservation policy agendas. (http://www.stateofthebirds.<strong>org</strong>/). The scale of<br />
computation required for this analysis is much larger than every before attempted by the eBird<br />
team. Without a TeraGrid resource such as Lonestar, it would have been much more difficult.<br />
Expert assistance at TeraGrid sites at Cornell, ORNL, and TACC were critical to the success of<br />
the computational campaign. In particular the data intensive mature of the calculation requires<br />
many large memory nodes and considerable I/O. Future plans include conducting analyses of six<br />
years of data instead of a single year with a future TeraGrid allocation in order to understand<br />
annual variability of migration patterns for several important reasons including as a “canary in the<br />
coal-mine” indicator of climate change.<br />
2.2.7<br />
Cell Biology: Complex Formation and Binding Affinity of NHERF1 to C-terminal<br />
Peptides (PI: Peter A. Friedman, U. Pittsburgh School of Medicine)<br />
The Na/H Exchange Regulatory Factor-1 (NHERF1) is a protein that is<br />
essential for intracellular signaling and trafficking. Recently, three NHERF1<br />
mutations were found in patients who exhibited renal phosphate wasting<br />
and bone thinning (osteopenia). Two of these mutations (R153Q, E225K)<br />
are located at the NHERF1 subdomain PDZ2. NMR and unfolding studies<br />
show that these mutations decrease the thermodynamic and conformational<br />
stability of the protein. Friedman, et al. performed molecular dynamics and<br />
thermodynamics simulations with AMBER on 32 PEs on Pople at PSC.<br />
These simulations made it possible to determine the different interactions of<br />
the wild type and mutated NHERF1 proteins. They show that the<br />
parathyroid hormone 1 receptor (PTH1R) has higher affinity for the mutant<br />
(R153Q) PDZ2 than for the wild type PDZ2, which may explain the<br />
mechanisms of renal disease caused by the mutation.<br />
Figure 2.6. Migration of Wood Thrush Occurrence 2009.<br />
Corrected for variation in detectability. Detailed predictions<br />
reveal both broad-scale patterns and fine-resolution detail.<br />
Figure 2.7. A snapshot from<br />
the MD simulation of bound<br />
PDZ2.<br />
16
2.2.8<br />
Chemistry: Molecular Simulation of Adsorption Sites in Metal-Organic Frameworks (PI:<br />
Randall Snurr, Northwestern)<br />
Metal-<strong>org</strong>anic frameworks (MOFs) are a new class of nanoporous materials that could be<br />
designed for a variety of purposes. For example, nanoscale pores tailored to attract a crowd of<br />
hydrogen gas molecules, which normally like plenty of elbow room, might be one way to pack<br />
more fuel into the tank of a hydrogen-driven vehicle or a fuel cell-powered mobile phone. Snurr<br />
likens nanoporous materials in this case to a sponge, one self-assembled Tinker Toy fashion by<br />
miniscule metal linker pieces and <strong>org</strong>anic struts that join to form a 3-D structure laced with tiny<br />
channels. Snurr and colleagues simulate adsorption sites in MOFs to predict how the materials<br />
will perform.<br />
A focus in 2010 was trying to develop or screen porous<br />
materials for capturing CO2 from the exhaust gases of<br />
coal-fired power plants. Carbon capture and sequestration<br />
are widely viewed as a possible way to reduce CO2<br />
emissions. The idea is to separate the carbon dioxide out<br />
of flue gas from a coal-fired plant and then store the CO2<br />
far underground in geologic formations. Snurr’s group is<br />
modeling porous materials for their ability to separate<br />
CO2 from the other gases in the flue gas stream. They<br />
have developed a consistent molecular modeling method<br />
and validated it against experimental data for 14<br />
representative MOFs. With this validation, they are<br />
screening large numbers of existing and hypothetical<br />
MOFs.<br />
In addition, the materials might be designed for potential<br />
applications in catalysis, chemical separation and to<br />
reduce auto emissions, among other things. Such<br />
materials could lower the energy demands—and cost—of<br />
energy-intensive processes, such as distillation, making<br />
them “greener.” The materials could be useful in chiral<br />
chemistry applications as well, for instance separating<br />
active and inactive forms of molecules for use in drugs.<br />
The researchers employ Monte Carlo simulations to<br />
understand the materials and their properties, such as the<br />
amount of molecules they adsorb, at an atomic level.<br />
Snurr and colleagues likewise can predict the position and<br />
orientation of adsorption sites. Their predictions match<br />
Figure 2.8. The structure of a metal-<strong>org</strong>anic<br />
framework (MOF) with potential for use in<br />
carbon dioxide capture systems superimposed<br />
behind a silhouette of a coal-fired power plant.<br />
Specially designed MOFs might be used to<br />
separate CO2 from flue gas for storage deep<br />
underground, keeping the CO2 out of the<br />
atmosphere.<br />
well with published results from experimental results using X-ray diffraction. The Monte Carlo<br />
technique employed is particularly useful for evaluating a model with a large number of inputs<br />
and more than a few uncertain parameters over and over again to yield an aggregate result, a<br />
description that fits, say, an atomic-level representation of a metal-<strong>org</strong>anic framework and its<br />
guest molecules. The work uses computing resources at several TeraGrid sites, including Purdue,<br />
NCSA and TACC. Recent papers covering the research include “Screening of Metal-Organic<br />
Frameworks for Carbon Dioxide Capture from Flue Gas Using a Combined Experimental and<br />
Modeling Approach," J. Am. Chem. Soc. 131, 18198-18199 (2009).<br />
17
2.2.9<br />
Computer and Computation Research: eaviv—Distributed visualization (PI: Andrei<br />
Hutanu, Louisiana State U.)<br />
Scientific research is increasingly dependent on simulation and analysis requiring high<br />
performance computers, distributed large-scale data, and high-speed<br />
networks. Andrei Hutanu of LSU leads a team addressing fundamental<br />
issues in distributed visualization design and implementation, where<br />
network services represent a first-class resource. They built a distributed<br />
visualization system, eaviv, and a distributed visualization application<br />
optimized for large data, high-speed networks and interaction. eaviv<br />
connects Spider at LSU’s Center for Computation & Technology<br />
(CCT), Lincoln and accelerator clusters at NCSA, and a cluster at the<br />
Laboratory of Advanced Networking Technologies (SITOLA) at<br />
Masaryk University in the Czech Republic. The Internet2 interoperable<br />
On-demand Network (ION) provides wide-area connectivity to support<br />
dynamic network circuit services, and NCSA’s mass storage system<br />
provided permanent data storage. Applications request and reserve<br />
point-to-point circuits between sites as needed using automated control<br />
software. eaviv supports distributed collaboration, letting multiple users<br />
in physically distributed locations interact with the same visualization to<br />
communicate their ideas and explore the datasets cooperatively. The team implemented a raycasting<br />
parallel volume renderer called Pcaster as the rendering component to demonstrate the<br />
distributed pipeline’s workflow. Compared to parallel volume renderers in existing software,<br />
Pcaster is a purely GPU-based volume renderer and image compositor supporting high-resolution<br />
rendering. Pcaster asynchronously couples with parallel data servers for network-streamed data<br />
input. The team tested Pcaster with data sets up to 64 gigabytes per timestep and achieved<br />
interactive frame rates of five to 10 frames per second on Lincoln to produce render images of<br />
1,024 x 1,024 resolution. The eaviv system’s configurable architecture also allows it to run with a<br />
local data server and a single renderer, making it a desktop tool for small-scale local data.<br />
This research was published in 2010 in Scalable Computing: Practice and Experience and<br />
Computing in Science and Engineering, and was presented at the TeraGrid 10 conference.<br />
2.2.10<br />
Computational Mathematics: Residual Based Turbulence Models for Large Eddy<br />
Simulation on Unstructured Tetrahedral Meshes (PI: Arif Masud, U. Illinois)<br />
Development of computational methods for modeling turbulent<br />
flows is considered a formidable challenge due to the plethora of<br />
associated spatial and temporal scales. Professor Arif Masud and<br />
graduate student Ramon Calderer from the Department of Civil<br />
and Environmental Engineering at the University of Illinois at<br />
Urbana-Champaign used Abe at NCSA to develop residual-based<br />
methods for modeling turbulence in complex fluid flows and<br />
fluid-structure interactions. Their new codes are mathematically<br />
consistent and robust and provide high-fidelity solutions at<br />
reduced computational costs compared to the more traditional<br />
Direct Numerical Simulations (DNS).<br />
In an effort to model flow-induced vibrations in off-shore oil<br />
platforms, the team used their new method to investigate fluidstructure<br />
interaction and turbulence around rigid and oscillating<br />
cylinders. Underwater currents trigger periodic vortices around<br />
risers that are oil pipelines extending from the floating platforms<br />
Figure 2.9. eaviv is a<br />
visualization application with<br />
three distributed components<br />
connected by high-speed<br />
networks: data, rendering, and<br />
viewer.<br />
Figure 2.10. Vortex shedding created by<br />
an oscillating cylinder: the cylinder was<br />
kept motionless and the Reynolds number<br />
increased in order to study the vortical<br />
structures in the cylinder's wake.<br />
18
to the seafloor. Vortex shedding causes energy transfer from fluid to the structure and leads to<br />
high-amplitude vibrations that can trigger fatigue failure of the structural systems. These new<br />
methods will help accurately evaluate, both qualitatively as well as quantitatively, the fatigue<br />
failure phenomena that is triggered by the turbulence in the natural convective flows around<br />
risers, thus providing a clearer and a more precise picture when the failure can happen. This will<br />
help in taking preventive measure in time, and therefore will be important not only for the design<br />
of these components, but also as a method to monitor these deep sea oil pipelines in the Gulf of<br />
Mexico.<br />
To visualize the 3D nature of turbulent flows, the team turned to NCSA’s Advanced Applications<br />
Support visualization team. Visualization programmer Mark Van Moer used ParaView, a parallel<br />
renderer, with custom VTK scripts to create visualizations for the team. “These visualizations<br />
have tremendously benefited us in the method development phase,” says Masud.<br />
This work will be presented at the 16th International Conference on Finite Elements in Flow<br />
Problems, Munich, Germany, in March 2011.<br />
2.2.11<br />
Earth Sciences: A TeraGrid Community Climate System Modeling (CCSM) Portal (PI:<br />
Matthew Huber, Purdue)<br />
Graduate student Aaron Goldner used the<br />
TeraGrid Community Climate System<br />
Modeling Portal (developed at Purdue)<br />
and TeraGrid computational resources to<br />
show potential for wetter, rather than<br />
drier, conditions in the American<br />
Southwest (and, indeed, the entire<br />
American West and East) with projected<br />
climate change. The research, presented<br />
at the American Geophysical Union fall<br />
meeting in December 2010, offered<br />
another indicator of where current<br />
climate models can be refined. Goldner<br />
was able to use TeraGrid resources for<br />
the computationally demanding task of<br />
including tropical cyclone winds in his<br />
simulations. That yielded a different<br />
potential picture of future water cycles<br />
over North America than current<br />
modeling, which doesn’t include<br />
cyclonic winds and projects drought-like,<br />
rather than wetter, conditions in areas<br />
such as the American Southwest.<br />
The graphical user interface of the<br />
CCSM portal made it easier for Goldner<br />
to set up his modeling initially, while<br />
TeraGrid hardware enabled modeling at a<br />
level of detail that still required more<br />
than a month to complete. The CCSM<br />
portal also has been used in combined<br />
political science/earth and atmospheric<br />
sciences classes at Purdue. The goal is to<br />
Figure 2.11. Enhanced tropical cyclone activity within ocean models can<br />
produce feedbacks that affect El Niño cycles. When these changes in El<br />
Niño are used as boundary conditions within a global atmospheric 19 model<br />
they cause increased rainfall over the entire United States by shifting<br />
atmospheric circulation, thus altering hydrologic cycles.
educate students in the use and misuse of modeling across disciplines, so they will be better<br />
builders and consumers of integrated, complex computer models. Besides climate science, such<br />
models are employed in business, engineering, medicine, the military and many other fields for<br />
purposes ranging from predicting terrorist activity to predicting the structure of protein molecules<br />
affecting human health and disease. Students in interdisciplinary teams (including at least one<br />
liberal arts and one science student) received hands-on experience running the NCAR<br />
Community Climate System Model via the TeraGrid. All students were able to perform 100-year<br />
simulations.<br />
That would not have been possible without the TeraGrid CCSM portal, which eliminates many<br />
barriers for class use. Purdue’s TeraGrid staff built in features to speed up secure access to real<br />
climate data and enable online data analysis and visualization. For instance, the portal’s features<br />
include the ability to tap TeraGrid resources through a community account so each student need<br />
not apply for TeraGrid computational time. A set of multimedia, web-based tutorials help bring<br />
novice users up to speed on using the portal and the climate model itself. Students say the portal<br />
allows them to get results faster and spend more time working with those results in their<br />
interdisciplinary teams, while offering hands-on experience beyond the classroom norm. The<br />
class use also wouldn’t have been possible without the hardware available on the TeraGrid.<br />
Climate modeling is demanding computationally and typical model runs by a class full of<br />
students would be difficult, if not impossible, with the computers and data storage available on<br />
most campuses. The CCSM Portal draws on Purdue’s Steele supercomputer, which is partially<br />
dedicated to TeraGrid use, and on TeraGrid resources at LONI and SDSC. Other schools are now<br />
looking at integrating the CCSM Portal and climate modeling into the classroom.<br />
2.2.12<br />
Fluid, Particulate, and Hydraulic Systems: Fluid and Solid Interaction (FSI) in<br />
Engineering, Biomedical and Other Applications (PI: Cyrus Aidun, Ge<strong>org</strong>iaTech)<br />
Problems in which fluid and solid interact dynamically occur in various engineering and<br />
biological applications. In such problems, the solid and fluid phases are tightly coupled requiring<br />
the simultaneous<br />
solution of both<br />
phases. In addition,<br />
the high spatial and<br />
temporal resolution<br />
required for such<br />
problems tends to<br />
make traditional<br />
numerical methods<br />
computationally<br />
expensive. Using the<br />
lattice-Boltzmann<br />
(LB) method and<br />
finite element (FE)<br />
analysis for<br />
modeling the fluid<br />
and solid phases,<br />
respectively, can<br />
provide a number of<br />
unique features<br />
making the solution of large problems in this field feasible. Still, the calculations require<br />
extensive data processing and storage.<br />
Figure 2.12. A numerical model to experiment comparison for eight channels, such as those in<br />
heart valves. The lighter shade right-side bars are from simulations and the darker shade left-side<br />
bars are from experiments. The modeling and experimental results agree well for all channels.<br />
20
In 2010, Aidun and colleagues made use of TeraGrid resources at Purdue, LONI, NCSA and<br />
TACC. The researchers have examined exceedingly complex noncolloidal suspensions, work<br />
which lends itself to varied applications such as biological systems (blood flow), paper<br />
manufacturing (wood fiber and coating flows), mining and petroleum industries (for waste<br />
trailing), and home products (paint and cosmetics). For example, blood is comprised of elements<br />
such as red blood cells and platelets suspended in plasma. Most computational modeling of blood<br />
flow has simplified blood as pure fluid flow without the presence of actual suspended particles.<br />
Understanding the physics of blood flow requires accurate simulation of the suspended solid<br />
particles to capture the suspension structure and the blood elements on a microscale. This kind of<br />
understanding could be important in, among other things, treating pathological blood clots, a<br />
common cause of heart attacks and strokes, which strike more than 2 million Americans annually.<br />
Clot formation and detachment also plays into such medical issues as the failure of popular<br />
bileaflet mechanical heart valves. Aidun’s group published results from such simulations more<br />
than once in 2010, including “Parallel Performance of a Lattice-Boltzmann/Finite Element<br />
Cellular Blood Flow Solver on the IBM Blue Gene/P Architecture,” Computer Physics<br />
Communications, 181:1013-1020, 2010. Their paper “Numerical Investigation of the Effects of<br />
Channel Geometry on Platelet Activation and Blood Damage,” accepted by Annals of Biomedical<br />
Engineering in October 2010, reports results that may help optimize the design of prosthetic<br />
replacement heart valves. The researchers also are looking at extremely complex flows through<br />
saturated deformable media, work that could provide insight into material properties of foams and<br />
felts used in industrial applications and lead to improvement in the designs of these materials.<br />
2.2.13<br />
Fluid, Particulate, and Hydraulic Systems: Large Eddy Simulation Investigations of<br />
Aviation Environmental Impacts (PI: Sanjiva Lele, Stanford)<br />
One of the impacts about which the climate research community is very uncertain is the extent to<br />
which aircraft change cloud cover. That’s why Sanjiva Lele and his team at Stanford University,<br />
including PhD candidate Alexander Naiman, are studying the impact of commercial aviation on<br />
global climate. The direct effect is known: that airplanes produce<br />
contrails, which are clouds of ice particles, under certain atmospheric<br />
conditions. But Lele and Naiman have shown there’s also an indirect<br />
effect, where jet contrails induce cirrus cloud formation where clouds<br />
might not have formed naturally otherwise. This is a concern, as<br />
increased cloud cover may reduce a region’s temperature as well as<br />
the amount of solar energy a region receives.<br />
The portion of the research that's being conducted on Abe at NCSA is<br />
a Large Eddy Simulation (LES) of contrails using a highly scalable<br />
parallelized LES code that was developed at Stanford. It is a new<br />
model of aircraft exhaust plume dilution that is used as a subgrid scale<br />
model within a large-scale atmospheric simulation, providing<br />
predictive equations for the evolution of individual exhaust plumes<br />
based on parameters provided by the large-scale simulation. The<br />
simulations start with the wake of a commercial airliner at a cruise<br />
condition, one second after the airplane has passed. In addition to<br />
solving the LES fluid equations, they also compute the deposition of<br />
water vapor onto the ice particles that form a contrail, leading to<br />
cirrus cloud formation. The outputs of their simulations include 3D<br />
fields of all the simulation variables at certain time intervals, which<br />
was not previously possible.<br />
Current simulations continue for 1200 seconds (20 minutes), and the<br />
Figure 2.13. This image shows 3D<br />
isosurfaces of two quantities. The<br />
opaque surfaces on the inside are<br />
vorticity magnitude colored by<br />
streamwise vorticity; these show the<br />
development of the aircraft wake<br />
vortices, which are what give contrails<br />
interesting dynamics. The vortices are<br />
initially parallel, then they form loops<br />
and interact chaotically until they<br />
dissipate. The translucent green/blue<br />
outer surface shows the aircraft jet<br />
exhaust scalar, which essentially<br />
defines the outer extent of the contrail.<br />
21
team plan to extend these into the two to three hour range. The end goal is to understand how<br />
atmospheric conditions and aircraft types affect the properties of a contrail that are important for<br />
calculating climate impacts.<br />
Their results have been published in Atmospheric Chemistry and Physics in 2010 and also<br />
presented in 2009 at meetings of the American Physical Society Division of Fluid Dynamics and<br />
the American Geophysical Union.<br />
2.2.14<br />
Human Computer Interactions: A Window on the Archives of the Future (PIs: Robert<br />
Chadduck, National Archives and Record Administration; Maria Esteva, and Weijia Xu,<br />
TACC)<br />
With the proliferation of digital records, the task of the archivist<br />
has grown exponentially more complex. This problem is<br />
especially acute for the National Archives and Records<br />
Administration (NARA), the government agency responsible for<br />
managing and preserving the nation’s historical records.<br />
In 2014, NARA expects to receive 8.8 Petabytes (10 15 bites) of<br />
electronic records, bringing their accumulated volume to more<br />
than 35 Petabytes.<br />
To find innovative and scalable solutions to large scale and<br />
heterogeneous electronic records collections, NARA turned to the<br />
TeraGrid and TACC, drawing on the expertise of the center’s<br />
digital archivists and data experts.<br />
Using data from NARA’s Cyberinfrastructure for a Billion<br />
Electronic Records (“CI-BER”) research collaboration, a testbed<br />
of documents collected by NARA, the researchers created a<br />
visualization based on the “treemap” and relational database<br />
systems to represent the collection’s arrangement and to show the<br />
its properties at different levels of aggregation, classification and abstraction. The NARA treemap<br />
presents a view of the 3-million-file collection that allows one to identify the differences between<br />
groups of records through visual clues and distinguish patterns and outliers that would be difficult<br />
to spot otherwise. Moreover, it allows archivists to learn by visually comparing and contrasting<br />
groups of records.<br />
TACC’s experts are currently building a multi-touch enabled tiled display system to improve<br />
interactivity and to enhance the collaborative aspects of visual analysis. The group is also<br />
experimenting with cloud-computing methods, using the distributed storage of Longhorn at<br />
TACC with the open-source Hadoop package to scale data<br />
analysis methods.<br />
The collaboration between NARA and the TeraGrid is leading to<br />
the development of tools that combine the power of advanced<br />
computing with the experience and skills of archivists and data<br />
curators and providing a window on the archives of the future.<br />
2.2.15<br />
Molecular and Cellular Biosciences: Computational<br />
Approaches to Understanding Ion Channel and<br />
Transporter Function (PI: Michael Grabe, U. Pittsburgh)<br />
Work by the Grabe lab at PSC and TACC has established<br />
previously elusive mechanisms of substrate transport in an<br />
important class of cellular membrane channels that are key targets<br />
Figure 2.14. Treemap view of electronic<br />
records of 125 Federal Agencies in the<br />
Cyber testbed collection developed for the<br />
National Archives and Records<br />
Administration. The different shades of<br />
yellow represent ranges of number of files,<br />
from more (bright yellow) to less (light<br />
yellow).<br />
Figure 2.15. Image of the sodium-galactose<br />
cotransporter vSGLT (gray). The image<br />
shows several snapshots of galactose<br />
(colored molecule) coming unbound 22 from<br />
vSGLT. The different colors show the time<br />
progression of unbinding (red to purple).<br />
Membrane lipids and water are not shown.
in treating diabetes and obesity. Their results were published recently in Nature (The mechanism<br />
of sodium and substrate release from the binding pocket of vSGLT, Nature, 468: 988–991).<br />
Although many protein structures have been solved at different stages of substrate transport, the<br />
atomic-level insight required to understand the dynamics of this process has been missing. To<br />
address this issue, the Grabe group performed several long timescale molecular dynamics (MD)<br />
simulations (up to 200 ns) of a sugar-sodium cotransporter, vSGLT, on BigBen at PSC and<br />
Ranger at TACC. Their simulations reveal specific conformational changes that occur in the<br />
protein upon the release of sodium from its binding site that result in the release of galactose into<br />
the intracellular space. These results are exciting because they provide specific atomistic<br />
information about the mechanism of transport that can be used in developing strategies for<br />
treating diseases linked to the function of these types of channels, such as diabetes.<br />
2.2.16<br />
Phylogenetic Research: The CIPRES Science Gateway (PI: Mark Miller, UCSD)<br />
For centuries malaria has mystified physicians and terrified patients, claiming more childhood<br />
lives than any other infectious disease across large sections of the world. Though much has been<br />
learned about the genetics of various Plasmodium parasites, which cause malaria across<br />
vertebrate species, key aspects of the evolutionary relationships of these parasites have been<br />
elusive.<br />
Now, with the aid of a portal linking them to TeraGrid expertise and computational resources,<br />
researchers led by the University of Maryland and the University of South Carolina have clarified<br />
the evolutionary history of Plasmodium by analyzing an unprecedented 45 distinct genes from<br />
genomes of eight recently sequenced Plasmodium species.<br />
The results, published online in the journal Parasitology on December 1, 2010, offer the first<br />
comprehensive dating of the divergence of these individual Plasmodium species and provide new<br />
insights into the complex relationship between Plasmodium species and their mammalian hosts.<br />
Figure 2.16. Alternative hypotheses for the phylogenetic relationships among Plasmodium species. Pfal = P. falciparum;<br />
Prei=P. reichenowi; Pgal=P. gallinaceum; Pviv=P. vivax; Pkno=P. knowlesi; Pyoe=P. yoelii; Pber=P. berghei; Pcha =P.<br />
chabaudi. (Credit: Joana C. Silva, University of Maryland, et al.)<br />
“The results clarify the ancient association between malaria parasites and their primate hosts,<br />
including humans,” said James B. Munro, a researcher from the University of Maryland School<br />
of Medicine, Baltimore, Md. “Indeed, even though the data is somewhat noisy due to issues<br />
related to nucleotide composition, the signal is still strong enough to obtain a clear answer.”<br />
A major finding of the research is that humans are likely to serve as reservoir for P. falciparum—<br />
that is, humans are likely to transmit this most prevalent and most dangerous of all the parasitic<br />
infections to great apes, and not the other way around. This finding contradicts previous studies,<br />
which suggested that humans can be infected with P. falciparum derived from apes. The results<br />
obtained in this study argue that “if P. falciparum infections in great apes are derived from<br />
humans, [there may be a] need to establish refuges for the great apes that are safe from human<br />
intrusion.”<br />
23
The research builds on the unveiling of the genome sequences of the two most widespread human<br />
malaria parasites—P. falciparum and P. vivax—and the monkey parasite P. knowlesi, together<br />
with the draft genomes of the chimpanzee parasite P. reichenow; three rodent parasites, P. yoelii<br />
yoelli, P. berghei and P. chabaudi chabaudi; and one avian parasite, P. gallinaceum. To examine<br />
the association between malaria parasites and their primate hosts, the researchers sought to<br />
compare genetic variations found in 45 highly conserved nuclear genes for which sequences are<br />
available from all eight Plasmodium species.<br />
The evolutionary relationships were inferred using, among others, the software package MrBayes,<br />
which consumed about 200,000 CPU hours on Abe NCSA. The researchers accessed this resource<br />
via the CIPRES Science Gateway, a browser interface developed at SDSC that permits access to<br />
TeraGrid compute resources. “Without CIPRES, this work, and the other projects I am working<br />
on, would not go as quickly or as smoothly,” said Munro. “CIPRES is a fantastic resource.”<br />
Other phylogenetic or divergence time analyses were conducted on the Brazos HPC cluster at<br />
Texas A&M University, and on the cluster at the Institute for Genome Sciences at the University<br />
of Maryland School of Medicine. Further studies are expected to be run on Trestles, a new TG<br />
data-intensive HPC resource housed at SDSC.<br />
2.2.17<br />
Physical Chemistry: Gas Phase Chemistry of Molecules and Ions of Astrophysical<br />
Interest (PI: Zhibo Yang, U. Colorado)<br />
Space is not empty. In fact, says Zhibo Yang, a post-doctoral researcher at the University of<br />
Colorado, Boulder, there’s an area of the universe called the<br />
interstellar medium (ISM) that is filled with molecules, atoms,<br />
ions, and grains of dust. And these things are conducting some<br />
very interesting chemical activity.<br />
Among these reactions, ion-molecule reactions are dominant.<br />
Yang and his colleagues are studying reactions that involve<br />
recently detected negative ions with the help of Abe at NCSA.<br />
The team is looking at the reaction between ions and neutrons,<br />
measuring the reaction constants of these reactions, the charge,<br />
and the mass of the negative ions using mass spectroscopy. But<br />
mass spectroscopy yields no information regarding the ion<br />
structure. To get information about the structure energies and to<br />
understand their active mass requires very large calculations.<br />
Yang says, “the typical calculation will last for 12 hours on our<br />
PC but we can do the same calculation on Abe in about two to<br />
three hours.“<br />
So far, the group has completed experimental and theoretical<br />
studies of reactions between hydrogen atoms and carbon chain<br />
anions, and computational studies of gas phase reactions of<br />
carbon chain anions with nitrogen and oxygen. They are<br />
Figure 2.17. This figure explains the<br />
low reactivity between a nitrogen atom<br />
and some typical molecular anions<br />
(deprotonated acetonitrile is used as an<br />
example here). This reaction is spinforbidden,<br />
whereas the energy available<br />
for the spin-conversion is very small,<br />
such that reaction is very slow.<br />
providing some fundamental, general rules to help scientists understand other related reactions.<br />
There are other scientists who are doing modeling, explains Yang, and they want to use the<br />
team’s reaction with constant in their models.<br />
Five oral presentations have been made at international and domestic conferences and six papers<br />
were published in 2010, in the Journal of the American Chemical Society, the Astrophysical<br />
Journal, and in Physical Chemistry Chemical Physics.<br />
24
2.2.18<br />
Physics: The Geophysical High Order Suite for Turbulence (GHOST) (PI: Annick<br />
Pouquet, NCAR)<br />
The Geophysical High Order Suite for Turbulence (GHOST) is a framework for studying<br />
fundamental turbulence via direct numerical simulation (DNS) and modeling. Running on<br />
TeraGrid computers in 2010, GHOST became the first known successful attempt to combine<br />
OpenMP and MPI in a large-scale hybrid scheme for pseudo-spectral fluid dynamics. This<br />
hybridization enables highly efficient use of computing resources both within each node and<br />
between nodes across a network. Using a TeraGrid allocation at NICS, NCAR researchers were<br />
able to do a 3,072 3 simulation of a rotating flow—the largest such simulation in the world—on<br />
nearly 20,000 processor cores. The efficiency achieved with<br />
GHOST allowed them to study flows at a higher resolution and<br />
with greater fidelity by using all the cores on each node. Such<br />
efficiency gains are allowing researchers to solve problems they<br />
could not do before by significantly reducing the processing time<br />
on large core-count computer systems. Pablo Mininni, Duane<br />
Rosenberg, Raghu Reddy, and Annick Pouquet presented their<br />
paper on the topic, Investigation of Performance of a Hybrid<br />
MPI-OpenMP Turbulence Code, at TG'10, and it received the<br />
conference award for best paper in the technology track.<br />
2.2.19<br />
Student Training: Use of TeraGrid Systems in the<br />
Classroom and Research Lab (PI: Andrew Ruether,<br />
Swarthmore College)<br />
Swarthmore College uses the TeraGrid for both student<br />
coursework and faculty research. When Ruether taught<br />
Swarthmore’s Distributed and Parallel Computing course, all<br />
students were given TeraGrid accounts, and they used TeraGrid<br />
systems for class projects. Two of computer science Professor Tia<br />
Newhall’s students extended the work done in the class to a<br />
summer project and presented their results at the TeraGrid 2010<br />
conference. The students developed and tested a novel<br />
parallelization technique for solving the K-Nearest Neighbor<br />
problem. The algorithm is used to classify objects from a large<br />
set. A few examples of its use include discovering medical<br />
images that contain tumors from a large medical database,<br />
recognizing fingerprints from a national fingerprint database and<br />
finding certain types of astronomical objects in a large<br />
Figure 2.18. These four turbulence<br />
visualizations show Taylor-Green flow at<br />
increasing resolutions using 64 3 , 256 3 ,<br />
1,024 3 , and 2,048 3 grid points. This<br />
modeling advancement using a hybrid<br />
MPI-OpenMP code is only possible on<br />
systems with significantly more processor<br />
cores than NCAR’s most powerful<br />
supercomputer. GHOST code has been run<br />
successfully on Kraken and Ranger.<br />
GHOST code scales well on these and<br />
other platforms for hydrodynamic runs<br />
with up to 3,072 3 grid points. These results<br />
demonstrate that pseudo-spectral codes<br />
such as GHOST – using FFT algorithms<br />
that are prevalent in many numerical<br />
applications – can scale linearly to a large<br />
number of processors.<br />
astronomical database. The TeraGrid allowed the students to run large-scale experiments<br />
necessary to demonstrate that their solution worked well for real-world problems.<br />
Swarthmore also has three faculty members—in computer science, physics and chemistry—doing<br />
publishable work using TeraGrid resources. Because Swarthmore does not have a highperformance<br />
computing system, the TeraGrid has been a key resource for faculty researchers who<br />
have reached the limits of desktop computing. In one case, this reduced the run time of a<br />
simulation from one week on a desktop computer to one hour using Purdue's Condor pool, which<br />
offers nearly 37,000 processors and is available as a high-throughput TeraGrid resource. This<br />
allowed<br />
the<br />
research<br />
group to<br />
run many<br />
Figure 2.19. The gray lines are magnetic field lines in plasma and the colored dotted lines are charged<br />
particle orbits through the magnetic field. TeraGrid resources were used to model many particle orbits to get a<br />
sense of the confinement in this configuration. Knowledge gained from the modeling could be useful in<br />
designing fusion reactors based on magnetic confinement.<br />
25
more simulations than it would otherwise have had time to complete. In addition, Swarthmore has<br />
used high-performance TeraGrid systems at LONI, NCSA and TACC. For example, chemistry<br />
Professor Paul Rablen has been using the TeraGrid for an investigation of rearrangements in<br />
certain carbenes, highly reactive carbon molecules widely used as efficient catalysts, among other<br />
places in the pharmaceutical industry. Rablen and a collaborator from Colby College have<br />
submitted a paper describing their project to the Journal of Organic Chemistry. In addition,<br />
Michael Brown, professor of physics, is working on ways to design fusion reactors and uses the<br />
TeraGrid to model the behavior of the plasma in a magnetic containment field. Swarthmore<br />
senior Dan Dandurand, one of Brown’s students, calculated the complex orbits of more than a<br />
billion energetic protons, calculations that help shed light on magnetic confinement fusion. In<br />
another set of calculations, Dandurand determined the fraction of energetic protons collected by a<br />
simulated probe in the plasma. These calculations helped to calibrate and understand an actual<br />
probe used in experiments. The researchers have submitted a paper, “Calibrated Cylindrical Mach<br />
Probe in a Plasma,” using some TeraGrid-produced results to the Review of Scientific<br />
Instruments.<br />
3 Software Integration and Scheduling<br />
The software integration area covers the operational tasks associated with keeping the TeraGrid’s<br />
coordinated software up-to-date and consistently configured across all the HPC, HTC, and<br />
storage resources; as well as a number of small capability improvement projects. In PY6, these<br />
projects include improving our advanced scheduling capabilities, enhancing our information<br />
services, supporting NSF’s software cyber-infrastructure development and maintenance programs<br />
(SDCI, STCI), improving documentation, and formalizing our application hosting capabilities.<br />
3.1 Highlights<br />
TeraGrid began moving its GRAM5 services (used for remote job submission, primarily by<br />
science gateways) to production status, replacing older GRAM4 services that had been used for<br />
more than a year and in some cases as much as two years. These updates will enable science<br />
gateways to begin reporting end user identities to HPC systems when jobs are submitted under<br />
community (gateway-wide) credentials.<br />
We also made modest improvements to our Automatic Resource Selection capabilities and added<br />
resource descriptions (both HPC and data resources) and science gateway Web service<br />
descriptions to our integrated information services.<br />
3.2 Coordination and outreach<br />
The GIG Software Integration team participated in various TeraGrid internal coordination<br />
activities and in several significant external outreach and collaboration activities.<br />
Notable internal coordination activities included working with the Science Gateways areas on the<br />
rollout of Globus Toolkit Version 5 (GT5) related capabilities, and ongoing discussions with the<br />
Common User Environment (CUE) team on the CUE implementation.<br />
3.3 Capability Expansion and Maintenance<br />
Capability expansion and maintenance include the coordinated delivery, deployment,<br />
maintenance, and support of Coordinated TeraGrid Software and Service (CTSS) capabilities.<br />
This includes Advanced Scheduling and Information Services capabilities that are presented in<br />
their own sections below.<br />
The most notable capability expansion this quarter was the delivery of improved Science<br />
Gateway Support and Remote Computation capabilities with the new Globus Toolkit GRAM5<br />
26
service. In addition to GRAM5, we are also updating other capability kits to include the latest<br />
versions of Globus software, including the data movement capabilities (GridFTP servers) and<br />
several client capabilities (Globus, UberFTP, GSI OpenSSH, and MyProxy client tool updates).<br />
This activity started with GRAM5 Alpha testing in the summer of 2009 and culminates this<br />
quarter with production deployments by RPs and testing by various Science Gateways. This<br />
deployment activity will continue through 2011Q1.<br />
We are also delivering updates to TeraGrid’s Resource Integration capability (to enable<br />
publishing to the new Resource Description Repository and publishing TeraGrid storage resource<br />
information), our Science Workflow Support capability (updating Condor to version 7.4.4), and<br />
our Data Movement capability (adding support for TeraGrid’s long-term data archive replication<br />
mechanism). We also released the first official TeraGrid Visualization Software and Services<br />
(VTSS) registration capability enabling Viz resources to advertise visualization software<br />
information.<br />
3.4 Advanced Scheduling Capabilities<br />
SDSC improved the reliability of TeraGrid’s Automatic Resource Selection capability by adding<br />
persistent scheduling data to the TGCDB (central database). We also began work toward making<br />
MCP (one of our automatic resource selection tools) available from TeraGrid resource login<br />
nodes in addition to the TeraGrid user portal. To support MCP—and perhaps other applications—<br />
we reviewed the practice and gave approval to issue proxy credentials with up to 11-day<br />
lifetimes. We also developed a method for transferring MCP job files from the user portal to<br />
remote login hosts.<br />
3.5 Information Services Enhancements<br />
We took the opportunity this quarter to improve the overall accuracy and consistency of the data<br />
in our information services. TeraGrid’s information services are populated by data that is<br />
published independently by each RP, which creates opportunities for inconsistent terminology<br />
and uneven accuracy across systems. To measure the current quality of the data, we ran several<br />
data consistency sweeps, checking for inconsistent naming, versions of software that were<br />
inconsistent with documented expectations, and data that simply didn’t match our understanding<br />
of reality. The Software WG dedicated one meeting to reviewing these inconsistencies and<br />
assigning fixes to individual publishers, and we later checked that fixes had been made.<br />
We are now using TeraGrid’s information services to publish information from the TeraGrid<br />
Resource Description Repository (RDR). Each RP includes resource description information in<br />
RDR, and information services pulls that information and publishes it for unified presentation to<br />
users and staff.<br />
Led by personnel at RENCI, the Science Gateway team is beginning to publish data in our<br />
information services that enumerates and describes the Web services offered by science<br />
gateways. This is an important example of how non-TeraGrid <strong>org</strong>anizations (remember that<br />
science gateways are not operated by TeraGrid itself, but by science teams with independent<br />
funding) can use the TeraGrid infrastructure to disseminate useful scientific tools.<br />
3.6 Operational Issues<br />
The SDSC RP continued preparing the new Gordon resource for TeraGrid service, including<br />
preparing CTSS capabilities and software. The Gordon team reported that they would like to see<br />
more comprehensive and easier-to-use installation documentation for CTSS capabilities. The GIG<br />
software integration team is working to improve this documentation per this feedback.<br />
The TACC RP prepared Globus GRAM5 software for use on multiple resources at TACC (see<br />
“Capability Expansion and Maintenance” above), and they reported that it has been difficult to<br />
27
get GRAM5 from Globus 5.0.2 (the most recent update) to work uniformly on all of their<br />
resources, particularly in regard to parallel (MPI) jobs that use multiple processes. The Globus<br />
development team (funded separately from TeraGrid) has been working with TACC personnel to<br />
resolve these issues.<br />
CTSS’s remote login, data movement, application and parallel application development and<br />
runtime support, and TeraGrid core integration capabilities are all used on most TeraGrid<br />
systems. Use of the CTSS science gateways capability (which allows remote job submission by<br />
science gateways) and the remote compute capability (which allows remote job submission by all<br />
other users) varies widely at three sites, are used frequently at three sites, and are used<br />
infrequently at two sites. Only two resource providers offer the data visualization capability but<br />
both report infrequent use. The data management capability is used relatively infrequently at the<br />
two sites where it is available. The science workflow support capability is available on most<br />
resources and is used relatively infrequently, but each use results in many job submissions,<br />
making it difficult to compare to the other capabilities.<br />
4 Science Gateways<br />
4.1 Highlights<br />
2.2M CPU hours were used via community accounts this quarter, serving 942 gateway users via<br />
12 gateways. 32% of all users charging jobs to the TeraGrid this quarter did so using community<br />
accounts via a gateway. These figures are achieved primarily through the explosion of user<br />
interest in the CIPRES portal, with 751 users. The portal sees use throughout the US, including<br />
17 EPSCOR states and throughout the world. It is used in many classes, but also used in research<br />
resulting in publications in major journals such as Nature, Cell and the Proceedings of the<br />
National Academy of Sciences.<br />
Figure 4-1.Q4 Science Gateway usage by machine.<br />
28
Figure 4-2 Q4 usage by gateway.<br />
The Indiana University Science Gateway team worked with Prof. Borries Demeler and Dr. Emre<br />
Brookes of the University of Texas Health Science Center to transition their UltraScan science<br />
gateway infrastructure from TACC’s Lonestar to Ranger systems. During the current reporting<br />
quarter, UltraScan’s Ranger allocation usage went from 0 to 57,747 SUs, using the GFAC service<br />
the Open Gateway Computing Environments (OGCE) to manage its ultracentrifugation<br />
experimental data analysis and mask differences between Globus GRAM versions. This work<br />
also involved significant testing of the new GRAM5 services on Ranger that greatly accelerates<br />
TeraGrid’s deployment schedule.<br />
4.2 Targeted Support<br />
CIPRES.<br />
The CIPRES gateway can now support individual user allocations. Heavy users of the gateway<br />
can apply for their own allocation and access it directly through the same familiar gateway<br />
interface. With the high demand for the gateway, this is a method for extending the allocation, but<br />
will also to enable individual researchers to accomplish their own research goals unfettered by<br />
limited computational time..<br />
Tools were moved from Lonestar to Abe due to decommissioning of the temporary<br />
decommissioning of Lonestar during its upgrade. Improvements to gateway user interface include<br />
bulk file upload and download. Next quarter plans include porting codes raxml, mrbayes and<br />
mafft to the new Trestles system.<br />
Cactus.<br />
The basic configuration and simulation commands for SimFactory now work with Adcirc, a<br />
coastal circulation and storm surge model. SimFactory (http://simfactory.<strong>org</strong>) unifies middleware<br />
for remote access and job management. The additional commands --adcprep-schedule, --enableadcprep-submission,<br />
--adcprep-walltime were added to address Adcirc's specific needs.<br />
29
Several additional capabilities have been developed for SimFactory. Instead of submitting<br />
directly to a queue, it is now possible to request an interactive job using Run/Interactive submit<br />
feature of SimFactory, and then to manually run the job inside the interactive shell. This<br />
facilitates running on private clusters which don't need a queuing system. It is also now possible<br />
launch Totalview on TeraGrid machines Queenbee, Abe, Kraken, and local machine Eric using<br />
SimFactory. This relies on the run/Interactive submit feature. DDT debugging support will be<br />
available next quarter.<br />
Advance Reservation will be available soon on Queenbee, or other machines that use Moab.<br />
GridAMP.<br />
This gateway has been taking a leadership role working with sites that are using the community<br />
shell commsh. Commsh restricts the directories where commands can be accessed, thus adding a<br />
level of security. GridAMP is working with Kraken on their commsh implementation.<br />
Maintenance work next quarter includes rewriting a daemon to use GRAM2/5 instead of WS-<br />
GRAM. The team is also considering a ssh/gsissh based method like the CCSM Portal to utilize<br />
the PBS scheduler’s “job chaining” ability.<br />
GridAMP staff have also co-authored a paper with security staff entitled “Securing TeraGrid<br />
Gateways”. The paper includes a survey of TeraGrid resource providers on their approaches to<br />
securing community accounts and lays the groundwork for a more standardized approach<br />
throughout the program. This work is described in more detail in the Gateway Infrastructure and<br />
Services section below.<br />
The gateway has automated TeraGrid central database (TGCDB) accounting feedback using<br />
REST-based services developed by gateway accounting staff (section 4.3). Future plans include<br />
updates to use new (January 2010) versions of the interface. The accounting data from TGCDB<br />
now feeds back all the way to the gateway for examination by the individual researchers and the<br />
PIs. Job lifecycle is tracked from submission through execution and completion, and now all the<br />
way through the accounting record. Every job is tracked from start to finish, and every SU is<br />
accounted for.<br />
PET.<br />
Work begins on a gateway prototype, leveraging efforts of a new NCAR-funded software<br />
engineer for science gateway development. First, the project will focus on automation of the<br />
execution portion of the workflow, allowing execution on grid-based and remote resources.<br />
Community members will be able to upload full experiment definitions, and the system will<br />
automate the execution of the models and return results to the researcher. Once this works, it will<br />
be used as the back-end for the new gateway work. The gateway web development is separately<br />
funded by NCAR. This will take the grid-automated workflow and overlay an interface that<br />
allows web-based users to run selected prototype experiments but vary the input parameters<br />
within constraints.<br />
The PET model is a global economy model that simulates the action of various sectors of the<br />
economy for regions of the world. Economic inputs and outputs are balanced in a region and<br />
between regions through trade, given prespecified growth rates of things like population.<br />
The goal of the gateway project is to make it possible for people to experiment with how<br />
changing parameters changes the behavior of the economy. For example, how changing the<br />
interest rate, the savings rate, the capital tax rate, or introducing carbon taxes in a certain region<br />
affects sectors of the economy such as energy resource use and capital savings.<br />
The desired outcome is to have a gateway that can be quickly used to help address the "but what<br />
if" comments that always arise when people encounter results from a model such as this. Upon<br />
30
hearing about the experiments, many people start off by saying "Well, this is interesting, but what<br />
if (assumption) was (some slightly different value) instead of what you used" The gateway<br />
should allow users to experiment with moderately computationally intensive experiments to<br />
address these questions.<br />
This work is in the early prototype stage.<br />
Social Informatics Data Grid (SIDGrid).<br />
The team has improved portability of existing gadgets (tunneling requests through container;<br />
gadget-gadget communication using HTML5 facilities), and tested these in an iGoogle container.<br />
Client-side script and server-side support have been developed to simplify the introduction of<br />
new applications into the portal. Work began to expose multiple local and remote compute<br />
resources through portal.<br />
SIDGrid developer Tom Uram presented demonstrations of his work at the Argonne booth at<br />
SC10 and also served as a committee member for the Gateway Computing Environments<br />
workshop the conference.<br />
Support continues for OOPS (protein simulation) users of portal, including the addition of new<br />
scripts to the portal.<br />
Next quarter plans include implementation of a data management front-end to allow users to<br />
upload, manage, and process data using gadget-based applications defined in the portal.<br />
Alternatives will be investigated for managing applications available to portal, including<br />
integration with source code management systems used by scientists. Work will continue on<br />
gadget portability, including testing in third-party containers such as iGoogle and OGCE.<br />
Approaches will be evaluated for single sign-on in gadget context so that users won’t have to<br />
authenticate with each gadget in a container page.<br />
Support for groups using the OOPS (Open Protein System) portal will continue.<br />
GCE Workshop.<br />
The IU Science Gateway team <strong>org</strong>anized the sixth Gateway Computing Environments (GCE)<br />
workshop at Supercomputing 2010. This year’s workshop featured presentations of 13 peerreviewed<br />
papers to be published in IEEE Digital Proceedings. The one day workshop was<br />
attended by approximately 50 people. Presentations and additional information are available<br />
from the workshop website, http://www.collab-ogce.<strong>org</strong>/gce10/index.php/Main_Page.<br />
UltraScan.<br />
The Indiana University Science Gateway team worked with Prof. Borries Demeler and Dr. Emre<br />
Brookes of the University of Texas Health Science Center to transition their UltraScan science<br />
gateway infrastructure from TACC’s Lonestar to Ranger systems. This involved removing<br />
UltraScan dependencies on Globus GRAM4 and significant testing of Ranger’s new GRAM5 job<br />
manager (documented in previous quarterly reports). UltraScan now uses the GFAC service<br />
developed by the Open Gateway Computing Environments (OGCE) to manage its<br />
ultracentrifugation experimental data analysis. GFAC’s capabilities include masking the<br />
differences between GRAM versions. During the current reporting quarter, UltraScan’s Ranger<br />
allocation usage went from 0 to 57,747 SUs. The IU team is also assisting the UltraScan team as<br />
it develops its next generation software (UltraScan3, led by Dr. Bruce Dubbs). Both UltraScan2<br />
and UltraScan3 use IU’s Gateway Hosting Service for significant testing and development. IU<br />
work was led and performed by Suresh Marru and Raminder Singh, with assistance from<br />
Patanachai Tangchaisin (student intern). As part of this overall effort, the Indiana University<br />
Science Gateway team worked closely with Globus developers and TACC & LONI system<br />
31
administrators in testing and debugging problems for Gram5 deployments and job manager<br />
integration. The IU team, through its testing and bug fixing, has assisted the TeraGrid Software<br />
Working Group in rolling out the new Gram job submission into the CTSS release.<br />
In the upcoming quarter, the advanced support will focus on resource scheduling and job queue<br />
predictions.<br />
BioVLAB.<br />
BioVLAB is a set of projects led by Prof. Sun Kim at IU that investigates problems in cancer<br />
research that can be realized as bioinformatics workflows. Kim implements the workflows using<br />
the OGCE workflow suite of tools (including the GFAC, the XBaya workflow composer, and the<br />
Workflow Interpreter service). Prototyping work is done on Amazon EC2 cloud infrastructure,<br />
but the size of the full research problems requires the use of the TeraGrid. Suresh Marru of the IU<br />
Science Gateway team assisted Prof. Kim to request and obtain a TeraGrid allocation (reported<br />
last quarter). During the upcoming quarter, Marru is working with the Dr. Kim’s graduate<br />
student to support long running applications and conditional loops within the workflows.<br />
Ocean Land Atmosphere Simulation (OLAS).<br />
The IU Science Gateway Team has worked with Mike Lowe in acquiring new data storage<br />
hardware to meet the real-time data needs for the OLAS gateway for hurricane simulation. Suresh<br />
Marru from the gateways team has setup quarry gateway hosting node for OLAS gateway. The<br />
Unidata LDM data ingesting tools are deployed to fetch real time data streams from National<br />
Hurricane Center and other NOAA sources. The PI of the project Dr. Craig Mattocks is in a job<br />
transition from University of North Carolina Chapel Hill. This project will slow down in the<br />
upcoming quarter.<br />
GridChem/ParamChem.<br />
The IU gateways team including the student interns Ye Fan and Patanachai Tangchaisin has built<br />
an OGCE-GridChem bridge services and have integrated the OGCE workflow suite with the<br />
GridChem Middleware services. As reported in the previous year, the gateways team has<br />
provided advanced support to the GridChem gateway to enhance the single job submission to do<br />
coupled workflows. The newly developed OGCE-GridChem bridge has closed the loop on the<br />
workflow integration. Now, the GridChem users will be able to monitor workflow progress<br />
directly from the GridChem clients. Within the current quarter, the gateway team will assist<br />
Sudhakar Pamidigantam and Jayeeta Gosh from GridChem to incorporate the new developments<br />
into the upcoming release.<br />
Neutron Science TeraGrid Gateway.<br />
The ORNL Neutron Science TeraGrid Gateway (NSTG) has spent a significant amount of time<br />
during this quarter continuing its outreach to the neutron science community, specifically at Oak<br />
Ridge. The NSTG is working with the neutron science directorate at Oak Ridge to transition the<br />
gateway to full community operations and support of the gateway and NSTG RP/SP on a<br />
TeraGrid contributory basis moving forward. This quarter built upon the prior quarters successful<br />
physical infrastructure move to the Spallation Neutron Source (SNS) sight. Labor effort was spent<br />
on building SNS in house familiarity with TeraGrid operations and culture. This has been a<br />
challenge but is not without progress. The effort is continuing. Additional effort has been spent<br />
developing and presenting strategies that both explain the mutual benefit and enable the<br />
continuation of the current activities and the realization of opportunities not yet explored. In<br />
particular, the ability to move data at high bandwidth and low latency from the scientific<br />
instruments to large scale cyberinfrastructure facilities is understand and appreciated by the SNS<br />
and will, most likely, form the core of the continuing effort.<br />
32
John Cobb presented the capstone talk at the NOBUGS 2010 conference describing the NSTG<br />
activities and integration with SNS as a “Best practice” for the integration of national<br />
cyberinfrastructure with an experimental user facility. Cobb was also on the conference<br />
<strong>org</strong>anizing committee.<br />
GISolve.<br />
This team continues to explore new applications and communities for high performance<br />
computing. Recent interactions have been with the land use science community. Future plans<br />
include working with a public health community. Several publications from the group are listed<br />
in this report’s appendices.<br />
CCSM/ESG.<br />
The joint team is making progress as planned in all three major project areas. Next quarter,<br />
overarching goals include completion of work on software licensing and the completion of phase<br />
I of the prototype CESM1 gateway. This will support running CESM1 simulations on Steele,<br />
automatic production of attributes, and automatic publishing of model output to ESG.<br />
Task 1 is to provide interfaces for remote model run invocation. CESM1 is being tested on Steele<br />
at Purdue. The team has partially resolved the MPI communication problem, but there are still<br />
some problems scaling up. CESM-WS web service interfaces have been deployed to a production<br />
server, including bug fixes and performance improvements. Incremental changes support new<br />
requirements from the CESM1 gateway client. The CESM1 gateway directly invokes the CESM-<br />
WS interfaces and currently supports case creation/configuration/submission and status tracking<br />
Next quarter the team will address the CESM1 scaling issues, complete model validation and add<br />
support for data access and publishing of model output from the prototype CESM1 gateway to<br />
ESG.<br />
Task 2 is to publish Purdue CCSM archive data to ESG. Memory of the Purdue ESG data node<br />
has been upgraded. The FUSE interface to the ESG test gateway has been tested and successfully<br />
published (manually) an iRODS CCSM3 data collection through this interface.<br />
Next quarter the team will automate publishing of model output from the CESM1 gateway to<br />
ESG test gateway.<br />
Task 3 is to collect and publish CESM1 model data and metadata back to ESG. Work continues<br />
on instrumenting CESM such that it can produce a METAFOR-compliant CIM document<br />
(WCRP/CMIP5 metadata). Support was added for outputting ESMF attributes from the CESM<br />
components and driver, and the attribute suite was further analyzed. Tested the system, and<br />
provided code back to CESM team for integration. The team completed development of an initial<br />
version of an ATOM feed for ingesting CIM documents into ESG. Integration of the feed was<br />
tested with ESG, publishing model metadata back into it.<br />
Next quarter the team plans to review metadata content and completeness by Purdue, add code to<br />
CESM to automatically set attributes from within CESM code, integrate as CESM tagged release,<br />
and test at Purdue. Packaging of the ATOM feed will be refined and released in a V1.0.0 open<br />
source version. Basic installation and usage documentation will be completed and deployed at<br />
Purdue.<br />
Purdue Environmental Data Portal.<br />
The Environmental Data Portal has integrated three new data sets, the Indiana precipitation<br />
dataset, St. Joseph watershed water quality and stream flow data and a historical time series of<br />
average precipitation and temperature data for Midwest.<br />
33
Purdue has also created a data publishing web application so individual researchers can self<br />
publish and share their datasets. Both file based and table based data are supported. Completion is<br />
expected next quarter.<br />
The Springboard hub extends access to HPC beyond the command line. Access to Purdue’s<br />
Wispy cloud have been integrated and version 1 of Cloud Dashboard, a user interface hosted in<br />
Springboard allows TeraGrid users to launch, control and run applications on Wispy,. This was<br />
demonstrated at CloudCom in December 2010. Springboard has also been upgraded with the<br />
latest HUBzero software.<br />
4.3 Gateway Infrastructure and Services<br />
Accounting Tools.<br />
A new job monitoring portlet has been deployed in the TeraGrid user portal and is running at all<br />
sites. The portlet includes start time predictions for queued jobs. The User Profile Service caches<br />
job information from the TeraGrid central database. This includes an expanded search interface, a<br />
view of real time jobs, attribute support and performance improvements, responding to user<br />
requests.<br />
Improved standardization of community accounts.<br />
A survey of Teragrid RP sites regarding gateway security items has been completed. Results are<br />
being written up in a white paper.<br />
Security work also includes installation and testing of GRAM5 and its interoperation with<br />
commsh. The discovery of incomplete configuration instructions has resulted in rerunning tests<br />
for the while paper. Subsequently, improvements have been made to configuration instructions as<br />
well as GRAM5 installation instructions.<br />
Next quarter, several projects will be completed - GRAM4 installation on Kraken (including<br />
integration with commsh and gramAudit and testing with the GridAMP gateway) commsh<br />
documentation updates. and the white paper on securing TeraGrid science gateways<br />
Work will begin to convert the GridAMP gateway to GRAM5 on Kraken.<br />
Virtual machine support.<br />
During this quarter, Indiana University added virtual machines to support failover for the<br />
TeraGrid Mobile Portal and TeraGrid Forum.<br />
SimpleGrid.<br />
Job management and visualization Open Social gadgets have been developed for SimpleGrid this<br />
quarter. These were demonstrated at SC10. SimpleGrid gadgets allow scientists to use TeraGrid<br />
in iGoogle and other open social containers, contributing to broad use and easy integration of<br />
TeraGrid capabilities through social networking.<br />
SimpleGrid online courses are being developed in CI-Tutor, which is used extensively in<br />
TeraGrid’s education, outreach and training activities. Previous online courses have been<br />
extended to include more examples and better instructions.<br />
Next quarter, the team will deploy SimpleGrid as a cloud resource for new science gateway<br />
development support. Users will be able to request a virtual machine image with SimpleGrid<br />
installed.<br />
Helpdesk support.<br />
The gateway helpdesk team continued to help new gateway developers including some from<br />
National University through the use of the SimpleGrid tutorial.<br />
34
Work initiated by a helpdesk ticket from the CMMAP gateway has evolved into policy<br />
discussions which have had positive security implications. This work may to improved<br />
standardization across Resource Providers in the treatment of community accounts. Helpdesk<br />
staff also provided advice to CMMAP on the setup of a gateway-side GridFTP server.<br />
A helpdesk ticket inquiring about support for a socket-based application with web services from<br />
NCAR has resulted in an initial meeting with the security-wg to discuss the usage model and<br />
determine whether it can be supported by any Resource Providers.<br />
The helpdesk staff maintain a FAQ at http://gw3.quarry.iu.teragrid.<strong>org</strong>/simplegrid2/docs/faq/<br />
which will be incorporated into the TeraGrid Knowledge Base and documentation.<br />
4.4 RP Operations: SGW Operations<br />
During this quarter RPs worked with developers and providers of Scientific Gateways on specific<br />
RP resources in order to ensure that operation of each gateway is secure and that there is a means<br />
for tracking accountability for gateway usage.<br />
All labor funding in the extension period comes through the GIG and so those activities are all<br />
captured in the above sections.<br />
5 Users and User Support<br />
User support in the TeraGrid comprises interlocking activities in several project areas:<br />
documentation, allocations and accounting services in the User Facing Projects and Core<br />
Services area (Section 6 of this report); the TeraGrid Operations Center in the Networking,<br />
Operations and Security area (Section 8); user outreach and training in the Education, Outreach<br />
and Training/External Relations area (Section 10); and the Frontline User Support and Advanced<br />
User Support services described in the current section. In addition, specialized user support<br />
services are provided by staff from the Science Gateways (Section 4) and Data and Visualization<br />
(Section 7) areas. The synergy of all these aspects of user support is coordinated by the User<br />
Interaction Council (UIC) which consists of the GIG Director of Science and the directors of the<br />
above mentioned areas.<br />
5.1 Highlights<br />
The User Support (US) and Advanced User Support (AUS) staff continued their highly rated<br />
services for TeraGrid users at every level. As needed, US and AUS staff collaborated with the<br />
Data and Visualization, Science Gateways, EOT and UFC staff to provide comprehensive support<br />
for TeraGrid users.<br />
The UIC and the Extreme Scalability Working Group initiated the planning of the 5 th annual<br />
TeraGrid/Blue Waters Workshop. This will be held in Pittsburgh, PA April 14-15, 2011 with the<br />
topic “Data-Intensive Analysis, Analytics, and Informatics”.<br />
Successful collaborations between various Advanced User Support staff and TeraGrid PIs<br />
resulted in joint journal publications (Phillip Blood, PSC) and conference presentations (Wayne<br />
Pfeiffer, SDSC; Raghu Reddy, PSC), including Gordon Bell finalists’ presentation (Yifeng Cui,<br />
Amit Chourasia, SDSC) at SC10. One of the EOT highlights, in this quarter, was teaching of an<br />
Introductory HPC workshop at Puerto Rico (University of Puerto Rico at Rio Piedras and<br />
University of Puerto Rico at Mayaguez) in December. Multiple AUS staff such as Marcela<br />
Madrid (PSC), Mahidhar Tatineni (SDSC), Amit Majumdar (SDSC), Sergiu Sanieleveici (PSC)<br />
participated in this workshop at Puerto Rico.<br />
35
User support activities focused on the introduction of the new systems to be fielded by several<br />
RPs. Friendly use of Blacklight at PSC has already resulted in remarkable scientific progress,<br />
notably by Dr. Sarah Young from the University of Arizona and the Broad Institute of Harvard<br />
and MIT, working with PSC senior scientific specialist Dr. Phillip Blood. This group, led by Dr.<br />
Steve Rounsley (U. Arizona), is comparing different assembly techniques, codes, and genomes on<br />
Blacklight using simulated genome data, in order to inform the genomics community of the<br />
optimal computational strategies to run their assemblies. As Blacklight has been built up, they<br />
have run assemblies of up to 600 megabases (the S<strong>org</strong>hum genome) with various data sets and<br />
levels of coverage using the assembly codes ABySS, SOAPdenovo, and Velvet. Early access to<br />
Blacklight has enabled them to quickly try different assembly parameters and different assembly<br />
codes to find optimal ways of assembling these large genomes. They are currently working on<br />
analyzing the data they have generated. They publish the results of their study for the community<br />
at http://www.plantagora.<strong>org</strong>/ .<br />
SDSC user services staff members installed software, ran benchmarks, and created a user guide in<br />
preparation for putting the new Trestles resource into production. The machine was made<br />
available for friendly use in early December. Detailed benchmarking was performed for several<br />
applications, including AMBER, NAMD, ABAQUS, Velvet, SOAPdenovo, and ABySS and the<br />
final results will be made available online. They also continued to enhance software and services<br />
on Dash. A new framework was developed to enable users to run Hadoop on Dash and the<br />
service is currently being tested by friendly users. In addition, the production vSMP software was<br />
upgraded and tested. Early user experiences with Dash were presented at the “Grand Challenges<br />
in Data-Intensive Discovery” conference <strong>org</strong>anized and hosted by SDSC in October 2010.<br />
We have also focused on assisting potential and new TeraGrid users, whom we recruit in<br />
increasing numbers and who present novel challenges different from those of the traditional HPC<br />
community. For example, Purdue support staff showed a potential TeraGrid user that it is<br />
possible to run his Delphi code on a Windows environment via the Condor Pool. They assisted<br />
another user to run his software on the Condor Pool under Linux instead of the initial request of<br />
running on Windows; and helped another to explore whether he could install and use Numerical<br />
software on TeraGrid machines.<br />
5.2 User Engagement<br />
User engagement is done by various means. Under the User Champions program, RP consultants<br />
are assigned to each TRAC award right after the results of an allocations meeting become known.<br />
The assignment takes place by discussion in the user services working group, taking into account<br />
the distribution of an allocation across RP sites and machines, and the affinity between the group<br />
and the consultants based on expertise, previous history, and institutional proximity. The assigned<br />
consultant contacts the user group as their champion within the TeraGrid, and seeks to learn about<br />
their plans and issues. We leverage the EOT area’s Campus Champions program (Section 10) to<br />
fulfill this same contact role with respect to users on their campuses, especially for Startup and<br />
Education grants. Campus Champions are enrolled as members of the user services working<br />
group, and thus being trained to become “on-site consultants” extending the reach of TeraGrid<br />
support. To help users efficiently scale their codes and ancillary data processing flows to the tens<br />
of thousands of processes (on Ranger and Kraken), the Advanced User Support and User Support<br />
areas jointly operate the extreme scalability working group. Led by Nick Nystrom (PSC), this<br />
team brings together staff experts from all RP sites, external partners working to develop<br />
petascale applications and tools, as well as TeraGrid users and awardees of the NSF PetaApps,<br />
SDCI and STCI programs.<br />
Some examples of issues raised by users and actions taken in response are as follows. In the Fall<br />
2010 NICS User Survey, several respondents noted that the weekly NICS Capability (full-<br />
36
machine) runs limit use of the system by smaller jobs. In response, NICS has decided to target the<br />
new cores that will be added to Kraken (15% of the total) to such jobs.<br />
At PSC, suggestions and requests made by production users of Pople and by friendly users of<br />
Blacklight resulted in the installation of new versions of R, AMBER and LS-DYNA; making<br />
Gaussian easier to run; issues encountered when launching Gaussian jobs via the GridChem<br />
Science Gateway; and improving the instructions to change the PSC password for users who<br />
prefer to log into PSC systems directly.<br />
In response to users’ request to set up and run Python on Steele, Purdue staff created and<br />
documented specific examples. They also documented how to run codes in the Condor pool using<br />
static libraries, and how to run RMPI on Steele. The latter was done based on requests from the<br />
Campus Champions group.<br />
5.3 Frontline User Support<br />
In this section we describe the frontline user support provided by TeraGrid user support staff. In<br />
addition, TeraGrid provides online documentation, a knowledgebase and training support for<br />
“self-serve” help and this is explained in detail in §6. By means of the UIC, Frontline and<br />
Advanced support functions are operationally integrated with the TeraGrid Operations Center<br />
(TOC) and the “self-serve” online help team.<br />
The TOC help desk is the first tier of user support for the TeraGrid. It is staffed 7x24 at NCSA.<br />
All TeraGrid users are asked to submit problem reports to the TOC via email to<br />
help@teragrid.<strong>org</strong>, by web form from the TeraGrid User Portal, or via phone (866-907-2382).<br />
The TOC creates a trouble ticket for each problem reported, and tracks its resolution until it is<br />
closed. The user is automatically informed that a ticket has been opened and advised of the next<br />
steps, as well as of a ticket number enabling follow-up.<br />
If a ticket cannot be resolved within one hour at the TOC itself, it gets passed on to the second<br />
tier of user support. It is referred to the RP where the problem has been experienced, or, if the<br />
problem appears to be TeraGrid-wide, it is reported to the full user services working group.<br />
There, it is assigned to a user consultant who begins by discussing the matter with the user, in<br />
order to narrow the diagnosis and to understand if the issue lies with the user’s application, with<br />
the RP or TeraGrid-wide infrastructure, or both. Tiger teams are formed by the user services<br />
working group as necessary, to diagnose and solve complex problems that implicate several RP<br />
sites or fields of expertise. The personal contacts assigned to TRAC awarded projects (via the<br />
User Champions program), as well as the Campus Champions, proactively seek to develop<br />
porting and optimization plans with their assigned users. They act as proxies for “their” users<br />
with respect to managers, systems administrators and vendors at their local RP site and (via the<br />
user services working group) at other RP sites, ensuring support for any special arrangements that<br />
may be required for debugging, testing and benchmarking new or upgraded user codes.<br />
As needed, consultants may request the assistance of advanced user support staff (the third tier of<br />
user support), systems, or vendor experts. The immediate goal is to ensure that the user can<br />
resume his or her scientific work as soon as possible, even if addressing the root cause requires a<br />
longer-term effort. When a proposed root cause solution becomes available, we contact the<br />
affected users again and request their participation in its testing. When a user’s problem requires<br />
active assistance from a M.S.- or Ph.D.-level staff exceeding approximately one person-month of<br />
effort, the issue is referred to the third tier of user support, i.e. for Advanced User Support<br />
described in §5.5 following the guidance of the allocations process. In terms of experience and<br />
qualifications, frontline support consultants range from the B.S. level to the Ph.D. level, with<br />
many of the latter dividing their effort between frontline and advanced user support.<br />
37
The framework for coordinating frontline support across the TeraGrid is the user services<br />
working group which assembles staff from all the RP sites under the leadership of the Area<br />
Director for User Support. It enables each site’s staff to leverage the experience and expertise of<br />
all others, and a concerted approach to solving cross-site user problems based on sharing and<br />
adopting best practices. The working group functions by means of an email list and weekly<br />
conference calls. It coordinates the User Champions and Campus Champions programs that, as<br />
we mentioned above, provide user engagement at the personal level. By involving the Champions<br />
in resolving tickets submitted by “their” users, we can improve the promptness and quality of<br />
ticket resolution, and often prevent problems from occurring in the first place, or from recurring<br />
repeatedly.<br />
5.3.1<br />
Helping users port software into a TeraGrid environment<br />
In table 5.1 below, we provide some instances of assistance provided by our consultants to enable<br />
or improve users’ work on TeraGrid resources, which did not rise to the one FTE-month<br />
involvement level that defines “advanced” user support.<br />
Table 5.1. Examples of Frontline User Consulting Porting Software on TeraGrid Resources<br />
PI Name<br />
Juri Toomre, U.<br />
Colorado<br />
Yannick Pouquet,<br />
NCAR<br />
Paul Hargrove, UC<br />
Berkeley<br />
Noah Smith,<br />
Carnegie Mellon U.<br />
Paul Reynolds, U.<br />
Virginia<br />
Sameer Shende, U.<br />
Orgeon<br />
RP(s)<br />
Involved<br />
PSC,<br />
NICS<br />
PSC,<br />
NICS<br />
PSC<br />
Name of<br />
software<br />
SLASH,<br />
gridFTP<br />
MHD Code<br />
Berkeley<br />
UPC<br />
Nature of improvement<br />
End-to-end transfer of 3TB from PSC archive to<br />
NICS<br />
Troubleshoot problem with the code and help them<br />
to benchmark it on Kraken<br />
Support for development of the Berkeley UPC<br />
compiler during the build-up of Blacklight<br />
PSC Java Investigated and solved performance problems<br />
during the build-up of Blacklight<br />
PSC Java Installed OpenJDK and solved performance<br />
problems during the build-up of Blacklight<br />
PSC TAU Tested new and advanced features during the buildup<br />
of Blacklight<br />
Nick Wright, UCSD PSC IPM Solved performance problems during the build-up of<br />
Blacklight<br />
Mark Ellison, Ursinus<br />
College<br />
PSC Gaussian PI estimated his Gaussian job would take 800 hours<br />
to complete on Pople and he sought help speeding<br />
up the calculation. PSC consultant determined the<br />
optimal parameters for running this Gaussian<br />
calculation, speeding up the job by ~3.5x<br />
38
PI Name<br />
Joseph Hargitai,<br />
Albert Einstein<br />
College of Medicine<br />
Catherine Cooper,<br />
Washington State U.<br />
RP(s)<br />
Involved<br />
Name of<br />
software<br />
Nature of improvement<br />
SDSC Hadoop SDSC staff worked to integrate Hadoop with the<br />
global filesystem, local SSDs, and the scheduling<br />
infrastructure. The PI is currently testing the<br />
implementation.<br />
TACC Underworld TACC staff member Yaakoub El Khamra worked<br />
with the research team and the code developer to<br />
develop a version of the code that works on Ranger.<br />
This version of the code is the bleeding-edge<br />
version and modules are still being modified and<br />
tested. The PI is planning to submit an ASTA<br />
request in order to received continued support.<br />
Neutron Science<br />
TeraGrid Gateway<br />
ORNL Amber Consulted with user on desired Amber version.<br />
Initiated arrangements to install Amber11 on NSTG<br />
cluster<br />
Given the strong interest by our user community in utilizing the potential of GPGPU systems, it is<br />
noteworthy to mention, in particular, the assistance given by TACC staff members to three<br />
research groups on Longhorn:<br />
5.3.2<br />
• Lucas Wilcox (PI Omar Ghattas) ran a GPU-accelerated discontinuous Galerkin (DG)<br />
seismic wave propagation code on up to 478 GPUs of longhorn, achieving excellent<br />
strong scaling and near-perfect weak scaling. His largest run on 478 GPUs used a mesh<br />
with 6 billion nodes and 56 billion degrees of freedom.<br />
• Tom Fogal (PI Hank Childs) ran a GPU-accelerated volume rendering code on an 8192-<br />
cubed helium flame combustion dataset on 256 GPUs. This is among the largest data<br />
ever to be directly visualized, and this GPU-based code achieved performance on 256<br />
GPUs that would require 100,000s of CPU cores to match (they performed a similar run<br />
on Jaguar for comparison).<br />
• David LeBard ran a GPU-accelerated fast-analysis molecular dynamics code on 128<br />
GPUs of Longhorn, achieving between 10x to 100x over CPU-based versions.<br />
More Frequent User Support Issues/Questions<br />
Among the system or site specific user issues referred to the RPs for resolution, the most frequent<br />
had to do with login/access and account management issues (e.g. TeraGrid Portal password<br />
versus resource specific passwords, password resets, security credentials, adding and removing<br />
users, locating environment variables, allocations on machines leaving and joining the TeraGrid).<br />
In decreasing order of frequency, issues with job queuing (e.g. which queues to use for various<br />
types of jobs, throughput issues, scripting and execution issues); software availability and use<br />
(finding, setting up, building, optimizing and running); system availability and performance<br />
problems (e.g. memory limit questions, compute node failures, I/O timeouts, requests to kill<br />
jobs); file systems (e.g. permissions and quotas, corrupted files), and wide area data transfer and<br />
archive systems questions were also encountered.<br />
39
5.3.3<br />
Trouble Ticket Statistics<br />
Table 5.2 shows how long tickets remained open during this quarter, that is, how long it took for<br />
TeraGrid user support staff to provide a diagnostic, workaround or solution.<br />
Table 5.2. Number of Q4, 2010 tickets that remained “open” for the indicated time bins.<br />
Open for 1 hour, 1 day, 2 days, 1 week, 2 weeks, 4 weeks 140<br />
5.4 RP Operations: User Services<br />
5.4.1<br />
Consulting Services<br />
Ongoing daily operational user support was conducted at the RPs. Consulting is the second tier of<br />
TG’s three-tier user support, with the first tier being the TG Operations Center and the third tier<br />
being the TG Advanced User Support. TG RP Consulting staff provide real-time, frontline<br />
support for their local RP’s resources in response to users’ requests to the TeraGrid Operations<br />
Center and RP consulting systems. This support includes assistance such as allocations issues,<br />
initial login issues for new users, debugging jobs, data transfer, and compiler support. Consulting<br />
support typically engages a user for a relatively short period of time, and may pass a user on to<br />
Advanced Support if longer-term assistance is required. Some examples of site-specific assistance<br />
achievements and their impact on users’ research progress follow.<br />
PSC has operated a friendly user program for community participation in successive stages of<br />
building up Blacklight, its very large shared memory SGI Altix UV system scheduled to enter<br />
TeraGrid production in January 2011. The PSC consulting staff has worked with 31 friendly user<br />
groups, with a focus both on ensuring good performance for existing applications, and on<br />
fostering novel and innovative research projects that depend on the unique capabilities of<br />
Blacklight. Examples of the latter include: assembling the Rhesus macaque genome; assembling<br />
the S<strong>org</strong>hum genome and other projects related to the iPlant Collaborative; model fitting and<br />
optimization for machine translation and semantic parsing; and network analysis. Special effort<br />
has been devoted to programming environments that have not been traditionally supported on<br />
supercomputing systems, such as Java, Python and R, as well as to the deployment of UPC and of<br />
code performance analysis tools.<br />
In support of Jacobo Bielak (Civil and Environmental Engineering, Carnegie Mellon University),<br />
John Urbanic (PSC) developed analysis procedures using MATLAB to quantify differences<br />
produced by different algorithms and their implementations, for example between 64-bit and 32-<br />
bit floating-point versions, in the Hercules seismology code. Urbanic is now completing the<br />
production version of the single/double precision selectable code, along with updating the rupture<br />
code and addressing coordinate transforms that other groups are using.<br />
There are a number of ongoing support efforts by the Purdue RP staff to help current and<br />
potential TG users. They worked with Brian Henry (potential TeraGrid user – University of<br />
Illinois - Chicago) to investigate which system to use for access to Fluent as well as if it would be<br />
possible to run his Delphi code on a Windows environment via the Condor Pool. They assisted<br />
Charles Moseley, Southwestern Oklahoma State University, looking into how to run specific<br />
software on the Condor Pool under Linux instead of the initial request of running on Windows;<br />
40
helped Robert Coridan (UIUC) to explore wehteher he could install and use Lumerical software<br />
on TeraGrid machines.<br />
SDSC staff worked on developing software stack, documentation, and user environment for the<br />
new Trestles resource. Extensive benchmarking was performed for several applications including<br />
AMBER, NAMD, ABAQUS, Velvet, SOAPdenovo, and ABySS. User services staff also ran<br />
synthetic benchmarks like HPCC on the system. The high performance linpack (HPL) result was<br />
submitted to the top 500 list (Trestles is at #111) and also used to compute the SU conversion<br />
factors with other TG machines (TG documentation has been updated with this information).<br />
SDSC staff worked with early users to port their codes and run jobs in the friendly user phase of<br />
the operations. SDSC user services also assisted in the integration of the lustre parallel filesystem<br />
(Data Oasis) and conducted extensive I/O testing and benchmarking from Trestles. SDSC staff<br />
worked to develop a framework to run Hadoop via the PBS scheduler, and using persistent global<br />
storage in combination with local SSD storage on Dash. This setup is currently being tested by<br />
friendly users. SDSC hosted the “Grand Challenges in Data-Intensive Discovery” conference in<br />
October, 2010. As part of the conference, user services staff presented and participated in a panel<br />
session detailing early user experiences with the Dash system.<br />
Under the User Champions program (section 5.2), TACC staff contacted 68 PIs with new or<br />
renewal awards granted for the allocation period beginning October 1, 2010. Responses were<br />
received from 10 PIs and TACC staff members resolved any user issues or referred them to the<br />
appropriate site for resolution. TACC also initiated contact with 87 “startup” principal<br />
investigators whose allocations began in September, October, and November of 2010. Responses<br />
were received from 24 PIs and TACC staff members resolved any user issues or referred them to<br />
the appropriate site for resolution. TACC staff continued to facilitate special user requests,<br />
including resource usage outside normal queue limits, high priority in order to meet deadlines,<br />
and special allocation requests to prevent user research projects from coming to a halt.<br />
5.4.2<br />
Helpdesk<br />
During this quarter, TG RPs provided local helpdesk first-tier support that integrates with the<br />
TeraGrid Operations Center. This support works to provide rapid response email and phone<br />
responses for questions and trouble reports from the user community. Helpdesk staff direct users’<br />
queries to Consulting or Advanced user support in cases where second-tier or third-tier expertise<br />
is required.<br />
5.5 Advanced User Support<br />
Advanced User Support (AUS) staff are located at the RP sites and are coordinated by the AUS<br />
Area Director. The highest level of long-term user support for a TeraGrid user or user group is<br />
provided by the AUS staff. The overall advanced support efforts are categorized by three subefforts:<br />
Advanced Support for TeraGrid Applications (ASTA), Advanced Support for Projects<br />
(ASP), and Advanced Support for EOT (ASEOT).<br />
Operations of the AUS area is coordinated by the AUS Area Director jointly with the AUS Point<br />
of Contacts (POC) who are in most cases the managers or group leaders of the scientific<br />
computing applications groups at each of the RP sites. All the management and coordination<br />
issues such as analyzing TRAC ASTA review reports, contacting PIs for further understanding of<br />
ASTA plans, matching of appropriate AUS staff to ASTAs, discussing/initiating new ASPs,<br />
matching of AUS staff to ASPs, discussing ASEOT topics, preparing program plan and annual<br />
reports, etc., are handled by this group. They hold biweekly teleconferences for the management<br />
and coordination issues and continue discussions via email as needed.<br />
On alternate bi-weeks a technical tele/web-conference is hosted using ReadyTalk for all the AUS<br />
technical staff as well as the AUS POCs. At this technical tele/web-conference 1-2 technical<br />
41
presentations are made by AUS technical staff discussing progress of ASTA projects, progress of<br />
ASPs, and any other technical issue that is of interest to all the AUS technical staff. ReadyTalk<br />
allows all the AUS staff, attending these bi-weekly tele/web-conferences to view the slides and<br />
listen and participate in the technical presentations. This fosters an environment for collaboration<br />
among the AUS staff located at the various RP sites and also allows sharing of technical insight<br />
that may have impact on multiple ASTAs or ASPs. The presentation slides and audio recordings<br />
are archived and made publicly available for interested staff as well as users at:<br />
http://teragridforum.<strong>org</strong>/mediawiki/index.phptitle=Technical_presentations .<br />
5.5.1<br />
Advanced Support for TeraGrid Applications (ASTA)<br />
Advanced Support for TeraGrid Applications (ASTA) efforts allow AUS staff to work with a user<br />
for a period of a few months to a year. Activities include porting applications, implementing<br />
algorithmic enhancements, implementing parallel programming methods, incorporating math<br />
libraries, improving the scalability of codes to higher core counts, optimizing codes to utilize<br />
specific resources, enhancing scientific workflows, providing help for science gateways and<br />
performing visualization and data analysis projects. To receive ASTA support, TeraGrid users<br />
submit a request as a part of their annual, Supplemental, or Startup resource allocation proposal.<br />
The recommendation score provided by the reviewers is taken into account and upon discussion<br />
with the user regarding a well-defined ASTA workplan, AUS staff provide ASTA support to the<br />
user. This support often impacts a larger number of users who are part of the PI’s team. Since late<br />
2008, in addition to the quarterly TRAC allocations, ASTA support is also available to those<br />
requesting Supplements or Startup allocations. The TeraGrid-wide AUS effort improves the<br />
ability to optimally match AUS staff to an ASTA project by taking into account the reviewers’<br />
recommendation score, the AUS staff(s) expertise in a domain science/HPC/CI, the ASTA project<br />
workplan, and the RP site where the user has a resource allocation. In addition to providing longterm<br />
benefits to the user or the user team, projects are also beneficial to the TeraGrid as a whole.<br />
Results of ASTA projects provide insights and exemplars for the general TeraGrid user<br />
community and these are included in documentation, training and outreach activities. In the table<br />
5.3 we list 48 ASTA projects of which 7 were completed in this quarter. (It should be noted that<br />
many of the ASTA projects are renewed yearly based on the recommendation provided by TRAC<br />
and hence are not noted as completed at the end of an allocation period.) This list also includes<br />
Startup and Supplemental ASTA projects. Quarterly updates of ASTA work for most of these<br />
ASTA project are provided after the table.<br />
ASTA Project<br />
DNS of Spatially<br />
Developing<br />
Turbulent Boundary<br />
Layers, PI Ferrante,<br />
U. Washington<br />
Multiscale Analysis<br />
of Size Dependence<br />
of Deformation and<br />
Fracture of Hierarchy<br />
Protein Materials, PI<br />
Buehler, MIT<br />
An<br />
System<br />
Table 5.3. List of continuing and completed ASTA projects for the quarter.<br />
Earthquake<br />
Science<br />
Site(s)<br />
Involved<br />
SDSC, NCSA,<br />
TACC, NICS<br />
Status<br />
Continuing<br />
03/11<br />
SDSC Completed 12/10<br />
SDSC,<br />
TACC,<br />
PSC,<br />
NICS,<br />
Continuing<br />
through<br />
through<br />
Notes<br />
42
Approach to<br />
Physics-based<br />
Seismic Hazard<br />
Research, PI Jordan,<br />
USC<br />
Modeling Studies of<br />
Nano<br />
and<br />
Biomolecular<br />
Systems, PI<br />
Roitberg, U. Florida<br />
Efficient<br />
Implementation of<br />
Novel MD Simulation<br />
Methods in<br />
Optimized MD<br />
Codes, PI Voth, U.<br />
Chicago<br />
Flow and Nutrient<br />
Transport in 3D<br />
Porous Scaffolds<br />
Used for Bone<br />
Tissue Growth, PI<br />
Papavassiliou, U.<br />
Oklahoma<br />
Biomechanics of<br />
Stem Cells, PI Finol,<br />
CMU<br />
Global Kinetic<br />
Simulations of the<br />
Magnetosphere, PI<br />
Karimabadi, UCSD<br />
EpiSims, PI Roberts,<br />
RTI<br />
International;NekTar-<br />
G2, PI Karniadakis,<br />
Brown U.; GENIUS,<br />
PI Coveney, UCL<br />
Enabling Neutron<br />
Science Research<br />
with the Neutron<br />
Science TeraGrid<br />
Gateway, PI Cobb,<br />
ORNL<br />
Simulation of Liquid<br />
Fuel Combustors, PI<br />
Mashayek, UIC<br />
Multichannel<br />
Scattering<br />
Theory<br />
NCSA 06/11<br />
SDSC,<br />
NICS<br />
PSC,<br />
TACC<br />
NCSA,<br />
NICS,<br />
Continuing<br />
09/11<br />
Continuing<br />
09/11<br />
through<br />
through<br />
PSC Continuing through<br />
06/11<br />
PSC Continuing through<br />
09/11<br />
SDSC, NICS Continuing through<br />
06/11<br />
NIU,<br />
NCSA,<br />
TACC,<br />
Purdue<br />
SDSC,<br />
ANL,<br />
IU,<br />
Continuing<br />
09/11<br />
through<br />
ORNL Continuing through<br />
03/11<br />
NCSA Continuing through<br />
06/11<br />
TACC Completed 09/10<br />
43
via the Modified<br />
Faddeev Equation,<br />
PI Hu, CSULB<br />
Turbulence<br />
Simulations Towards<br />
the PetaScale:<br />
Intermittency,<br />
Mixing, Reaction and<br />
Stratification, PI<br />
Yeung, Ge<strong>org</strong>ia<br />
Tech<br />
First-Principles<br />
Molecular Dynamics<br />
for Petascale<br />
Computers, PI Gygi,<br />
UCD<br />
UNRES Force-field<br />
for Simulation of<br />
Large Molecular<br />
Systems, PI<br />
Scheraga, Cornell<br />
Petascale Adaptive<br />
Computational Fluid<br />
Dynamics, PI<br />
Jansen, RPI<br />
Insight into<br />
Biomolecular<br />
Structure, Dynamics,<br />
Interactions and<br />
Energetics from<br />
Simulation, PI<br />
Cheatham, U. Utah<br />
Large Eddy<br />
Simulation of<br />
Particle-Turbulence<br />
Interactions in<br />
Complex Flows, PI<br />
Apte, Oregon State<br />
Nonlinear Evolution<br />
of the Universe, PI<br />
Cen, Princeton<br />
Modeling Global<br />
Climate Variability<br />
with the Multi-scale<br />
Modeling<br />
Framework: The<br />
Boundary-layer<br />
Cloud Problem, PI<br />
NICS,<br />
TACC<br />
SDSC,<br />
Completed 09/10<br />
NICS, TACC Continuing through<br />
12/10<br />
PSC Completed 12/10<br />
NICS,<br />
PSC<br />
TACC,<br />
Completed 12/10<br />
SDSC Continuing through<br />
12/11<br />
TACC Continuing through<br />
03/11<br />
TACC,<br />
NICS<br />
SDSC,<br />
Continuing<br />
09/11<br />
through<br />
Purdue, SDSC Continuing through<br />
06/11<br />
44
Helly, SIO/UCSD<br />
Lattice Gauge<br />
Calculation of<br />
Hadronic Physics, PI<br />
Liu, University of<br />
Kentucky<br />
Leveraging<br />
Supercomputing for<br />
Large-scale Gametheoretic<br />
Analysis, PI<br />
Sandholm, CMU<br />
Astrophysical<br />
Applications of<br />
Numerical Relativity:<br />
Coalescing Binary<br />
Systems and Stellar<br />
Collapse, PI<br />
Schnetter, LSU<br />
Fractals in Elasto-<br />
Plasticity, PI Ostoja-<br />
Starzewski, UIUC<br />
The CIPRES<br />
Science Gateway, PI<br />
Miller, SDSC/UCSD<br />
High-Resolution<br />
Modeling of<br />
Hydrodynamic<br />
Experiments, PI<br />
Demeler, U. Texas<br />
Health Science<br />
Center<br />
Center for Integrated<br />
Space Weather<br />
Modeling, PI Quinn,<br />
Boston U.<br />
Computational<br />
Modeling of Thermal<br />
Stripping in Nuclear<br />
Reactors, PI Kimber,<br />
U. Pittsburgh<br />
Simulation of<br />
Complex Organic<br />
Mixture<br />
Biotransformations in<br />
Natural Systems, PI<br />
VanBriesen, CMU<br />
Villous Motility as a<br />
Critical Mechanism<br />
PSC, NICS Continuing through<br />
03/11<br />
PSC Continuing through<br />
03/11<br />
PSC,<br />
SDSC<br />
NCSA,<br />
Continuing<br />
03/11<br />
through<br />
NCSA Continuing through<br />
01/11<br />
SDSC Continuing through<br />
06/11<br />
IU,<br />
Purdue<br />
PSC,<br />
Continuing<br />
03/11<br />
NICS, TACC Completed 12/10<br />
PSC Completed 12/10<br />
through<br />
PSC Continuing through<br />
04/11<br />
NICS, SDSC Continuing through<br />
Also see Science<br />
Gateways section of the<br />
quarterly report<br />
(UltraScan)<br />
45
for Efficient Nutrient<br />
Absorption in the<br />
Small Intestine, PI<br />
Brasseur, Penn<br />
State<br />
Systematics of<br />
Nuclear Surface<br />
Vibrations in<br />
Deformed Nuclei, PI<br />
Engel, U. North<br />
Carolina<br />
Simulations of<br />
Molecular Clouds<br />
and Protoplanetary<br />
Disks, PI Mac Low,<br />
American Museum<br />
of Natural History<br />
Structural and<br />
Functional<br />
Characterization of<br />
Glu-Plasminogen<br />
and<br />
Lys-<br />
Plasminogen, PI<br />
Kim, CMU<br />
Design<br />
of<br />
Communicating<br />
Colonies of<br />
Biomimetic<br />
Microcapsules, PI<br />
Kolmakov, Univ. of<br />
Pittsburgh<br />
Computational<br />
Modeling of Ultracold<br />
Molecular Collisions<br />
and Reactions<br />
Involving Clusters<br />
and Nanoparticles,<br />
PI Naduvalath,<br />
UNLV<br />
Development of<br />
Novel HIV Entry<br />
Inhibitors Using the<br />
ROCS Shaped<br />
Based Matching<br />
Algorithm and<br />
Molecular Dynamics<br />
Studies of gp120<br />
Envelop Proteins, PI<br />
LaLonde, Bryn Mawr<br />
03/11<br />
NICS, TACC Continuing through<br />
03/11<br />
PSC Continuing through<br />
03/11<br />
PSC Continuing through<br />
03/11<br />
TACC,<br />
NCSA<br />
SDSC,<br />
Continuing<br />
03/11<br />
through<br />
NICS, PSC Continuing through<br />
03/11<br />
PSC Continuing through<br />
03/11<br />
46
College<br />
Simulation and Data<br />
Analysis of Macro-<br />
Molecular Systems,<br />
PI Blaisten-Barojas,<br />
Ge<strong>org</strong>e Mason<br />
University<br />
Applications of Bispeptide<br />
Nanostructures, PI<br />
Schafmeister,<br />
Temple University<br />
Finite-elements<br />
Modeling of Singleand<br />
Poly-Crystalline<br />
Material Removal<br />
Including<br />
Crystallographic<br />
Effects,<br />
PI<br />
Ozdoganlar, CMU<br />
Modeling Geological<br />
Sequestration of<br />
Carbon Dioxide in a<br />
Deep Saline<br />
Reservoir Using the<br />
TOUGH2 Simulator,<br />
PI Valocchi, UIUC<br />
Multi-Scale and<br />
Isotropy in Rotating<br />
Turbulence, PI<br />
Pouquet, NCAR<br />
Variational Approach<br />
to Reservoir<br />
Stimulation for<br />
Enhanced<br />
Geothermal<br />
Systems, PI Bourdin,<br />
LSU<br />
DREAM Assessment<br />
Using Octave Open<br />
Source Software, PI<br />
Surfleet, Oregon<br />
State University<br />
Cyberinfrastructure<br />
to Support Science<br />
and<br />
Data<br />
Management of the<br />
Dark Energy Survey,<br />
PSC Continuing through<br />
03/11<br />
NICS Continuing through<br />
04/11<br />
PSC Continuing through<br />
03/11<br />
NCSA Continuing through<br />
03/11<br />
PSC, NICS Continuing through<br />
06/11<br />
TACC Continuing through<br />
06/11<br />
Purdue Continuing through<br />
06/11<br />
NCSA, Purdue Continuing through<br />
06/11<br />
47
PI Mohr, UIUC<br />
Kinetic Simulations<br />
of Collisional<br />
Inonospheric Waves<br />
and Turbulence, PI<br />
Oppenheim, Boston<br />
Univ.<br />
Research Allocation<br />
for the Large<br />
Synoptic Survey<br />
Telescope Data<br />
Challenge 3b, PI<br />
Axelrod, University<br />
of Arizona<br />
Numerical Modeling<br />
of Double-diffuive<br />
Convection, PI<br />
Radko, Naval<br />
Postgraduate School<br />
Determination of the<br />
Conformation of<br />
Polypeptides in<br />
Solution as well as<br />
on a Surface Using<br />
Molecular Dynamics<br />
Simulation, PI<br />
Narsimhan, Purdue<br />
University<br />
NICS Continuing through<br />
06/11<br />
NCSA,<br />
TACC<br />
PSC,<br />
Continuing<br />
08/11<br />
through<br />
TACC Continuing through<br />
09/11<br />
SDSC Continuing through<br />
10/11<br />
PI: Ferrante (U Washington, Fluids, Particulate and Hydraulic Systems). DNS of Spatially<br />
Developing Turbulent Boundary Layers. Continuing through 03/11. The ASTA team for this<br />
project consists of Darren Adams (NCSA), David Bock (NCSA), John Peterson (TACC), Lonnie<br />
Crosby (NICS) and Dmitry Pekurovsky (SDSC).<br />
In this quarter David Bock has been integrating a new data reader, based on the the new HDF5<br />
library module created by Darren Adams for this PI, into visualization rendering system. Jay<br />
Alameda (PI of NSF SI2 funded project on Eclipse) has identified ways of using the Eclipse<br />
parallel tool to benefit some of the issues the ASTA PI is interested in. A staff (Al Rossi from<br />
NCSA), funded from the SI2 grant, is participating in this project jointly with Adams in this<br />
regard.<br />
PI: Buehler (MIT, Mechanics and Materials). Multiscale Analysis of Size Dependence of<br />
Deformation and Fracture of Hierarchy Protein Materials. Completed 12/10. The project is lead<br />
by Ross Walker (SDSC). There is no update for this quarter and the ASTA was completed.<br />
PI: Jordan (University of Southern California, Earth Sciences). SCEC PetaScale Research: An<br />
Earthquake System Science Approach to Physics-based Seismic Hazard Research. Continuing<br />
through 06/11. The ASTA team consists of Yifeng Cui (SDSC), John Urbanic (PSC), Byoung-Do<br />
Kim (TACC), Kwai Wong (NICS), Mark Vanmoer (NCSA), and Amit Chourasia (SDSC).<br />
48
During this quarter Cui prepared and performed a two-day data-intensive 1-Hz M8 simulation<br />
using 30K Kraken cores and this generated 120 TB volume data for analysis for the visualization<br />
tool GlyphSea. Optimization of MPI-I/O scheme was done on upgraded Cray XT5. Cui<br />
contributed significantly to the slides presented at the Gordon Bell finalist presentation by PI Tom<br />
Jordan. Cui and Chourasia were co-authors of the Gordon Bell finalist SC10 paper.<br />
Chourasia continued extensive visualization work for M8 simulations. Several animations were<br />
created for analysis and presentation at the SC10 conference Gordon Bell and Master Works talks<br />
by PI Tom Jordan. The visualization were performed on the Jaguar, Kraken and Nautilus<br />
machines. Movies and images are available from:<br />
http://visservices.sdsc.edu/projects/scec/m8/1.0/<br />
Preliminary building response visualizations were also created in collaboration with SCEC and<br />
CalTech researchers.<br />
Urbanic continued substantial optimizations regarding single and double precision switches for<br />
the Hercules code this quarter.<br />
Figure 5.1 Instantaneous X velocity component from M8 simulations (every 100 th step)<br />
(Visualization by Amit Chourasia SDSC).<br />
49
Figure 5.2 Instantaneous Y velocity component from M8 simulations (every 100 th step)<br />
(Visualization by Amit Chourasia, SDSC).<br />
PI: Roitberg (U. Florida, Biophysics). Modeling Studies of Nano and Biomolecular Systems.<br />
Continuing through 09/11. The ASTA team consists of Ross Walker (SDSC) and Christian<br />
Halloy (NICS).<br />
During this quarter Walker has worked on further improving the PMEMD GPU implementation<br />
in AMBER to improve performance by between 5 and 10%. He removed the dependencies on the<br />
external CUDPP library for carrying out RadixSorts which improves stability and works around<br />
bugs found in several NVIDIA compiler versions. These updates were released as part of<br />
bugfix.12 for AMBER and are being used directly by the PI. Specifically the problems addressed<br />
are as follows:<br />
1. Removes the dependence on CUDPP which could cause crashes on certain large<br />
simulations due to bugs in the CUDPP library.<br />
2. Fixes problems with large simulations, 400K+ atoms crashing randomly with an<br />
allocation failure.<br />
3. Fixes "ERROR: max pairlist cutoff must be less than unit cell max sphere radius!" bug<br />
allowing cutoffs larger than 8.0 to be used for both small and large systems.<br />
4. Writes final performance info the the mdout file.<br />
5. Possible (not fully tested) workaround for NVIDIA GTX4XX and GTX5XX series cards<br />
to avoid possible random hangs during PME simulations.<br />
6. Tests for the use of non-power of 2 MPI tasks when running parallel GPU PME<br />
calculations. Update so that the code now quits with a suitable error message.<br />
50
7. Minor performance improvements, mostly due to the use of a faster radixsort.<br />
Walker also made arrangements for Prof. Roitberg's graduate student to spend two months from<br />
the middle of March onwards working with him directly at SDSC. Prof. Roitberg will be covering<br />
the student’s expenses.<br />
PI: Voth (University of Chicago, Physical Chemistry). Efficient implementation of novel MD<br />
simulation methods in optimized MD codes. Continuing through 09/11. The ASTA team consists<br />
of Phil Blood (PSC), John Peterson (TACC), and Lonnie Crosby (NICS).<br />
In this quarter Blood has continued to work with the Voth group to find the best strategy for<br />
improving parallel performance of their multistate empirical valence bond method for modeling<br />
proton transport in classical molecular dynamics simulations. In past quarters the group<br />
implemented their method in LAMMPS, but the implementation suffers from severe scaling<br />
challenges due to the many all-to-all communications required each time step. In previous<br />
quarters Blood ported their implementation to a new hybrid OpenMP/MPI LAMMPS code that<br />
reduces some communication bottlenecks caused by the all-to-all communication.<br />
In this quarter, the Voth group released a new version of their LAMMPS-MS-EVB code with<br />
increased functionality, which Blood then ported to the faster hybrid LAMMPS code and ran<br />
several benchmarks to test performance of the new code. This code matched the performance of<br />
the previous hybrid LAMMPS-MS-EVB code. To explore other ways to improve performance,<br />
members of the Voth group tried a new multiple-program implementation of the regular<br />
LAMMPS code (without MS-EVB), with the calculations requiring all-to-all communication<br />
running only on a small subset of the available processors. Blood performed in depth<br />
benchmarking and performance analysis of this new implementation. Through these tests and<br />
discussions with the Voth group it was found that the performance of this code could be improved<br />
by careful placement of the two types of processes (processes with or without all-to-all) at<br />
runtime. Benchmark tests showed that the new multiple-program code with careful process<br />
placement outperformed the previous hybrid approach (see Figure 5.3). In addition, the<br />
performance analysis demonstrated that the remaining scaling issues were due to load imbalance<br />
between the two types of process groups. Through ongoing discussions with the Voth group<br />
Blood has identified possible ways to further optimize these calculations and reduce load<br />
imbalance. These optimizations, as well as porting MS-EVB to the multiple-program LAMMPS<br />
code, will be pursued in subsequent quarters.<br />
51
Figure 5.3 Speedup plot of ns/day achieved with various implementation of the LAMMPS<br />
code.(Plot by Phil Blood, PSC).<br />
PI: Papavassiliou (U. Oklahoma, Chemical, Thermal Systems). Investigation of Flow and<br />
Nutrient Transport in Porous Scaffolds Used for Bone Tissue Growth. Continued through 06/11.<br />
ASTA staff for this project is Raghu Reddy (PSC). There is no update on this project for this<br />
quarter.<br />
PI: Finol (CMU, Mechanical and Biomedical Engineering). Biomechanics of Stem Cells.<br />
Continuing through 09/11. The ASTA consultant is Anirban Jana (PSC). One of Prof Finol's<br />
PhD students, Samarth Raut, successfully presented his PhD proposal to his advisory committee<br />
which includes Jana. All the work on Pople with ADINA was crucial to formulation of his PhD<br />
proposal. Jana also pointed out faster ways to transfer simulation results from Pople to the group's<br />
local machines, using multiple scp processes. This sped up transfers significantly from what some<br />
members of his group was achieving before (e.g. transfers that took 1-2 days before, now take a<br />
few hours).<br />
PI: Karimabadi (UCSD, Magnetospheric Physics). Global Kinetic Simulations of the<br />
Magnetosphere. Continuing through 06/11. The ASTA staff for this project are Mahidhar<br />
Tatineni (SDSC) and Glenn Brook (NICS). Gleen Brook and Lonnie Crosby (NICS) continued to<br />
discuss I/O study with the user Burlen from the PI’s team in this quarter.<br />
PI: Roberts (RTI International, Computer Science) EpiSims; PI: Karniadakis (Brown U., CFD)<br />
NekTar-G2; PI Coveney (University College of London, Chemistry) GENIUS. Continuing<br />
through 09/11. The lead, regarding MPIg research and implementation, for these ASTA projects<br />
is Nick Karonis (NIU, ANL). He has continued to work with Karniadakis and Grinberg and<br />
have identified key portions of their application that are computational bottlenecks in the<br />
application that might also be good candidates to introduce multithreading. Under Dr.<br />
Gringberg's (from Karniadakis group) advice, Karonis started working on the application's preconditioner<br />
and have made significant progress there introducing multithreading through<br />
52
OpenMP. Plan is to continue the efforts to introduce multithreading, through OpenMP, through<br />
the rest of the application as directed by Dr. Grinberg and computational bottleneck analysis.<br />
PI: Cobb (ORNL, Neutron Science). Enabling Neutron Science Research with the Neutron<br />
Science TeraGrid Gateway. Continuing through 03/11. Cobb’s team is providing their own<br />
ASTA support for this project and continued to deploy software tools and work processes to<br />
leverage experimental neutron science.<br />
PI: Mashayek (Univ. of Illinois Chicago, Applied Mathematics). Simulation of Liquid Fuel<br />
Combustors, Continuing through 06/11. ASTA staff for this project is Ahmed Taha (NCSA). The<br />
Simulations using Ansys-Fluent are in progress. Results are being analyzed, processed and<br />
communicated to the PI. A modified configuration will be used and simulations will be resumed.<br />
PI: Gygi (UCD, Materials Science). First-Principles Molecular Dynamics for Petascale<br />
Computers. Continuing through 12/11. The ASTA staff for this project are BD Kim (TACC) and<br />
Bilel Hadri (NICS). Hadri help PI’s team to use OpenMP. As a result PI’s team did initial<br />
assessment of performance with partial user of OpenMP in a limited set of functions. They<br />
observed that they can achieve about the same performance using MPI only and a mixed<br />
MPI/OpenMP approach, and in some cases they can obtain 5-10% speedup with the<br />
MPI/OpenMP version. This is encouraging since there are many more opportunities for thread<br />
parallelism in the rest of the code, and the team is pursuing that.<br />
PI: Scheraga (Cornell U., Biophysics). UNRES Force-field for Simulation of Large Molecular<br />
Systems. Completed 12/10. The ASTA team consists of Phil Blood and Mahin Mahmoodi (PSC).<br />
Not much interaction occurred with PI’s team this quarter, although the PI’s team expressed<br />
interest to continue work with the ASTA team. Based on the work done earlier the following<br />
paper was accepted and published:<br />
A. Liwo, S. OÅ^Âdziej, C. Czaplewski, D. S. Kleinerman, P. Blood and H. A. Scheraga, J.<br />
Chem. Theory Comput., 2010, 6:890-909.<br />
PI: Jansen (RPI, Fluids). Petascale Adaptive Computational Fluid Dynamics. Completed 12/10.<br />
ASTA team for this project consists of Glenn Brook (NICS), David O’Neal (PSC), and John<br />
Peterson (TACC). There is no major update for this quarter.<br />
PI: Cheatham (U. Utah, Biochemistry). Insight into Biomolecular Structure, Dynamics,<br />
Interaction and Energetics from Simulation. Continuing through 12/10. The ASTA team consists<br />
of Ross Walker (SDSC). Over the last 3 months Walker has worked with Prof. Cheatham under<br />
his ASTA project to provide finalized benchmark information to Prof. Cheatham for his use in<br />
LRAC requests. Walker has also worked to help improve the I/O performance of the parallel<br />
analysis package ptraj. As part of this he has worked with Prof. Cheatham and collaborators to<br />
begin developing a C++ version of this package in order to make future improvements and<br />
feature additions easier.<br />
PI: Apte (Oregon State University, Fluid, Particulate, and Hydraulic Systems). Large Eddy<br />
Simulation of Particle-Turbulence Interactions in Complex Flows. Continuing through 03/11.<br />
The ASTA staff for this project is Carlos Rosales (TACC). No interaction with the PI’s team<br />
happened in this quarter.<br />
PI: Cen (Princeton University, Astrophysics and Cosmology). Nonlinear Evolution of the<br />
Universe, Continuing through 09/11. The ASTA team consists of Yaakoub Kharma (TACC),<br />
Bilel Hadri (NICS) and Amit Chourasia (SDSC). There is no update for this quarter. Chourasia<br />
met with the PI at the Supercomputing10 conference and discussed future ASTA effort.<br />
PI: Helly (UCSD, SIO, Atmospheric Sciences). Modeling Global Climate Variability with the<br />
Multi-scale Modeling Framework New parameterizations of Cloud Micro-physics and<br />
53
Developing Community Community Accounts Portal for Running MMF. Continuing through<br />
06/11. Phil Cheeseman (Purdue), Adam Jundt (SDSC) and Mahidhar Tatineni (SDSC) are the<br />
ASTA team for this project. AUS and Gateways staff from multiple TeraGrid sites are working to<br />
get remote job submission and data movement pieces of the project working.<br />
PI: Liu (University of Kentucky, Elementary Particle Physics). Lattice Gauge Calculation of<br />
Hadronic Physics. Continuing through 03/11. The ASTA team consists of Raghu Reddy (PSC)<br />
and Haihang You (NICS). You worked with the PI’s team and helped them with account issues<br />
and in submitting jobs.<br />
PI: Sandholm (CMU, Computer and Computation Research). Leveraging Supercomputing for<br />
Large-scale Game-theoretic Analysis. Continuing through 03/11. Joel Welling and David O’Neal<br />
(PSC) are working with the PI on this project. No update is available for this quarter.<br />
PI: Schnetter (LSU, Gravitational Physics). Astrophysical Applications in Numerical Relativity:<br />
Coalescing Binary Systems and Stellar Collapse. Continuing through 03/11. The ASTA team<br />
consists of Mahin Mahmoodi (PSC), Rick Kufrin and Rui Liu (NCSA), and Wayne Pfeiffer<br />
(SDSC). There is no update on this project for this quarter.<br />
PI: Ostoja-Starzewski (UIUC, Mechanics and Materials) Fractals in Elasto-Plasticity.<br />
Continuing through 01/11. The ASTA staff for this project is Seid Koric (NCSA). Following<br />
updates are provided for this quarter and includes work done by the graduate student with Koric:<br />
1. Size 80 3 v.s. size 100 3 : Done. Verified 80 3 is reliable estimation.<br />
2. Displacement BC v.s. Traction BC: Done. Verified boundary conditions independence and<br />
RVE.<br />
3. Study on different randomness configurations: Done. Three aspects: Gaussian or Uniform<br />
distribution types; Different random variants; Randomness on elastic moduli and/or yield limit.<br />
4. Study on different material hardening configurations: Done. Comparison of six materials to<br />
show hardening effects.<br />
5. Study on different loading configurations: In progress. Study of yield surface (whether<br />
associated flow rule holds) needs at least five more simulations.<br />
PI: Miller (SDSC/UCSD, System and Population Biology) The CIPRES Science Gateway.<br />
Continuing through 06/11. ASTA support is being provided by Wayne Pfeiffer (SDSC).<br />
A paper titled "Creating the CIPRES Science Gateway for Inference of Large Phylogenetic<br />
Trees" was presented at the GCE10 workshop in conjunction with SC10. The authors are Mark<br />
A. Miller, Wayne Pfeiffer, and Terri Schwartz, all of SDSC.<br />
Pfeiffer installed the MrBayes and RAxML phylogenetics codes on Trestles, the newest<br />
supercomputer at SDSC. He also helped Adam Jundt of SDSC make these codes accessible via<br />
modules.<br />
Benchmark runs showed that, for the same core count, MrBayes and RAxML run slightly faster<br />
on Trestles than on Abe at NCSA, where most CIPRES gateway runs currently execute.<br />
However, the increased availability of computer time on Trestles suggests the use of more cores<br />
for some MrBayes and RAxML runs on Trestles than were used on Abe. Such runs will execute<br />
appreciably faster on Trestles than before, albeit at lower parallel efficiency. In addition, runs of<br />
as long as two weeks are allowed on Trestles versus one week on Abe. Thus most gateway runs<br />
of MrBayes and RAxML will be moved from Abe to Trestles.<br />
To help users with special requirements, expedited phylogenetic analyses were manually run on<br />
Dash and Trestles at SDSC. Specifically, MrBayes jobs were run on Dash for<br />
54
• Joel Ledford of the California Academy of Sciences and UC Berkeley,<br />
and RAxML jobs were run on Trestles for<br />
• Joel Guenther of UC Berkeley.<br />
The parallel version of the MAFFT multiple sequence alignment code was installed on Abe,<br />
Dash, and Trestles. Parallelization is via Pthreads and gives a parallel efficiency that is fairly<br />
high at 8 threads, but much lower above that.<br />
Benchmark runs on 8 cores gave best run times that were<br />
1.89x slower on Abe than Dash and<br />
1.05x slower on Trestles than Dash.<br />
The times vary appreciably from run to run because the parallel search algorithm uses random<br />
numbers in a way that is not reproducible. Thus the number of iterations and even the final<br />
alignment vary from run to run, though the resulting alignments seem to be of similar quality.<br />
In light of the preceding results, it is planned to make CIPRES gateway runs of MAFFT on<br />
Trestles. Currently such runs are made with a serial version of MAFFT on Triton, a non-<br />
TeraGrid computer at SDSC.<br />
PI: Demeler (U. Texas Health Science Center, Biochemistry and Molecular Structure and<br />
Function) High-Resolution Modeling of Hydrodynamics Experiments. Continuing through 03/11.<br />
ASTA support is provided by Suresh Marru, Raminder Singh (IU), Roberto Gomez (PSC), and<br />
Phil Cheeseman (Purdue). The ASTA team consists of staff from both the Science Gateway area<br />
and the Advanced User Support area since the PI’s project requires advanced support for<br />
gateways as well as MPI optimization. Further description of activities is provided in the Science<br />
Gateways section.<br />
PI: Quinn (Boston University, Solar Terrestrial Research). Center for Integrated Space Weather<br />
Modeling. Continuing through 12/10. The ASTA staff for this project are Justin Whitt, Glenn<br />
Brook (NICS), and Carlos Rosales-Fernandez (TACC). The ASTA completed in 12/10. Glenn<br />
Brook and Justin White have continued to advise the group via bi-weekly teleconferences. They<br />
began examining the performance characteristics of their latest skeletal communications model<br />
and verified the gains realized by the message buffering previously suggested by Glenn Brook.<br />
They also examined the effects of different MPI rank orderings and the use of the MPI memory<br />
copy optimization on the current model and saw no significant improvements. They completed<br />
weak and strong scaling studies of the new communications model; as well as, compared the<br />
performance on multiple architectures and determined the new communications model to be<br />
bound by both latency and contention for network resources. This led them to suggest an<br />
alternate communications stencil requiring less messages that were larger in size to be sent.<br />
Work by the Quinn team continues toward this end.<br />
PI: Kimber (University of Pittsburgh, Thermal Systems). Computational Modeling of Thermal<br />
Striping in Nuclear Reactors. Completed 12/10. Anirban Jana from PSC is the ASTA staff for<br />
this project. On the turbulence modeling and thermal striping of jets, the realizable k-epsilon<br />
model was found to be a good candidate turbulence model for the jets, for isothermal conditions.<br />
FLUENT simulations with this model provided close agreement with experimentally measured<br />
axial velocity decay. Currently non-isothermal jet validations are being pursued.<br />
On LBM, several critical physics related to boiling, namely liquid-vapor equilibrium, surface<br />
tension of a vapor bubble, establishment of an exponential density gradient in a fluid due to<br />
gravity, establishment of a horizontal interface between liquid and and vapor under gravity were<br />
55
demonstrated (with rigorous quantitative validation using theoretical formulas). LBM runs have<br />
been primarily using up Pople SUs for this project.<br />
PI: VanBriesen (CMU, Chemical and Reaction Processes). Simulation of Complex Organic<br />
Mixture Biotransformations in Natural Systems. Continuing through 04/11. Anirban Jana from<br />
PSC is collaborating on this ASTA project. The graduate student, Amanda Hughes, was able to<br />
defend her PhD last quarter, based on her Star-P runs on Pople. She is still working on finishing<br />
up papers. Jana continued to answer 1-2 small queries from her last quarter.<br />
PI: Brasseur (Penn State, BioPhysics).Villous Motility as a Critical Mechanism for Efficient<br />
Nutrient Absorption in the Small Intestine. Continuing through 03/11. Lonnie Crosby (NICS), for<br />
computational work, and Amit Chourasia (SDSC), for visualization work, are involved in this<br />
ASTA project. Chourasia started preliminary work on visualization of Intestine3D. Chourasia<br />
also met the PI in person to understand the data and things to look for in the visualization. The PI<br />
found interesting features at first look from the visualization and this is contrary to their original<br />
hypothesis.<br />
PI. Engel (U. North Carolina, Physics). Systematics of Nuclear Surface Vibrations in Deformed<br />
Nuclei. Continuing through 03/11. ASTA staff for this project are Meng-Shiou Wu (NICS) and<br />
Victor Eijkhout (TACC). Wu’s major effort in the fourth quarter of the project was to assist the<br />
team solved a scalability issue in their code. The original request from the team is as follow:<br />
Test calculation using 80k cores: The purpose of this test was to investigate the efficiency of the<br />
part of our calculation that is difficult to parallelize and time consuming: reading 0.4 GB of input<br />
data into a single core and distributing the data to 80k cores. The test had not ended after three<br />
hours.<br />
After looking into their code, it was found that the performance issue came from their file<br />
vbc1.17k_t.f90, line 650 – 782. The code segment reads a file of about 550MB and distributes the<br />
data to 80K cores.<br />
The code segment appeared to have been written to run on machines of smaller scale, with less<br />
physical memory available. It reads a section of the data file, broadcast it to the rest of the nodes<br />
(cores) sequentially, then read the next section, and repeat the ‘broadcast’. This process was<br />
repeated 4533 times.<br />
Once the cause of performance issue was determined, it was not difficult to come up with a<br />
solution for this broadcast problem. While the data is of the size 550M, those are real numbers<br />
stored in text format; thus after reading the file, the exact memory required for each core is<br />
128MB, and this amount of memory for every single core is not a problem on Kraken. The<br />
process that was repeated 4533 times can be finished in one step.<br />
The original program reads a segment of data, copies it to another buffer then broadcast it. This<br />
extra copy was removed and replaced with a larger buffer and with a single read, and the buffer is<br />
used for broadcast with Cray’s MPI_Bcast which is optimized to run on SeaStar2.<br />
The modified code shows significant performance improvement over the original code as shown<br />
in Figure 5.4. The figure shows performance comparison of repeated send/receive and the result<br />
of a single MPI_Bcast of a large buffer. No I/O was involved. The tests for performance<br />
comparison used up to 24,000 cores for the modified code (30 seconds, not shown in Figure 5.4),<br />
which is still better than the original code when using 2,400 cores (470 seconds).<br />
Figure 5.5 shows performance comparison after I/O mechanisms was added. The first I/O<br />
mechanism is the original method that reads a section of data repeatedly. The second I/O pattern<br />
reads the whole file once into a single buffer.<br />
56
The team also mentioned that they had trouble verifying the result of broadcast. An example<br />
verification procedure using extra I/O was added and was shown to the team. Instructions of<br />
using performance tools to collect performance data and conduct analyses for their code were also<br />
given to the team.<br />
While the solution of this performance is not complex, the team does not seem to have<br />
experience of optimization; thus many instructions were prepared and were given to the team,<br />
include: 1). A set of example programs (base on their code structure) to modify their code step by<br />
step. 2). A power point file with more than 10 slides showing how the problem was solved, step<br />
by step. 3). Performance data, and instruction of how those performance data were<br />
collected/generated.<br />
Figure 5.4 Performance comparison of repeated send/recv and single MPI_Bcast – time in secs<br />
(Y axis) Vs # of cores (X axis) (Plot by Meng-Shiou Wu, NICS)<br />
57
Figure 5.5 Performance comparison with I/O mechanisms - time in secs (Y axis) Vs # of cores (X<br />
axis) (Plot by Meng-Shiou Wu, NICS)<br />
PI. Mac Low (American Museum of Natural History, Planetary Astronomy). Simulations of<br />
Molecular Clouds and Protoplanetary Disks. Continuing through 03/11. Roberto Gomez (PSC) is<br />
working the PI’s team for this ASTA project. No update available for this quarter.<br />
PI. Kim (CMU, Biochemistry and Molecular Structure and Function). Structural and Functional<br />
Characterization of Glu-Plasminogen and Lys-Plasminogen. Continuing through 03/11. ASTA<br />
effort is provided by Phil Blood (PSC). In this quarter the group started exploring some of the<br />
accelerated sampling methods that Blood suggested to the team the previous quarter.<br />
PI. Kolmakov (Univ. of Pittsburgh, Materials Research).Design of Communicating Colonies of<br />
Biomimetic Microcapsules. Continuing through 03/11. Carlos Rosales-Fernandez (TACC), Jay<br />
Alameda (NCSA), and Dongju Choi (SDSC) are providing ASTA support for this project. No<br />
update available for this quarter.<br />
PI. Naduvalath (UNLV, Chemistry).Computational Modeling of Ultracold Molecular Collisions<br />
and Reactions Involving Clusters and Nanoparticles. Continuing through 03/11. ASTA effort is<br />
provided by Bilel Hadri (NICS) and Mahin Mahmoodi (PSC). There has been no interaction from<br />
the PI’s side this quarter.<br />
PI. LaLonde (Bryn Mawr College, Biochemistry and Molecular Structure and Function).<br />
Development of Novel HIV Entry Inhibitors Using the ROCS Shaped Based Matching Algorithm<br />
and Molecular Dynamics Studies of gp120 Envelop Proteins. Continuing through 03/11. ASTA<br />
staff of this project is Phil Blood (PSC). Currently PI is successfully continuing runs on the<br />
warhole machine at PSC.<br />
PI. Blaisten-Barojas (Ge<strong>org</strong>e Mason University, Physical Chemistry). Simulation and Data<br />
Analysis of Macro-Molecular Systems. Continuing through 03/11. ASTA staff for this project is<br />
Yang Wang (PSC). The user was ready to port the code to Lincoln. Wang provided the PI with<br />
58
the makefile for compiling the source code and creating the executable. It is expected that the PI<br />
would be able to run the code and produce science results.<br />
PI. Schafmeister (Temple University, Organic and Macromolecular Chemistry). Applications of<br />
Bis-peptide Nanostructures. Continuing through 04/11. ASTA staff for this project is Bilel Hadri<br />
(NICS). There is no update for this quarter.<br />
PI. Ozdoganlar (CMU, Manufacturing Processes and Equipment).Finite-Elements Modeling of<br />
Single- and Poly-Crystalline Material Removal Including Crystallographic Effects. Continuing<br />
through 03/11. ASTA staff for this project is Anirban Jana (PSC). There was one instance when<br />
the MPI version of ABAQUS didn't seem to be working on Pople and Jana helped the graduate<br />
student to resolve this.<br />
PI. Valocchi (UIUC, Earth Sciences). Modeling Geological Sequestration of Carbon Dioxide in a<br />
Deep Saline Reservoir Using the TOUGH2 Simulator. Continuing through 03/11. David Bock<br />
(NCSA) is providing ASTA support for the project and there is no update for this quarter.<br />
PI. Pouquet (NCAR, Physics).Multi-Scaling and Isotropy in Rotating Turbulence. Continuing<br />
through 06/11. ASTA staff for this project are Raghu Reddy (PSC), Mahin Mahmoodi (PSC), and<br />
Christian Halloy (NICS). Reddy’s work with this group on the hybrid code resulted in following<br />
presentations/publications:<br />
1. AGU presentation (poster): 'A hybrid MPI-OpenMP scheme for scalable parallel<br />
pseudospectral computations for fluid turbulence', AGU 2010 (this paper also served as one of the<br />
bases for a session that was convened called 'Large Scale Geosciences Applications Using GPU<br />
and Multicore Architectures' sponsored by Earth and Space Science Informatics and co-sponsored<br />
by Nonlinear Geophysics and Planetary Sciences sections of AGU; this paper was the basis of the<br />
'multicore' part).<br />
2. The Teragrid '10 presentation: 'Investigation of performance of a hybrid MPI-OpenMP<br />
turbulence code', Aug 2010; winner of Best Paper, Technology Track.<br />
PI Bourdin (LSU, Applied Mathematics). Variational Approach to Reservoir Stimulation for<br />
Enhanced Geothermal Systems. Continuing through 06/11. ASTA staff for this project is<br />
Yaakoub El Khamra (TACC). In this quarter Kharma helped with C and compilation questions<br />
and setup a storage space. He also taught the group about existing profiling tools.<br />
PI. Surfleet (Oregon State Univ., Engineering Systems). DREAM Assessment Using Octave Open<br />
Source Software. Continuing through 06/11. ASTA staff for this project is Phil Cheeseman<br />
(Purdue). PI’s team was provided some help regarding usage of scratch space.<br />
PI. Mohr (UIUC, Astronomical Sciences). Cyberinfrastructure to Support Science and Data<br />
Management for the Dark Energy Survey. Continuing through 06/11. Jay Alameda (NCSA) and<br />
Phil Cheeseman (Purdue) are the ASTA staff for this project. There has been no significant<br />
interaction with the PI’s team this quarter.<br />
PI. Oppenheim (Boston University, Aeronomy). Kinetic Simulations of Collisional Ionospheric<br />
Waves and the Turbulence. Continuing through 06/11. ASTA staff for this project is Haihang<br />
You from NICS. In this quarter scaling studies were done using PI’s code called EPPIC, or the<br />
Electrostatic Parallel Particle-in-Cell simulator. Below is a quick summary of the scaling tests.<br />
Job 1 is the job done on Kraken. Job 2 is 2x larger in volume and number of particles (so<br />
particle/volume is the same) and it scales fairly well.<br />
59
Machine # Processors Domains Grids per domain # e- per proc. Wall Clock(sec)<br />
Kraken 1024 128 8x512x512 262144 31441<br />
Kraken 4096 512 4x1024x1024 524288 67115<br />
PI. Axelrod (Univ. of Arizona, Physics). Research Allocation for the Large Synoptic Survey<br />
Telescope Data Challenge 3b. Continuing through 08/11. Darren Adams (NCSA), Raghu Reddy<br />
(PSC), and Chris Jordan (TACC) are providing ASTA support for this PI. There is no update this<br />
quarter for this project.<br />
PI.Radko (Naval Postgraduate School, Physical Oceanography) Numerical Modeling of Doublediffusive<br />
Convection. Continuing through 09/11. ASTA staff for this project is Gabriele Jost.<br />
The goal of the AUS is to increase the performance and scalability of Prof. Radko's 2D and 3D<br />
double diffusion convection (DDC) implementations TILT2D and TILT3D. The codes are<br />
currently parallelized base on OpenMP, which limits their scalability on SMP clusters to 1 node<br />
(eg 16 threads on Ranger).<br />
The ASTA staff and PI’s team are pursuing 3 possibilities:<br />
1. Develop an MPI implementation of TILT3D<br />
2. Investigate the use of GPGPUs (eg on Longhorn at TACC) for TILT2D<br />
3. Explore the use of very large scale SMP nodes (eg. SGI Altix UV)<br />
During the first quarter they focused the development of an MPI based implementation of<br />
TILT3D. The work items below were performed.<br />
Item 1: Planning and discussing a strategy with Prof. Radko<br />
=============================================================<br />
The application consists of 2 main programs. One program is used to generate initial input data<br />
for numerical experiments, the other program performs the DDC calculations. It was decided to<br />
use a 1D domain decomposition of the 3D arrays in the 3rd dimension.<br />
The code to generate the input data currently generates a single binary file with input data. For the<br />
distributed version they decided to keep this approach. Each MPI process will read its part of the<br />
input data from the binary file.<br />
The code DDC code uses FFTW to perform 3D real-to-complex FFTs. In order to perform the<br />
FFT on distributed data the structure of the code will be changed to<br />
1. perform 2D FFT, using FFTW real-to-complex in the xy-plane<br />
2. use MPI_All_to_all to redistribute the data<br />
3. perform a 1D complex-to-complex FFT in z dimension<br />
4. use MPI_All_to_all to obtain the original distribution in z-dimension<br />
Item 2: Start implementation<br />
============================<br />
Change the input generation code from sequential into an MPI code. Input generation was<br />
changed to allow direct access to the binary file. This is to support the reading of multiple MPI<br />
processes from different locations of the files.<br />
Test the code for running on 1 MPI process.<br />
Implement several test cases for performing a 3D FFTW in 2 steps: 2D real-to-complex (R2C) in<br />
xy, 1D complex-to-complex (C2C) in z dimension.<br />
60
Implement test cases for MPI based versions of the restructured FFTW.<br />
Change the original TILT3D into an MPI program, currently running on only 1 MPI process.<br />
Item 3: Miscellaneous<br />
======================<br />
Ported TILT3D and TILT3D to Ranger, tested execution of the OpenMP based implementation.<br />
Plans for next quarter:<br />
=======================<br />
- Continue working on MPI implementation of TILT3D<br />
- Port TILT2D code to ember (SGI Altix UV at NCSA) to explore scalability of OpenMP for a<br />
large number of threads<br />
PI.Narsimhan (Purdue University, Molecular Biosciences) Determination of the Conformation of<br />
Polypeptides in Solution as well as on a Surface Using Molecular Dynamics Simulation.<br />
Continuing through 10/11. The ASTA staff for this project is Ross Walker (SDSC) and Kimberly<br />
Dillman (Purdue). Walker has been interacting with Hector Chang (student of the PI) and<br />
Kimberly Dillman to initially characterize problems that Hector has been having with running<br />
their 600,000+ atom system using PMEMD on multiple TeraGrid systems. The initial<br />
conclusions are that it may be an issue with the input topology file rather than a bug in the<br />
dynamics code itself. Walker is waiting on receiving copies of the input files in order to debug<br />
this issue further. Once this is done they will proceed to optimizing performance for the PI's<br />
specific problem set.<br />
5.5.2<br />
Advanced Support for Projects (ASP)<br />
Under the ASP sub-area, AUS staff work on projects that directly benefit a large number of<br />
TeraGrid users, as opposed to the ASTA projects for which AUS staff work, following the<br />
guidance of the TRAC review process, with an individual user on his/her applications such that<br />
TeraGrid resources can be used effectively by the user. The different categories of ASPs are:<br />
• Installation and maintenance of domain science codes and software as well as HPC and<br />
CI related software and tools on suitable HPC machines located at the RP sites<br />
• Projects, decided in collaboration with users, that have the potential to impact large<br />
number of users<br />
• Creation of a software and associated database to monitor libraries being used by users<br />
• Implementation and testing of software allowing cross-site MPIg based simulations to<br />
run and associated interaction with cloud computing<br />
Among these four categories, the second one involves more effort, from AUS staff, than the other<br />
three.<br />
5.5.2.1<br />
Installation and maintenance of domain science codes and software as well as HPC<br />
and CI related software and tools on all the HPC machines located at the RP sites<br />
For this category of ASP tasks, AUS staff from all the RP sites continued to participate in this<br />
effort. This is considered a foundation type work that is performed in an ongoing basis and is<br />
needed for TeraGrid users who routinely use these codes and software for their science<br />
simulations. Many of these codes and software are also needed and used as a part of various<br />
ASTA projects by AUS staff. Examples of these codes and software include scientific<br />
applications in domains sciences such as chemistry, biochemistry, materials science, engineering,<br />
numerical mathematics, visualization and data, and in HPC/CI. AUS effort continued in this<br />
quarter involving maintenance and proper installation of these codes and software, optimization<br />
61
of the software for an architecture, debugging the software, interaction with and training for users<br />
regarding usage of these codes and software. This is carried out by AUS staff that have Ph.D. and<br />
M.S. level expertise in specific domain sciences and computer science.<br />
5.5.2.2<br />
Projects, decided in collaboration with users, that have the potential to impact large<br />
number of users<br />
Under the second category of ASP efforts, AUS staff work on projects that have the potential to<br />
benefit a large number of users (at least 10 user groups) within a specific domain science or<br />
TeraGrid users in general, utilizing the unique expertise and experience that AUS staff hold. The<br />
projects are identified based on impact it will have on the user community, input from users, input<br />
from AUS staff and other TeraGrid experts. Expertise is available within the AUS staff pool,<br />
across the RP sites, to undertake such projects that have broader impact. AUS staff from multiple<br />
RP sites jointly collaborated on these projects. AUS staff continued to pursue five such projects.<br />
Two of these were pursued by the AUS staff that have domain science as well as HPC expertise<br />
in chemistry/biochemistry and materials science and are users and, in some cases, developers of<br />
molecular dynamics and materials science codes. Large numbers of TeraGrid users use the<br />
molecular dynamics and materials science codes on various TeraGrid machines. The third project,<br />
investigating hybrid programming model, was pursued by AUS staff that has expertise in the area<br />
of HPC programming paradigm. The fourth project involves investigation of PGAS languages<br />
and comparison of performance with MPI based programming models. Initially kernels for linear<br />
system solvers are being explored and later other kernels/codes will be added. AUS staff from<br />
other sites can join once the project progresses. The fifth project involves developing tutorials of<br />
profiling and tracing tools for users.<br />
The first project involved porting, optimizing (using the best compiler options, machine specific<br />
environment variables), benchmarking, and profiling widely used molecular dynamics (MD)<br />
codes on various TG machines. AUS staff from PSC, SDSC, and TACC are participating in this<br />
project. AMBER and NAMD were benchmarked on Ranger and Kraken for various input cases<br />
and utilizing various compiler and environment options. Benchmarks were also done using all the<br />
cores within a node, and partial number of cores within a node to see the impact on performance<br />
and scaling. The results were presented from the AUS benchmark webpage<br />
(https://www.teragrid.<strong>org</strong>/web/user-support/aus_projects) and feedback was received from some<br />
users of these codes. User news was posted to inform the general users about this webpage. As<br />
newer machines, such as the DCL machines, come into production, further benchmarking of MD<br />
codes is ongoing on these machines.<br />
We believe that as such detailed MD benchmark data are made available to the TeraGrid MD<br />
users, it will greatly benefit this large community. In addition to always having the optimally<br />
installed MD codes on various TG machines, this project will enable the MD community to (1)<br />
gain knowledge about performance and scaling characteristics of MD codes on various machines,<br />
(2) know about the optimal compiler and libraries to use on TG machines, (3) install their own<br />
versions of MD codes on TeraGrid machines, and (4) justify resource requests in allocation<br />
proposals. This will also help TRAC reviewers to cross-check users’ requests for resources in<br />
their allocation proposals. The profiling results will be made available to the MD code developers<br />
and enable them to tune the future released versions of the MD codes for TeraGrid machines.<br />
Feedback regarding the usefulness of this project, was sought from about ten prominent long time<br />
TeraGrid MD users and strong support for this project was received from the users.<br />
The second project is similar to the MD project described above and targets codes that are used<br />
more for materials science type simulations, such as VASP, and CPMD. AUS staff, from NCSA,<br />
LONI, PSC and NICS, that have expertise in these codes and in the field of materials science,<br />
carried out this second project. No further activities were needed in this quarter for this project.<br />
62
User news was posted to inform the general users about this webpage. User feedback received,<br />
from materials science users, for this project also showed strong support for this project.<br />
The third project involves evaluation and benchmarking of the hybrid (MPI-OpenMP/pthreads)<br />
programming model. Due to the emergence of multi-core processors, hybrid programming, using<br />
MPI tasks and multiple OpenMP threads or pthreads per tasks, is of interest to users. This model<br />
may provide better performance than straight MPI programming. In addition hybrid programming<br />
may allow some simulations that are otherwise not possible due to limited memory available per<br />
core and hence provide a science capability not achievable by straight MPI codes on multi-core<br />
architectures. In order to investigate these issues, AUS staff from PSC, NICS, TACC and SDSC<br />
compared performance and usage of codes written in hybrid mode versus straight MPI mode. The<br />
codes being targeted are an astrophysics code, a hybrid benchmarking code, phylogenetics codes,<br />
FFTW, and a Monte Carlo code and work is ongoing. Raghu Reddy (PSC) wrote a draft<br />
whitepaper based on his hybrid programming work. Other AUS staff, participating in this project,<br />
also worked on developing whitepapers based on their hybrid code work. These whitepapers will<br />
be made available for TeraGrid users via the AUS project webpage.<br />
As mentioned earlier the fourth project involves investigating PGAS languages. No significant<br />
update is available this quarter for this project.<br />
The fifth project involved developing TAU tutorial and this is done by AUS staff from PSC. This<br />
tutorial and documentation are available from AUS projects webpage. The TAU tutorial gives<br />
users information about TAU installation on Ranger and Kraken and other TeraGrid resources,<br />
and is a practical guide for TAU. It will allow users to learn about the commonly used features of<br />
the TAU software tool and will enable users to successfully start using TAU on TeraGrid<br />
resources.<br />
5.5.2.3<br />
For the third category of ASP, AUS staff at NICS are implementing and testing “ld” and “aprun"<br />
as a part of the libraries database project on Kraken. This database will provide user support staff<br />
knowledge about library usage such as which ones are used the most and how frequently. The<br />
information obtained will be used to improve the software management, procurement and<br />
maintenance requirements. Also, this information may prove useful in the planning and<br />
development stages of new resources as they are brought to production. This project was<br />
completed in this quarter and has now moved into production phase. The database is growing<br />
based on regularly collected software usage data. The data is being used to identify software that<br />
is used least often, and to identify software used most often. These information are used to tailor<br />
support for software at NICS.<br />
5.5.2.4<br />
Creation of a software and associated database to monitor libraries being used by<br />
users<br />
Under the fourth category of ASP, the project being done involves both ASTA and ASP and it<br />
has been described in §5.5.1 (PI Roberts (RTI International, Computer Science) EpiSims; PI:<br />
Karniadakis (Brown U., CFD) NekTar-G2; PI Coveney (University College of London,<br />
Chemistry) GENIUS). In this quarter this project has also transitioned into operational phase and<br />
will continue as an ASTA project.<br />
5.5.3<br />
Implementation and testing of software allowing cross-site MPIg based simulations to<br />
run and associated interaction with cloud computing<br />
Advanced Support for EOT (ASEOT)<br />
AUS staff from all the RP sites contribute to preparing and delivering advanced HPC and CI<br />
technical contents for trainings and workshops that are offered throughout the year at all the RP<br />
sites. This is done in coordination with the EOT area. List of workshops, which were offered are<br />
63
listed in the EOT section of this report. One of the EOT highlights, in this quarter, was teaching<br />
of an Introductory HPC workshop at Puerto Rico (University of Puerto Rico at Rio Piedras and<br />
University of Puerto Rico at Mayaguez) in December. Multiple AUS staff such as Marcela<br />
Madrid (PSC), Mahidhar Tatineni (SDSC), Amit Majumdar (SDSC), Sergiu Sanieleveici (PSC)<br />
participated in this workshop at Puerto Rico.<br />
Under the ASEOT effort, AUS staff continued to participate in the Extreme Scalability (XS)<br />
working group, which specifically focuses on architecture, performance, tools and software<br />
related to petascale-level simulation. AUS staff are participating in projects identified by the XS<br />
working group, and in many cases these projects allow interaction with researchers/users funded<br />
by NSF PetaApps, SDCI, and STCI grants. Interactions within the extreme scalability working<br />
group also help AUS staff to identify ASPs that may impact large number of users. The hybrid<br />
MPI-OpenMP programming model project and the TAU tools project, as stated in §5.5.2.2, are<br />
joint projects between the AUS area and the XS working group.<br />
AUS experts, from all the RP sites, regularly present technical talks at the bi-weekly AUS<br />
technical tele/web-conferences. These technical talks, based on ASTA work or other AUS work,<br />
are recorded and the slides and the audio recordings are available from:<br />
http://teragridforum.<strong>org</strong>/mediawiki/index.phptitle=Technical_presentations.<br />
Multiple AUS staff are regular members of the TRAC review committee and in addition many<br />
AUS staff review TRAC proposals that request smaller amount of SUs and hence do not require<br />
review by regular members of the TRAC review committee.<br />
6 User Facing Projects and Core Services<br />
The User-Facing Projects and Core Services (UFC) area continued its work of providing the<br />
central access and information dissemination points for TeraGrid users and to present a single<br />
interface to the collective efforts of the TeraGrid RPs. The GIG supports and coordinates effort in<br />
this area from staff at IU, NCSA, PSC, SDSC, TACC, and UC/ANL.<br />
6.1 Highlights<br />
In a project spanning multiple RPs staff from NCSA, PSC and SDSC, coordinated as a team<br />
through the GIG, successfully upgraded the TGCDB’s primary and secondary servers to new<br />
versions of postgreSQL. This upgrade provides significant performance enhancements as well as<br />
an ongoing supported version of the db for TeraGrid.<br />
TeraGrid continues to set new records for use, in October, 1,373 individuals charged jobs on<br />
TeraGrid resources, a new monthly high. Contributing to this increase was a rise in science<br />
gateway use, nearly 1/3 of all TeraGrid users submitted jobs were done so through a science<br />
gateway.<br />
SDSC staff members installed Trestles documentation as well as integrated Trestles into AMIE in<br />
preparation of the Jan 1 production date.<br />
TACC staff members, Katie Cohen and Kent Milfeld coordinated the TRAC review process and<br />
<strong>org</strong>anized the allocation review meeting held within the reporting period. In addition, the TACC<br />
team <strong>org</strong>anized and chaired the weekly allocations working group meetings, and processed all<br />
advance, startup and supplement allocation requests (275) during the reporting period.<br />
6.2 Enhanced TeraGrid User Access<br />
This objective supports the creation and distribution of new user logins across TeraGrid RPs for<br />
the various resources, as well as enhancements to this process to integrate with campus<br />
64
authentication infrastructures. Within the TeraGrid User Portal, users are also provided ways to<br />
maintain their user profiles, customize their interaction with TeraGrid, and access TeraGrid<br />
resources and services remotely.<br />
Active TGUP development continued on the vetted/unvetted user portal account effort, which is<br />
nearing release, and GridShib authentication to enable users to easily gain access to TeraGrid and<br />
its resources<br />
6.2.1<br />
Operational Activities<br />
The fourth quarter has typically seen a slight dip in TeraGrid user metrics, and Q4 2010 was no<br />
exception. 515 new individuals joined the ranks of TeraGrid users, and in October, 1,373<br />
individuals charged jobs on TeraGrid resources, a new monthly high. This continued elevated<br />
workload is made possible by prior efforts to streamline the new-user process and reduce the<br />
central staff effort.<br />
TeraGrid users numbered 6,505 at the end of the quarter, including 5,563 individuals with current<br />
TeraGrid accounts. Of the 5,563 individuals, 1,904 (34%) were associated with jobs recorded in<br />
the TeraGrid central accounting system; an additional 942 users submitted jobs via science<br />
gateways—that is, nearly 1/3 of TeraGrid users submitting jobs worked through gateways. Figure<br />
6-1 shows the growth of the TeraGrid community since 2004. (Active Users = Users Charging<br />
Jobs + Gateway Users; TeraGrid Users = Current Accounts + Gateway Users.)<br />
The Knowledge base and the TGUP’s access capabilities continue to be among the most highly<br />
used features, with users accessing the file manager portlet 12,973 times, the knowledge base<br />
11,420 times, and allocations usage 8,980 times this quarter. Table 6-1 shows the complete listing<br />
of the most highly visited TGUP applications for this quarter.<br />
Figure 6-1. TeraGrid User Trends, Q4 2010<br />
Table 6-1. Q4 2010 Most Visited TGUP Applications<br />
Application<br />
# Users Q4 2010 Visits<br />
File Manager 64 12,973<br />
Knowledge Base (via TGUP alone) 10,959 11,420<br />
65
Allocation Usage 1,039 8,980<br />
GSI-SSH 491 6,460<br />
System Accounts 671 5,783<br />
Data Collections 2,228 4,238<br />
System Monitor 486 3,150<br />
Science Gateways 28 1,397<br />
Consulting 216 1,169<br />
Add/Remove User Form 189 722<br />
QBETS Wait Time 40 625<br />
Karnak Prediction 46 620<br />
My Jobs 154 510<br />
QBETS Queue Prediction 395 395<br />
Ticketing System 143 320<br />
User Profile 184 280<br />
Feedback 177 177<br />
6.3 Allocations Process and Allocation Management<br />
This objective encompasses the allocations process, both for Startup/Education allocations as well<br />
as the merit-review TRAC Research request process, the POPS system for request handling and<br />
management, mechanisms by which allocation PIs manage allocations through transfers,<br />
extensions and so on, and interfaces by which allocation PIs manage the users who are authorized<br />
to use their allocations. Operationally, this objective includes the TRAC review process, the<br />
Startup allocations review and decision process, and the maintenance and operations of the POPS<br />
system.<br />
The allocation staff continues to receive in excess of two new/renewal startup requests each day.<br />
TeraGrid received 206 Startup requests (168 approved) and 12 Education requests (12 approved).<br />
Table 6-2 shows the overall allocations management activity handled by POPS and the<br />
allocations staff. Note that for Transfers, the table shows only the positive side of the transaction<br />
to show the total transfer level; there is a corresponding negative amount, adjusted for resource<br />
exchange rates.<br />
Table 6-2. POPS Requests and Awards, Q4 2010<br />
Research Startup Education<br />
# SU # SUs # SU # SUs # SU # SUs<br />
Req Requested Awd Awarded Req Requested Awd Awarded Req Requested Awd Awarded<br />
New 58 266,335,446 52 65,028,339 177 35,790,440 144 24,470,244 9 1,343,026 9 1,143,029<br />
Prog. <strong>Report</strong> 7 10,874,082 7 8,180,451 n/a n/a<br />
Renewal 61 343,166,885 58 209,659,993 29 6,438,085 24 5,501,386 3 193,090 3 193,090<br />
Advance 23 28,700,768 23 11,658,507 n/a n/a<br />
Justification 5 8,225,558 4 4,726,478 0 0 0 0 0 0 0 0<br />
Supplement 16 42,771,601 13 13,336,601 22 1,905,034 19 1,580,004 1 60,000 1 60,000<br />
Transfer 106 48,982,063 97 46,917,813 30 943,360 27 787,526 3 120,000 2 70,000<br />
Extension 43 n/a 40 n/a 44 n/a 41 n/a 0 n/a 0 n/a<br />
6.4 RP Integration and Information Sharing<br />
The TGCDB continues to be successfully monitored via Inca. The failover system continues to<br />
sync with the production TGCDB server. In a GIG coordinated project spanning multiple RPs,<br />
staff from NCSA, PSC and SDSC working as a team successfully upgraded the TGCDB and<br />
failover servers to a new version of postgreSQL. In doing so, significant performance<br />
enhancements were made available as well as providing an ongoing supported version of the db<br />
for TeraGrid.<br />
66
6.5 RP Operations: Accounting/Core Services<br />
TG RP staff provided local allocations and account management support for users including<br />
account creation/deletion, password reset, and local system usage accounting. These staff<br />
interacted with the TG Account Management transaction system (AMIE), which is a transactionbased<br />
grid account management system that is central to all users having access to the TeraGrid.<br />
6.5.1<br />
NCSA staff continues to support and maintain AMIE and contributed to the recent upgrade of the<br />
TGCDB.<br />
6.5.2<br />
NCSA<br />
PSC continues to house, support and maintain the secondary instance of the TGCDB. The PSC<br />
staff continues to ensure replication between the secondary and the primary continues. PSC staff<br />
were central to the success of the recent upgrade of the TGCDB.<br />
6.5.3<br />
PSC<br />
SDSC continues to house, support and maintain the primary instance of the TGCDB. The SDSC<br />
staff continues to ensure replication between the secondary and the primary. SDSC staff were<br />
central to the success of the recent upgrade of the TGCDB. SDSC staff also built the local<br />
accounting infrastructure to track Trestles CPU usage and the Trestles resource was integrated<br />
into the TeraGrid via AMIE in December.<br />
6.5.4<br />
SDSC<br />
TACC<br />
TACC staff members continued to contributed to activities related to User Facing Projects and<br />
Core Services. Maytal Dahan led TGUP portal development activities including the development<br />
of vetted/unvetted user portal accounts and GridShib authentication to enable users to easily gain<br />
access to TeraGrid resources. Furthermore, this quarter the user portal team released expanded<br />
job service features. The user portal team released an interface to the Karnak prediction service<br />
offering start time and wait time predictions on TG resources. The team also released a ‘’My<br />
Jobs’’ feature, which lists the status of a user’s jobs on the TeraGrid. It also integrates Karnak by<br />
enabling prediction capabilities on queued jobs. The team continued to support the software and<br />
systems for both the TeraGrid User Portal and the TeraGrid Web Site. Katie Cohen and Kent<br />
Milfeld coordinated the TRAC review process and <strong>org</strong>anized the allocation review meeting held<br />
within the reporting period. In addition, the TACC team <strong>org</strong>anized and chaired the weekly<br />
allocations working group meetings, and processed all advance, startup and supplement allocation<br />
requests (275) during the reporting period.<br />
6.6 User Information Presentation<br />
This objective ensures that users are provided with current, accurate information from across the<br />
TeraGrid in a dynamic environment of resources, software and services. Operationally, this<br />
objective ensures high reliability of the presentation mechanisms, including quarterly operational<br />
support for the TGUP and TeraGrid web site. Support for TeraGrid internal web environments,<br />
notably the Wiki, is included here.<br />
Members of both the TGUP and Documentation teams received advance LifeRay training at the<br />
LifeRay corporate offices in CA.<br />
The TGUP and TeraGrid Web presence continued to increase its user base and support large<br />
numbers of visitors and page hits. Table 6-2 shows hits, visitors and logins for the TGUP, Web<br />
site and Wiki.<br />
67
Table 6-3. TeraGrid Web traffic, Q4 2010.<br />
Web Portal Wiki<br />
Unique Visitors Hits Unique Visitors Unique Logins Hits Unique Visitors Pages Edits<br />
Oct 7,616 650,744 3,519 1,191 365,191 2,745<br />
8,136 48,419<br />
Nov 7,741 610,412 3,165 1,173 304,410 3,054<br />
Dec 5,831 499,191 3,035 1,058 308,510 2,470<br />
6.7 RP Operations: User Facing Projects<br />
TG RP staff maintained and updated local documentation for their resources. Some of this<br />
activity resulted from user feedback and interaction with the TeraGrid Operations Center, RP help<br />
desk support personnel, and members of the advanced user support team. Hardware and software<br />
upgrades on HPC, visualization, and storage resources also resulted in local documentation<br />
activities.<br />
The SDSC Documentation Group added the entire User Guide section for the Trestles resource.<br />
This included seven new pages of User Support information integrated with the existing site,<br />
located at: http://www.sdsc.edu/us/resources/trestles/. Existing documentation was also updated<br />
for the Dash resource and the Visualization Services section.<br />
6.8 Information Production and Quality Assurance<br />
This objective includes efforts to provide documentation for TeraGrid activities, provide a<br />
knowledgebase of answers to user questions, and maintain and update general web site content,<br />
all within the TeraGrid Web environment. This activity requires close interaction with RPs and<br />
other GIG areas as subject-matter experts and sources of information. As part of UFC operations,<br />
this objective focuses on updates and Quality Assurance (QA). These ongoing processes and<br />
efforts ensure that existing user information remains correct, is coordinated across both the GIG<br />
and RP subject-matter experts, is in sync with evolution of TG environment (via collaboration<br />
with the CUE working group), and contains minimal errors.<br />
6.8.1<br />
IU<br />
2010 <strong>Report</strong> for the TeraGrid Knowledge Base<br />
Summary statistics for the TeraGrid Knowledge Base:<br />
Total number of documents available in the TGKB at the end of 2010 = 760<br />
The total number of TGKB documents including those archived and in draft = 851<br />
Number of new documents created during 2010 = 100<br />
Number of documents modified during 2010 = 646<br />
Number of documents archived during 2010 = 34<br />
Total hits (documents retrieved) = 1,005,645<br />
Relevant hits (documents retrieved, minus bots, crawler, scrappers, link checkers) = 360,231<br />
Relevant hits excluding searches on “common Inca error” and “CTSS core errors” = 217,440<br />
68
Note that the total repository of TGKB documents is 851, of which 71 are archived, primarily<br />
because of the retirement of resources, and 20 are in a draft state. This leaves 760 documents that<br />
are currently in active use.<br />
69
Most new documentation originated as a direct result of conversations within the User Services<br />
group. The ticket system and the Forums in Life Ray have also spawned documentation. As a<br />
general practice, the growth of the TGKB is now focused on support-driven documentation and<br />
less on quota base contributions.<br />
Not reflected in these numbers are 119,821 hits that originated from a utility that searches for<br />
“common Inca error” and 29,970 hits on “CTSS core errors”.<br />
An analysis of the “referrer” field in the web logs shows that a significant proportion of the<br />
relevant hits (non-spam/bot/crawler) that originated from the TeraGrid server were initiated by<br />
the Inca and CTSS support tools.<br />
70
From an analysis of the hits for which we have referrer information, excluding those that<br />
originate from support tools, it is clear that the search interfaces on the IU and TeraGrid servers<br />
account for a relatively small proportion of the searches. People are getting to TGKB<br />
documentation primarily through Google searches. This phenomenon increased as the year<br />
progressed.<br />
Some of the most frequently accessed documents during 2010 were:<br />
How do I get started using the TeraGrid<br />
Who is eligible to apply for a TeraGrid allocation<br />
How do I apply for a new TeraGrid allocation<br />
On the TeraGrid, what is Queen Bee<br />
What's the quickest way for new users to access the TeraGrid<br />
What is a TeraGrid-wide login<br />
How do I request supplemental service units for a TeraGrid allocation<br />
Getting started on Big Red<br />
On a TeraGrid system, how can I use SoftEnv to customize my software environment<br />
How do I request additional calendar time for my TeraGrid allocation<br />
Big Red On the TeraGrid, what is Big Red<br />
What is Lustre<br />
About applying for a Research allocation on the TeraGrid<br />
On the TeraGrid, what is Steele<br />
71
How do I transfer an allocation from one TeraGrid system to another<br />
How do I create a login for the TeraGrid's Partnerships Online Proposal System<br />
On the TeraGrid, how are compute jobs charged<br />
On the TeraGrid, what types of allocations are available<br />
How can I add or remove a user from an existing TeraGrid allocation<br />
Using TREE-PUZZLE on Big Red at IU<br />
What is Globus, and where can I find out how to use it on the TeraGrid<br />
TeraGrid overview<br />
How do I access IU's MDSS/HPSS from my TeraGrid account<br />
FASTA programs on Big Red<br />
On the TeraGrid, what is Single Sign-On What is Single Sign-On<br />
On the TeraGrid, what is the NSTG<br />
What is ClustalW, and where is it installed on the TeraGrid<br />
On the TeraGrid, what is Pople<br />
On the TeraGrid, what is the Longhorn Visualization Portal<br />
On the TeraGrid, what is Lonestar<br />
How do I set up authentication with SSH keys on a TeraGrid resource<br />
On Big Red, how do I transfer files using Grid-FTP (globus-url-copy)<br />
On the TeraGrid, what is Spur<br />
What is HPSS<br />
What is MPI, and where can I find information about using it on TeraGrid resources<br />
How do I compare queues on TeraGrid resources at different sites<br />
How can I access TeraGrid account and resource information<br />
What visualization and analysis resources are available on the TeraGrid<br />
What is the TeraGrid User Portal, and how do I access it<br />
What is Python, and where is it installed on the TeraGrid<br />
7 Data and Visualization<br />
7.1 Highlights<br />
SDSC continued visualization support for the latest SCEC simulations with selected ones featured<br />
at SC’10 Gordon Bell, Masterworks presentations, and the SC’10 plenary video presentation.<br />
Members of the TACC Visualization group facilitated the efforts of TeraGrid user Rob Farber to<br />
scale his code to 500 GPUs using the TeraGrid resource Longhorn. Farber successfully ran the<br />
code during dedicated time and presented the results at SC10.<br />
72
7.2 Data<br />
7.2.1<br />
Data Movement continues to be a stable, slowly evolving area of TeraGrid infrastructure. The<br />
most recent deployment of GridFTP includes the ability to initiate connections using SSH, thus<br />
providing more flexibility to users in how they access GridFTP servers. The “WANpage” has<br />
now been deployed at PSC, providing information on WAN file system performance analogous to<br />
the “Speedpage”, and new information services registration patterns have been developed to<br />
support the Albedo Lustre-WAN file system and facilitate automated testing of client data<br />
movement performance.<br />
7.2.2<br />
Data Movement<br />
Wide-Area Filesystems<br />
The Albedo Lustre-based WAN file system is now substantially complete, with all servers and<br />
storage now deployed and all sites except SDSC having added the storage to the file system.<br />
Security policies for providing access to the file system from non-TeraGrid resources have been<br />
developed, in consultation with the Security working group. Several improvements to the identity<br />
mapping code, included security patches, have been made by staff at PSC and Indiana, and<br />
infrastructure to track client mounts and associated identity mapping data has been developed and<br />
is already deployed at some sites.<br />
Indiana University has worked with Brock Palen at the University of Michigan to mount DC-<br />
WAN there. Palen approached IU for a DC-WAN mount to ease the pain his TeraGrid users have<br />
been experiencing getting data local to Ann Arbor onto the TeraGrid. IU has successfully brought<br />
all of the head nodes at Michigan online and is making plans to bring the compute nodes (via<br />
Lustre Router) online as well.<br />
7.2.3<br />
Data Collections<br />
The Data Replication Service remains the primary mechanism of support for TeraGrid-wide data<br />
collections, and approximately one third of the total resources in the replication service have now<br />
been allocated. Evaluations have been performed on various mechanisms for web-based and<br />
desktop access to the DRS, such as WebDAV, CGI-based interfaces, and native graphical<br />
interfaces. Work to facilitate staging of data into and out of large tape archives used by the DRS<br />
is ongoing. The DRS has also been federated with several more specific data grids, including the<br />
iPlant Collaborative data grid, the TACC center-wide data grid, and the SDSC data grid; this<br />
facilitates the movement of data from campus- or project-level resources into national-scale<br />
resources.<br />
ORNL has been working in collaboration with the DataONE project to explore and define, with<br />
the data area director and the TeraGrid Forum, ways for DataONE and TeraGrid to interoperate.<br />
Specifically, a plan has been defined and is being executed for a pilot DataONE member node<br />
deployment with a mounted TeraGrid storage substrate. This effort will produce a kind of “Data<br />
Collections Gateway” and will also involve the Science Gateways group. Allocation of these<br />
resources will be requested in the Q1CY11 TRAC process.<br />
73
7.2.4<br />
Archival Storage<br />
SDSC continued HPSS to SAM-QFS migration of users with numerous small files. Repacking<br />
effort continues at roughly 60-70 volumes per week. TACC has completed a set of upgrades to<br />
the Ranch/SAM-QFS archive infrastructure, including a substantial increase in the available disk<br />
cache that should improve overall data throughput and stability.<br />
7.2.5<br />
Data Architecture<br />
The recommendations made in the data architecture document have now been implemented as<br />
completely as possible within the context of the TeraGrid effort. Current and future efforts in data<br />
architecture are focused on planning for and implementing the transition to XD in a transparent<br />
fashion, ensuring continuous access to data for the user community through the transition, and<br />
compiling and communicating lessons learnt for the benefit of XD and other large-scale<br />
cyberinfrastructure efforts.<br />
7.3 RP Operations: Data<br />
7.3.1<br />
Distributed Data Management<br />
The PSC-wide, central Lustre file system, Bessemer, has been upgraded to Lustre version 1.8.<br />
During this upgrade, storage was re-provisioned to Albedo, TeraGrid’s Wide Area Lustre project.<br />
By upgrading to Lustre version 1.8, PSC will support a broader range of modern clients mounting<br />
Bessemer. It has been mounted on Blacklight. Bessemer is also mounted again on Pople, Salk,<br />
and several utility servers around the Center.<br />
Josephine Palencia and Kathy Benninger of PSC along with Yujun Wu of the University of<br />
Florida have made good progress on the ExTENCI Lustre WAN project and the tasks are on<br />
schedule. Palencia and Wu installed the necessary Lustre WAN and Kerberos software on a<br />
system at the University of Florida, and Benninger is helping with tuning the 10GigE interface.<br />
With Kerberos authentication enabled on the file system, the work has increased its security. A<br />
private wiki for the system administrators from the partner sites has been created by PSC. All<br />
project members who need access to this wiki have been given PSC ID’s and can authenticate to<br />
it using Kerberos. PSC leads weekly teleconference calls on this part of ExTENCI. The ExTENCI<br />
Lustre WAN project was highlighted in two events at SC10. First, J. Ray Scott gave a talk at the<br />
High Performance Cluster File System (HPCFS) Software Foundation meeting, which included a<br />
slide on the ExTENCI file system, and second, there was a demonstration of the file system with<br />
clients in the CalTech and PSC booths.<br />
DC-WAN is now mounted on Steele’s login nodes, and Purdu continued to work on mounting the<br />
DC-WAN on Steele’s compute nodes. A December maintenance downtime allowed Purdue staff<br />
to make configuration changes. After Jan.-Feb. testing and working with IU staff, Purdue hopes<br />
to release this to production.<br />
SDSC Albedo servers and storage have been provisioned and should be ready for addition to the<br />
file system by the end of 01/11.<br />
LONI has ordered a 200TB storage upgrade for Queen Bee.<br />
7.3.2 Data Collections<br />
Purdue added three new datasets for access through its multidisciplinary data portal: (1) Indiana<br />
precipitation dataset, (2) St Joseph Watershed water quality data and stream flow data, and (3)<br />
74
historical time series of average precipitation and temperature data. New features were also<br />
implemented that allow users to preview remote sensing and other image data before download.<br />
Purdue also continued to develop a data publishing web application which aims at enabling<br />
individual researchers to self publish and share their dataset. The component supports two types<br />
of data management backend: iRODS for file based data and MySQL for table based data. It<br />
provides templates and wizards to help guide users to upload, search for, and download data.<br />
TACC completed the addition of a new collection of data from the University of California<br />
Curation Center, representing a large collection of scholarly publications archived by the<br />
University of California. This collection is being used to explore the opportunity for integrated<br />
infrastructure supporting both traditional scholarly publishing activities and publication of large<br />
scientific datasets.<br />
7.4 Visualization<br />
NCAR develops and distributes the visual data analysis package, VAPOR, which is funded in<br />
part through GIG PY5&6 funds. Progress was made on two of the major milestones of the<br />
PY5&6 award: 1) Development work on a lossy compression library for scientific data has been<br />
completed and made available as part of the VAPOR 2.0 release. 2) Performance optimization of<br />
the parallel API to the VAPOR progressive access data model was completed, and friendly users<br />
for further testing are being gathered.<br />
The NCAR VAPOR team has also developed a next-generation Progressive Data Access (PDA)<br />
scheme based on their promising research results in wavelet coefficient prioritization. This new<br />
scheme offers far greater data reduction than the current multi-resolution approach at comparable<br />
fidelity, or improved fidelity at the same level of data reduction. The prototype PDA research<br />
code was hardened and integrated into VAPOR, and made publicly available in version 2.0 of the<br />
package, released in December, 2010. VAPOR 2.0 provides command line tools and a serial C++<br />
API for transforming raw binary data into and out of the VDC2 format. Moreover, VAPOR’s<br />
visual data analysis tool, vap<strong>org</strong>ui, is capable of reading and displaying VDC2 format data.<br />
Considerable progress was also made on both the development and support for a public, parallel<br />
API for writing Cartesian gridded data into the next-generation PDA format (now known as<br />
VDC2) and the evaluation of the functionality, scalability and performance of the parallel API for<br />
writing Cartesian gridded data into the VDC2 format. A prototype, Fortran-callable parallel<br />
interface for writing VDC2 data was developed, and specifications for a public API completed.<br />
The public API will be offered as an extension to NCAR’s own Parallel I/O (PIO) library<br />
(http://web.ncar.teragrid.<strong>org</strong>/~dennis/pio_doc/html/). Existing Fortran PIO codes will be able to<br />
output VDC2 data with minimal modification, calling only two or three configuration functions.<br />
New codes will benefit from PIO’s abstraction of high-performance parallel IO, and the ability to<br />
easily toggle between output to one of PIO’s native, uncompressed formats (currently netCDF<br />
and raw MPI-IO), and compressed VDC2 data.<br />
The prototype parallel interface has been tuned for scalability and high performance. The plot<br />
below demonstrates the scalability of the library on NCAR’s IBM Power6 system with synthetic<br />
data sets with 2048^3 and 4096^3 degrees of freedom.<br />
Work on the public API to VDC2 should be wrapped up early in the next quarter. Once<br />
completed, the NCAR VAPOR team will evaluate the new capabilities by working with<br />
numerical modelers to integrate the API into an actual simulation code. Discussions with two<br />
candidate friendly user groups have already begun: one group at the National Renewable Energy<br />
Lab (NREL); and an internal group working at NCAR, but computing at NICS.<br />
75
Figure 2: Throughput in Megabytes per second (MBps) vs. number of MPI tasks for<br />
2048^3 and 4096^3 synthetic data sets.<br />
7.5 RP Operations: Visualization<br />
During this quarter, visualization services at the TG RPs included the maintenance,<br />
administration, and operation of visualization resources, including visualization-specific systems.<br />
The UChicago/Argonne RP continued its collaboration with PI Michael Norman on his Projects<br />
in Astrophysical and Cosmological Structure Formation. A recent simulation computed by the<br />
Enzo team, done on Kraken at NICS, uses a flux-limited diffusion solver to explore the radiation<br />
hydrodynamics (rad-hydro) of early galaxies, in particular, the ionizing radiation created by<br />
Population III stars. The simulation volume is 11.2 co-moving megaparsecs, and has a uniform<br />
grid of 1024 3 cells, with over 1 billion dark matter and star particles, the largest calculation of this<br />
type known to date. The UChicago/Argonne RP team produced an animation of multiple<br />
variables of this new data using vl3, a hardware-accelerated volume rendering application<br />
developed at UChicago/Argonne, visualizing these previously unseen results. This animation<br />
was showcased in both the Argonne and SDSC booths at SC10. The Enzo team has since been<br />
using this animation to promote their work, posting it in various Web outlets, including SciVee<br />
(http://www.scivee.tv/node/26565) and Vimeo (http://vimeo.com/17771397). Enhancements were<br />
also made to the vl3 Enzo reader to enable the calculation of derived quantities from multiple<br />
variables as the data is read from disk. The Enzo and UChicago/Argonne teams also collaborated<br />
76
on a demonstration that took place in the SDSC booth at SC10. It included real-time streaming of<br />
volume rendered images from multiple instances of vl3 at Argonne to a tiled display on the<br />
exhibit floor in New Orleans. The researchers were able to simultaneously visualize multiple<br />
variables from the rad-hydro simulation and compare results. One invaluable improvement in this<br />
demonstration over a similar one at SC09 is that control over the remote visualization was put in<br />
the hands of the scientists. Using an interface largely refactored from previous work done on the<br />
TeraGrid Visualization Gateway, the researchers were able to control the view, as well as<br />
manipulate color and transfer functions from a standard Web browser. Another improvement<br />
includes incorporating the Celeritas data-streaming library, which is optimized for high<br />
bandwidth, wide-area networks. This enabled interactive, uncompressed streaming of the images,<br />
eliminating artifacts due to compression, while also reducing latency. In addition, a Python client<br />
was developed, which integrates remote streaming to the desktop with the visualization controls<br />
in a single client. Beyond just a one-time demo, the tools developed are now in the hands of the<br />
Enzo scientists, who are able to use them to conduct their research after returning home to SDSC.<br />
Feedback from the Enzo team will help guide further development of the tools.<br />
The UChicago/Argonne RP continues to work with the Parallel Simulations of Blood Flow in the<br />
Human Cranial Tree project (PI: Ge<strong>org</strong>e Karniadakis), and in collaboration with a PetaApps<br />
award, on the visualization of their data, using a combination of ParaView and custom-built code.<br />
The RP team continues development of their custom ParaView reader plug-in, which loads data<br />
in native NekTar format, computes derived quantities and curved surfaces at higher resolutions,<br />
where appropriate, and pushes it into the ParaView pipeline. Support for time-variant data has<br />
now been integrated into the reader, enabling researchers to animate through time steps of the<br />
simulation results. Data caching has also been implemented in the reader, alleviating the need to<br />
continually re-read data from disk. The team will next investigate techniques for pre-fetching<br />
data in order to further reduce delays incurred during animation while waiting for data to be read<br />
from disk. The NekTar team has integrated LAMMPS code with their NekTar simulations,<br />
producing multi-scale data sets. The RP team continues to develop utilities for converting the<br />
resulting red blood cell (RBC) and velocity continuum and various types of particle data to<br />
formats suitable for visualizing. The latest calculations simulate a single diseased red blood cell,<br />
and the flow around it, as it is pushed along and adheres to the boundary wall. Several<br />
animations of both the RBC data and the NekTar continuum data were produced, and showcased<br />
in the Argonne booth at SC10, as well as included in the SC10 Plenary Session Video loop. The<br />
two teams also collaborated on an electronic poster submission for SC10 highlighting the various<br />
aspects of this work, and presented it at the conference.<br />
Mike Norman describes the radiation-hydrodynamics<br />
simulation data during streaming visualization<br />
demonstration at SC10.<br />
Still frame from an animation of four variables of the<br />
radiation-hydrodynamics simulation data.<br />
77
NCAR operates the dual-node DAV cluster named Twister. NCAR also supports the visual data<br />
analysis package, VAPOR. Version 2.0 (previously called version 1.6) of VAPOR was released<br />
late this quarter. Nearly 300 copies of the software have been downloaded in the first month since<br />
its release.<br />
SDSC continued visualization support for the latest SCEC simulations. A few animations of M8<br />
visualizations were featured at SC10 Gordon Bell, Masterworks presentations and SC 10 plenary<br />
video. SDSC continued development of GlyphSea software and presented a poster at 2010 AGU<br />
Fall meeting showcasing volumetric visualization of seismic data. A new collaboration to<br />
visualize intestine flow in small intestine is in progress with PI James Brasseur of Pennsylvania<br />
State University.<br />
Purdue RP released its 3D visualization gadget for super resolution NEXRAD level II radar data<br />
in December 2010. The gadget is available from the RP web site and the Google gadget<br />
directory. The figure below is a screen shot of the gadget showing Oct. 26, 2010 3D storm data<br />
from multiple radar stations.<br />
The TACC RP continues to operate and support both Spur and Longhorn. Spur is completely<br />
allocated out and the TACC RP has been working to ensure that visualization and data analysis<br />
on Longhorn continues to increase.<br />
8 Network Operations and Security<br />
8.1 Highlights<br />
In November, 2010 IU along with four research partners announced that they had succeeded in<br />
using the Lustre file system over a wide area network (WAN), saturating the world's first<br />
commercial 100 gigabit link. Using the full duplex capability of the 100 gigabit link, the team<br />
achieved an aggregate average transfer rate of 21.904 gigabytes per second. This is over 87% of<br />
the theoretical maximum allowed by the link. The longer-term goal is to explore the possibility of<br />
having one or more wide area file systems that provide Dresden and Freiberg with massive data<br />
transfer capabilities, enabling faster exchange of shared data and insights.<br />
78
NCSA's SGI Altix UV system, Ember, went into production on November 1 st , replacing the SGSI<br />
Altix system Cobalt.<br />
SDSC installed Trestles, a DCL awarded 100TFlop system. SDSC began friendly user testing on<br />
the Trestles system in early December and began TeraGrid production on Jan 1, 2011.<br />
8.2 Networking<br />
The TeraGrid continued operation of the backbone network consisting of core routers in Chicago,<br />
Denver and Los Angeles, connected with dedicated 10 Gigabit/sec links and two levels of<br />
redundancy in the form of alternative paths via NLR. TeraGrid resource providers maintain their<br />
respective site border routers and dedicated links to one of the three core routers. The dedicated<br />
links consist of at least one 10Gb/s link .<br />
SDSC has been continuing to work with CENIC engineering on planning the re-routing of<br />
TeraGrid waves in preparation of the removal of the Los Angeles TeraGrid core router. In mid-<br />
January 2011 circuits that are currently terminating in the Los Angeles router will be re-homed to<br />
the San Diego router.<br />
TACC installed a new border router. The Juniper EX8208 enables the use of routing instances<br />
(in lieu of having multiple physical routers) to keep the end-to-end peering separate from the<br />
public routing peering (NLR, I2, UTNET, and Internet).<br />
The following two graphs show total usage and 20-second average peaks on the TeraGrid<br />
backbone between Chicago and Denver.<br />
Note: Due to a data collection anomaly during early Oct (beginning 1/8 of the graphs) lower then actual levels were<br />
recorded.<br />
Figure 8-1: Daily peak bandwidth on<br />
Chicago-Denver link.<br />
Bandwidth is measured at 20s granularity.<br />
Figure 8-2: Total transfer (GB) per day on<br />
Chicago-Denver link.<br />
Total theoretical capacity is ~100,000 GB/day.<br />
8.3 RP Operations: Networking<br />
During this quarter, TG RP staff provided support and operation of the local RPs network<br />
connectivity to the TeraGrid backbone. Duties include daily monitoring and maintenance of the<br />
network, planning and/or implementing changes to RP connections to the TG backbone, direct<br />
user support for network-specific issues, switch firmware upgrades, and HPC system<br />
interconnects. TG RP staff also kept the user community apprised of interruptions in network<br />
connectivity via local and TeraGrid user news.<br />
79
The ORNL NSTG operations continue to run without difficulty at the SNS location. From this<br />
period forward, the network move from the ORNL main campus should be considered<br />
successfully completed.<br />
SDSC network engineers worked to facilitate an Infiniband to Ethernet bridging service to enable<br />
TeraGrid users to have access to the new Trestles resource.<br />
8.4 INCA<br />
The SDSC Inca team continued to support and maintain the Inca deployment on TeraGrid and<br />
helped system administrators to debug failures on their machines. In response to IIS changes and<br />
requests from the GIG-pack and QA groups, existing Inca test suites were modified. Four<br />
existing TeraGrid reporters were also modified and one new reporter was created and deployed.<br />
At the time of this report 2,976 pieces of test data are being collected across TeraGrid platforms.<br />
NCSA’s Ember was added to the Inca TeraGrid deployment during this time. NCSA’s Cobalt<br />
was removed from Inca testing and TACC's Lonestar was removed from testing until the machine<br />
returns in 2011.<br />
The SDSC Inca team also updated TeraGrid’s Inca deployment to the working 2.6 release. The<br />
two most relevant features of this release are data mirroring capabilities so that monitoring data<br />
and notifications will still available if SDSC goes down (not yet configured) and historical<br />
graphing abilities for performance data. The new graphing capabilities will allow the Information<br />
Services team to view the response times of their Inca monitored server.<br />
8.5 Grid Services Usage<br />
The Globus usage listener for GT5 (GRAM5) continues to gather Grid Service usage for the<br />
quarter. The following graphs detail this usage.<br />
8.5.1 Data Movement (GridFTP) Statistics<br />
Figure 8-3: Total GridFTP transfers (GB) for the quarter<br />
per site.<br />
Figure 8-4: Total GridFTP transfers (GB) per day per<br />
site.<br />
80
8.5.2 Job Submission (GRAM) Statistics<br />
Figure 8-5: Number of GRAM submissions per<br />
resource per day.<br />
Figure 8-6: Total number of GRAM submissions<br />
per resource for the quarter.<br />
8.6 RP Operations: HPC Operations<br />
TG RPs continued to provide the personnel and infrastructure support for the local TG RP<br />
resources that are made available to the user community. System administration duties include<br />
daily monitoring of the health of the system, 24 hour on-call where provided, basic filesystem<br />
support, machine-specific parallel filesystem support, and operating system software maintenance<br />
and upgrades. Additionally, this activity provides for CTSS kit (grid middleware) deployment,<br />
administration and support. RP staff also assisted local and TeraGrid user services personnel to<br />
provide timely announcements to the user community related to the status of all TG resources.<br />
8.6.1<br />
LONI<br />
5.1M SUs were delivered to 14.1K TeraGrid jobs serving 102 to 156 unique users per month. The<br />
Queen Bee system had 100% availability without outage in Q4 2010. A planned storage upgrade<br />
for Queen Bee had to be delayed due to some issues with the State Purchasing. The purchase<br />
order of 200TB storage was issued in early January 2011 and we expect the new storage be<br />
available in Q1 2011.<br />
8.6.2<br />
NCSA<br />
NCSA's SGI Altix UV system (Ember) went into production on November 1st. The system has a<br />
peak performance of 16 TFlops and has a total of 8 TB of memory and 450 TBs of GPFS file<br />
space. Each of the four UVs contains 384 cores with 2 TBs of memory and ran the Top 500 HPL<br />
benchmark at 87.5% efficiency for 3.5TFlops each.<br />
Cobalt retired on November 15th. All user accounts and allocations transferred for seamless<br />
support of shared memory intensive applications to Ember.<br />
8.6.3<br />
ORNL<br />
The largest operational activity for the ORNL NSTG environment continues to be staff training of<br />
new SNS embedded systems administration. The learning curve for new staff to understand<br />
TeraGrid’s accumulated body of knowledge is imposing. NSTG staff is climbing this learning<br />
curve, but its operational performance has not been as high quality as prior norms. The current<br />
81
emphasis is on getting new staff acquainted and comfortable with this environment and as such<br />
more emphasis has been placed on learning the environment sooner than rigid uptime goals. More<br />
rapid integration is seen as more important, particularly as NSTG looks to update the systems<br />
used to facilitate data movement to/from TeraGrid HPC resources and neutron science facilities.<br />
To that end, two new system administration staff attended SC10, including many TeraGrid<br />
relevant educational events (such as cluster construction tutorial).<br />
8.6.4<br />
PSC<br />
The 512-node version of Anton, a special purpose machine provided to PSC’s National Resource<br />
for Biomedical Supercomputing (NRBSC) by special arrangement through D.E. Shaw Research<br />
and the National Institute of General Medical Sciences of the National Institutes of Health, went<br />
into production on Thursday 07-Oct-2010. Biomedical researchers were awarded time on Anton<br />
through a competitive peer review process. Designed to dramatically increase the speed of<br />
molecular dynamics (MD) simulations compared with the previous state of the art, Anton is<br />
working fine and without problems for the researchers who received computing time in the first<br />
phase of the project. PSC will have two one-day training workshops at PSC, one on Monday 21-<br />
Feb-2011 and one on Wednesday 23-Feb-2011 for the 32 PI’s who received an allocation for the<br />
second phase of the Anton project which starts on 01-Apr-2011.<br />
Markus Dittrich and graduate student Jun Ma of PSC’s National Resource for Biomedical<br />
Supercomputing presented a research poster entitled Computational study of active zone structure<br />
and function at the mammalian neuromuscular junction at the Annual Meeting of the Society for<br />
Neuroscience in San Diego Nov 13-17. Co-authors of the poster are J.R. Stiles of PSC and T.<br />
Tarr and S.D. Meriney of the Department of Neuroscience and the Center for Neuroscience at the<br />
University of Pittsburgh. The abstract of the poster says, “In our poster we present a<br />
computational model of synaptic function at the mouse neuromuscular junction using MCell. Our<br />
model reproduces established experimental results and we predict the arrangement of presynaptic<br />
calcium channels and investigate possible mechanism for paired pulse facilitation.”<br />
8.6.5<br />
SDSC installed Trestles, a DCL awarded 100TFlop system. SDSC began friendly user testing on<br />
the Trestles system in early December and began TeraGrid production on Jan 1, 2011. SDSC<br />
brought up the Trestles system using Rocks 5.4/CentOS 5.5. SDSC updated the Dash vSMP<br />
software from 3.0.45.36 to 3.0.45.47 to improve performance and stability of the vSMP system.<br />
The Dash development system was upgrade through a series of versions (vSMP-3.0.45.10,<br />
vSMP-3.0.45.36, vSMP-3.0.45.42, vSMP-3.0.45.45, vSMP-3.0.45.47, vSMP-3.0.70.0, vSMP-<br />
3.0.110.0, vSMP-3.5.29.0) to test performance and new features of vSMP.<br />
8.6.6<br />
SDSC<br />
TACC<br />
A new TeraGrid resource, Lonestar4, is scheduled to commence production on February 1, 2011.<br />
In order to begin installation of the system, the “old” Lonestar was removed from production<br />
status on December 1, 2010. TeraGrid PIs with an active Lonestar allocation balance on<br />
December 1, 2010 (and not expiring on January 1, 2011) will be granted an allocation on the new<br />
system. The new system will be configured with 1,888 Dell M610 blades, each with 2, 6-core,<br />
3.33 GHz Westmere processors (12 cores per blade), 24 GB of memory, and 146 GB of storage.<br />
Compute nodes will have access to a PB Lustre Parallel file system. The network interconnect<br />
will be QDR InfiniBand, a fully non-blocking fat-tree topology providing a point-to-point<br />
unidirectional bandwidth of 40 Gb/sec.<br />
82
8.7 Security<br />
8.7.1<br />
Security Working Group<br />
A new proposal to support MPC on the portal was reviewed by the Security Working Group. The<br />
new design removes MCP from the portal and moves it to Teragrid resources. One concern was<br />
the protection of the proxy credential while in transit between the portal and the resource. The<br />
group recommended that the portal should delegate this to GRAM vs. GridFTP.<br />
Securing Community Accounts Survey: Victor Hazlewood reviewed the results of a survey sent<br />
to RP security leads in early November. Although not all results have been received, he felt there<br />
was enough information to share with the working group. Nancy began the discussion stating the<br />
goal for this excerise is to have a more uniform development experience across RPs for Science<br />
Gateways building off the information gathered at the "Security Summit" of 2008. The results of<br />
the survey and suggested policy recommendations for RPs is in draft status and available upon<br />
request.<br />
The security team spent a siginificant amout of time responding to serious Linux vulnerabilities.<br />
At on point we were averaging, one per week and when announced, there were no vendor patches<br />
available to mitigate the risk. This created an akward sitiuation where RPs had to find ways to<br />
protect against these threats until patches were forthcoming. For each of these vulnerabilities, the<br />
Security Working Group created a wiki page so the vulnerability status of all RPs could be<br />
tracked. This included identifying which of the production systems were at risk, what controls<br />
could be applied to mitigate the risks until an official patch was released.<br />
During the time of this report there were approximately 11 compromised user accounts and one<br />
login node compromise.<br />
The annual TeraGrid assessment project was completed and accepted during this quarter. This<br />
year’s effort focused on an assessment of the TeraGrid User portal (TGUP) operations and<br />
technologies. The TeraGrid user portal has become and increasingly important piece of the<br />
TeraGrid infrastructure including a common place for many TeraGrid users to get live<br />
information as well as pointers to static information on the portal, POPS, or other TeraGrid<br />
maintained web presences. In addition, a username password login into the TGUP can also be<br />
used to generate short term proxied credentials that can be used for access to TeraGrid resources<br />
central or at the RP’s including queried of properly authorized user records on the TGCDB,<br />
orchestrating file transfers on the TeraGrid and even command line access to TeraGrid sites. The<br />
assessment found that TeraGrid staff were well aware of and taking measures to prevent common<br />
web-based application vulnerabilities. In addition, the TGUP has defined and handled issues of<br />
properly handling and proxying user credentials while accessing the portal. However, the<br />
assessment did identify issues, in addition to these, that warranted further vigilance including<br />
additional requirements for third party (i.e. no TeraGrid RP) hosted systems. During this quarter,<br />
previous drafts were reviewed and discussed in several forums including the security working and<br />
appropriate modifications were made.<br />
8.7.2<br />
Expanded TeraGrid Access<br />
An update of the GSI-SSHTerm software to use the latest jGlobus 2.0 and BouncyCastle TLS<br />
libraries was completed. GSI-SSHTerm is one of the most popular applications in the TGUP, but<br />
the currently deployed version relies on out-of-date security libraries that do not support current<br />
recommended security algorithms such as SHA-256. In Q1 2011 we will be testing this new GSI-<br />
SSHTerm version for production roll-out in the TGUP. Jim Basney assisted the TGUP team with<br />
the necessary Kerberos integration to support the new Vetted/Unvetted account management<br />
process, and also contributed to the TGUP team with InCommon/Shibboleth testing in<br />
preparation for rolling out the production InCommon login capability in the TGUP.<br />
83
8.8 RP Operations: Security<br />
During this quarter, TG RPs provided expert staff in support of local and TG-wide collaborative<br />
security. TG RP local security operations include intrusion detection, examining log data,<br />
investigating unusual use patterns, and incident response for all hardware and networks on a 24-<br />
hour on-call basis. All TG RP partners are required to participate in the TG-wide Security<br />
Working Group which coordinates security operations and incident response across the project.<br />
During this quarter there was an incident of a user losing control of his access credentials. This is<br />
not too unusual. The security working group and incident response teams respond to handful of<br />
these each year. However, during this quarter it just happened that TGCDB access was not<br />
available when this incident surfaced. It was noted that the absence of TGCDB access made<br />
connecting this user with his/her local username on each TeraGrid resource more difficult and<br />
consequently took more time. This did not create a serious problem for TeraGrid as this was a<br />
single user compromise and exploitation to TeraGrid hosts was not aggressive, it did highlight<br />
that a TGCDB outage does have some impact on security incident handling capabilities for<br />
TeraGrid.<br />
8.8.1<br />
NCAR<br />
NCAR was not directly involved in the multiple-account compromise security incidents in this<br />
quarter. Accounts for the affected users either had not been requested on NCAR resources or had<br />
been closed due to inactivity well before the incidents occurred. The prompt detection and<br />
response at other RPs prevented adversaries from compromising additional accounts that were<br />
active at NCAR. The ability of other RPs to share network connection details from adversary<br />
activity was crucial to ensure that other accounts at NCAR were not also compromised.<br />
NCAR’s network monitoring system is operating at full capacity with bro and argus.<br />
8.8.2<br />
The NCSA security team responded to five different incidents involving TeraGrid users or<br />
resources in the fourth quarter of 2010. All five of these incidents involved remotely<br />
compromised user accounts and one of the five resulted in the intruders escalating to root<br />
privileges on a TeraGrid resource. Two of the compromised accounts were reported from remote<br />
sites and actions were taken to disable the accounts and no malicious activity was discovered for<br />
those accounts. The other three incidents were detected with the current security monitoring done<br />
within NCSA and then we notified other TeraGrid sites who were able to take proactive<br />
mitigation steps. The downtime for the compromised machine was less than a day. NCSA<br />
worked with users in each incident to get their remote systems cleaned and access to NCSA reenabled.<br />
8.8.3<br />
NCSA<br />
ORNL<br />
During the fourth quarter of calendar 2010, there were no root compromises on the NSTG nodes.<br />
There were no users who lost control of their credentials from activities on NSTG machines.<br />
Accounts of TeraGrid users who were the victims of stolen credentials at other TeraGrid sites did<br />
not log on to NSTG machines during suspect periods. NSTG staff took timely action to identify<br />
and insure deactivation of accounts of users whose credentials were stolen and users whose<br />
credentials may have been exposed during a known incident.<br />
During this period, the ORNL RP, along with all of the TeraGrid RP’s continued to participate in<br />
TeraGrid-wide security activities including the security working group meetings and the weekly<br />
incident reporting calls.<br />
84
8.8.4<br />
PSC<br />
During October, the PSC security team spent a significant amount of time responding to serious<br />
Linux vulnerabilities that had been announced. These announcements were occurring about once<br />
per week, and when announced, there were no vendor patches available. This created an awkward<br />
situation where we had to find ways to protect against these threats until patches were made<br />
available. For each of these vulnerabilities PSC’s Jim Marsteller created a wiki page so the status<br />
of all production systems could be tracked. This included identifying which of the production<br />
systems were at risk and what controls could be applied to mitigate the risks until an official<br />
patch would be released. This same process was duplicated for TeraGrid in order to coordinate<br />
responses from all resource providers. To date there have been no security incidents at PSC<br />
related to these vulnerabilities.<br />
PSC had no other security incidents during the quarter.<br />
After conducting an audit for accounts that have not seen any activity in the past six months, PSC<br />
disabled a number of inactive user accounts. We now have 1,200 fewer idle accounts for hackers<br />
to target. There has only been one call from a user whose account was incorrectly disabled<br />
(account was only used to access a Wiki maintained by PSC). We have modified our procedures<br />
so that access to a wiki will be considered as active use of an account.<br />
Jim Marsteller held the annual “Security 4 Lunch” on 08-Nov-2010 for all PSC staff and students.<br />
The one hour presentation covered general information on security best practices including<br />
material for PSC staff traveling to SC10. This year the staff was very interactive, asking a number<br />
of security related questions. In fact, the VPN demo ran over the one hour allotted time.<br />
8.8.5<br />
Purdue disabled 11 TG accounts due to compromises reported at other RP sites. Purdue also<br />
deleted SSH keys for one account due to compromise at another RP site. Purdue patched several<br />
Linux kernel and libc 0-day vulnerabilities.<br />
8.8.6<br />
Purdue<br />
SDSC<br />
SDSC had five minor incidents this quarter. These incidents resulted in the compromise of<br />
multiple user accounts elsewhere. Six user accounts were proactively suspended at SDSC as a<br />
result. No evidence of abuse of these accounts was found prior to their deactivation.<br />
SDSC performed and completed a security audit of Trestles. Minor configuration changes were<br />
made to the system configuration to enhance security posture and improve the quality of<br />
information available for incident detection and response.<br />
8.8.7<br />
TACC<br />
A security compromise on the previous Lonestar system occurred on the two login nodes on<br />
October 27, 2010. A then recent Linux kernel vulnerability was used by an intruder to gain root<br />
privileges and replace ssh to collect user passwords. This tripped automatic change detection<br />
scripts on Lonestar and notified admins of the compromise. TACC admins immediately cut off<br />
access to Lonestar, notified TG security working group of compromised user account and began<br />
forensic analysis of the compromise. During the investigation, the file used by the intruder to<br />
start collecting passwords was found and had three user accounts in it, however, the intruder may<br />
not have had time to retrieve those passwords once remote access was disabled. Those users<br />
were immediately notified to change their passwords just in case of possible collection. A patch<br />
was applied to the Lonestar kernel to remove the vulnerability.<br />
In addition, three user accounts with compromises at other locations had their accounts disabled<br />
on TACC systems.<br />
85
8.9 Common User Environment<br />
The CUE working group continues to work toward releasing the CUE to the Teragrid User<br />
Community. The group wrapped up the beta testing period and is working with TG Public<br />
Relations team to announce the development. The CUE implements and procedures have been<br />
finalized on all current TG architectures, and documentation has been created explaining its use.<br />
The group also continues to work on how to improve the commonality of queuing and testing<br />
throughout the TG.<br />
8.10 System Performance Metrics<br />
8.10.1<br />
Resource Usage Summary<br />
TeraGrid saw a slight dip in delivered NUs for Q4 2010. TeraGrid compute resources delivered<br />
13.2 billion NUs to users, down 0.2 billion, or 1%, from Q3 ( Figure 8-7). However, the<br />
13.2 billion NUs represent a 1.7x increase in the NUs delivered in Q4 2009. Nearly 100% of<br />
those NUs were delivered to allocated users.<br />
Figure 8-7. TeraGrid Computational Resource Usage<br />
8.10.2 Summary of Usage / Users / Allocations by Discipline<br />
Figure 8-9 shows the relative fraction of PIs, current accounts, active users (individuals charging<br />
jobs), allocations, and NUs used according to discipline. The nine disciplines with more than 2%<br />
of NUs used are listed in the figure. Staff project usage is excluded, and users that appear on more<br />
than one allocated project may be counted more than once, if those projects are associated with<br />
different fields of science. Consistent with past quarters, the same disciplines dominant the NUs<br />
used column, though there have been some changes in the ordering. As in past quarters, more<br />
than 20% of PIs and current accounts, as well as 15% of active users, are associated with the 19<br />
“other” disciplines, collectively representing about 4% of total quarterly usage.<br />
86
Figure 8-8. TeraGrid Usage Summary by Discipline<br />
8.10.3 Summary of Top PIs / Projects, Overall and by Discipline<br />
873 different projects charged usage on TeraGrid computational resources, down just slightly<br />
from 882 in Q3 2010. Figure 8-11 shows the top 20 PIs by total usage in NUs. The top 20 PIs<br />
consumed 51% of the NUs used and the remaining 853 projects consumed the other 49%. Table<br />
8-1 shows the top PIs (>2% of the discipline’s usage) in each discipline with more than 2% of<br />
total usage; the final section shows the top PIs for all other disciplines. The overall rank of each<br />
PI among the 873 projects charged is also listed.<br />
87
Figure 8-9. Top 20 TeraGrid PIs, Q4 2010.<br />
Table 8-1. Top PIs by Discipline for Q4 2010<br />
Physics - 4,434,534,994<br />
PI, Institution<br />
NUs Rank<br />
Robert Sugar, UC Santa Barbara 1,638,923,621 1<br />
Colin Morningstar, CMU 640,211,020 2<br />
Martin Savage, U Washington 428,451,797 4<br />
Annick Pouquet, NCAR 341,273,022 6<br />
Erik Schnetter, Louisiana State U 224,922,154 10<br />
Gregory Howes, U Iowa 163,168,914 18<br />
Klaus Bartschat, Drake U 114,348,314 27<br />
Keh-Fei Liu, U Kentucky 112,853,570 28<br />
All 58 Others 838,673,711<br />
Molecular Biosciences - 2,658,196,474<br />
PI, Institution<br />
NUs Rank<br />
Klaus Schulten, UIUC 345,370,971 5<br />
Gregg Beckham, NREL 231,860,063 8<br />
Thomas Cheatham, U Utah 227,306,075 9<br />
Aleksei Aksimentiev, UIUC 224,136,759 11<br />
Michael Klein, Temple U 205,449,920 12<br />
Wonpil Im, U Kansas 157,946,455 19<br />
Michael Feig, Michigan State U 130,993,491 24<br />
Adrian Roitberg, U Florida 107,003,924 29<br />
Adrian Elcock, U Iowa 82,770,951 35<br />
Emad Tajkhorshid, UIUC 74,511,642 39<br />
Joan-Emma Shea, UC Santa Barbara 65,880,698 44<br />
Shantenu Jha, Louisiana State U 59,760,050 47<br />
Benoit Roux, U Chicago 57,868,742 48<br />
All 146 Others 685,819,292<br />
Astronomical Sciences - 1,685,868,532<br />
PI, Institution<br />
NUs Rank<br />
Adam Burrows, Princeton U 170,416,566 16<br />
Matthias Rempel, NCAR 148,447,510 21<br />
Thomas Quinn, U Washington 139,205,725 23<br />
Dean Townsley, U Alabama 115,117,819 25<br />
John Hawley, U Virginia 89,879,233 32<br />
Juri Toomre, U Colorado 86,428,294 33<br />
Tiziana Di Matteo, CMU 47,615,465 56<br />
Alexei Kritsuk, UC San Diego 45,032,479 60<br />
Kristian Beckwith, U Colorado 38,884,415 66<br />
Patrick Fragile, College of Charleston 38,223,664 68<br />
All 64 Others 456,883,341<br />
Atmospheric Sciences - 1,509,180,022<br />
PI, Institution<br />
NUs Rank<br />
Homayoun Karimabadi, UC San Diego 566,654,690 3<br />
James Kinter, COLA 204,125,064 13<br />
Amy McGovern, U Oklahoma 194,388,010 15<br />
Jon Linker, Predictive Science, Inc. 169,502,106 17<br />
Cecilia Bitz, U Washington 149,774,821 20<br />
Frank Bryan, NCAR 140,617,805 22<br />
Fuqing Zhang, Penn State U 99,263,199 30<br />
Ming Xue, U Oklahoma 60,618,290 46<br />
Nikolai Pogorelov, U Alabama, Huntsville 39,230,549 65<br />
All 42 Others 127,474,964<br />
Chemical, Thermal Systems - 705,619,623<br />
PI, Institution<br />
NUs Rank<br />
Daniel Bodony, UIUC 92,044,666 31<br />
Stephen Pope, Cornell U 79,483,414 36<br />
Olivier Desjardins, U Colorado 74,007,481 40<br />
Ge<strong>org</strong>e Karniadakis, Brown U 56,358,256 49<br />
Rajat Mittal, Johns Hopkins U 47,801,999 55<br />
Madhusudan Pai, Stanford U 38,363,095 67<br />
Kenneth Jansen, U Colorado 29,716,357 81<br />
Tengfei Luo, MIT 26,382,524 87<br />
Krishnan Mahesh, U Minn 21,218,059 104<br />
Rigoberto Hernandez, Ge<strong>org</strong>ia Tech 19,726,039 109<br />
Antonino Ferrante, U Washington 15,620,546 127<br />
Andreas Heyden, U South Carolina 15,493,910 129<br />
Prosenjit Bagchi, Rutgers U 14,115,376 134<br />
All 71 Others 179,349,917<br />
Materials Research - 701,534,288<br />
PI, Institution<br />
NUs Rank<br />
Juana Moreno, Louisiana State U 84,569,902 34<br />
Chris Van de Walle, UC Santa Barbara 48,495,399 54<br />
Stephen Paddison, U Tennessee 47,312,719 57<br />
Ivan Oleynik, U South Florida 37,949,397 69<br />
James Chelikowsky, UT Austin 37,422,656 70<br />
Stefano Curtarolo, Duke U 36,784,789 74<br />
Lubos Mitas, NC State U 28,679,061 83<br />
Dallas Trinkle, UIUC 26,921,164 85<br />
Marvin L. Cohen, UC Berkeley 24,977,598 91<br />
Jacek Jakowski, U Tennessee 24,071,540 93<br />
Boris Yakobson, Rice U 19,850,871 107<br />
Gerbrand Ceder, MIT 19,007,395 110<br />
Bilge Yildiz, MIT 18,783,043 111<br />
Peter Cummings, Vanderbilt U 16,399,990 122<br />
John D. Joannopoulos, MIT 16,098,219 125<br />
Dane M<strong>org</strong>an, U Wisconsin-Madison 14,480,280 133<br />
All 96 Others 198,965,451<br />
Chemistry - 605,512,370<br />
PI, Institution<br />
NUs Rank<br />
Gregory A. Voth, U Chicago 253,092,527 7<br />
Carlos Simmerling, SUNY Stony Brook 70,610,828 41<br />
Guillermina Estiu, U Notre Dame 41,112,637 64<br />
88
Arun Yethiraj, U Wisconsin-Madison 22,711,862 97<br />
B. Montgomery Pettitt, U Houston 21,431,386 101<br />
Kendall N. Houk, UCLA 15,505,351 128<br />
Jennifer Wilcox, Stanford U 14,050,308 135<br />
All 123 Others 166,707,676<br />
Advanced Scientific Computing - 229,391,250<br />
PI, Institution<br />
NUs Rank<br />
Ali Uzun, Florida State U 115,023,030 26<br />
Bruce Boghosian, Tufts U 50,598,758 52<br />
Stefan Boeriu, UC Santa Barbara 21,263,524 103<br />
Martin Berzins, U Utah 12,905,492 142<br />
Xiaowen Wang, UCLA 8,646,245 183<br />
Richard Loft, NCAR 8,094,723 188<br />
All 33 Others 12,438,757<br />
Biological and Critical Systems - 217,030,681<br />
PI, Institution<br />
NUs Rank<br />
Thomas Jordan, USC 203,515,374 14<br />
Tony Keaveny, UC Berkeley 10,012,261 160<br />
All 3 Others 949,158<br />
All Others - 471,977,749<br />
8.10.4 Usage by Institution<br />
PI, Institution (Discipline)<br />
NUs Rank<br />
Mark Miller, SDSC (Environmental Biology) 43,578,333 62<br />
Clinton N. Dawson, UT Austin (Mathematical Sciences) 37,173,926 71<br />
Narayana Aluru, UIUC<br />
(Cross-Disciplinary Activities) 36,860,594 72<br />
Leung Tsang, U Washington (Earth Sciences) 34,595,141 76<br />
Omar Ghattas, UT Austin<br />
(Earth Sciences) 32,853,585 78<br />
Baylor Fox-Kemper, U Colorado (Ocean Sciences) 25,077,517 90<br />
M<strong>org</strong>an Moschetti, USGS<br />
(Earth Sciences) 16,987,228 119<br />
Alexa Griesel, SIO (Ocean Sciences) 15,990,957 126<br />
Ronald E. Cohen, CIW<br />
(Earth Sciences) 12,465,931 143<br />
Shanhui Fan, Stanford U (Electrical and Comm. Systems) 11,473,088 151<br />
Lynn Wright, SURA<br />
(Ocean Sciences) 11,179,288 154<br />
Liwen Shih, U Houston-Clear Lake (Computer and Computation Research) 9,800,482 164<br />
All 141 Others 184,399,738<br />
Users from 319 institutions across the U.S. and abroad were associated with the usage delivered by<br />
TeraGrid resources. The top 47 institutions (by NUs consumed) were responsible for 80% of the total<br />
NUs delivered. Table 8-3 lists the 319 institutions, the number of users responsible for the usage at each<br />
site, and the NUs consumed.<br />
Table 8-2. TeraGrid Usage by Institution, Q4 2010<br />
Institution<br />
Users NUs<br />
Indiana U 14 1,209,917,563<br />
UIUC 241 947,274,596<br />
NCAR 18 803,008,614<br />
U Washington 35 773,850,669<br />
Jefferson Lab 1 583,245,165<br />
LANL 9 575,785,646<br />
U Utah 22 379,976,674<br />
BNL 1 326,051,936<br />
U Colorado 47 224,511,529<br />
Princeton U 19 223,859,648<br />
UC San Diego 90 205,965,605<br />
U Chicago 43 187,560,066<br />
U Wisconsin-Madison 41 184,085,630<br />
SUNY Stony Brook 15 177,031,382<br />
NREL 10 170,118,242<br />
U Kansas 15 161,622,666<br />
Louisiana State U 41 159,911,762<br />
U Iowa 16 155,173,281<br />
UC Santa Barbara 45 149,052,069<br />
U Maryland, College Park 25 147,531,049<br />
Atmospheric Tech Services Co. 1 142,054,709<br />
U Arizona 9 136,287,533<br />
Michigan State U 11 135,586,126<br />
Ge<strong>org</strong>ia Tech 39 130,820,561<br />
U Oklahoma 39 129,222,144<br />
Temple U 14 121,647,604<br />
CMU 23 117,688,814<br />
Florida State U 4 115,054,042<br />
U Virginia 19 114,394,924<br />
MIT 40 114,175,543<br />
Arizona State U 14 110,404,033<br />
Caltech 28 110,398,424<br />
Boston U 18 108,077,635<br />
U Florida 23 107,731,519<br />
SDSC 11 107,401,874<br />
Cornell U 36 107,279,390<br />
Penn State U 15 105,150,433<br />
UC Berkeley 49 101,429,549<br />
Predictive Science, Inc. 2 96,627,908<br />
University College, London (United Kingdom) 10 88,686,385<br />
Purdue U 41 83,293,637<br />
UT Arlington 6 82,759,053<br />
Vanderbilt U 13 82,375,139<br />
UNC 11 75,952,289<br />
Stanford U 26 74,903,474<br />
Drake U 2 74,019,013<br />
College of Wm & Mary 6 73,457,068<br />
Brown U 17 69,772,189<br />
Institution<br />
Users NUs<br />
UT Austin 31 69,228,821<br />
MPI Gravitationsphysik (Germany) 3 68,792,486<br />
Duke U 3 67,436,232<br />
UC Davis 35 66,442,525<br />
U Wyoming 1 63,923,890<br />
Florida Atlantic U 5 61,836,421<br />
Lock Haven U 3 57,808,300<br />
U Tennessee 17 56,636,866<br />
U Rochester 6 53,664,639<br />
UCLA 42 52,849,964<br />
Ge<strong>org</strong>e Washington U 1 52,431,067<br />
Johns Hopkins U 37 52,181,285<br />
U Notre Dame 9 52,088,050<br />
Baylor College of Med 3 50,059,787<br />
Harvard U 16 47,396,464<br />
USC 8 46,285,740<br />
U Pittsburgh 31 45,881,069<br />
RIT 6 45,782,967<br />
Albert Einstein Institute (Germany) 5 45,400,150<br />
Weill Cornell Med College 4 44,136,642<br />
U South Florida 9 44,067,247<br />
UC Irvine 15 40,780,159<br />
U Alabama, Huntsville 7 39,440,356<br />
NYU 13 38,269,787<br />
College of Charleston 4 38,223,664<br />
U Minn 7 33,739,652<br />
Columbia U 19 29,373,799<br />
U Kentucky 5 29,092,242<br />
COLA 2 28,831,540<br />
U Illinois, Chicago 18 28,606,508<br />
U New Hampshire 8 27,452,549<br />
U Michigan 32 25,826,136<br />
SIO 3 24,684,192<br />
Slovak Acad of Sci (Slovakia) 2 24,591,584<br />
Tulane University 5 23,124,481<br />
San Diego State U 6 22,929,779<br />
Drexel U 12 22,909,409<br />
NC Central U 2 21,605,418<br />
Colorado College 1 21,333,888<br />
RPI 8 21,302,205<br />
U Illes Balears (Spain) 1 21,013,662<br />
Duquesne U 7 20,948,678<br />
Texas A&M 6 20,900,934<br />
ANL 2 20,681,907<br />
Inst HEP (China) 1 20,652,667<br />
USGS 2 20,588,221<br />
Rice U 5 20,020,985<br />
NC State U 24 19,046,770<br />
89
Institution<br />
Users NUs<br />
Toyota Tech Inst, Chicago 3 17,118,826<br />
Desert Res Inst 1 16,858,090<br />
U Houston-Downtown 2 16,645,898<br />
Rutgers U 16 15,970,517<br />
UT El Paso 13 15,941,729<br />
Missouri S&T 6 15,805,150<br />
U South Carolina 6 15,594,044<br />
U Babes-Bolyai (Romania) 1 15,315,566<br />
Washington U 23 14,095,068<br />
Perimeter Inst (Canada) 1 13,197,121<br />
University of Sussex 1 13,072,599<br />
Lehigh U 3 12,525,876<br />
CIW 3 12,465,930<br />
U Toronto (Canada) 2 11,930,190<br />
LLNL 3 11,416,587<br />
Iowa State U 9 11,392,361<br />
Wesleyan U 4 10,878,061<br />
AMNH 4 10,789,837<br />
S-Star Alliance 2 10,332,690<br />
U Pennsylvania 6 10,303,691<br />
Univ of Vienna 1 10,158,154<br />
Monash U (Australia) 5 10,147,380<br />
U Delaware 15 10,047,796<br />
U Texas Hlth Sci Ctr Houston 7 9,921,915<br />
U Tenn, Oak Ridge 3 9,917,531<br />
WPI 2 9,910,015<br />
Rockefeller U 1 9,879,235<br />
U Houston-Clear Lake 1 9,800,482<br />
U Central Florida 5 9,442,235<br />
Mt Sinai Sch of Med 4 9,345,555<br />
CWRU 2 8,582,542<br />
SUNY at Binghamton 4 8,063,447<br />
Michigan Tech 33 7,922,384<br />
U Jena (Germany) 2 7,791,485<br />
U Heidelberg (Germany) 2 7,641,263<br />
Yale U 5 7,246,761<br />
UC Merced 2 7,054,908<br />
Aerospace Corp. 1 7,026,930<br />
UC Riverside 4 7,008,793<br />
Colorado State U 11 7,001,508<br />
Venter Inst 1 6,811,981<br />
Polish Acad of Sci (Poland) 2 6,489,834<br />
UT San Antonio 4 6,308,729<br />
Thomas Jefferson University 1 6,141,766<br />
Ge<strong>org</strong>etown U 6 6,115,811<br />
CSU-Fullerton 2 6,084,006<br />
UC San Francisco 7 6,039,313<br />
Fermilab 1 5,954,681<br />
Northwestern U 11 5,800,669<br />
U Mass, Amherst 11 5,609,537<br />
Montana State U 6 5,221,324<br />
Oregon State U 6 5,216,266<br />
Indiana U, Kokomo 2 4,894,259<br />
Florida A&M U 3 4,884,468<br />
UNLV 2 4,867,971<br />
U Maryland, Baltimore Co 13 4,775,739<br />
Corning, Inc. 1 4,714,311<br />
U Houston 3 4,679,787<br />
St Louis U 7 4,678,418<br />
University of Montana 3 4,653,138<br />
US DoE 1 4,415,527<br />
U New Mexico 3 4,150,006<br />
UNC System Office 1 4,081,758<br />
Oxford U (United Kingdom) 1 4,003,403<br />
Washington State U 8 3,661,784<br />
LSST Corp. 2 3,396,236<br />
Naval Postgrad School 2 3,380,823<br />
Chosun U (South Korea) 1 3,256,671<br />
Central Michigan U 2 3,218,427<br />
IIMCB (Poland) 1 3,098,208<br />
U Mass, Dartmouth 3 2,988,924<br />
U Zurich (Switzerland) 2 2,981,723<br />
UT Southwestern Med Ctr 2 2,762,290<br />
SciberQuest Inc 1 2,716,447<br />
Institution<br />
Users NUs<br />
Boise State U 3 2,582,013<br />
Old Dominion U 2 2,448,650<br />
NIST 1 2,322,942<br />
U Nebraska, Lincoln 1 2,320,162<br />
U Hawaii, Honolulu 1 2,195,680<br />
U Puerto Rico System 9 2,193,279<br />
PSC 3 2,164,350<br />
Brandeis U 6 2,155,126<br />
Clark Atlanta University 2 2,076,030<br />
Eastern Michigan U 4 2,064,236<br />
Kansas State U 5 1,998,763<br />
NETL 1 1,888,765<br />
CSU-Long Beach 1 1,768,921<br />
U Vermont 1 1,660,730<br />
U Cincinnati 3 1,644,887<br />
San Francisco State U 1 1,642,179<br />
Ill. Geological Survey 1 1,639,597<br />
U Nevada-Reno 1 1,629,641<br />
Institute for Advanced Study 1 1,626,099<br />
University of Wisconsin-Eau Claire 2 1,599,091<br />
ORNL 12 1,595,000<br />
U Ge<strong>org</strong>ia 8 1,481,844<br />
Bucknell U 1 1,374,194<br />
Emory U 7 1,364,824<br />
Long Island U 3 1,317,116<br />
Kingsborough Comm College 2 1,306,818<br />
CSU-Northridge 3 1,303,083<br />
Florida Tech 3 1,301,897<br />
U North Dakota 8 1,272,509<br />
Northeastern U 4 1,245,598<br />
Lamar University-Beaumont 2 1,160,282<br />
Virginia Tech 7 1,078,605<br />
SUNY at Albany 3 1,073,476<br />
Princeton Plasma Physics Lab 1 1,063,815<br />
Virginia Commonwealth U 1 1,027,087<br />
Albany College of Pharmacy & Health Sci 2 1,016,926<br />
Clarkson U 1 1,016,256<br />
La Jolla Bioeng Inst 1 927,707<br />
Wayne State U 4 904,128<br />
UC Santa Cruz 4 893,570<br />
UT System 1 847,592<br />
Southwest Res Inst 3 841,579<br />
Computational Sci & Eng 2 828,196<br />
New College of Fla. 1 826,954<br />
U Michigan, Flint 1 656,518<br />
U Oregon 6 602,263<br />
U Paris 1 587,040<br />
ESSC (United Kingdom) 1 576,754<br />
U Puerto Rico, Mayaguez 2 555,170<br />
CUNY 2 546,644<br />
Bangor U (United Kingdom) 2 518,043<br />
U Hawaii, Manoa 4 509,838<br />
U North Texas 2 501,365<br />
WHOI 1 473,179<br />
LBNL 2 468,742<br />
Auburn U 5 443,256<br />
U Arkansas 1 434,564<br />
Clark U 1 408,661<br />
Santa Clara U 1 402,033<br />
Colorado Sch of Mines 2 395,388<br />
Einstein College of Med 1 374,052<br />
NASA Langley 1 364,696<br />
US Army 1 342,581<br />
US FDA 2 340,654<br />
SD State U 2 333,678<br />
Southern Methodist U 2 331,404<br />
Kent State U 1 300,034<br />
Barnard Coll 1 295,275<br />
Vienna U of Technology (Austria) 2 288,265<br />
Washington U School of Med 1 282,344<br />
Oregon Health & Sci U 2 281,518<br />
Vorcat, Inc. 1 269,436<br />
Albany State U 2 241,109<br />
SIU Carbondale 1 236,143<br />
90
Institution<br />
Users NUs<br />
BRI City of Hope 1 221,516<br />
Ursinus College 1 220,499<br />
Wichita State U 2 200,544<br />
Bryn Mawr College 1 194,120<br />
Ohio U 2 189,413<br />
Dartmouth College 2 172,468<br />
U Maryland School of Medicine 2 163,026<br />
U Alaska, Fairbanks 1 161,490<br />
Natl Space Sci & Tech Ctr 1 157,459<br />
SUNY ESF 2 149,160<br />
Queens U, Belfast (United Kingdom) 1 141,696<br />
Alfred University 1 120,718<br />
U Maine 1 113,646<br />
Keldysh Inst of Applied Math (Russia) 1 112,734<br />
RENCI 1 98,145<br />
N Ga College & State U 5 84,916<br />
Climate Central 1 73,975<br />
Francis Marion U 1 72,880<br />
UNC, Charlotte 1 71,942<br />
Monell Chemical Senses Ctr 1 68,266<br />
Jackson State U 2 65,952<br />
TU IImenau (Germany) 1 48,414<br />
IIT 2 48,267<br />
KAIST (South Korea) 1 46,701<br />
Dickinson College 3 46,669<br />
NJIT 1 45,487<br />
U Split (Croatia) 1 43,575<br />
CTC 1 43,397<br />
Bloomsburg U 1 39,959<br />
Brigham and Women's Hospital 1 37,286<br />
Macalester College 3 35,752<br />
Mississippi State U 3 35,397<br />
PNL 1 32,836<br />
Marquette University 3 22,496<br />
Cold Spring Harbor Lab 1 19,892<br />
Siena College 4 15,554<br />
SUNY at Buffalo 3 14,139<br />
Illinois State U 1 13,891<br />
CUNY City College 2 12,005<br />
Marian College 1 11,893<br />
Oklahoma State U 3 8,607<br />
Connecticut College 1 7,747<br />
U Hawaii Research Corp 1 7,512<br />
JPL 1 6,993<br />
U Minn, Duluth 1 6,953<br />
U Provence (France) 1 5,481<br />
U Detroit Mercy 2 5,189<br />
Fort Hay State U 1 4,828<br />
Kitware 1 4,699<br />
WV State College 1 4,295<br />
HDF Group 1 4,070<br />
Tufts U 1 3,191<br />
Texas State U 1 2,964<br />
Rose-Hulman Inst of Tech 1 2,563<br />
Howard U 1 2,313<br />
U Mass, Lowell 1 2,185<br />
Clemson U 2 1,569<br />
NW Research Associates 2 564<br />
U Pittsburgh Sch of Med 1 564<br />
ETH Zurich (Switzerland) 1 324<br />
SLAC 1 297<br />
U Paul Sabatier (France) 1 283<br />
Ohio State U 1 252<br />
Julich Supercomputing Centre (Germany) 1 248<br />
Grand Valley State U 1 229<br />
Allegheny College 1 142<br />
National U 2 76<br />
SSSU (India) 2 69<br />
Salk Institute 1 56<br />
U Miami 1 40<br />
CENTRA - Inst Sup Tecnico (Portugal) 1 39<br />
ND State U 1 17<br />
Widener U 1 11<br />
Inst Bio<strong>org</strong>anic Chem, Polish Acad Sci (Poland) 1 2<br />
Institution<br />
Users NUs<br />
NASA Ames 1 2<br />
91
9 Evaluation, Monitoring, and Auditing<br />
9.1 XD – TIS<br />
9.1.1 Technology Identification and Tracking<br />
Update<br />
The initial version of the XTED database and UI was released in time for SC 2010 with the<br />
functionality outlined below. Having achieved the base set of functionality, we are have been<br />
working towards implementing the next batch of use cases extending the functionality of the<br />
XTED website. In addition to various bug fixes and schema modifications resulting from<br />
feedback of the initial release, a conversion of the UI code from a servlet to a portlet<br />
implementation was undertaken in order to better integrate with the existing features of the<br />
Liferay portal server. We worked with the Technology Evaluation Process (TEP) group to<br />
develop use cases to support the functionality needed by that part of the TIS project. Finally we<br />
have begun similar conversations with the Usage, Satisfaction, and Impact Evaluation team to<br />
identify and document the use cases needed to support their activities, to be completed in Q1<br />
2011.<br />
Development<br />
Use Cases<br />
The following provides descriptions of the functionality released during the quarter and the<br />
functionality planned for the next release. Additionally, we highlight the use cases developed in<br />
conjunction with the TEP team.<br />
Released for SC 2010<br />
1. Search Technologies - Any user can view the list of technologies available via the XD<br />
user portal. Ability to filter by keywords.<br />
2. View Technology Details - Any user can view detailed information about a<br />
technology.<br />
3. Create Technology Record - An authenticated user adds a new technology which will<br />
be reviewed by TIS staff.<br />
4. Edit Technology Record - An authenticated user with role of PI or Point of Contact<br />
(POC) for a specific technology can edit an existing technology.<br />
Next Release (Feb 2011)<br />
1. Approve Technology - TIS staff member must approve new technologies before they<br />
become publicly viewable.<br />
2. Request Technology - Allows an authenticated user to request a technology that may<br />
or may not exist in XTED.<br />
3. Respond to Request - TIS staff responds to a technology request, adding a new<br />
technology entry if applicable with a follow-up email sent to requester.<br />
92
4. Review Suggested Keywords - TIS staff cleans up user submitted keywords to create<br />
an evolving controlled vocabulary.<br />
5. Delete Technology - Technology record creators and TIS staff can soft/hard delete a<br />
record.<br />
Technology Evaluation Process Use Cases (Q1 2011 release)<br />
1. Add Review Content – The Review Contact adds the review content causing the Tech<br />
status to be updated from 'Under Review' to 'Reviewed'<br />
2. Edit Review Content - Update information for a Reviewed Tech<br />
3. Assign Reviewer - Assign or reassign a 'Review Contact' for an existing Tech record<br />
4. Update Review Status - Set Review status from 'Not Reviewed' to 'Selected for<br />
Review' or 'Under Review'<br />
5. TIS Staff Create Technology Record - TIS can create a tech record that doesn't exist,<br />
cannot edit the 'general details'.<br />
9.1.2 Technology Evaluation Laboratory<br />
The Technology Evaluation Laboratory is composed of a combination of reserved resources<br />
housed at TACC, the FutureGrid systems, and resource provider (RP) systems. These systems<br />
represent the HPC computing environment found in XD and will facilitate the Technology<br />
Evaluation Process (TEP) conducted by the TIS team. The test cluster housed at TACC is<br />
currently being installed and prepared to emulate as closely as possible the different environments<br />
found in the TeraGrid today with plans to extend the environments to match those systems found<br />
in XD as well. This system provides the TEP group unrestricted access to hardware when other<br />
systems are either not available or are unable to perform the required testing. Underway is the<br />
creation of documentation explaining how the TEP team will utilize this hardware. The Track 2D<br />
grid test bed or FutureGrid brought most of their systems online in the last quarter and now have<br />
the ability to support multiple virtual machines (VMs), some which are maintained by the<br />
FutureGrid team and others which are supplied by users. Current work with FutureGrid has<br />
involved testing cloud technologies including Eucalyptus and Nimbus to determine strengths and<br />
weaknesses of each to draw conclusions on which technology the TEP team will use in their<br />
evaluations. The team is working to document the process of how we intend to use FutureGrid<br />
for TIS activities. In addition to documentation regarding the test cluster and FutureGrid we are<br />
compiling a list of all RP hardware available for testing, and recording information such as how<br />
to access, points of contact, and type of access.<br />
9.1.3 Technology Evaluation Process<br />
The Technology Evaluation Process team is charged with developing an evaluation process for<br />
the various software and hardware candidates that TIS will be asked to evaluate. We are also<br />
tasked to coordinate with other groups as necessary to achieve our goals and theirs.<br />
Progress on the Process<br />
The Technology Evaluation Process (TEP) team has developed a formal process for testing<br />
candidate software. It is under final review by all members of TIS.<br />
Consisting of 6 steps:<br />
• Develop an initial test plan<br />
93
o<br />
o<br />
A brief description of the team members involved, time frames and duties<br />
A listing of the requirements the software needs to meet<br />
• Perform an initial study<br />
o<br />
A subset of the team will perform an in depth study of the candidate<br />
• Create initial documents<br />
o<br />
The same subset of the team will generate several documents that will be used in<br />
the testing and insertion processes<br />
• Define the final test plan<br />
o<br />
o<br />
• Perform the tests<br />
o<br />
A formal test plan will be created by all team members that includes metrics,<br />
scripts and features to be evaluated<br />
Subject matter experts (SME) will be used as available so that the tests are<br />
appropriate and accurate<br />
The testing will be performed on multiple systems by more than one team<br />
member for rigor<br />
• Generate the final reports<br />
o<br />
o<br />
Internal reports will be generated on the testing. External reports will be<br />
generated as the testing of the candidate merits<br />
Any final report will be publicly available through the XTED database<br />
The Plan to do an evaluation of the TEP<br />
The evaluation tests were completed and demonstrated some areas that needed improvement.<br />
Those changes have been made to the process. We have coordinated with the TIS group on the<br />
information they require, and have documented that. We are coordinating with the XTED group<br />
on the changes that we need in the database.<br />
Goals<br />
Our goal for this quarter is to have the formal version of the process accepted by all of TIS in<br />
January and start testing real candidates thereafter. We are also coordinating on the software<br />
candidate identification process with the rest of the group such that we receive a preliminary set<br />
of candidates. We will also obtain a list of available resources and personnel for testing. During<br />
this quarter, we will need to note the rate of testing, so that we can accurately estimate the amount<br />
of future testing that can be done with TIS resources. We will also identify any changes needed in<br />
the process which are noted during “real-world”testing.<br />
9.1.4 Technology Insertion Support<br />
The Technology Insertion Support (YNT) team facilitates transition of supported and<br />
recommended technologies from evaluation to deployment by TeraGrid and XD service providers<br />
and users. YNT works with the evaluation teams to ensure that deployment, design, installation,<br />
configuration, maintenance, and usage methods for supported and recommended technologies are<br />
sufficiently documented, together with validation tests and usage examples.<br />
94
YNT publications are intended for use by systems administration staff at TeraGrid and XD<br />
service providers to guide deployment of technologies for use with their resources. For<br />
technologies intended for end-user installation and use, YNT publications are intended for use by<br />
systems administration staff supporting end-users. Documentation for usage examples will be<br />
prepared for use by end-users directly.<br />
The YNT team has drafted an initial definition for deployment objects, which include components<br />
necessary to sufficiently document the deployment design, installation, configuration,<br />
maintenance, testing and usage methods for a supported and recommended technology, along<br />
with its deployment profiles, which document deployment instances and their current status per<br />
service provider. Deployment profiles include defined deployment roles and associated contact<br />
information per deployment object per service provider.<br />
The YNT team continues to work with the evaluation teams to develop procedures and<br />
instrumentation in the evaluation process to support documenting deployment object components<br />
including validation tests and usage examples. Additional useful metadata from the evaluation<br />
process, such as time, expertise and effort required to perform each evaluation will be collected<br />
during evaluations to inform prospective implementers regarding deployment effort and expertise<br />
required. The YNT team continues to review existing deployment models in TeraGrid and other<br />
CI projects, for guidance in establishing appropriate YNT procedures.<br />
To assist the overall evaluation process, the YNT team has initiated work on a metric to guide<br />
selection and to establish priorities among technology evaluation candidates. The objective of this<br />
metric is to establish a consistent procedure by which to vet candidate technologies for scope,<br />
applicability, community interest and availability of required evaluation resources and expertise.<br />
The calculated metric value for each candidate technology will rank it relative to all others under<br />
consideration. The rank-ordered list of technologies will establish the work priorities for the<br />
evaluation teams, and will be reviewed and updated regularly to ensure that teams are working on<br />
the most important technology evaluations, as metric factors change over time.<br />
9.1.5 Usage, Satisfaction, and Impact Evaluation<br />
Led by the TIS user advocate, the Usage, Satisfaction and Impact Evaluation team is charged<br />
with engaging the user community in the evaluation and testing process, by various methods<br />
including focus group teleconferences aimed at finding out what functional requirements are not<br />
being met. We have conducted additional meetings with users representing groups and interested<br />
in topics we identified last quarter, continuing the process of learning how best to prepare,<br />
<strong>org</strong>anize and conduct such meetings, and how to feed their outcome into the TIS identification<br />
and tracking, evaluation and testing activities described above.<br />
As a result of the 5 meetings with groups of active users and campus champions that we have<br />
conducted so far, we have formulated specific requirements and some candidate technologies to<br />
be evaluated in the areas of wide area file transfers, workflow engines, and single sign-on to<br />
Teragrid resources.<br />
We have focused on the best ways to integrate community input into the TIS process and to<br />
enable community input and feedback via XTED. In Figure 9.1, which was used as a basis for<br />
discussion at the TIS meeting held at SC10, red denotes user roles that must be enabled by XTED<br />
(green represents TIS staff roles and blue represents technology developer community roles). The<br />
figure illustrates that XTED’s role-based access controlled tools should allow our staff to enter<br />
the requirements obtained via our user engagement processes, as well as allowing (in a later<br />
95
stage) users to enter requirements on their own. Furthermore, we may ask some of the users to<br />
participate in the evaluation and testing of the technologies that, based on TIS staff evaluation,<br />
have emerged as the leading candidates to fulfill the functional requirements these users have<br />
identified as being critical to their work. Finally, the community of current and potential<br />
TeraGrid/XD users (as well as the developer community) needs to be empowered to comment on<br />
the evaluation and testing results we will publish via XTED. The process is cyclical, reflecting<br />
the dynamic nature of the XD cyberinfrastructure, of its stakeholders’ needs and expectations,<br />
and of the potentially relevant technologies.<br />
Figure 9.1: Stakeholder driven TIS processes to be supported via XTED<br />
9.2 Quality Assurance<br />
Between October and December 2010, the QA group performed testing of the new GridFTP 5<br />
release. GridFTP5 was deployed to virtual machines on FutureGrid’s Sierra (UCSD) and Foxtrot<br />
(University of Florida) resources and verified that several of the new features, such as data set<br />
synchronization and the offline mode for the server, worked as expected. No major problems<br />
were detected in this testing, though a bug related to the new dataset synchronization feature was<br />
reported. The results are summarized on the TeraGrid Wiki at<br />
http://www.teragridforum.<strong>org</strong>/mediawiki/index.phptitle=GridFtp5_Testing. GridFTP 5 was also<br />
deployed to all TACC machines without any problems.<br />
As GRAM5 has not yet been recommended for production deployment across TeraGrid,<br />
members of the QA group continued to debug it. Two issues related to run away globus-jobmanager<br />
processes and a problem using jobtype=multiple were raised by TACC and are being<br />
investigated by Globus developers. A problem with PBS.pm not allowing GRAM MPI jobs to<br />
execute was resolved at LSU.<br />
96
In response to feedback from the Science Gateway group, QA is also investigating what is<br />
required to submit Globus MPI jobs on various TeraGrid machines. We’d like to use these<br />
findings to improve documented user examples and would ideally like to have standards for MPI<br />
jobs and an automated tool for submitting them. Also, based on correspondence with the<br />
CIPRES Gateway project, documentation will be drafted for using GRAM clients to launch jobs.<br />
Other QA activities included discussions with members of the CUE regarding details of their<br />
deployments and how to best test them. A basic CUE test was written and more will be created<br />
when CUE deployments are registered in IIS. Work also continued to create a sustainable<br />
process for collecting and analyzing usage data in order to determine which services are most<br />
important to users. A manual graphical analysis was made of existing process accounting reports,<br />
which could be used as an example for future automated reports. Finally, a new test was written<br />
for the Karnak metascheduling service.<br />
9.3 XD – TAS<br />
In this TAS quarterly progress report we discuss infrastructure put into place to facilitate software<br />
development, <strong>org</strong>anizational meetings and technical progress made toward initial software<br />
releases.<br />
9.4.1 Technical Progress<br />
XDMoD User Interface:<br />
Substantial progress has been made on the XDMoD interface in terms of the underlying<br />
infrastructure, look-and-feel of the interface, and its functionality. We begin with a discussion of<br />
the underlying infrastructure (programming constructs).<br />
The XDMoD portal is a web-based application developed using open source tools including the<br />
LAMP (Linux, Apache, MySQL, PHP) software stack and the ExtJS user interface toolkit.<br />
XDMoD is developed using the Model-View-Controller software design pattern where the View<br />
is implemented entirely using the ExtJS user interface toolkit to provide the look-and-feel and<br />
advanced features of a desktop application within a web browser. The Controller serves as a<br />
router where service requests are received and sent to the appropriate Model (e.g., charting, report<br />
building, application kernels, etc.) for handling. Multiple Controllers can be created to handle<br />
specific types of requests such as the generation of charts, creation of customized reports, and<br />
queries for information such as available resources to feed the user interface. We have chosen to<br />
use authenticated RESTful services as the interface to all backend data services (Controllers and<br />
Models) and to completely de-couple the user interface (View) from the backend services. This<br />
de-coupling allows us to immediately achieve the goal of a providing a service API that can be<br />
utilized by 3 rd party data ingestion or display tools such as Google gadgets or other portal<br />
infrastructures.<br />
ExtJS is an advanced cross-browser JavaScript user interface toolkit that provides a rich set of<br />
functionality that one would typically expect to find in a desktop application. This includes dragand-drop<br />
functionality, event handling, callbacks, and customizable widgets. This immediately<br />
97
provides XDMoD developers with a wide variety of components from which to choose when<br />
developing the portal. This includes tree views for <strong>org</strong>anizing hierarchical data, tabbed display<br />
panels for <strong>org</strong>anizing and displaying different types of information, a mechanism for easily<br />
utilizing RESTful services, and components for automatically parsing data returned from those<br />
services.<br />
Figure 1 shows the login page for XDMoD. The plot window in Figure 2, a screen capture of the<br />
XDMoD portal interface, shows a plot of the Total SU usage by NSF directorate for 2010, and is<br />
available under the TG Usage tab (which is highlighted). To the left of the plot window is the<br />
Chart Menu Window, which shows the wide range of plots available, including plots of TG<br />
utilization broken down by Job Size, Resource Provider, NSF Directorate, Principal Investigators,<br />
Institutions, etc. Some of the important features of the portal are described below.<br />
Figure 1. XDMoD Portal login in page.<br />
98
Figure 2. XDMoD Portal<br />
The 9 Navigation Tabs along the top of the window have the following functionality.<br />
1. My Summary: Displays an overview of the individual user’s jobs.<br />
2. TG Summary: Displays a system-wide overview of all users jobs.<br />
3. My Usage: Displays a large number of metrics related to the individual’s usage.<br />
4. TG Usage: Displays a large number of metrics related to the system-wide TG<br />
usage.<br />
5. Allocations: Displays a summary of the status of the individual’s TG allocations.<br />
6. App Kernels: Provides information on application kernel performance on various<br />
resources.<br />
7. <strong>Report</strong> Generator: Provides a mechanism for generating user specified reports.<br />
8. User Like Me: Provides an evaluation of resources based upon user specifications.<br />
9. Search: Allows authorized users to search for data on users, institutes, etc.<br />
Other key functional features include: the “Add Charts” button that adds the displayed chart to<br />
the report queue, the “<strong>Report</strong> Queue” button that aids the user in managing and exporting reports<br />
that are a compilation of user selected charts, the “Export” button that allows plot data to be<br />
output in a variety of formats (XLS, CSV, XML, PNG), the “Chart/Dataset” button that allows<br />
the user to toggle between the display of a given chart or the data set used in its generation, the<br />
Date Selection pull-down menu that allows the user to select different time ranges for the desired<br />
plot, a search feature that helps the user in locating charts, and a series of buttons that allow the<br />
user to resize the displayed charts and control which information fields are visible.<br />
Based on the feedback received at SC10 for the beta version of XDMoD as well as input from the<br />
TG Quarterly Meetings, we have substantially revised the functionality planned for the initial<br />
release of XDMoD 1.0, which is scheduled for the summer of 2011. For example, XDMoD has<br />
been enhanced by the addition of time-based plots similar to those included in the TG Quarterly<br />
<strong>Report</strong>s. Plots such as Number of Jobs, Total CPU Consumption, and Total SU Charged over<br />
99
user specified time periods are now readily available. A series of three example plots of TG<br />
utilization are shown below as Figures 3-5.<br />
Figure 3. Total TG CPU hours on all TG resources over a one-year period by resource. Hovering<br />
over a given color on the plot brings up a tab that indicates which resource corresponds to that<br />
particular color.<br />
Figure 4. Time history of all TG resources from 2006-2010 in terms of total CPU hours delivered<br />
by resource.<br />
100
Figure 5. Total number of TG jobs run over a two-year period by resource.<br />
In addition to the time history type plots, a series of new data plots have been added including<br />
utilization plots by NSF status, by person and by institution. Examples are given below for Total<br />
CPU usage by NFS status (Figure 6), Total CPU usage per Principal Investigator (Figure 7) and<br />
Total CPU usage per institution (Figure 8). In addition to the Pie charts, the user will be able to<br />
choose to view the data as a bar chart or in tabular form, thereby facilitating their inclusion in<br />
reports. Figure 8 also shows the entire XDMoD Portal interface (as opposed to just the plots<br />
shown in Figures 6 and 7) and is included to give a better indication of the rich feature set of<br />
metrics being developed within the portal.<br />
101
Figure 6. Total CPU Consumption by NSF Status obtained using the XDMoD Portal.<br />
Figure 7. Total CPU Consumption per PI obtained using the XDMoD Portal.<br />
102
Figure 8. Total CPU Consumption per institution. Included in the figure is a screen shot of the<br />
entire XDMoD portal<br />
Role-based Access to XDMoD: The various roles that will be defined in XDMoD were specified<br />
for the initial XDMoD 1.0 release this summer at TG 11. It was decided to have 5 distinct roles:<br />
NSF Program Officer, Center Director, Principal Investigator, User and Public. Further detail on<br />
the access levels assigned to each role are given in the attached Appendix 1. It is anticipated that<br />
the specific information available in each of the roles will be further refined based on discussion<br />
with NSF and XD leadership.<br />
XDMoD Additional Progress: Developing a client for a complex and fast-changing software API<br />
can be challenging. To assist 3 rd party developers (as well as internal developers needing to<br />
interface with XDMoD components), we have created a tool to make this task easier. The<br />
XDMoD REST API Call Builder, shown in Figure 9, provides a mechanism for a developer to<br />
iteratively and hierarchically construct a RESTful API URL to perform their desired query or<br />
action. Using a programming language feature called Reflection, the API Call Builder inspects<br />
the publically available API interfaces for all available Controllers and Models to guide a<br />
developer through the construction of a valid API URL. Starting with the selection of the<br />
appropriate controller, the developer is provided with a list of available choices for each level in<br />
the URL hierarchy and steps through the process of constructing a valid URL for the desired API<br />
call. The URL is initially displayed with each required component in red and changes these to<br />
green as the developer makes the required choices. A test is included that checks for errors and<br />
the proper completion of the call and displays the call output. This tool is currently being used<br />
for internal diagnostic usage, and still being enhanced for eventual use by external developers,<br />
such as the developers of the TG/XD User portal.<br />
103
Figure 9. Prototype tool to aid developers in the construction of valid RESTful API calls.<br />
XDMoD Data Warehouse:<br />
The TAS development and production hardware has been installed and is backed up to an off-site<br />
location. Virtual machines have been implemented for the various server/web components of the<br />
TAS project.<br />
The TGCdb has successfully been mirrored at CCR from PSC allowing for fast and efficient<br />
ingestion of all data stored within the TGCdb into the XDMoD data warehouse and aggregation<br />
to support the plethora of charting options available. A list of desirable improvements/data<br />
elements to the centralized database is being prepared for submission to the XD transition team.<br />
The XDMoD data warehouse has been redesigned to support new functionality and is defined as<br />
a simple star schema as shown in Figure 10. A dimensional approach to data storage, a star<br />
schema is relatively easier to understand compared to the normalized approach. Upon ingestion,<br />
the transactional data of various data sources (i.e. TGCdb) are partitioned into “facts” or<br />
“dimensions”. Dimensions are the reference information that gives context to facts. For example<br />
a TG job transaction can be broken up into facts such as the job_id and total CPU time consumed,<br />
and into dimensions such as resource provider, field of science, funding allocation, and the<br />
principal investigator of the project. The schema makes it easy to understand and query the data<br />
pertaining to jobs logged in the TGCdb. However, we still need to optimize for performance<br />
when it comes to aggregating data over a custom date range. For this purpose, the star schema is<br />
further aggregated into subsequent star schemas where aggregation over time is added as an extra<br />
dimension as shown in Figure 11. In this phase, we aggregate data over its complete duration by<br />
day, week, month and quarter. Keeping multiple resolutions of data allows for faster queries<br />
when aggregating or querying data over long periods of time.<br />
104
Figure 10: XDMoD data warehouse schema<br />
105
Figure 11: Additional time based aggregations in the XDMoD data warehouse.<br />
Application Kernel Development:<br />
Application kernel work for the beta release at SC10 was initially focused on Computational<br />
Chemistry/Biophysics. Application kernels currently under development that were discussed at<br />
SC10 include: Amber, GAMESS, NAMD, NWChem, Intel MPI, NAS parallel benchmarks, and<br />
the HPC Challenge benchmark suite. A “USER LIKE ME” view has been prototyped to allow<br />
users without a detailed knowledge of TG/XD resources quickly to see recommendations and<br />
application performance data based on their field and desired application profile. As shown in the<br />
XDMoD window below, thirteen application kernels are presently being run continuously on<br />
local (CCR) HPC clusters. Figure 12 below shows AMBER-based application kernels run on 8,<br />
16 or 32 processors.<br />
106
Figure 12. AMBER-based application kernels run on 8, 16 or 32 processors.<br />
Work on the application kernels was focused on planning for the implementation of an initial<br />
deployment of kernels on Future Grid and subsequently one or two sites on TG/XD to be<br />
synchronized with the release of XDMoD 1.0. Present work has been centered on the best<br />
scheme to track different versions of each kernel and to account for changes in how each kernel<br />
runs on the various resources when a variety of changes are made including both hardware and<br />
software modifications.<br />
Planning for the implementation and development of an initial set of kernels on Future Grid was<br />
done in collaboration with Professor von Lazewski. FutureGrid will provide a useful test-bed for<br />
development of the application kernels. While this deployment will not offer a larger scale for<br />
application processing over the current set of local computing resources, it will test our<br />
development and implementation methods. An xdtas account has been obtained for FutureGrid<br />
and a preliminary plan has been determined to run a set of application kernels on the system.<br />
Custom <strong>Report</strong> Builder/University of Indiana Subcontract:<br />
Professor Gregor von Lazewski (Co-PI) and Dr. Lizhe Wang from Indiana University visited<br />
CCR on 12/14/10 to 12/15/10 for a series of meetings and conferences to discuss XDMoD,<br />
Application Kernels, and FutureGrid TAS collaboration.<br />
107
The initial version of the custom report builder has been completed. A user may use a togglebutton<br />
on any chart to select it for inclusion in a custom report. The report generator will utilize<br />
an XML configuration file that describes the format and contents of the report, which will be sent<br />
to the user via email upon report generation. Initially, a PDF version of the report containing the<br />
title, description and image for each selected chart will be available while later versions will<br />
expand to other output formats.<br />
A preliminary demonstration version of the custom report builder is shown below in Figure 13.<br />
This demonstration version features two methods for generating a report. The first method allows<br />
a user to select individual charts as viewed in XDMoD and add them to a report queue. From the<br />
report queue the user can then generate a report and have it delivered via email, either as a onetime<br />
report or a regularly scheduled report. Using the second method, the user can select a<br />
combination of resource providers, resources, and metrics to be included in the report. The report<br />
can be immediately down loaded or sent as an e-mail if it is a one-time report or scheduled to be<br />
sent as an e-mail if it is a recurring report on the user selected intervals.<br />
Figure 13. Demonstration version of the Custom <strong>Report</strong> Builder.<br />
User Needs Assessment/University of Michigan Subcontract:<br />
108
Dr. Ann Zimmerman of the University of Michigan traveled to CCR on 10/25/10 for a full day of<br />
meetings and conferences. She was briefed on TAS progress to date and we discussed plans for<br />
soliciting user input for TAS products such as XDMoD and the Application Kernel Toolkit at<br />
SC10 and beyond.<br />
User Needs: Team members from the University of Michigan (UM) submitted their Institutional<br />
Review Board (IRB) application for approval. Upon acceptance of the application, they will be<br />
able to recruit and conduct research with potential users of TAS. They also developed data<br />
collection protocols for user interviews and observations and submitted these with the IRB<br />
paperwork. Studies of user needs will focus initially on end users of TG/XD. The sampling<br />
strategy will include users with different size allocations, from different domains, and with<br />
varying levels of computational expertise and experience.<br />
9.4.2 Meetings and Events<br />
Weekly meetings of the TAS Working Group are held every Wednesday afternoon. UB project<br />
members attend in person with University of Michigan and Indiana University subcontractors<br />
attending by web and audio conference. TAS personnel were very active at SC10. A description<br />
of the presentations made at SC10 is given below.<br />
CCR Presentations at SC10:<br />
11/16/10 SC10 Birds-of-a-feather XDMoD presentation at SC10: Dr. Thomas R. Furlani, Dr.<br />
Matthew D. Jones and Mr Steven M. Gallo led a BOF presentation of XDMoD at SC10. The<br />
TAS program in general and the prototype XDMoD portal in particular was very well received.<br />
A number of useful comments and suggestions were made by TG users at the BOF and these<br />
suggestions will find their way into future XDMoD releases.<br />
11/17/10 SC10 Dell booth presentation on TAS: Dr. Thomas R. Furlani gave an invited<br />
presentation describing the TAS program, XDMoD and other CCR projects at the Dell booth.<br />
CCR booth at SC10 features XDMoD demonstration: The CCR booth at SC10 was well<br />
attended. The booth featured an information sheet on XDMoD and a working demonstration of<br />
XDMoD on a laptop computer. The TG/XD users who tried the demonstration were generally<br />
pleased with the software and they were able to offer several useful comments that will aid the<br />
development of XDMoD.<br />
TAS Technical Advisory Committee Meeting at SC10:<br />
11/15/10: Meeting of the TAS Technical Advisory Committee at SC10: The TAS Technical<br />
Advisory Committee met for the first time at SC10. Attendees included the committee members:<br />
109
Professor Abani Patra (Chair) (UB), Dr. P.K. Yeung (CFD, Ge<strong>org</strong>ia Tech), Dr. Martin Berzins<br />
(Chair, Computer Science at Utah), Dr. Craig Stewart (Executive Director, Pervasive Technology<br />
Institute, Indiana), Dr. Richard Brower (Lattice QCD, Boston University), Dr. Sara Graves<br />
(Director, Information Technology and Systems Center, Univ of Alabama, Huntsville), CCR<br />
staff: Dr. Thomas Furlani (PI), Dr. Matthew D. Jones (Co-PI), Mr. Steven Gallo, Mr. Ryan<br />
Gentner, Mr. Amin Ghadersohi, Dr. Charng-Da Lu, and Dr. Robert L. DeLeon and<br />
subcontractors: Dr. Ann Zimmerman from the University of Michigan and Professor Gregor von<br />
Lazewski (Co-PI) from Indiana University. Also attending John Towns (TIS PI), Kevin<br />
Thompson (NSF), and Dr. Richard Moore (SDSC).<br />
After brief introductory comments by Chairman Abani Patra, Dr. Furlani, Dr. Jones, Mr. Gallo<br />
and Dr. Zimmerman provided a briefing on TAS progress to date and a demonstration of the<br />
XDMoD Portal. A very lively and informative discussion followed and a series of comments and<br />
recommendations were noted that will provide helpful guidance for the TAS program. A detailed<br />
summary of the recommendations of the committee were compiled into a report. The first TAS<br />
TAC report from this meeting is attached as Appendix 2.<br />
9.4.3 Coordination with other TG/XD and External Projects<br />
On Tuesday Nov. 16, at SC10, TAS members meet with TACC technical staff to discuss<br />
XDMoD and TG portal development efforts and explore the use of common portal technologies.<br />
During his visit to CCR, on 12/15/10, Professor Gregor von Lazewski from Indiana University<br />
discussed TAS collaboration with FutureGrid.<br />
9.4.4 TAS Appendices<br />
9.4.4.1 Appendix 1: Proposed Role based Data Access Privileges for TAS<br />
We have identified five specific roles for the XDMoD TAS interface. The purpose of this<br />
document is to formalize the information that will be made available through XDMoD for each of<br />
the roles.<br />
1. Public Role (TG account not necessary but an approved XDMoD account is required)<br />
• Overall TG Utilization broken down by resource provider, field of science, time, etc<br />
o No user specific data<br />
• Performance data across RP’s – performance of the TAS application kernel suite<br />
• User Like Me (a tool to assist users in identifying the resource most appropriate for their<br />
computational needs)<br />
2. User Role - extends Public role and adds the following data (requires an active TG account)<br />
110
• Personal utilization information for the authenticated user<br />
• Drill down to user’s individual job details and allocation information<br />
3. PI Role - extends User role and adds the following data (requires an active TG account)<br />
• List all users under each of the PI’s allocations<br />
• Provide detailed utilization information for all users under a particular allocation (only<br />
view activity specific to the PI's allocations)<br />
4. Center Director / Resource Provider / System Admin Role<br />
• Utilization data specific to the user's associated center/resource<br />
• Performance data specific to the user's associated center/resource<br />
• Ability to view detailed job-level information for users of the resource<br />
5. Program Officer Role<br />
• No restrictions on data access<br />
• TG Utilization<br />
• Access to all data for a specific user<br />
• Performance data across all RP’s<br />
9.4.4.2 Appendix 2. <strong>Report</strong> of TAS Technical Advisory Committee<br />
First TAS Technical Advisory Committee (TAC) Meeting Discussion Points and<br />
Recommendations,<br />
New Orleans 11/15/10<br />
Attendees included:<br />
The committee members: Prof. Abani Patra (Chair) (UB), Prof. P.K. Yeung (Ge<strong>org</strong>ia Tech), Prof.<br />
Martin Berzins (Univ. of Utah), Dr. Craig Stewart (Indiana University), Dr. Richard Brower<br />
(Boston University), Dr. Sara Graves (Univ of Alabama, Huntsville). Member of the TAS TAC<br />
not attending SC10 meeting: Dr. Jerzy Bernholc (NC State)<br />
TAS staff: Dr. Thomas Furlani (PI), Dr. Matthew D. Jones (Co-PI), Mr. Steven Gallo, Mr. Ryan<br />
Gentner, Mr. Amin Ghadersohi, Dr. Charng-Da Lu, Dr. Robert L. DeLeon, Dr. Ann Zimmerman<br />
from the University of Michigan and Professor Gregor von Lazewski (Co-PI) from Indiana<br />
University.<br />
Also attending John Towns (TIS PI), Kevin Thompson (NSF), and Dr. Richard Moore (SDSC).<br />
Meeting started with a brief presentation of the Charge to the TAC and a presentation of the TAS<br />
project including a demo of the XDMoD tool. The discussions were wide ranging and included<br />
structural, policy and technical issues. Grouping them in these categories we have:<br />
111
Category 1: Structural Issues:<br />
1) TAS and TG/xD Governance: The Technical Advisory Committee deliberated on the<br />
relationship of TAS and XD governance. Suggested mechanisms are:<br />
• The TAC believes that TAS should provide input to XD governance rather than be part of<br />
it.<br />
• The TAC emphasized that it is important that TAS reports both to XD and NSF.<br />
• TAS should be separate from XD and maintain a distinct identity (“arm’s length”).<br />
• XD governance can request that TAS audit various aspects of XD that are of concern<br />
and/or in need of independent oversight.<br />
• TAS should prepare quarterly audit reports of XD activities with specific findings and<br />
XD should be required to respond formally to these in the subsequent quarterly meeting.<br />
2) The TAC also felt that TAS should interact closely with the XD science advisory board and<br />
provide them with data as needed.<br />
Category 2: Policy Issues:<br />
1) There was an active discussion on whether it is the best use of the TG/XD resources to run<br />
very large jobs (ones that cannot be easily run elsewhere) for which they are uniquely suited or to<br />
run all jobs that may have scientific merit. TAS should provide data and an analysis of the job<br />
size distribution and associated scientific productivity over the various facilities to allow NSF to<br />
make such policy decisions.<br />
2) Many of the TG/XD facilities have heavy non-NSF usage. One of the functions of TAS is to<br />
indicate the extent to which the facilities are used on NSF supported programs. While TAS<br />
probably will not be able to directly report the non-NSF usage in detail due to the lack of access<br />
to this data at the resource providers, TAS can and should report the overall NSF level of machine<br />
utilization.<br />
Category 3: Technical Issues:<br />
1) The TGCDB (TG centralized database): In general, the TGCdb records are very good, there<br />
are some areas where data is apparently missing and other areas where there is room for added or<br />
improved record storage. TAS reports that their initial assessment of the TGCdb finds some<br />
inconsistency in batch job usage reporting. Three notable examples are that job exit status is not<br />
reported, not all computational resources report usage by processor core as well as by node, and<br />
memory per job is not recorded. Indeed, reporting is not always consistent for jobs on an<br />
individual resource nor from one resource provider to another. Accordingly, it is the<br />
recommendation of the TAS Technical Advisory Committee that TAS formulate a uniform<br />
taxonomy for TGCDB and more extensive collection of resource utilization statistics.<br />
112
2) Given the emergence of Green technology for HPC, the TAC suggests that TAS should<br />
recommend that energy usage be reported, thereby providing a framework to measure system<br />
performance in terms of energy consumption. The best reporting units would be energy usage per<br />
job and energy usage per core. While potentially costly to implement in current TG systems,<br />
solicitations for future systems should make this a requirement.<br />
3) The TAC feels that the POPS database containing information gathered during allocation<br />
applications, can potentially provide a unique and useful source of data for TAS e.g. publications<br />
made using past allocation resources. TAS should utilize this data and if necessary make<br />
recommendations on the data that is recorded, format, and data collection procedures. For<br />
example, should the POPS data be incorporated in the TGCDB.<br />
4) It was pointed out that practically no existing TG reporting (annual reports etc.) can be cited in<br />
the open literature. TAS should consider either a version of the auditor’s reports that is open to<br />
the public and/or provide information that is citable within the context of XDMoD.<br />
5) The TAC recommends that TAS needs to access data from the TGCDB and other sources as<br />
needed for reporting on TG/XD.<br />
6) It was pointed out that feedback from experienced users is a very powerful method for both<br />
helping new users learn to use the TG/XD facilities and to help to monitor how such facilities are<br />
operating on a daily basis. Thus a social network analogous to Facebook or at least a user list<br />
serve should be developed. TAS should investigate the feasibility of such services and possibly<br />
assist in setting up such mechanisms but not to take the lead or the principal responsibility for<br />
establishing and maintaining these social networking mechanisms. As an initial step in providing<br />
users with access to data such as this, TAS should explore the feasibility of mining help desk<br />
tickets and their resolution from the resource providers to provide user’s with a query based<br />
system to quickly find answers to their questions. Mining the tickets could also help monitor<br />
system performance by looking for common problems across a given resource.<br />
7) The interaction between computational and experimental work is important. A useful metric<br />
of the importance of computational studies is the value that the work has to experimentalists.<br />
This information should be actively sought after by TAS.<br />
8) TAS should mine Advanced Support for TeraGrid Applications (ASTA) files for information<br />
and problem specific expertise and support ASTA as indicated.<br />
9) In general only traditional CPU usage is reported. TAS should help develop mechanisms to<br />
monitor and audit memory, storage and visualization facilities. This could be done through<br />
developing suitable application kernels to assess system performance and recommending<br />
additional usage reporting for these facilities.<br />
113
10 Education, Outreach, and Training; and External Relations<br />
The TeraGrid collective efforts have engaged at least 9,151 people through at least 162 events as<br />
shown in the table below. The number of participants were not recorded at all events. Details on<br />
the events are included in Appendix B – EOT Event Details.<br />
#<br />
Participants<br />
# Underrepresented<br />
people<br />
# events<br />
Workshops 825 435 41<br />
Tutorials 366 36 17<br />
Async Tutorials 1,663 32<br />
Forum and Tours 1,078 85 33<br />
Conference<br />
Presentations<br />
1,727 278 15<br />
Demos/Fairs/Exhibits 2,945 1,670 9<br />
Classes 156 66 7<br />
Seminars 391 49 8<br />
Total 9,151 2,619 162<br />
10.1 Joint EOT-ER Outreach Highlights<br />
US Science and Engineering Expo, October 23-24, 2010 Washington DC<br />
On October 23-24, 2010, over a million<br />
people flocked to Washington D.C. to<br />
celebrate science and engineering<br />
innovation and inventions at the inaugural<br />
USA Science and Engineering Expo.<br />
Staff from Indiana University, UChicago, Blue<br />
Waters, NICS, NCSA, Shodor, and students<br />
and educators from the DC area, joined<br />
together in reaching out to young people to get<br />
them interested and engaged in science and<br />
technology. The effort was supported by a<br />
supplemental award from OCI. The team<br />
extends its appreciation to Irene Lombardo<br />
and the OCI office for their support.<br />
Pervasive Technology Institute employees<br />
from IU volunteered in a booth co-sponsored<br />
by the TeraGrid and Blue Waters projects.<br />
Thousands of educators, children and their parents filled the TeraGrid tent non-stop for two<br />
whole days. They took the time to check out a 3D movie as well as the “LittleFe” cluster<br />
computer. iPads loaded with animated scientific visualizations and photographs of<br />
114
supercomputers were a huge draw for little fingers which wanted to explore the easy-to-use<br />
screens.<br />
“It was great, because when the kids asked ‘What is a supercomputer’ we were able to show<br />
them pictures of the Data Center at Indiana University and the new Blue Waters facility at NCSA<br />
at the University of Illinois and also let them see LittleFe,” said IU-TeraGrid Outreach<br />
Coordinator Robert Ping. “LittleFe is like a baby supercomputer. It’s portable and is able to be<br />
used for training and doing demonstrations to show the computational power of a<br />
supercomputer,” he said.<br />
Educators loaded up their bags with NCSA coloring books and posters about supercomputing as<br />
well as publications like the TeraGrid EOT Highlights and Science Highlights. They were<br />
thrilled to have tangible items they could take back to their classrooms to start discussions about<br />
computer science, supercomputing and the many fields of study it touches.<br />
The EXPO was conceived in response to the Obama administration’s desire to encourage more<br />
interest in Science, Technology, Engineering, and Math (STEM) careers by exposing children<br />
and families to new technologies that are strengthening communities, building careers, and<br />
stimulating economic growth.<br />
The Pervasive Technology Institute’s Advanced Visualization Lab (AVL), part of the Data to<br />
Insight Center, created the stereoscopic video shown in the booth. It is centered on real-world<br />
applications of computational science and features current research that utilizes the TeraGrid.<br />
The stereoscopic video allowed kids and parents alike to don 3D glasses and get a glimmer of<br />
what it might be like to be a scientist working in the area of climate and weather.<br />
Animations of whirling wind turbines, swirling tornadoes and inching inch-worms combined with<br />
real-world research to show the study of climate isn’t just about the weather. It is useful in other<br />
areas of research and study like hydrology and agriculture as well. By using the NSF funded<br />
Linked Environments for Atmospheric Discovery (LEAD) project as the topic the public was also<br />
able to find out how their tax dollars are put to valuable use both as research tools and as ways to<br />
get kids interested in science and technology.<br />
Mike Boyles a volunteer from IU and manager of the AVL said, “It doesn’t really matter what the<br />
subject matter is of the video. The way the kids eyes lit up, their thoughtful questions and their<br />
expressions of “wow” and “whoa” were the real value we created by showing the 3D scientific<br />
movie.”<br />
This wasn’t the first time masses of people gathered on the National Mall to express their<br />
concerns and interests, but it was the first time that so many came to share their passion for<br />
science, engineering, and technology.<br />
“This expo is a chance for institutions like D2I and IU to show children that science and math<br />
aren’t the scary subjects they’re sometimes thought to be, and are actually a lot of fun,” said D2I<br />
Director Beth Plale. “Reaching out to young people and getting them intrigued in science and<br />
technology will be critical in the coming years in order for the U.S. to be competitive in the<br />
global scientific and economic landscape.”<br />
115
EU-US HPC Challenges in Computational Sciences Summer School<br />
The first European US Summer School on HPC Challenges in Computational<br />
Sciences <strong>org</strong>anized jointly by Europe’s DEISA project and the US/NSF TeraGrid project was<br />
taking place at Santa Tecla Palace in Acireale, Sicily from Oct 3-7, 2010.<br />
Among the participants were sixty graduate students or post-docs, selected from more than 100<br />
applications: 25 from US and 35 from EU universities and research institutions. Students came<br />
from a variety of disciplines, among them astronomy, atmospheric sciences, chemistry, computer<br />
science, engineering, mathematics and physics. Female students were represented with a fraction<br />
of 20%.<br />
Twenty-five high level speakers were covering major fields of computational sciences, with nine<br />
speakers from the US and sixteen from Europe. Areas covered included Challenges by Scientific<br />
Disciplines, Programming, Performance Analysis & Profiling, Algorithmic Approaches &<br />
Libraries, and Data Intensive Computing and Visualization.<br />
“The summer school HPC Challenges in Computational Sciences was an excellent opportunity<br />
for me to get an overview of the complete spectrum of High Performance Computing and<br />
Computational Science. Attending talks from the large number of distinguished speakers from<br />
almost every domain of computational science and High performance computing has enabled me<br />
to get a very clear idea of the current trends in the area”, one of the students stated afterwards.<br />
To hold a joint EU US summer school was suggested by leading computational scientists from<br />
both continents. “Our primary objective for the student experience was to advance computational<br />
116
sciences by enabling and stimulating future international collaboration, innovation, and discovery<br />
through the most effective use of HPC,” said Hermann Lederer from the DEISA coordination<br />
team at RZG, Garching.<br />
“We hope to continue with such events every year— alternating between EU and US<br />
destinations,” said TeraGrid Forum Chair John Towns, National Center for Supercomputing<br />
Applications. “The overwhelmingly positive feedback of the students - 85% rated the event as<br />
very good or excellent – can be taken as a mandate”, he added.<br />
Leake assisted with the promotion and facilitation of the first DEISA/TG Joint Summer School<br />
on HPC Challenges in Computational Sciences held October 4-7, 2010 in Catania, Italy. Ferguson<br />
(NICS) was also in attendance and provided technical support for the participants. A flier and<br />
registration tool were developed by ER team members.<br />
Leake established a FaceBook community (with 54 ongoing members) to facilitate collaboration<br />
before, during, and after the event. Social networking proved to be a highly effective way to<br />
acquaint participants (from 20 countries) before the event. It helped to break the ice among the<br />
participants and the presenters.<br />
One of the TeraGrid presenters, Sean Ahern (NICS/ORNL), after meeting participant Jeremy<br />
Logan (University of Maine) in Sicily, encouraged ORNL to hire Logan as a postdoc to assist<br />
with research in parallel I/O. Ahern recognized that Jeremy's work could be applicable to ORNL's<br />
HPC data processing research. Logan started at ORNL a few weeks ago!!<br />
The following are comments from a few of the participants.<br />
Benjamin Cruz Perez said, “I used TeraGrid resources to complete my master’s thesis. Without<br />
TeraGrid’s tools and training, I would not have been able to obtain results in time for my<br />
defense.” “Thanks to these experiences, I have met with other researchers with similar interests<br />
who also use supercomputing for their research. I’m now comfortable using the techniques I’ve<br />
learned for the efficient use of high performance computing, and look forward to participating in<br />
multidisciplinary research in the future.”<br />
“Sometimes it doesn’t hurt to push things, the system, the methodology, yourself, to the limit.<br />
Often, it is under such duress that great things happen,” said Terrence Neumann, Marquette<br />
University.<br />
“I have been writing proposals for a while now and will be submitting a new Teragrid proposal<br />
soon. It was nice learning more about Teragrid and its EU counterpart, DEISA. I enjoyed the<br />
friendship and conversations, and I am looking forward to implementing some of the new<br />
techniques and libraries I learned about in my own research,” said Cengiz Sen, Turkish national<br />
postdoc fellow, University of Cincinatti.<br />
Giuseppa MUSCIANISI, PhD Student in Mathematics and Informatics. Department of<br />
Mathematics, University of Messina, Italy said, “With the exciting backdrop of Riviera dei<br />
Ciclopi, I increased my knowledge about HPC challenges thanks to the great talks by experts and<br />
discussions with colleagues from around the world. The School was very well <strong>org</strong>anized. All talks<br />
addressed the tools, software applications, and techniques needed to conduct HPC research.“<br />
The presentations given are available from http://www.deisa.eu/Summer-School/talks<br />
A report of the Summer School is included in Appendix D – EU-US HPC Summer School <strong>Report</strong><br />
117
Outreach to University of Puerto Rico<br />
TeraGrid conducted a one day outreach session for over 50 faculty and students at the University<br />
of Puerto Rico at Rio Piedras on December 8 and the University of Puerto at Mayaguez on<br />
December 9, 2010. The participants were provided with hands-on access to TeraGrid systems<br />
(Purdue’s Steele and PSC’s Pople) to learn how to submit and run parallel jobs.<br />
Stefano Leonardi, professor at University of Puerto Rico at Mayaguez, helped to <strong>org</strong>anize the<br />
visit. Leonardi has worked at the Sapienza University of Rome, Max Planck Institute,<br />
Saarbreucken, Germany, and at Carnegie Mellon and Berkeley Universities in the US.<br />
Leonardi said, “My wife and I moved to Puerto Rico about two years ago because we feel it is as<br />
close to Paradise as one can get—unless you rely on supercomputing to do your work. If it<br />
weren’t for TeraGrid, there is no way I could conduct my research.” Participants have requested<br />
accounts on TeraGrid subsequent to the visit.<br />
Supercomputing’10—November 12-18, New Orleans, LA<br />
Members of the ER team volunteered to assist with TeraGrid’s presence in the RP booths and in<br />
the TeraGrid booth. About 1000 Science and EOT Highlights documents and other TeraGrid<br />
communication collateral were disseminated. We distributed about 1000 orange petaflops (flipflops)<br />
that were left-over from the TeraGrid’10 Conference.<br />
118
The TeraGrid booth showcased group photos from TeraGrid outreach events including the EXPO<br />
in DC and the EU-US Summer Schools.<br />
Leake and Hunt developed an engagement activity among the TG booths to showcase the<br />
Campus Champion Program. The theme “Embrace the Future with a Campus Champion”<br />
identified champions as important team members for expanding and supporting the TeraGrid<br />
community. Nineteen “trading” cards featured photographs of the champions and information<br />
about TeraGrid’s transition to XD. These champions were provided with copies of the cards for<br />
their efforts in acquainting local local users with TeraGrid.<br />
http://www.teragridforum.<strong>org</strong>/mediawiki/index.phptitle=TeraGrid_and_RP_Booth_Engagement<br />
_Activity<br />
119
Leake shopped locally for Mardi-Gras themed decorative elements to give the booth a regional<br />
flavor. Feather boas, beads, and fragrant floral arrangements echoed the colors from the scientific<br />
images in the booth petaflops, and brochures. After the conference, the floral arrangements and<br />
decorative objects were donated to a local nursing home. The arrangements, showcasing huge<br />
stargazer lilies, were still fresh and beautiful in the nursing home for many more days.<br />
TeraGrid’11 Conference, Salt Lake City Utah, July 18-24, 2011<br />
ER and EOT personnel are instrumental in the initial planning for the 2011 conference. Trish<br />
Barker (NCSA) is communication committee chair. Bill Bell (NCSA) is involved as deputy chair.<br />
Many other ER team members serve on various committees. Diane Baxter (SDSC) and Pallavi<br />
Ishwad (PSC) are EOT track chair. Laura McGinnis (PSC) is the lead for the Student Program<br />
and is coordinating with OSG on a Summer School for students. Scott Lathrop (UChicago) is the<br />
chair for BOFs.<br />
Proceedings from TG’10 (leadership provided by Froelich and many others on the ER team) were<br />
published by the ACM on December 4:<br />
http://portal.acm.<strong>org</strong>/citation.cfmid=1838574<br />
120
10.2 EOT Impact Stories<br />
Indiana<br />
IU event to help Indiana Business Leaders get their heads in the clouds (IU News release, CI<br />
Newsletter, multiple local and regional media outlets in Indiana)<br />
Turn on any technology-related news program or radio show these days and you're likely to hear<br />
someone mention cloud computing. "Cloud" has become a hot technology buzzword, in part due<br />
to the substantial promise cloud applications hold for business and industry.<br />
To help leaders from corporations and small businesses begin to realize this potential, the<br />
Pervasive Technology Institute at Indiana University hosted an educational seminar discussing<br />
how cloud technologies are providing access to greater computational power at a lower cost. The<br />
seminar<br />
http://newsinfo.iu.edu/asset/page/normal/8185.htmlThe Cloud Computing for Business and<br />
Industry seminar was held in conjunction with the IEEE 2nd International Conference on Cloud<br />
Computing Technology and Science (CloudCom 2010).<br />
World cloud computing leaders convene at IU-hosted conference (HPC Wire, SC Online, IU<br />
News release, a number of national and regional outlets) (also video from from the<br />
conference available online)<br />
The hot technology buzzword, "cloud," describes Internet-accessible infrastructure -- such as data<br />
storage and computing hardware -- that is hidden from users. Top cloud computing researchers<br />
and industry leaders from around the globe are gathering in Indianapolis this week to discuss the<br />
latest research and the future of this powerful emerging discipline.<br />
The IEEE 2nd International Conference on Cloud Computing Technology and Science<br />
(CloudCom 2010) took place from Tuesday, November 30 through Friday, December 3 at<br />
University Place Conference Center at Indiana University-Purdue University Indianapolis<br />
(IUPUI). This year's hosts are the Pervasive Technology Institute at Indiana University and the IU<br />
School of Informatics and Computing. The conference included a number of talks on TeraGrid<br />
related technologies and services.<br />
SDSC<br />
“Grand Challenges in Data-Intensive Discovery” was the focus of a conference at SDSC on<br />
October 26-28, 2010. An element of SDSC’s TeraGrid outreach program associated with the new<br />
Gordon Track 2D system, the conference provided an opportunity for attendees to share their<br />
expertise while exchanging ideas about the computational challenges and concerns common to<br />
data-intensive problems. Presenters from a broad representation of disciplines allowed<br />
participants to understand some of their shared computational challenges. A small sampling of<br />
attendee responses to follow-up evaluation gives an indication of the conference’s success.<br />
A number of collaborations resulted from this conference. It was amazing to see how common<br />
the data challenge problems are. It was also interesting to see how different fields have come<br />
up with similar solutions independently. This clearly points to an urgent need for multidisciplinary<br />
efforts.<br />
I wanted to learn about the characteristics and the needs of data intensive applications; how<br />
they are matched by existing systems, and how future systems could be built to meet these<br />
applications’ needs. The conference had a good balance of technical and high level talks.<br />
I wanted to learn from other researchers in different fields - especially Social Science &<br />
Humanities - about their work with large data sets. There are commonalities with data<br />
searching tools in neuroscience and archaeology. This was interesting to see.<br />
121
The mix of breadth and depth, centered around what this machine [Gordon] has to offer a<br />
potentially wide range of areas, was very good.<br />
Participants also provided valuable feedback on additional topics for conferences, tutorials, and<br />
workshops related to data-intensive computing. Their candid perspectives on where they saw<br />
expertise needed in the next 3 years provides useful workforce development program guidance.<br />
SDSC’s Diane Baxter served as a committee chair for an NSF task force on Cyberlearning and<br />
Workforce Development, working closely with other members of the TeraGrid community and<br />
national STEM education colleagues to examine and analyze risks and opportunities in emerging<br />
national priorities for workforce development associated with cyberinfrastructure and<br />
computational STEM.<br />
SDSC’s Jeff Sale spearheaded design of the TeraGrid Campus Champions portal, an effort<br />
involving a great deal of community input, testing, and revision. In December, the portal was<br />
finally deployed as an integrated component of the TeraGrid User Portal. At the time of this<br />
report the portal is being used by the community. Jeff’s ongoing and continuing online portal<br />
support provides a mechanism for continuing improvements and updates to the portal to best<br />
serve the needs of the Campus Champions.<br />
As Dash came on line and Gordon testing continued, SDSC shared its discoveries along the way<br />
through two presentations at high profile meetings that were also included in published<br />
proceedings of those events:<br />
• Moneta: A High-performance Storage Array Architecture for Next-generation, Nonvolatile<br />
Memories, Adrian M. Caulfield, Arup De, Joel Coburn, Todor Mollov, Rajesh<br />
Gupta, and Steven Swanson, Proceedings of The 43rd Annual IEEE/ACM International<br />
Symposium on Microarchitecture, 2010.<br />
• Understanding the Impact of Emerging Non-Volatile Memories on High-Performance,<br />
IO-Intensive Computing, Adrian M. Caulfield, Joel Coburn, Todor Mollov, Arup De,<br />
Ameen Akel, Jiahua He, Arun Jagatheesan, Rajesh K. Gupta, Allan Snavely, and Steven<br />
Swanson, Supercomputing, 2010. (Nominated for best technical paper and best student<br />
paper).<br />
TACC<br />
Landmarks on the Human Genome<br />
Vishy Iyer, The University of Texas at Austin<br />
Genetics and Nucleic Acids<br />
We typically think of heredity as rooted in our genes, and it is. But<br />
scientists are finding that variants in the non-coding regions of the<br />
genome, formerly considered junk, and even proteins that are not<br />
part of the genome at all but that interact with genes, play a role in<br />
our personal traits as well.<br />
Vishy Iyer, an associate professor in the Institute for Cellular and<br />
Molecular Biology at The University of Texas at Austin, used<br />
TeraGrid resources to explore the expression of genes related to<br />
regulatory proteins and to investigate the role of heredity in the<br />
transcription binding process. Based on his findings, Iyer and<br />
colleagues published one of the first studies that used next-gen<br />
sequencing and HPC analysis in Science in April 2010.<br />
Figure 1: Representation of allele-specific<br />
and non–allele-specific SNPs across the<br />
CTCF binding motif (17). The y axis<br />
indicates the difference between the two as<br />
a percentage of normalized total SNPs.<br />
Higher bars indicate an increased<br />
representation of allelespecific SNPs<br />
122<br />
relative to other positions, which tends to<br />
occur at conserved positions.
The Ranger system at the Texas Advanced Computing Center assisted the process by taking the<br />
short sequences read by the ChIP-Seq gene sequences and aligning them to the reference genome.<br />
It sounds simple enough, until you imagine sequencing millions of small bits of DNA and<br />
matching them to approximately three billion base pairs. Using several thousand cores in parallel<br />
on Ranger, the alignment required more than 175,000 processor hours, or the equivalent of 20<br />
years on a single processor.<br />
Their proof of principle study determined that distinctions in transcription binding can be studied<br />
using their method. They were also able to tell the difference in binding from the gene that you<br />
inherited from your father and mother, showing that transcription factor binding appears to be a<br />
heritable trait. They are now using this technology to look at those differences and apply it to<br />
cases where a mutation from a parent predisposes one to a disease.<br />
10.3 EOT Highlights<br />
UChicago/ANL<br />
Following the completion of the SC10 Conference in November, Scott Lathrop, TeraGrid Area<br />
Director for EOT, launched his role as the SC11 Conference Chair. SC11 will be held November<br />
12-18, 2001 in Seattle, Washington. (http://sc11.supercomputing.<strong>org</strong>). Numerous staff from<br />
TeraGrid and Blue Waters are involved as committee members for the SC11 Conference.<br />
Indiana<br />
The Pervasive Technology Institute at Indiana University announced that it has joined OpenSFS,<br />
a technical <strong>org</strong>anization that promotes collaboration among institutions using open source<br />
scalable file systems such as Lustre. Indiana University participated in the TeraGrid booth at the<br />
USA Science & Engineering Expo on the National Mall and provided a 6 minute 3D stereo video<br />
about the Linked Environments for Atmospheric Discovery (LEAD) project. Indiana University<br />
along with four research partners announced that they had succeeded in using the Lustre file<br />
system over a wide area network (WAN), saturating the world's first commercial 100 gigabit link<br />
with an aggregate average transfer rate of 21.9 GBps or 87% of theoretical maximum.<br />
ORNL<br />
TeraGrid staff played an important role in the SC10 education program in New Orleans in<br />
November, 2010. Laura McGinnis was the chair of the SC10 education program this year. The<br />
program was an unqualified success with over 100 attendees from secondary and undergraduates<br />
institutions. An addition this year was the opening of the sessions to general conference attendees,<br />
many of whom attended one or more sessions. In fact in a few sessions even the large rooms<br />
available in New Orleans were full to capacity. Other TeraGrid staff assisted as well including<br />
Edee Wiziecki, John Cobb, and Scott Lathrop.<br />
Purdue<br />
Purdue hosted a CI Days event on December 8-9, 2010 on Purdue's West Lafayette campus.<br />
Purdue CI Days 2010 aimed to broaden the use of cyberinfrastructure, with expert presentations<br />
and seven panel discussions. Most of the panelists are researchers who already taking advantage<br />
of CI resources ranging from cutting-edge supercomputers to classroom discussion tools. They<br />
shared their experiences and lessons learned with the audience. Approximately 200 researchers<br />
and students attended the event.<br />
TACC<br />
TACC’s education and outreach programs include the scientific computing courses (2 during this<br />
reporting period) and tours of TACC’s facility at the J.J. Pickle Research Campus (about 10 miles<br />
north of UT Austin’s main campus) and the ACES Visualization Laboratory at the UT Austin<br />
123
main campus. The Austin Forum continues to attract a broad audience from the university, the<br />
city, and the wide range of technology industries around Austin. TACC’s Scientific Computing<br />
courses (3) were all well enrolled this semester, reaching 74 graduate and undergraduate students.<br />
Course descriptions and instructional slides for TACC’s two introductory courses, Introduction to<br />
Scientific Programming, and Scientific/Technical Computing, were made available online to the<br />
public. TACC also partnered with the Division of Statistics and Scientific Computation to build a<br />
new classroom for advanced computational instruction. The room will be completed in January<br />
and dedicated in the first quarter of 2011. TACC has continued to use social media and social<br />
news sites like Facebook, Twitter, Slashdot and Digg, to broaden our reach and participate in<br />
community conversations. TACC continues to experiment with webcasting training courses, and<br />
will expand the course availability in the coming months.<br />
10.4 Training, Education and Outreach<br />
NCSA<br />
New CI-Tutor Course Released: NCSA released a new online tutorial in CI-Tutor, “Introduction<br />
to Performance Tools.” This tutorial provides an introduction to a number of commonly-used<br />
performance tools on the TeraGrid. The tools highlighted include two Linux utilities: strace,<br />
which traces system calls and signals during program execution; and gprof, which can be used to<br />
track how functions are using system resources. More complex toolsets include NCSA’s<br />
PerfSuite, a collection of tools and supporting libraries that provides a higher-level view of<br />
application performance; and TAU (Tuning and Analysis Utilities), a suite of tools for<br />
performance instrumentation, analysis, and visualization of large-scale parallel computer systems<br />
and applications. The course is <strong>org</strong>anized as individual lessons that provide a general overview of<br />
the tool, demonstrates how the tool might be run on a TeraGrid computing resource, provides<br />
examples of subsequent data analysis, and concludes with self-assessment and pointers to more<br />
comprehensive resources regarding each tool.<br />
CI-Tutor Upgraded: CI-Tutor was upgraded to ATutor version 2.0.2. ATutor is the open-source<br />
learning management software on which CI-Tutor is based. This version provides increased<br />
security and a number of enhancements to the functionality. The domain name for the site was<br />
also changed from ci-tutor.ncsa.illinois.edu to www.citutor.<strong>org</strong>.<br />
NICS<br />
Jim Ferguson (NICS) joined other TeraGrid External Relations and EOT staff in Washington,<br />
D.C. at the USA Science & Engineering Festival. (http://www.usasciencefestival.<strong>org</strong>/) The<br />
TeraGrid EOT team was responsible for one of fifteen booths that were being staffed by NSF<br />
projects for the October 23-24 capstone event on the National Mall and Pennsylvania Avenue. A<br />
conservative estimate of the number of visitors to the TeraGrid booth put it around 2000 each<br />
day. Somewhere between 200,000 and 400,000 attendees came to D.C. during the week-long<br />
series of events.<br />
Bhanu Rekepalli, Jacek Jackowski, and Jim Ferguson (all of NICS) participated in a Panel<br />
Session at SC10 in New Orleans, November 17. The panel concerned Workforce Development, a<br />
first of its kind at an SC Conference, and the three NICS speakers highlighted some of the work<br />
they had done in this area with the South Carolina/Tennessee EPSCoR project and their<br />
associated TeraGrid work. Approximately 40 people attended the Panel.<br />
Jim Ferguson (NICS) was on the conference staff for the TeraGrid/DEISA Summer School, held<br />
in Acireale, Italy, October 4-6. The workshop was targeted at graduate level students in<br />
computational sciences, and the attendees included 25 students from the U.S. and 35 students<br />
from European institutions. Ferguson helped review and select students from the U.S. and<br />
Europe to attend the conference, and was responsible for engaging the students during the<br />
124
conference to assess their level of satisfaction with the program and garner any suggestions for<br />
future like workshops.<br />
ORNL<br />
A case study of the NSTG RP at ORNL was presented to the international NOBUGS 2010<br />
workshops in October in Gatlinburg. (http://www.nobugsconference.<strong>org</strong>/Conferences) This three<br />
day workshop had over 50 attendees form three four different continents. The NSTG case study<br />
was the capstone presentation officially ending the conference presentations.<br />
Purdue<br />
Purdue’s Kay Hunt, the TeraGrid Campus Champion Program lead, gave a number of<br />
presentations including at Oklahoma Supercomputing Symposium, EPSCoR Cyberinfrastructure<br />
meeting in Washington, DC, EDUCAUSE in Anaheim, CA, Internet2 in Atlanta, GA, IEEE<br />
CloudCom in Indianapolis, IN, and CI Days events at Univ. of Michigan, University of<br />
Wisconsin-Milwaukee, and Purdue. She planned and led campus champion BOFs at SC10 and<br />
Internet2 meeting.<br />
Purdue RP provided accounts on Steele and assisted with the training in Puerto Rico, in<br />
December 2010.<br />
10.5 RP Operations: Training<br />
Jim Marsteller held the annual “Security 4 Lunch” on 08-Nov-2010 for all PSC staff and students.<br />
The one hour presentation covered general information on security best practices including<br />
material for PSC staff traveling to SC10. This year the staff was very interactive, asking a number<br />
of security related questions. In fact, the VPN demo ran over the one hour allotted time.<br />
SDSC<br />
RP-funded training activities included hands-on workshops, training sessions via the Access Grid,<br />
support of online and in-person tutorials. TeraGrid RP staff participated in both local site-specific<br />
training as well as TG-wide training.<br />
TACC<br />
TACC and Cornell HPC staff members taught 6 workshops within the reporting period. Staff<br />
members provided 36.5 hours of instruction to a total of 71 students.<br />
VIRTUAL WORKSHOP<br />
Staff members at the Center for Advanced Computing at Cornell University continue to maintain,<br />
support, and develop new modules for the Ranger Virtual Workshop. Available through both the<br />
TeraGrid User Portal and the TACC User Portal, the workshop provides users access to seventeen<br />
training modules with seven new modules currently under development and three modules<br />
scheduled for updates.<br />
Available Modules<br />
An Introduction to Linux (new)<br />
An Introduction to C Programming (new)<br />
Parallel Programming Concepts and High-Performance Computing<br />
Ranger Environment<br />
Message Passing Interface (MPI)<br />
MPI Point-to-Point Communications (updated)<br />
MPI Collective Communications<br />
MPI One-Sided Communication<br />
125
OpenMP<br />
Hybrid Programming with OpenMP and MPI<br />
Profiling and Debugging<br />
Optimization and Scalability<br />
Computational Steering<br />
Large Data Visualization<br />
ParaView<br />
VisIt<br />
Using Databases<br />
Modules Under Development<br />
Advanced MPI (We are planning to add content to the existing MPI modules instead)<br />
Data Analysis with Python on Ranger<br />
Distributed Debugging Tool (DDT)<br />
Multi-core<br />
Multi-node Map Reduce<br />
Parallel I/O<br />
Profiling with mpiP<br />
Using Scripting Tools on Ranger<br />
Modules Being Updated<br />
Hybrid<br />
MPI Collective Communications<br />
OpenMP<br />
10.6 RP Operations: Education<br />
NCAR<br />
In December, NCAR opened applications to the SIParCS program for 2011. This year, 18<br />
summer project descriptions were solicited from local mentors. NCAR conducted training in the<br />
VAPOR visualization tool at the American Geophysical Union 2010 meeting in San Francisco.<br />
Approximately 25 people attended.<br />
PSC<br />
PSC’s Pallavi Ishwad and Jacob Czech gave a 90-minute presentation on the CMIST module<br />
“Enzyme Structure and Function” at the education program at SC10. The presentation was wellreceived.<br />
About 25 educators, 15 from under-represented groups, attended the presentation and<br />
provided important feedback for further improvements to the module. CMIST (Computational<br />
Modules in Science Teaching) is a program of the National Resource for Biomedical<br />
Supercomputing (NRBSC) at PSC that aims to bring innovative science tutorials into secondary<br />
school classrooms, focusing on integrative computational biology, physiology, and biophysics.<br />
CMIST modules include high quality, biologically realistic 3-D animations produced with<br />
cutting-edge simulation and visualization software developed at NRBSC.<br />
Ishwad also gave a 90-minute presentation on BEST, Better Educators of Science for Tomorrow.<br />
About 40 educators, 15 from under-represented groups, attended. BEST is a PSC outreach<br />
program that exposes teachers to modern molecular biology concepts and evaluates inclusion of<br />
computational biology and bioinformatics into high school curriculum. In addition, BEST<br />
prepares teachers to refocus their teaching strategies towards exposing and encouraging students<br />
to become aware of emerging and exciting biomedical careers.<br />
126
Also at SC10 Rebecca Day, a teacher at Frazier High School, presented three posters about<br />
Frazier’s participation in BEST. Frazier is in a rural school district in Western Pennsylvania with<br />
a 75% reduced meal plan population. Two of the posters were prepared by Frazier’s students in<br />
conjunction with PSC staff members and one by Day and other teachers, also in conjunction with<br />
PSC staff members. All four of the teachers and two of the four students are female. At least 50<br />
people visited the posters. Titles of the posters are the following:<br />
Purdue<br />
1. Bioinformatics in High School<br />
2. High School Research Internship in Bioinformatics<br />
3. Summer Internship at PSC<br />
In the fall semester of 2010, the Purdue C4E4 science gateway (http://c4e4.rcac.purdue.edu) was<br />
used in a graduate level Computational Watershed Hydrology class (CE549) taught by Professor<br />
Venkatesh Merwade of Civil Engineering. Students used the portal to calibrate SWAT (Soil and<br />
Water Assessment Tool) model for their research. SWAT calibration is computation intensive<br />
and in the past the students were not able to perform in a semester long class. In one study, a<br />
student from Department of Agronomy investigated the effects of soil patent materials on<br />
streamflow and nutrient processes using a paired catchment study in Indiana. He used the<br />
gateway to conduct long running calibrations on the TeraGrid. Preliminary findings indicate that<br />
nitrate output is more sensitive to soil parent material than either streamflow or sediment load.<br />
SDSC<br />
TG RP Education activities included activities to communicate the relationships between science,<br />
technology, and society, and the role that scientific computing plays. TG RP staff reached out to<br />
the K-20 community in an effort to encourage student interest in math and science during the<br />
education process, to members of diverse, underrepresented groups in order to broaden<br />
participation in high performance computing, and to the general public in order to promote<br />
awareness of role of high performance computing to advance scientific discovery. Some RP sites<br />
contribute to the scientific computing curriculum at their institution, creating and maintaining<br />
"course packages" that are available to higher education faculty to produce their own courses.<br />
SDSC Education programs in the fourth quarter included ten student-focused workshops and<br />
classes, and fourteen workshops for teachers. SDSC’s TeacherTECH program benefitted from<br />
new and expanded collaborations with other <strong>org</strong>anizations in the region that enhanced programs<br />
in bioinformatics and astronomy. Despite unusually low teacher professional development<br />
participation statistics in the region (due to a depressing state economy characterized by<br />
widespread layoffs and course cancellations at all levels), 221 teachers (153 underrepresented in<br />
STEM) participated in 14 workshops totaling more than 32 contact hours. It was truly<br />
encouraging that teachers participated enthusiastically in a new, multi-session, space science<br />
series without any financial incentives.<br />
Expanded programs for students proved tremendously popular, offering computer science,<br />
computational sciences and computing tool training that are not provided in the region’s schools.<br />
The ten classes – several of them full-day (6 hour) programs – attracted 175 (116<br />
underrepresented in STEM) students from middle school through undergraduates.<br />
TACC<br />
TACC Scientific Computing Curriculum Courses: Since the fall of 2006, over 400 students have<br />
enrolled in our courses. A total of 74 undergraduate and graduate students enrolled in the three<br />
Fall 2010 courses, Introduction to Scientific Programming (focused on FORTRAN and C),<br />
Scientific and Technical Computing, and Visualization and Data Analysis. Six TACC scientists<br />
127
are teaching these courses. In each course, enrollment is very high. Engineering students continue<br />
to hold the greatest percentage of students, compared to the natural sciences, computer sciences,<br />
geosciences, and liberal arts. The majority of students are involved in research. Introduction to<br />
Scientific Programming continues to attract the greatest percentage of undergraduates per course.<br />
TACC is evaluating the quality and effectiveness of the courses with pre- and post-course surveys<br />
based on the content, learning objectives, and the computational science competencies from the<br />
Ralph Regula School of Computational Science. The Spring 2010 course data indicates<br />
statistically significant gains in students’ knowledge and skills. There are a few interesting<br />
findings: Introduction to Scientific Programming attracts mostly undergraduates, while graduates<br />
are the overwhelming majority in the Parallel Computing course. The majority of students in the<br />
programming class were from engineering and physics, but two liberal arts majors enrolled in the<br />
class as well. Student achievement in the learning objectives for the parallel computing course,<br />
based on the competencies developed by the Ralph Regula School of Computational Science,<br />
increased from little or no experience to proficient/very experienced over the course of the<br />
semester. An encouraging highlight is 75% of students in the scientific programming course<br />
indicated that they are applying their new programming capability in other courses.<br />
TACC Scientific Computing Curriculum Package: Production of the Scientific Computing<br />
Curriculum Package came to a close in the 4 th quarter of 2010. Instructors reviewed the revised<br />
course materials and they were made available via the TACC website at SC’10<br />
(http://www.tacc.utexas.edu/education/academic-courses/). The presentation slides contain the<br />
bulk of the courses’ content that students use on a regular basis, thus the content accuracy and<br />
quality of the slides are critical to maximize impact and sustainability. Students have commented<br />
to our instructors about the high quality of the slides. The package content extends beyond the<br />
course slides to ensure that faculty at other institutions may readily adopt or integrate the<br />
materials for creating their own courses or enriching existing courses. Additional content includes<br />
assignments, tests, projects, references, and instruction guides.<br />
10.7 RP Operations: Outreach<br />
Indiana<br />
Indiana University was host to the IU Statewide IT Conference and the Data to Insight Center of<br />
the Pervasive Technology Institute hosted a booth to highlight the work scientists are able to<br />
accomplish given the technology provided by Indiana University and the TeraGrid. During the<br />
Show and Tell reception, Mike Boyles, Robert Ping and several students from the Center showed<br />
the Stereoscopic video, LEADING THE WAY and showed live demonstrations of climate<br />
forecasting enabled by Big Red, one of the TeraGrid supercomputers.<br />
128
NCAR<br />
For the first time, SIParCS outreach advertisements were sent to 45 universities, including 15<br />
minority-serving institutions.<br />
PSC<br />
During the last quarter PSC’s Phil Blood has been the initial TeraGrid contact for 7 new Campus<br />
Champions. He helped two of these new Champions find answers to questions that they had<br />
regarding TeraGrid, either for themselves or users on their campuses. During the same period he<br />
also assisted three established Champions in finding answers to questions related to their service<br />
as Campus Champions or TeraGrid usage scenarios important to users on their campuses. The<br />
bulk of his duties, however, involve his work with the Campus Champion Leadership Team and<br />
Advisory Board. Each month he attends three monthly calls (leadership, advisory, and general<br />
Campus Champion call) to help plan and <strong>org</strong>anize the Champion effort and implement strategies<br />
for improving the program. For example, in this quarter he helped develop a Champion mentor<br />
program in which an established Champion is assigned to work with new Champions. He also<br />
attended Champion meetings at SC’10.<br />
PSC had a booth and made a presentation at the Three Rivers Educational Technology<br />
Conference on 10-Nov-2011. This was a good venue to reach out to the local K-12 community<br />
and we had many visitors to the PSC booth. Contacts made or renewed included the technology<br />
coordinators at Winchester-Thurston and Ellis Schools; many people at the Allegheny<br />
Intermediate Unit, including the new Executive Director Linda Hippert; Grable Foundation<br />
Executive Director Gregg Behr; various Pittsburgh Technology Council people; and a number of<br />
technology coordinators and STEM teachers in the region.<br />
PSC has decided to once again be a bronze level sponsor for the SciTech Days this year. SciTech<br />
Days is one of the activities of SciTech that is a year-round initiative of the Carnegie Science<br />
Center in Pittsburgh. Bronze level sponsorship allows PSC to not only have a booth (March 8 –<br />
11, 2011) at SciTech Days but also to make two awards at the Pittsburgh Regional Science and<br />
Engineering Fair (April 1-2) for projects with the best computational element. This also helps to<br />
further PSC’s relationship with the Carnegie Science Center as a community partner.<br />
On 17-Nov-2011 a group of senior computer science majors from Slippery Rock University (part<br />
of the Pennsylvania State College system) visited PSC for a tour of the machine room that<br />
included a talk of HPC careers. Slippery Rock is also interested in possible internships for their<br />
students in the future.<br />
SDSC<br />
TG RP Outreach activities included activities to communicate the relationships between science,<br />
technology, and society, and the role that scientific computing plays. TG RP staff also reach out<br />
to the scientific and professional communities in an effort to encourage interest in the use of<br />
TeraGrid resources and services. Many RP sites contribute to broadening participation in the<br />
scientific computing community that should be covered in this section.<br />
As mentioned in the Highlights, SDSC activities included several national-scale outreach<br />
activities critical to shaping the future of the TeraGrid, XD, and data-intensive discovery. SDSC’s<br />
“Grand Challenges in Data-Intensive Discovery” conference in October 26-28, 2010 provided<br />
attendees a venue for share expertise and ideas about the computational challenges and concerns<br />
common to data-intensive problems from many fields of inquiry. Follow-up participant surveys<br />
provided valuable feedback on future data-intensive computing topics for training, outreach, and<br />
workforce development programs.<br />
129
The TeraGrid Campus Champions portal was deployed as an integrated component of the<br />
TeraGrid User Portal in December, with strong contributions from SDSC. Sale’s dedicated<br />
online support fueled significant site design and function improvements to serve the needs of<br />
future Campus Champions.<br />
SDSC shared its Dash and Gordon (Track 2D) development process discoveries through highly<br />
reviewed presentations and published proceedings of international professional meetings.<br />
TACC<br />
For this quarter, TACC hosted three of its monthly Austin Forum events with invited speakers<br />
from areas of interest focused on science and technology. The goal of “The Austin Forum on<br />
Science, Technology & Society” is to engage and educate the Austin community about the<br />
numerous ways in which science and technology enhance the quality of their everyday life as<br />
well as the health, prosperity, and security of the nation. Two hours are devoted to a networking<br />
reception, presentation, and Q&A discussion among the speaker and guests, which reinforces the<br />
broader impact and intellectual merit of the program. Speakers and topics for this reporting period<br />
focused on:<br />
• Oct. 2010: Dr. Steven Weinberg, Hopes and Fears for Big Science. Dr. Steven Weinberg<br />
holds the Josey Regental Chair in Science at The University of Texas at Austin, where he<br />
is a member of the Physics and Astronomy Departments. His research on elementary<br />
particles and cosmology has been honored with numerous prizes and awards, including in<br />
1979 the Nobel Prize in Physics and in 1991 the National Medal of Science. In 2004, he<br />
received the Benjamin Franklin Medal of the American Philosophical Society with a<br />
citation that said he is "considered by many to be the preeminent theoretical physicist<br />
alive in the world today."<br />
• Nov. 2010: Daniel Lorenzetti and Juan Garcia, "Transmedia Storytelling.” Transmedia<br />
Storytelling is a new way to use multiple media outlets, concurrently, with each element<br />
enhancing and advancing the narrative. It has profound implications for politics,<br />
marketing, and all forms of entertainment.<br />
• Dec. 2010: Dr. Alfred Gilman, CPRIT: Opportunities and Challenges. The Cancer<br />
Prevention and Research Institute of Texas (CPRIT) is a new Texas agency empowered<br />
to invest $3 billion over 10 years to enhance research and prevention activities toward<br />
alleviation of suffering and death from cancer. This vision poses enormous questions<br />
about distributed clinical trials management systems, as well as means for acquisition,<br />
storage, analysis, integration, and dissemination of vast amounts of information.<br />
TACC openly advertises tours of our facilities to the Austin and central Texas area. Tour groups<br />
include visitors in K-12, higher education, industry, government, and the general public. An<br />
overview of TeraGrid and its contributions to STEM research are provided at each event.<br />
10.8 External Relations<br />
A number of people serve on both the EOT and ER teams. The ER teams helps develop materials<br />
for outreach, and participate as presenters during outreach events. In Q410, Leake assisted with<br />
two CI Days, the US Expo, the DEISA/TG Summer School, and the Puerto Rican outreach<br />
initiative. Hunt (Purdue) serves on both ER and EOT teams and leads the Campus Champion<br />
initiative and assists with planning for the CI Days events. Ping (IU) also serves on both teams<br />
and is responsible for facilitating TeraGrid’s presence at conferences and events. Every ER team<br />
member serves as a TeraGrid ambassador through their involvement with outreach via<br />
professional <strong>org</strong>anizations, campus visits, committees, and attending conferences. Through these<br />
activities, and by sharing our stories and communication collateral with others, the ET team<br />
130
communicates TeraGrid’s impact. Leake, and ER team members, help to recruit and build<br />
relationships to engage multidisciplinary users from vastly diverse communities, regionally,<br />
nationally, and internationally.<br />
A number of joint ER-EOT outreach activities were highlighted at the start of Section 10.<br />
10.8.1 Benchmarking Effectiveness on the WWW and identifying exploits<br />
To benchmark the effectiveness of TeraGrid’s external relations efforts, Leake reviews Google<br />
alerts as they happen—new web information that appears mentioning TeraGrid in a unique way.<br />
In Q410 there were 114 legitimate alerts. In calendar year 2010 there were 657 total, exactly<br />
twice the number that occurred in 2009 (328)! This is an important benchmark for the<br />
effectiveness of the collective efforts of TeraGrid’s External Relations team.<br />
The process is also a way to determine quickly if TeraGrid’s resources have been exploited by<br />
<strong>org</strong>anizations that might use our domain for malicious or inappropriate ways. Suspected exploits<br />
are immediately sent to Jim Marsteller (PSC), leader of TeraGrid’s security working group, for<br />
handling, such as the recent case where a TeraGrid resource had been used to sell<br />
pharmaceuticals.<br />
10.8.2 Involvement in the Allocations Process<br />
Leake released the quarterly allocation request deadline announcement (TRAC) mid and near the<br />
end of the month in anticipation of the January 15 deadline. Fourteen sources carried the story.<br />
Reminders were sent via TeraGrid’s FaceBook community and published on the TeraGrid web<br />
site. This year the external relations team took a more active interest in the allocation process.<br />
Each quarter, a letter was provided to each of the TRAC reviewers (about 50), to emphasize the<br />
importance of promoting science through the efforts of the ER team.<br />
10.8.3 2010 and 2011 Science Highlights Selection Process<br />
In Q410, the ER team completed production of the 2010 Science Highlights publication for initial<br />
distribution at SC10 in New Orleans. Warren Froelich (SDSC) coordinated the production of the<br />
publication in collaboration with Carlton Bruett Design, a minority owned company. Copies of<br />
the publication were sent to NSF (OCI & OLPA), to Hunt (Purdue) for distribution to Campus<br />
Champions, to Ping (IU) for use with outreach to professional conferences and meetings, and to<br />
each Resource Provider for their outreach efforts. The Science and EOT Highlights publications<br />
are posted on the TeraGrid web site at: https://www.teragrid.<strong>org</strong>/web/news/news#2010scihigh<br />
All members of the ER team assist with the identification of prospective highlights for reporting<br />
and publishing purposes. Leake and Katz attend the TRAC meetings on an alternating basis to<br />
help identify exemplary science and highlights that meet the seven selection criteria found on<br />
page 36 of the Science Highlights book:<br />
https://www.teragrid.<strong>org</strong>/c/document_library/get_fileuuid=e950f0a1-abb6-4de5-a509-<br />
46e535040ecf&groupId=14002)<br />
Katz typically highlights about 10-15 applications via TeraGrid’s quarterly reporting process.<br />
Between 16 and 20 are eventually selected by the ER team and advisers (Lathrop, Roskies, Katz,<br />
Towns) for further development and inclusion in the annual science highlights book.<br />
131
In 2011, in anticipation of the transition to XD, the Science and EOT Highlights books will be<br />
combined. The ER and EOT teams have begun the process of identifying story prospects for this<br />
publication.<br />
10.8.4 RP Operations: External Relations<br />
PSC<br />
The movie “Career Gates: STEM” with appearances by PSC’s Joel Stiles and Jacob Czech won<br />
the National Academy of Television Arts & Sciences Mid-Atlantic Chapter’s 2010 Emmy®<br />
Award for Children/Youth/Teen (19 and under) - Program or Special. The movie is on you-tube<br />
at http://youtube.com/watchv=1zGHCvkS4qw.<br />
SDSC<br />
TeraGrid RP local external relations staff developed local site-specific content as well as<br />
contributed to TeraGrid-wide activities and publications. External relations staff provide web and<br />
print media outlets detailing the successes of the TeraGrid project and its user community. They<br />
also develop materials for Supercomputing and TeraGrid conferences.<br />
SDSC’s Communications staff continued to provide writing and publications support to TG<br />
External Relations activities. Specifically,<br />
• Warren Froelich, SDSC’s communications director, served as lead editor for 2010 TG<br />
Science Highlights, coordinating, editing and providing graphic oversight for the<br />
publication. Following his stint as Deputy Chair for TG’10, Froelich was tapped to serve<br />
in the same capacity for TG’11, working with event chair John Towns (NCSA) on the<br />
planning and follow-through for the annual conference scheduled next July in Salt Lake<br />
City.<br />
• Jan Zverina, in his role as TG science writer, continued to identify and write stories,<br />
including an article on TG’s science gateways being used as a model for a five-year, $4.4<br />
million NSF award to form a collaborative software framework for analysis of<br />
geographic data. As TG’10 Communications Co-Chair, Jan also worked with Sonya<br />
Nayak and members of the TG’10 communications team to finalize related TG’10<br />
expenditures.<br />
• Ben Tolo, SDSC’s graphics designer/web manager, created the TG’11 Save the Date<br />
card, provided slide graphics for SC’10, and helped proof TG Science Highlights. Ben<br />
also has been asked to serve on the Communications committee for TG’11.<br />
10.8.5 News featured in Q4 on TeraGrid.<strong>org</strong><br />
12/1/2010: Eight TeraGrid resources were included among the November 2010 Top500 (Leake)<br />
11/30/2010: SIParCS - Summer Internships in Parallel Computational Science application<br />
deadline is February<br />
11/29/2010: Women in Science Conference 2011 registration is now open<br />
11/22/2010: TeraGrid Partner TACC Receives 2010 HPCwire Readers' Choice Award (Singer-<br />
Villalobos/Dubrow)<br />
11/22/2010: UChicago/Argonne--Globus Cloud Solution Facilitates Large-Scale Data Transfer<br />
(GridFTP-enabled sites) (Leake)<br />
11/10/2010: SDSC-ASU's OpenTopography Facility Part of $4.4 Million NSF Award To<br />
Develop a Geographical Information Science (GIS) Framework (Froelich/Zverina)<br />
10/28/2010: TeraGrid and DEISA host first joint Summer School in Italy (Leake)<br />
10/21/2010: UChicago/ANL-The Beagle Has Landed In Chicago! New Cray XE6 Supercomputer<br />
for Biomedical Simulation and Data Analysis (Leake)<br />
132
10/18/2010: TeraGrid Partner SDSC Marks its 25th Anniversary (Froelich/Zverina)<br />
10/11/2010: Blacklight, the World’s Largest Coherent Shared-Memory Computing System, is Up<br />
and Running at the Pittsburgh Supercomputing Center<br />
10.9 Enhancement of Diversity<br />
NCAR<br />
NCAR: SIParCS outreach position advertisements were sent to 15 minority-serving institutions.<br />
SDSC<br />
SDSC participated with TeraGrid colleagues at the 2010 annual conference of the Society for the<br />
Advancement of Chicanos and Native Americans in Science (SACNAS), staffing a booth in the<br />
exhibit hall and encouraging young scientists to participate in TeraGrid computational science<br />
activities at all levels – from student program participants, to users, to research and teaching<br />
faculty, to XD services professionals. The conference drew 3,350 registered participants (1,396<br />
undergraduates and first-time student attendees), among whom were 883 students presenting their<br />
scientific research.<br />
Student and teacher participant statistics from both StudentTECH and TeacherTECH programs at<br />
SDSC showed significant involvement by women and minorities underrepresented in STEM<br />
studies and careers.<br />
TACC<br />
TACC accommodates a wide range of tour requests from school programs where underrepresented<br />
students are in attendance. Highlights include a tour of home-schooled students; a<br />
Computer Security class from Austin Community College; and a tour of the Houston Livestock<br />
Show & Rodeo Committee. Each of these groups was given a presentation about TACC and a<br />
tour of either the Ranger supercomputer or the visualization laboratory. TeraGrid was described<br />
in each presentation.<br />
10.10 International Collaborations<br />
GIG<br />
The GIG worked with Hermann Lederer, Deputy Director of Garching Computing Center of the<br />
Max Planck Society (RZG), to <strong>org</strong>anize the first EU-US HPC Summer School in Catania, Italy<br />
during October, 2010. A report of that summer school is included in Appendix C EOT EU-US<br />
Summer School <strong>Report</strong>.<br />
Indiana<br />
As part of Indiana University's ongoing relationship with the Technische Universitaet Dresden,<br />
IU was invited to use their expertise in wide area filesystems to help exercise the world's first<br />
commercial 100 Gigabit link. Provided by T-Systems and Alcatel-Lucent in June, the link uses a<br />
single wavelength to bridge the 60 kilometers (37 miles) between Dresden and Freiberg,<br />
Germany.<br />
With equipment provided by Data Direct Networks (DDN) and Hewlett Packard, and in<br />
consultation with Whamcloud, Lustre file systems in both Dresden and Freiberg. The preliminary<br />
goals of the project were to stress test Data Direct Network's S2A9900 and SFA10K disk<br />
appliances and fully exercise Lustre across the 100 gigabit connection.<br />
Using the full duplex capability of the 100 gigabit link, an average transfer rate of 21.904<br />
gigabytes per second was achieved. This rate is over 87 percent of the theoretical maximum<br />
allowed by the link and represents enough data to fill 285 DVDs every minute.<br />
133
LONI<br />
LONI announced the release of SAGA version 1.5.3 (Simple API for Grid Applications). It is<br />
now available on LRZ at Munich and is being installed in Amsterdam – part of the TG-DEISA<br />
project. SAGA has achieved piece-wise local execution of ensemble-MD runs on 2 DEISA<br />
machines, and will support coupled runs in the near future.<br />
NCAR<br />
NCAR: The user community of NCAR’s Asteroseismic Modeling Portal (AMP) is increasing in<br />
size with many new international participants. The gateway now has users from France,<br />
Germany, Australia, Italy, Spain, Austria, and the United Kingdom, in addition to its community<br />
of U.S. researchers.<br />
ORNL<br />
The ORNL RP participated in the conference <strong>org</strong>anizing committee for the 8 th NOBUGS<br />
conference held in October in Galtinburg, TN. Attendance was over 50 from four different<br />
continents. NOBUGS is an ongoing collaboration among neutron, light, and muon sources. It is a<br />
forum to discuss and share advances in data acquisition, data management, analysis, and other<br />
user group software.<br />
The ORNL RP continues to collaborate with the Swiss NSF funded Sinergia project focused on<br />
the fitting of diffuse neutron and x-ray scattering patterns in order to characterize the molecular<br />
structure of materials exhibiting diffuse scattering features such as streaks. <<br />
http://www.oci.uzh.ch/diversa/SNFProject/index.php> As the initial project begins to approach<br />
its end, the collaboration is concentrating on finishing publication submission for completed work<br />
and also exploring future opportunities in this area.<br />
PSC<br />
During 4-Oct-2010 to 7-Oct-2010, Phil Blood of PSC gave an invited talk at the first European-<br />
US Summer School on HPC Challenges in Computational Sciences <strong>org</strong>anized in a joint effort by<br />
TeraGrid and the Distributed European Infrastructure for Supercomputing Applications (DEISA)<br />
in Catania, Italy. There were 60 students in attendance at the summer school: 25 from U.S.<br />
institutions and 35 from European institutions, with about 20 nationalities represented. Together<br />
with Bernd Mohr from Jülich Supercomputing Centre, Blood spoke on tools and procedures for<br />
performance analysis of parallel applications, focusing on concrete examples from his experience<br />
at PSC working with research groups to improve their codes. Blood’s presentation focused on the<br />
U.S.-based tools involved in the POINT project (Productivity from Open, INtegrated Tools), an<br />
NSF-funded SDCI project in which PSC is a partner, while Mohr focused on European tools<br />
involved in the POINT project and a similar European collaboration. Comments following the<br />
talk by both instructors and students indicated that the presentation had been helpful and wellreceived.<br />
The talk itself was only a small part of the experience, as Blood spent the week<br />
associating with an array of students and instructors from different countries and backgrounds.<br />
During this time, Blood helped students directly with some of the problems and challenges that<br />
they are facing in computational science and formed several mutually beneficial associations with<br />
both students and instructors at the summer school.<br />
TACC<br />
TACC continues to support researchers and students in the UT Austin-Portugal Collaboratory.<br />
TACC is also collaborating with the Korea Institute of Science and Technology Information<br />
(KISTI) to create an integrated computational workflow and toolset that will be made widely<br />
available to serve the academic life science research community.<br />
134
10.11 Broader Impacts<br />
LONI<br />
As part of the OGF (Open Grid Forum), multiple standards-compliant distributed backends were<br />
developed for the Future Grid PGI (Production Grid Infrastructure) and GIN (Grid<br />
Interoperability Now) activities.<br />
NICS<br />
NICS has an ongoing relationship with the University of Tennessee-Knoxville’s chapter of the<br />
Society of Women Engineers. NICS Assistant Director Patricia Kovatch has given talks at<br />
campus meetings of the group, and NICS will be assisting SWE when it hosts its Regional<br />
Conference in the Spring of 2011.<br />
ORNL<br />
As mentioned above in section 7, John Cobb (ORNL), Chris Jordan (TACC) and Dan Katz (U.<br />
Chicago) continue to work to bring the DataONE datanet effort and TeraGrid efforts into closer<br />
collaborative arrangements.<br />
The outreach effort of the ORNL RP to the eBird team at Cornell is already having a broader<br />
impact in several dimensions. First, the direct impact of using TeraGrid resources is leading to<br />
more accurate and more useful analyses as the basis of land-use and conservation policy<br />
decisions. Second, this is a pathfinder effort that is showing the way to utilize HPC in the<br />
observational biological sciences in order to leverage the scientific utility of their collected data.<br />
In particular, the extreme volunteer “citizen science” aspect of eBird means that it is able to<br />
collect much larger and higher quality data samples than would ever be possible from a funded<br />
field effort. With larger data sets comes the opportunity for much higher resolution and thus more<br />
useful analyses, but those results must also use larger and larger computational resources (in<br />
terms of computation, memory size, and I/O stress) in order to realize this potential. Third, this<br />
new user community and application approach (large scale statistical analysis) broadens<br />
TeraGrid’s view of science application areas and brings a real-science data intensive use case that<br />
can help TeraGrid better understand and tailor its operations to support.<br />
SDSC<br />
An NSF task force on Cyberlearning and Workforce Development involved several TeraGrid<br />
EOT community members, including SDSC’s Baxter who served as a committee chair for<br />
training, K-14, and informal education. The task force’s analysis of risks and opportunities for<br />
workforce development associated with cyberinfrastructure and computational STEM will advise<br />
both NSF program development and XD services well into the future.<br />
TACC<br />
TACC is endeavoring to broaden the scope of the research and education projects that it supports,<br />
including outreach to the Humanities, Arts & Social Sciences. In October, TACC hired Rob<br />
Turknett as the Digital Media, Arts & Humanities Coordinator for the center. In that capacity, he<br />
is developing collaborations with humanities researchers and educating emerging communities in<br />
Texas about the opportunities for computing at TACC.<br />
135
11 Project Management<br />
11.1 Highlights<br />
The reporting process was modified this quarter to accommodate a change in the reporting<br />
requirements as we enter the final phase of the TeraGrid.<br />
11.2 Project Management Working Group<br />
11.2.1 Tracking<br />
The IPP/WBS tracking process is continuing with project managers reviewing status with area<br />
directors. The area directors are finding the tracking process to be a useful exercise, providing<br />
them with a thorough overview of their area’s status against the plan. The update is included as an<br />
appendix to this report.<br />
11.2.2 <strong>Report</strong>ing<br />
The reporting process was modified this quarter to accommodate a change in the reporting<br />
requirements as we enter the final phase of the TeraGrid. Normally we would generate an annual<br />
report for the calendar year, but later this year we will be generating a final report that includes<br />
calendar year 2010 as well as the activity in 2011.<br />
12 Science Advisory Board Interactions<br />
No Science Advisory Board (SAB) meetings took place during this period. The next SAB<br />
meeting, which may be the final one under the current TeraGrid project, will be held Jan. 20-21<br />
2011.<br />
13 Collaborations with Non-TeraGrid Institutions<br />
13.1 Data Systems<br />
DataONE: John Cobb at the ORNL-RP continues to work with the DataONE DataNet project,<br />
serving on its leadership team.<br />
REDDnet: The ORNL-RP continues to work with the REDDnet collaboration. Three different<br />
TeraGrid sites now have REDDnet nodes.<br />
13.2 Science Projects<br />
NEES: John Cobb (ORNL) has joined the NEEScomm Governing board. Although not directly a<br />
TeraGrid/NEES partnership, this does create mutual opportunities for TeraGrid and NEES as<br />
NEEScomm (www.nees.<strong>org</strong>) continues to improve its data management and curation efforts as<br />
well as it HPC strategy. Additionally, John Cobb and Daniel S. Katz (UC/ANL) are on the<br />
steering committee for a NEES-sponsored data workshop in February 2011.<br />
TACC continues to participate in the World Community Grid (WCG) effort. The main TACC<br />
resources contributing to the solution of problems addressed by the WCG is the visualization<br />
system Colt (40 compute cores). The Stampede cluster that previously contributed cycles was<br />
decommissioned 10/1. TACC staff desktops and laptops also provide cycles to the project. Over 9<br />
years of run time have been accumulated (19,377 results) on TACC resources in the reporting<br />
period.<br />
136
14 TeraGrid/Open Science Grid (OSG) Joint Activity<br />
14.1 ExTENCI<br />
The Extending Science Through Enhanced National Cyberinfrastructure (ExTENCI) Project is a<br />
joint Open Science Grid (OSG) and TeraGrid project, funded by OCI. The PIs are Paul Avery<br />
(U. Florida), Ralph Roskies (PSC), and Daniel S. Katz (U. Chicago).<br />
The goal of ExTENCI is to develop and provide production quality enhancements to the National<br />
CyberInfrastructure that will enable specific science applications to more easily use both OSG<br />
and TeraGrid or broaden access to a capability to both TeraGrid and OSG users.<br />
ExTENCI has four primary areas of work:<br />
14.1.1 ExTENCI – Distributed File System (Lustre-WAN)<br />
Hardware and software tasks during this period:<br />
• The Distributed File System became operational in the EXTENCI.ORG Kerberos realm.<br />
• Three clients clusters at FNAL are mounting the distribute file system for application<br />
testing.<br />
• A fix for a Lustre bug that was causing server instability has been installed for testing at<br />
PSC. Packaging for installing this fix in the production system has been delivered to UF.<br />
• The project is now supporting two new Linux OS distributions: Centos and Open<br />
Scientific Linux. These Linux variants are used at client sites. Having these OS’s<br />
supported will broaden the reach of the distributed file system.<br />
• Documentation for client installation has been developed and posted on the project<br />
website. The documentation was used by FNAL for their installation.<br />
• Network testing was performed between client sites and UF.<br />
• System monitoring tools have been installed at UF.<br />
• The distributed file system was demonstrated at SC10. Clients in the CalTech and PSC<br />
booths on the show floor mounted the file system at UF. Performance tests were done<br />
from the floor and graphs of the performance have been posted to the project archive. A<br />
presentation on the status of distributed Lustre efforts was delivered at SC10 to the<br />
HPCFS working group. This presentation included a slide on the ExTENCI distribute<br />
file system.<br />
• The project has begun to explore packaged Virtual Machines (VM’s) that incorporate the<br />
CMS and Lustre software. Work will be coordinated with the Virtual Machines project<br />
within ExTENCI.<br />
14.1.2 ExTENCI – Virtual Machines<br />
Hardware and software tasks during this period:<br />
• Purdue obtained a TeraGrid allocation to track usage of related activities of the<br />
virtualization efforts. The allocation includes Wispy, Purdue Condor and ASQL. Users<br />
have been added to this allocation and the glideinWMS team can now submit to Purdue<br />
Condor and Wispy<br />
• On the CMS-related work, Purdue’s proof-of-concept VM was unsuccessful running at<br />
Clemson due to network issues. We are recreating a VM with different virtual network<br />
scheme.<br />
137
• Purdue is working with CMS GlideinWMS team:<br />
• The project discussed SOAP on Nimbus Cloud "Wispy".<br />
• Wispy upgrade began (to the latest release of Nimbus software to get better SOAP<br />
support the CMS GlideinWMS). Still working to resolve issues that are encountered<br />
during the upgrade.<br />
• Providing starter VMs to GlideinWMS team to test Condor-C submission to Windows<br />
lab cloud.<br />
• Iteration and testing with GlideinWMS team is planned.<br />
• The Purdue has successfully started VMs from RENCI to Clemson using Engage job<br />
submission mechanisms.<br />
• Purdue participated in the discussion of planned VM extension work with the ExTENCI<br />
team, including the initial discussion of integrating Lustre-WAN enabled CMS VMs and<br />
testing/running them on Purdue resources.<br />
• Alex Younts has joined the Purdue team as a full time staff and began working on the<br />
ExTENCI project. Alex had been working with Wispy and HPC systems as an<br />
undergraduate student in the past two years.<br />
14.1.3 ExTENCI – Workflow & Client Tools<br />
Hardware and software tasks during this period:<br />
SCEC:<br />
• Task I1.1 (Split SCEC processing into TG and OSG parts, run complete processing endto-end<br />
on both infrastructures), started 8/1/2010, to finish 10/31/10, 100% complete.<br />
We have demonstrated this functionality, and it works as a demonstration, but is not<br />
practical for this science application, as OSG available slots are generally found on<br />
multiple sites, and large data sets cannot be stored (long-term) on most OSG sites due to<br />
limited disk storage.<br />
• Task I1.2 (Examine and report on alternatives for data movement in the contexts of<br />
performance and ease of use (Performance improvement by using OSG in addition to<br />
TeraGrid will depend on the ability to lease OSG resources, both data and<br />
compute)), started 8/1/2010, to finish 1/31/11, about 50% complete.<br />
We have initially explored using Globus online for the data transfers, which has been<br />
functional. We are also exploring the data transfer mechanism within Swift. We are<br />
working with to understand performance and usability, and to improve both. This task<br />
has become tied to the next, and will be delayed due to that tie.<br />
• Task I1.3 (Examine and report on alternatives for task mapping, etc., in the contexts of<br />
performance and ease of use (Performance improvement by using OSG in addition to<br />
TeraGrid will depend on the ability to lease OSG resources, both data and compute)), to<br />
start 11/1/2010, to finish 6/30/11, about 40% complete.<br />
We can use Swift to launch tasks on OSG, and can get up to 2k cores. We have a new<br />
model for using them, but have data flow problems currently; we are working with the<br />
OSG Engage VO and other parts of OSG to overcome this. We are now looking into<br />
potential software bottlenecks (tied to task I1.2 above).<br />
Protein Structure:<br />
138
• Task I2.1 (Run RAPTOR-OOPS workflow on TeraGrid.), started 8/1/2010, to<br />
finish 10/31/10, 20% complete.<br />
We have run limited testing of OOPS on Ranger, and plan to add Abe and QueenBee.<br />
• Task I2.2 (Run RAPTOR-OOPS workflow on TeraGrid and OSG. Determine effective<br />
site configuration and measure workflow behavior on the combined resources.),<br />
started 8/1/2010, to finish 1/31/11, 10% complete.<br />
Ran over 100,000 "Glass" jobs (which use the OOPS framework) on OSG/Engage;<br />
installed on ~14 sites; running well on 5. Max concurrency to date is ~750 job slots.<br />
Scripts are now ready for combined OSG/TG execution. Need to get/renew TG<br />
allocation.<br />
• Task I2.3 (Run one RAPTOR-OOPS workflow segment under MPI on TeraGrid and<br />
other segments on OSG. Measure workflow behavior and efficiency.), to be started<br />
2/1/2011, to finish 3/31/2011, 0% completed<br />
Task in future.<br />
• Task I2.4 (Determine effective data management strategies for RAPTOR-OOPS and<br />
measure workflow behavior and efficiency.), to be started 3/1/2011, to finish 7/31/2011,<br />
100% complete.<br />
Task in future.<br />
• Task I2.5 (Enhance workflow packaging and usability to enable additional users.<br />
Measure and report on workflow usage and site utilization by the OOPS user<br />
community.), started 12/1/2010, to finish 1/31/2012, 10% complete.<br />
Work proceeding on tools and documentation needed to engage new users. We are testing<br />
these techniques with new user M. Parisien for protein-RNA docking at UChicago.<br />
Parisien is using simple Condor-G scripts for application installation. Parisien has his app<br />
installed on about 10 OSG sites now. We have discovered issues with the Swift resource<br />
selector in these tests, and are now re-testing.<br />
• Task I2.6 (Adjust site selection based on application characteristics.), to be<br />
started 5/1/2011, to finish 7/31/2012, 0% complete.<br />
Task in future.<br />
• Task I2.7 (Engage one new user community in combined OSG-TeraGrid usage using<br />
ExTENCI techniques. Measure and report on workflow usage and site utilization by the<br />
new user community.), started 10/1/2010, to finish 4/28/2012, 30% complete.<br />
We have engaged new user G. Hocky of Reichman Group at Columbia U. Informal<br />
measurements indicate 100K+ jobs of avg 1.5 hour completed in 4Q2010 using Swift on<br />
OSG Engage VO. Used 18 sites, about 6 primary sites; peaks of over 1K cores.<br />
14.1.4 ExTENCI – Job Submission Paradigms<br />
These tasks have not started yet.<br />
139
A<br />
Publications Listing<br />
A.1. TeraGrid Staff Publications<br />
A.1.1 TeraGrid Staff Publications (for 2010 Q3)<br />
For Q3 2010, the TeraGrid sites reported the following publications authored in whole or in part<br />
by GIG- or RP-funded staff members.<br />
1. Wenjun Wu, Thomas D. Uram, Michael Wilde, Mark Hereld, and Michael E. Papka, A Web 2.0-Based<br />
Scientific Application Framework, 8 th International Conference on Web Services, July 5-10, 2010,<br />
Miami, Florida.<br />
2. Wenjun Wu, Thomas Uram, Michael Wilde, Mark Hereld and Michael E. Papka, Accelerating Science<br />
Gateway Development with Web 2.0 and Swift, TeraGrid 2010 Conference, Pittsburgh, Pennsylvania,<br />
August 2-5, 2010.<br />
3. Dooley R., Dahan M. 2010 “TeraGrid ’10”, Action Folders<br />
4. Dooley R. 2010 “Proceedings of The International Workshop on Science Gateways 2010”, Recipes for<br />
Success in New Science Gateway Development<br />
5. Dooley R., Dahan M. 2010 “TeraGrid ’10”, The TeraGrid Mobile User Portal: Ticketing, Data, and<br />
Complete Job Management On The Go<br />
6. Dooley R., Dahan M. 2010 “TeraGrid ’10”, Comprehensive File Management in the TeraGrid<br />
7. Daruru S., Dhandapani S., Gupta G., Iliev I., Xu W., Navratil P., Marin N., Ghosh J. 2010 “Proceedings<br />
of International Workshop on Knowledge Discovery Using Cloud and Distributed Computing Platforms<br />
(KDCloud, 2010) in Cooperation with IEEE ICDM 2010”, Distributed, Scalable Clustering for Detecting<br />
Halos in Terascale Astronomy<br />
8. Jordan C., 2010 “26 th IEEE (MSST2010) Symposium on Massive Storage Systems and Technologies”,<br />
Data Infrastructure in the TeraGrid<br />
9. Willmore F., Rosales C., Cazes J. 2010 “TeraGrid ‘10”, Experiences with the Distributed Debugging<br />
Tool<br />
10. Navratil, P. 2010 “Dissertation, The University of Texas at Austin”, Memory Efficient, Scalable Ray<br />
Tracing<br />
11. Jeong B., Abram G., Johnson G., Navratil P. 2010 “NVIDIA GTC Conference”, Large-Scale<br />
Visualization Using a GPU Cluster<br />
12. Wilkins-Diehr, N., Lawrence, K., "Opening Science Gateways to Future Success: The Challenges of<br />
Gateway Sustainability", accepted for Gateway Computing Environments workshop, November, 2010.<br />
13. Wilkins-Diehr, N., Baru, C., Gannon, D., Keahey, K., McGee, J., Pierce, M., Wolski, R., Wu, W.<br />
"Science Gateways: Harnessing Clouds and Software Services for Science" in Cloud Computing and<br />
Software Services: Theory and Techniques, edited by Syed A. Ahson and Mohammad Ilyas. [Boca<br />
Raton: CRC Press, 2011].<br />
14. H. Karimabadi, H. X. Vu, B. Loring, Y. Omelchenko, M. Tatineni, A.<br />
15. Majumdar, and J. Dorelli, "Enabling Breakthrough Kinetic Simulations of the Magnetosphere Using Petascale<br />
Computing," Asia Oceania Geosciences Society (AOGS) Hyderabad, India, July 5-9, 2010<br />
16. H. Karimabadi, H. X. Vu, B. Loring, Y. Omelchenko, M. Tatineni, A.<br />
17. Majumdar, and J. Dorelli, "3D Global Hybrid Simulations of the Magnetosphere and I/O Strategies for Massively<br />
Parallel Kinetic Simulations,” TG10, Pittsburgh, August 2-5, 2010<br />
18. Xu, D., Williamson, M.J., Walker, R.C., "Advancements in Molecular Dynamics Simulations of<br />
Biomolecules on Graphical Processing Units.", Ann. Rep. Comp. Chem., vol 6. 2010, 2-19 (Chapter 1).<br />
19. Goetz, A.W., Woelfe, T., Walker, R.C., "Quantum Chemistry on Graphics Processing Units", Ann. Rep.<br />
Comp. Chem., vol 6. 2010, 21-35 (Chapter 2).<br />
20. "Creating the CIPRES Science Gateway for Inference of Large Phylogenetic Trees" by M.A. Miller, W. Pfeiffer,<br />
and T. Schwartz was submitted and accepted for presentation at the GCE10 workshop in conjunction with SC10.<br />
21. Y. Cui, K. Olsen, T. Jordan, K. Lee, J. Zhou, P. Small, G. Ely, D. Roten, D. K. Panda, A. Chourasia, J.<br />
Levesque, S. Day and P. Maechling (2010). “Scalable Earthquake Simulation on Petascale<br />
Supercomputers”, to be presented at Supercomputing 2010, New Orleans, Nov 2010
22. Chourasia, A. 2010. GlyphSea. In ACM SIGGRAPH 2010 Computer Animation Fesitval (Los Angeles,<br />
California, July 26 - 30, 2010). SIGGRAPH '10. ACM, New York, NY, 111-111.<br />
23. L. Zhao, W. Lee, C. X. Song, M. Huber and A. Goldner. "Bringing High Performance Climate Modeling<br />
into the Classroom", TeraGrid 2010 conference, Pittsburgh, PA, August 2010.<br />
24. Adam Carlyle, Stephen Harrell and Preston Smith. "Cost-effective HPC: The Community or the Cloud",<br />
accepted to the 2nd IEEE International Conference on Cloud Computing Technology and Science,<br />
Indianapolis, IN, November 30 - December 3, 2010.<br />
25. Palencia, J., Budden, R., and Sullivan, K. 2010. Kerberized Lustre 2.0 over the WAN. In Proceedings of<br />
the 2010 TeraGrid Conference (Pittsburgh, Pennsylvania, August 02 - 05, 2010). TG’10. ACM, New<br />
York, NY, 1-5.<br />
26. Wei He, Matthew J. Crawford, Srinivas Rapirredy, Marcela Madrid, Roberto R. Gil, Danith H. Ly,<br />
Catalina Achim, “The Structure of a γ-Modified Peptide Nucleic Acid Duplex”, Mol. BioSyst. 6, 1619,<br />
1629, 2010.<br />
27. D Katz, Scott Callaghan, Robert Harknesse, Shantenu Jha, et al “Science on the TeraGrid”, accepted<br />
for Computational Methods in Science and Technology, in (2011)<br />
28. S Sehgal, A Merzky, M Erdelyim, S Jha, “Application-level Cloud Interoperability”, Springer book ”Grids,<br />
Clouds and Virtualization”, Edited by Massimo Cafaro (Fall 2010)<br />
29. A Luckow and S. Jha, “Astractions for Loosely-Coupled and Ensemble-based Simulations on Microsoft<br />
Azure”, Short Paper accepted for IEEE CloudCom (Dec 2010)<br />
30. Y el-Khamra, H Kim, S Jha, and M Parashar, “Exploring the Performance Fluctuations of HPC<br />
Workloads on Clouds”, Short Paper accepted for IEEE CloudCom (Dec 2010)<br />
31. D. Wright, O. Kenway, P Coveney and S. Jha, “Computational Estimation of Binding Affinities for<br />
Patient Derived HIV-1 Protease Sequences Bound to Lopinavir”, accepted First International<br />
Conference on Virtual Physiological Human (VPH) 2010,<br />
32. “Modelling Data-driven CO2 Sequestration Using Distributed HPC CyberInfrastructure”, Y Khamra, S<br />
Jha and C White, accepted for TeraGrid 2010 (published by ACM)<br />
33. “Exploring Adaptation to Support Dynamic Applications on Hybrid Grids-Clouds Infrastructure”, H. Kim,<br />
Y. el-Khamra, S. Jha and M. Parashar, Accepted, ACM ScienceClouds Workshop as part of HPDC<br />
2010<br />
34. “Exploring the RNA Folding Energy Landscape Using Scalable Distributed Cyberinfrastructure”, J. Kim,<br />
W Huang, S Maddineni, F Aboul-ela, S. Jha, Accepted, Emerging Computational Methods for Life<br />
Sciences, in conjunction with HPDC 2010.<br />
35. Liu, Y. and Wang, S. 2010. “Asynchronous Implementation of A Parallel Genetic Algorithm for the<br />
Generalized Assignment Problem”. In: Proceedings of TeraGrid 2010 conference. August 1-5, 2010.<br />
Pittsburgh, PA<br />
36. Padmanabhan, A., Tang, W., and Wang, S. 2010. “Agent-based Modeling of Agricultural Land Use on<br />
TeraGrid”. In: Proceedings of TeraGrid 2010 conference. August 1-5, 2010. Pittsburg, PA<br />
37. Wang, S. and Liu, Y. 2010. “GISolve 2.0: Geospatial Problem Solving Environment Based on<br />
Synthesizing Cyberinfrastructure and Web 2.0”. In: Proceedings of TeraGrid 2010 conference. August<br />
1-5, 2010. Pittsburgh, PA<br />
38. Bennett, D.A., Tang, W., and Wang, S. 2010. “Toward an Understanding of Provenance in Complex<br />
Land Use Dynamics.” Land Use Science, forthcoming<br />
Tang, W., Bennett D. A., and Wang, S. 2010. “A Parallel Agent-based Model of Land Use Opinions.”<br />
Journal of Land Use Science, forthcoming<br />
A.1.2 TeraGrid Staff Publications (for 2010 Q4)<br />
For Q4 2010, the TeraGrid sites reported the following publications authored in whole or in part<br />
by GIG- or RP-funded staff members.<br />
1. Wang, S., Wilkins-Diehr, N., Shi, X., Vatsavai, R. R., Zhang, J., and Padmanabhan, A. 2010.<br />
"Proceedings of the ACM SIGSPATIAL International Workshop on High Performance and Distributed<br />
Geographic Information Systems (ACM SIGSPATIAL HPDGIS 2010)." ACM Digital Library<br />
2. Liu, Y., Wu, K., Wang, S., Zhao, Y., and Huang, Q. "A MapReduce Approach to Gi*(d) Spatial Statistic".<br />
In: Proceedings of the ACM SIGSPATIAL International Workshop on High Performance and Distributed
Geographic Information Systems (ACM HPDGIS 2010), November 2, 2010, San Jose, CA, USA<br />
3. Padmanabhan, A., and Wang, S. "A Distributed Resource Broker for Spatial Middleware Using Adaptive<br />
Space-Filling Curve". In: Proceedings of the ACM SIGSPATIAL International Workshop on High<br />
Performance and Distributed Geographic Information Systems (ACM HPDGIS 2010), November 2,<br />
2010, San Jose, CA, USA<br />
4. D. S. Katz, S. Callaghan, R. Harkness, S. Jha, K. Kurowski, S. Manos, S. Pamidighantam, M. Pierce, B.<br />
Plale, C. Song, J. Towns, "Science on the TeraGrid," Computational Methods in Science and<br />
Technology (CMST), v.Special Issue 2010, pp. 81-97, 2010.<br />
5. Cobb, J. W. “SNS and TeraGrid's NSTG: A best practice for integration of national cyberinfrastructure<br />
with an experimental user facility” NOBUGS 2010 Oct. 10-13, 2010 Gatlinburg, TN <<br />
http://neutrons.ornl.gov/conf/nobugs2010/><br />
6. Javier V. Goicochea, Marcela Madrid, and Cristina Amon, “Hierarchical Modeling of Heat Transfer in<br />
Silicon-Based Electronic Devices”, ASME Journal of Heat Transfer 132, 102401 (2010).<br />
7. Kh. Odbadrakh, A. Rusanu, G. M. Stocks, and G. D. Samolyuk, M. Eisenbach, Yang Wang, D.M.<br />
Nicholson, “Calculated Electronic and Magnetic Structure of Screw Dislocations in Alpha Iron”, Journal<br />
of Applied Physics (accepted for publication).<br />
8. Alexander J. Ropelewski, Hugh B. Nicholas, Jr. and Ricardo R. Gonzalez Mendez, “MPI-PHYLIP:<br />
Parallelizing Computationally Intensive Phylogenetic Analysis Routines for the Analysis of Large Protein<br />
Families”, PLoS One-<br />
9. Yang Wang, D.M.C. Nicholson, G.M. Stocks, Aurelian Rusanu, Markus Eisenbach, and R. E. Stoller, “A<br />
study of radiation damage effects on the magnetic structure of bulk Iron”, Journal of Applied Physics,<br />
(accepted for publication).<br />
10. H. Zhang, J. Woo, L. Zhao, D. Braun, X.C. Song, M. Lakshminarayanan. "Domain-specific web services<br />
for scientific application developers," Proceedings of the 6th Gateway Computing Environments<br />
Workshop (GCE10) at SC10, New Orleans, LA, November 2010.<br />
11. Understanding the Impact of Emerging Non-Volatile Memories on High-Performance, IO-Intensive<br />
Computing, Adrian M. Caulfield, Joel Coburn, Todor Mollov, Arup De, Ameen Akel, Jiahua He, Arun<br />
Jagatheesan, Rajesh K. Gupta, Allan Snavely, and Steven Swanson, Supercomputing, 2010.<br />
(Nominated for best technical paper and best student paper).<br />
12. "Creating the CIPRES Science Gateway for Inference of Large Phylogenetic Trees" by M.A. Miller, W.<br />
Pfeiffer, and T. Schwartz was presented at the GCE10 workshop in conjunction with SC10.<br />
13. E. McQuinn, A. Chourasia, J. H. Minster, J. Schulze (2010), GlyphSea: Interactive Exploration of<br />
Seismic Wave Fields Using Shaded Glyphs, Abstract IN23A-1351 presented at 2010 Fall Meeting,<br />
AGU, San Francisco, Calif., 13-17 Dec.<br />
14. Yifeng Cui, Kim B. Olsen, Thomas H. Jordan, Kwangyoon Lee, Jun Zhou, Patrick Small, Daniel Roten,<br />
Geoffrey Ely, Dhabaleswar K. Panda, Amit Chourasia, John Levesque, Steven M. Day, and Philip<br />
Maechling. 2010. Scalable Earthquake Simulation on Petascale Supercomputers. In Proceedings of the<br />
2010 ACM/IEEE International Conference for High Performance Computing, Networking, Storage and<br />
Analysis (SC '10). IEEE Computer Society, Washington, DC, USA, 1-20.<br />
15. Burtscher M., Kim B., 2010 “Proceedings of the ACM/IEEE Supercomputing Conference 2010”<br />
PerfExpert: An Easy-to-Use Performance Diagnosis Tool for HPC Applications<br />
16. Navratil P. 2010 “IBERGRID 2010” Introduction to Longhorn and GPU-Based Computing<br />
17. Navratil P., 2010 “Proceedings of the Sixth High-End Visualization Workshop Increasing Hardware<br />
Utilization for Peta-Scale Visualization<br />
18. Jeong B., Abram G., Johnson G., Navratil P. 2010 “NVIDIA GPU Technology Conference” Large-Scale<br />
Visualization Using a GPU Cluster<br />
19. Hanlon M., Mock S., Nuthulapati P., Gonzales M., Soltis P., Soltis D., Majure L., Payton A., Mishler B.,<br />
Tremblay T., Madsen T., Olmstead R., McCourt R., Wojciechowski M., Merchant N. 2010 “IEEE<br />
Proceedings of Gateway Computing Environments 2010” My-Plant.<strong>org</strong>: A Phylogenetically Structured<br />
Social Network
A.2. Publications from TeraGrid Users<br />
A.2.1 Publications from TeraGrid Users (for 2010 Q3)<br />
The following 675 publications were gathered primarily from renewal Research submissions to<br />
the September 2010 TRAC meeting. The extraction process was conducted by a UCSD<br />
undergraduate student at SDSC. The publications are <strong>org</strong>anized by field of science and by the<br />
proposal with which they were associated.<br />
Advanced Scientific Computing<br />
MCA06N043<br />
1. A. Uzun and M. Y. Hussaini, “Simulation of Noise Generation in Near-Nozzle Region of a Chevron<br />
Nozzle Jet,” AIAA Journal, Vol. 47, No. 8, August 2009, pp. 1793 – 1810.<br />
2. H. Zhao, A. Uzun, M. Y. Hussaini and S. L. Woodruff, “Computations of Flow Past a Circular<br />
Cylinder Using a Continuous-Turbulence Model,” AIAA Journal, Vol. 47, No. 10, October 2009, pp. 2460 –<br />
2466.<br />
3. A. Uzun, and M. Y. Hussaini, “Simulations of Vortex Formation Around a Blunt Wing Tip,”AIAA<br />
Journal, Vol. 48, No. 6, June 2010, pp. 1221 – 1234.<br />
4. J. Bin, A. Uzun and M. Y. Hussaini, “Adaptive Mesh Refinement for Chevron Nozzle Jet Flows,”<br />
Computers and Fluids Journal, Vol. 39, No. 6, June 2010, pp. 979 – 993.<br />
5. A. Uzun and M. Y. Hussaini, “High-Fidelity Numerical Simulation of a Chevron Nozzle Jet Flow,”<br />
Under review for publication in International Journal of Aeroacoustics, 2010.<br />
6. J. Bin, A. Uzun and M. Y. Hussaini, “Adaptive Grid Redistribution Method for a Domain with<br />
Complex Boundaries,” Under review for publication in Journal of Computational Physics, 2010.<br />
7. A. Uzun and M. Y. Hussaini, “High-Fidelity Numerical Simulations of a Round Nozzle Jet Flow,” In<br />
preparation, 2010.<br />
8. A. Uzun and M. Y. Hussaini, “Prediction of Tandem Cylinder Noise using Delayed Detached Eddy<br />
Simulation,” In preparation, 2010.<br />
Astronomical Sciences<br />
AST020001<br />
9. “Selection Rules for the Nonlinear Interaction of Internal Gravity Waves,” Jiang, C.-H. and Marcus,<br />
P. S., March 2009, Phys. Rev. Letters, 102, 124502-1 to 124502-4.<br />
10. “Jupiter’s Shrinking Great Red Spot and Steady Oval BA: Velocity Measurements with the ACCIV<br />
Automated Cloud Tracking Method,” Asay-Davis, X., Marcus, P., Wong, M. and e Pater, I., May 2009,<br />
Icarus, 203, 164–188.<br />
11. “Changes in Jupiter’s Zonal Velocity Profile between 1979 and 2008,” Asay-Davis, X., Wong, M.,<br />
Marcus, P., and de Pater, I., under review (minor revision required), Icarus, 2010.<br />
12. “Jupiter’s Zonal Winds: Are They Bands of Homogenized Potential Vorticity and Do They Form a<br />
Monotonic Staircase” Marcus, P. and Shetty S., submitted to Philsophical Transactions Royal Society A,<br />
2010.<br />
13. “KECK AO and HST Images of Vortices on Jupiter,” de Pater, I., Wong, M., Luszcz-Cook, S,<br />
Adamkovics, M., Conrad, A., Marcus, P., Asay-Davis, X., and Go, C., under review (minor revision required),<br />
Icarus, 2010.<br />
14. “Three-Dimensional Simulations of Kelvin-Helmholtz Instability in Settled Dust Layers in<br />
Protoplanetary Disks,” Barranco, J.A., February 2009, The Astrophysical Journal, 691, 907–921. 8. “Forming<br />
Planetesimals by Gravitational Instability: I. The Role of the Richardson Number in Triggering the Kelvin-<br />
Helmholtz Instability,” Lee, A., Chiang, E., Asay-Davis, X., Barranco, J., August 2010, The Astrophysical<br />
Journal, 718.
AST070032<br />
15. P. Sharma, I. J. Parrish, E. Quataert, “Thermal Instability with Anisotropic Thermal Conduction and<br />
Adiabatic Cosmic Rays: Implications for Cold Filaments in Galaxy Clusters, ” accepted, Astrophysical<br />
Journal (2010), arXiv:1003.5546 Non-Refereed Publications<br />
16. P. Sharma, B. D. G. Chandran, E. Quataert, I. J. Parrish, “Turbulence and Mixing in the Intracluster<br />
Medium,” in the Proceedings of The Monster’s Fiery Breath: Feedback in galaxies, groups, and clusters,<br />
Madison, Wisconsin, ed. S. Heinz, E. Wilcots (2009), arXiv:0909.0270 Submitted/In Progress<br />
17. P. Sharma, P. Colella, D. F. Martin, “Numerical Implementation of Streaming Down the Gradient:<br />
Application to Fluid Modeling of Cosmic Rays,” under review, SIAM Journal of Scientific Computing (2009),<br />
arXiv:0909.5426<br />
18. P. Sharma, G. W. Hammett, “A Fast Semi-implicit Method for Anisotropic Diffusion,” to be<br />
submitted to J. Comp. Phys.<br />
AST080005<br />
19. “Regarding the Line-of-Sight Baryonic Acoustic Feature in the Sloan Digital Sky Survey and Baryon<br />
Oscillation Spectroscopic Survey Luminous Red Galaxy Samples” Kazin, E. A., Blanton, M. R., Scoccimarro,<br />
R., McBride, C. K., & Berlind, A. A., 2010, The Astrophysical Journal, in press<br />
20. “The Baryonic Acoustic Feature and Large-Scale Clustering in the Sloan Digital Sky Survey<br />
Luminous Red Galaxy Sample” Kazin, E. A., Blanton, M. R., Scoccimarro, R., McBride, C. K., Berlind, A. A.,<br />
Bahcall, N. A., Brinkmann, J., Czarapata, P., Frieman, J. A., Kent, S. M., Schneider, D. P., & Szalay, A. S.,<br />
2009, The Astrophysical Journal, 710, 1444<br />
21. “From Finance to Cosmology: The Copula of Large-Scale Structure” Scherrer, R. J., Berlind, A. A.,<br />
Mao, Qingqing, & McBride, C. K., 2009, The Astrophysical Journal (Letters), 708, 9 Submitted or In<br />
Preparation:<br />
22. “The Large Suite of Dark Matter Simulations (LasDamas) Project: Mock Galaxy Catalogs for the<br />
Sloan Digital Sky Survey” McBride, C. K., Berlind, A. A., Scoccimarro, R., Piscionere, J. A., Busha, M.,<br />
Gardner, J., Manera, M., Wechsler, R. H., & Van den Bosch, F., 2010, The Astrophysical Journal, in<br />
preparation<br />
23. “Accurate Determination of the Abundance of Massive Clusters From the LasDamas Project”<br />
McBride, C. K., Berlind, A. A., Manera, M., Scoccimarro, R., Busha, M., Manera, M., Wechsler, R. H., & Van<br />
den Bosch, F., 2010, The Astrophysical Journal, in preparation<br />
24. “Density Profiles of Dark Matter Halos From the LasDamas Project” Musher, D. E., Robbins, K.,<br />
Berlind, A. A., McBride, C. K., Busha, M., Wechsler, R. H., 2010, The Astrophysical Journal, in preparation<br />
25. “Density Profiles of Galaxy Groups and Clusters in the Sloan Digital Sky Survey” Norris, J. M.,<br />
Berlind, A. A., & McBride, C. K., 2010, The Astrophysical Journal, in preparation<br />
26. “The Luminosity and Color Dependence of the Sloan Digital Sky Survey Galaxy 3-Point Correlation<br />
Function” Marin, F, Nichol, R., Frieman, J., & McBride, C. K., 2010, The Astrophysical Journal, in<br />
preparation<br />
AST080029<br />
27. Penna, R., McKinney, J. C. et al. 2010, MNRAS, Simulations of Magnetized Disks Around Black<br />
Holes: E®ects of Black Hole Spin, Disk Thickness, and Magnetic Field Geometry, in press,<br />
http://adsabs.harvard.edu/abs/2010arXiv1003.0966P<br />
28. Broderick, A. & McKinney, J. C. 2010, ApJ, Parsec-scale Faraday Rotation Measures from General<br />
Relativistic MHD Simulations of Active Galactic Nuclei Jets, submitted,<br />
http://adsabs.harvard.edu/abs/2010arXiv1006.5015B<br />
29. Uzdensky, A. & McKinney, J. C. 2010, Physics of Plasmas, Magnetic Reconnection with Radiative<br />
Cooling. I. Optically-Thin Regime, submitted, http://adsabs.harvard.edu/abs/2010arXiv1007.0774U
30. Tchekhovskoy, A., Narayan, R., McKinney, J. C. 2010, New Astronomy, Magnetohydrodynamic<br />
Simulations of Gamma-Ray Burst Jets: Beyond the Progenitor Star, in press,<br />
http://adsabs.harvard.edu/abs/2009arXiv0909.0011T<br />
31. McKinney, J. C., Blandford, R., Uzdensky, D. 2010, MNRAS, General Relativistic<br />
Magnetohydrodynamical Simulations of Gamma-Ray Bursts, in progress<br />
32. McKinney, J. C. & Kohri, K. 2010, MNRAS, Code and Tables for Nuclear Equations of State and<br />
Neutrino Opacities for General Relativistic Magnetohydrodynamical Simulations of Gamma-Ray Bursts,<br />
submitted<br />
33. McKinney, J. C. 2010, Physics <strong>Report</strong>s Invited Review, Simulating Black Hole Accretion Flows and<br />
Jets, in progress<br />
34. McKinney, J. C. & Blandford, R.D. 2010, MNRAS, Observed GRB Relativistic Jet Variability as a<br />
Pattern Effect Induced by Unstable Jet Modes, in progress<br />
35. McKinney, J. C. & Tchekhovskoy, A. D. 2010, MNRAS, CPC-HARM: Conservative-Primitive-<br />
Consistent high-accuracy General Relativistic Magnetohydrodynamics code, in progress<br />
AST080048<br />
36. J. Debuhr, E. Quataert, & C. P. Ma, 2010, “The Growth of Massive Black Holes in Galaxy Merger<br />
Simulations with Feedback by Radiation Pressure," MNRAS, submitted<br />
37. J. Debuhr, E. Quataert, C. P. Ma, & P. F. Hopkins, 2010, “Self-Regulated Black Hole Growth via<br />
Momentum Deposition in Galaxy Merger Simulations," MNRAS Letters, 406, L55<br />
AST080049<br />
38. Lehe, R., Parrish, I. J., Quataert, E. 2009, ApJ 707, 404, “The Heating of Test Particles in<br />
Numerical Simulations of Alfv´enic Turbulence”<br />
39. McCourt, M., Parrish, I. J., Quataert, E., & Sharma, P., in preparation, “Nonlinear Saturation of<br />
Buoyancy Instabilities in Dilute Magnetized Plasmas”<br />
40. Parrish,I.J.,Quataert,E.,Sharma,P.2009,ApJ,703,96, “AnisotropicThermalConduction and the<br />
Cooling Flow Problem in Galaxy Clusters”<br />
41. Parrish, I. J., Quataert, E., Sharma, P. 2010, ApJ, 712, 194, “Turbulence in Galaxy Cluster Cores:<br />
A Key to Cluster Bimodality”<br />
42. Sharma,P.,Parrish,I.J.,&Quataert,E.2010,AcceptedinApJ,arXiv:1003.5546, “Thermal Instability with<br />
Anisotropic Thermal Conduction and Adiabatic Cosmic Rays: Implications for Cold Filaments in Galaxy<br />
Clusters”<br />
AST090074<br />
43. Hoeflich, P.; Krisciunas, K.; Khokhlov, etal., Secondary Parameters of Type Ia Supernova Light<br />
Curves, ApJ, 710(2010) 444.<br />
44. Fully-threaded-tree AMR simulations of astrophysical and terrestrial combustion, being submitted<br />
into proceedings of the ASTRONUM2010 conference, SanDiego, CA, June 2010.<br />
45. Two incoming presentations at the incoming conference Frontiers in Computational Astrophysics io<br />
October 2010, Lyon, France.<br />
46. Two papers on large-scale modeling of SNIa in preparation.<br />
47. Paper describing 3D modeling of detonation waves with detailed nuclear kinetics, being submitted<br />
to ApJ.<br />
AST090086
48. “Turbulence Stagnation and the Ignition of the Deflagration in Type Ia Supernovae”<br />
Townsley, D.M., Jackson, A.P., Calder, A.C. in preparation<br />
49. “Evaluating the Impact of Untertainties in Turbulence-Flame Interactions on Simulations of Type Ia<br />
Supernovae” Jackson, A.P., Townsley, D.M., Calder, A.C. in preparation<br />
AST090087<br />
50. “Erratum: Effects of metal enrichment and metal cooling in galaxy growth and cosmic star formation<br />
history” J.-H. Choi & K. Nagamine 2009, MNRAS, 395, 1776<br />
51. “Effects of metal enrichment and metal cooling in galaxy growth and cosmic star formation history”<br />
J.-H. Choi & K. Nagamine 2009, MNRAS, 393, 1595<br />
52. “Effects of UV background and local stellar radiation on the H I column density distribution”<br />
53. K. Nagamine, J.-H. Choi, & H. Yajima 2010, submitted to ApJL (arXiv:1006.5345)<br />
54. “Luminosity Distribution of Gamma-Ray Burst Host Galaxies at redshift z=1 in Cosmological<br />
Smoothed Particle Hydrodinamic Simulations: Implications for the Metallicity Dependence of GRBs”<br />
55. Y. Niino, J.-H Choi, M. A. R. Kobayashi, K. Nagamine, T. Totani, Tomonori & B. Zhang 2010,<br />
submitted to ApJ (arXiv:1006.5033)<br />
56. “Escape fraction of ionizing photons from high-redshift galaxies in cosmological SPH simulations”<br />
57. H. Yajima, J.-H. Choi & K. Nagamine 2010, submitted to MNRAS (arXiv:1002.3346)<br />
58. “Multiphase and Variable Velocity Galactic Outflow in Cosmological SPH Simulations” J.-H. Choi &<br />
K. Nagamine 2010, submitted to MNRAS (arXiv:1001.3525)<br />
59. “Effects of cosmological parameters and star formation models on the cosmic star formation<br />
history” J.-H. Choi & K. Nagamine 2009, accepted for publication in MNRAS (arXiv:0909.5425)<br />
60. American Astronomical Society, AAS Meeting 215, 376.02 “Effects of Star Formation Models on<br />
The Cosmic Star Formation History”<br />
61. SFR@50: Filling the Cosmos with Stars at ABBAZIA DI SPINETO SARTEANO, ITALY “Cosmic<br />
star formation rate in LCDM cosmological simulation”<br />
62. Lorentz center workshop in Leiden Netherlands : The Chemical enrichment of the Intergalactic<br />
Medium The effect of multiphase galactic winds on galaxy formation”<br />
63. Sejong University (Korea), April 2009 “Cosmic star formation history in LCDM cosmological<br />
simulation”<br />
64. American Astronomical Society, AAS Meeting 213, 344.02<br />
“Effects of metal enrichment and metal cooling in galaxy growth and cosmic star formation history”<br />
AST090104<br />
65. J. M. Call, J. E. Tohline & L. Lehner 2010 “A generalized advection formalism for relativistic fluid<br />
simulations”, CQGrav, 27, in press<br />
66. 68. W. P. Even & J. E. Tohline 2009, ‘ ‘Constructing Synchronously Rotating Double White<br />
Dwarf Binaries”, Ap.JS., 184, 248, arXiv:0908.2116<br />
67. 69. W. P. Even 2010, PhD Dissertation “Mass Transfer and Mergers in Double White Dwarf<br />
Binaries”, http://etd.lsu.edu/docs/available/etd-04152010-230541/<br />
68. D. Marcello, P. M. Motl & J. E. Tohline 2010, “Radiation-Hydrodynamic Treatment of Binary<br />
Accretion Flows”, ApJ, in preparation<br />
69. P. M. Motl et al. 2010, “Double Degenerate Merger Code Comparison Project: The Importance of<br />
Initial Conditions and the Equation of State”, ApJ, in preparation
70. P. M. Motl, J. E. Tohline & J. Frank 2010 “Convergence to stability in a simulated double white<br />
dwarf binary”, ApJ, in preparation<br />
AST090106<br />
71. K. Beckwith and P. Armitage. Global, High Resolution Simulations of Thin Disks with a Higher<br />
Order Godunov Scheme. ApJ, 2010, in prep.<br />
72. J. B. Simon, J. F. Hawley and K. Beckwith. Accretion States in Vertically Strati fied MRI-driven<br />
Turbulence. ApJ, 2010, in prep.<br />
73. K. Beckwith and J. M. Stone. A Multidimensional, Higher Order Godunov Scheme for Relativistic<br />
Magnetohydrodynamics. ApJSS, 2010, in prep.<br />
AST100040<br />
74. Tchekhovskoy, A., McKinney, J. C., and Narayan, R. (2009). Efficiency of Magnetic to Kinetic<br />
Energy Conversion in a Monopole Magnetosphere. ApJ , 699, 1789–1808.<br />
75. Tchekhovskoy, A., Narayan, R., and McKinney, J. C. (2010a). Black Hole Spin and The Radio<br />
Loud/Quiet Dichotomy of Active Galactic Nuclei. ApJ , 711, 50–63.<br />
76. Tchekhovskoy, A., Narayan, R., and McKinney, J. C. (2010b). Magnetohydrodynamic Simulations<br />
of Gamma-Ray Burst Jets: Beyond the Progenitor Star. New Astronomy, 15, 749.<br />
77. Penna, R. F., McKinney, J. C., Narayan, R., Tchekhovskoy, A., et al. (2010). Simulations of<br />
Magnetized Disks Around Black Holes: Effects of Black Hole Spin, Disk Thickness, and Magnetic Field<br />
Geometry. MNRAS, in press (ArXiv:1003.0966).<br />
MCA00N020<br />
78. Cunningham, A. J., Klein, R. I., McKee, C. F. & Krumholz, M. R. 2010, ApJ, in preperation<br />
79. Kratter, K. M., Matzner, C. D., Krumholz, M. R., & Klein, R. I., 2010, ApJ, 708, 1585<br />
80. Krumholz, M. R., Cunningham, A. J., Klein, R. I., & McKee, C. F. 2010, ApJ, 713, 1120<br />
81. McKee, C. F., Li, P. S. & Klein, R. I. 2010, ApJ, accepted.<br />
82. Li, P. S., McKee, C. F. & Klein, R. I. 2010, in preparation.<br />
83. Offner, S. S. R. & Klein, R. I. & McKee, C. F. & Krumholz, M. R. 2009, ApJ, 703, 1310<br />
84. Offner, S. R ., Kratter, K. M, Matzner, C. D., Krumholz, M. R., and Klein, R. I., ApJ, 2010, submitted<br />
85. Turk, M. J., Abel, T., & O’Shea, B. 2009, Science, 325, 601<br />
MCA04N012<br />
86. R. Cen, The State of the Universe at z ∼ 6, ApJL, submitted, 2010<br />
87. A. Mesinger, S. Furlanetto, R. Cen, 21cmFAST: A Fast, Semi-Numerical Simulation of the High-<br />
Redshift 21-cm Signal, ApJ, submitted, 2010<br />
88. O. Zhan,A.Mesinger, M. McQuinn, H. Trac,R.Cen,L., Hernquist, Comparison of Reionization<br />
Models: Radiative Transfer Simulations And Approximate, Semi-Numeric Models, ApJ, submitted, 2010<br />
89. Z. Zheng, R. Cen, H. Trac, J. Miralda-Escude, Radiative Modeling of Lyman Alpha Emitters. II. New<br />
Effects in Galaxy Clustering, ApJ, submitted, 2010<br />
90. Z. Zheng, R. Cen, H. Trac, J. Miralda-Escude, Radiative Transfer Modeling of Lyman Alpha<br />
Emitters: I. Statistics of Spectra and Luminosity,ApJ, in press, 2010<br />
91. R. Cen, N.E. Chisari, Star Formation Feedback and Metal Enrichment History of the Intergalactic<br />
Medium, ApJ, submitted, 2010
92. N.A. Bond, M.A. Strauss, R. Cen, Crawling the Cosmic Network: Exploring the Morphology of<br />
Structure in the Galaxy Distribution, MNRAS, in press, 2010<br />
93. N.A. Bond, M.A. Strauss, R. Cen, Crawling the Cosmic Network: Identifying and Quantifying<br />
Filamentary Structure, MNRAS, in press, 2010<br />
94. R. Cen, P. McDonald, H. Trac, A. Loeb, Probing the Epoch of Reionization with the Lyman Alpha<br />
Forest at z ∼ 4 − 5, ApJL, 706, L164, 2009<br />
95. J.H. Wise, R. Cen, Ionizing Photon Escape Fractions from High Redshift Dwarf Galaxies,ApJ, 693,<br />
984, 2009<br />
96. M.K.R. Joung,R.Cen, G.Bryan, Galaxy Size Problem at z=3: Simulated Galaxies Are Too<br />
Small,ApJL, 692, L1, 2009<br />
MCA95C003<br />
97. “Simulations of Magnetorotational Turbulence with a Higher-Order Godunov Scheme,” J. B.<br />
Simon, J. F. Hawley, K. Beckwith, ApJ, 690, 974(2009).<br />
98. “Direct Calculation of the Radiative Efficiency of an Accretion Disk Around a Black Hole,” S. C.<br />
Noble, J. H. Krolik, & J. F. Hawley, ApJ, 692, 411(2009).<br />
99. “MHD Simulations of Accretion Disks and Jets :Strengths and Limitations,” J. .F.Hawley,Astrophys.<br />
and Space Sci., 320, 107(2009).<br />
100. “GRMHD Prediction of Coronal Variability in Accreting Black Holes,” S. C. Noble & J. H. Krolik, J.<br />
H., ApJ, 703, 964–975(2009).<br />
101. “Transport of Large Scale Poloidal Flux in Black Hole Accretion,” K.Beckwith,J.F.Hawley,&J.H.<br />
Krolik, Astrophysical Journal, 707, 428–445(2009).<br />
102. “Viscous and Resistive Effects on the Magnet orotational Instability with aNetToroidalField,” Simon,<br />
J. B. & Hawley, J. F., ApJ, 707, 833–843(2009).<br />
103. “What is the Numerically Converged Amplitude of MHD Turbulence in Strati fied Shearing Boxes”<br />
J. Shi & J. H. Krolik, ApJ, 708, 1716–1727(2010).<br />
104. “Dependence of Inner Accretion Disk Stress on Parameters: The Schwarzschild Case,” S. C.<br />
Noble, J. H. Krolik, J. F. Hawley, ApJ, 711, 959–973(2010).<br />
105. “General Relativistic MHD Jets,” J. H. Krolik & J. F. Hawley, in The Jet Paradigm—From Microquasars<br />
to Quasars, Lect. Notes Phys, 794, ed. T. Belloni (Springer-Verlag: Berlin), pp. 265–287 (2010).<br />
106. “Numerical Simulations of MHD Accretion Disks,” J. F. Hawley, in IAU Highlights of Astronomy, vol<br />
15, in press(2010).<br />
107. “Local Simulations of Magnetized Accretion Disks,” J. B. Simon, PhD thesis, Department of<br />
Astronomy, University of Virginia, July, 2010.<br />
108. Krolik: Invited talk “Dynamics of Black Hole Accretion”, Joint Astronomy Colloquium at the Max-<br />
Planck-Institut-f¨ur-Astrophysik and the European Southern Observatory, June, 2009, Garching, Germany.<br />
109. Hawley: Invited talk “Numerical Simulations of MHD Accretion Disks,” for Joint Discussion 7,<br />
Astrophysical Outflows and Associated Accretion Phenomena, IAU General Assembly, Rio de Janeiro,<br />
Brazil, Aug. 5–7, 2009.<br />
110. Hawley: Invited talk “Global Simulations of Black Hole Accretion,” at the Workshop on Matter and<br />
Electromagnetic Fields in Strong Gravity, Aug. 24–28, College Park, MD.<br />
111. Krolik: Invited talk “Radiation from Black Hole Accretion”, at Matter and Electromagnetic Fields in<br />
Strong Gravity, August, 2009, U. of Maryland, College Park MD<br />
112. “AngularMomentumTransportinMagnetizedAccretionDisksviatheMagnetorotational Instability,” AAS<br />
Meeting, January, 2010, Washington, DC.<br />
113. Hawley:Invitedtalk,“3DGRMHDSimulations:StressattheISCO,”attheworkshop “Computational<br />
RelativisticAstrophysics: Frontiers ofMHD,”PrincetonCenterforTheoreticalScience,January13–16, 2010,<br />
Princeton, NJ.
114. Krolik:Invited talk “ConnectingRadiationtoAccretionDynamicsthroughSimulations”,atPrinceton<br />
Workshop on Computational Relativistic Astrophysics, Princeton Center for Theoretical Science”, January,<br />
2010, Princeton NJ<br />
115. Hawley: Invited talk, “GRMHD Simulations of Black Hole Accretion Disks,” 5th International<br />
Conference on Numerical Modeling of Space Plasma Flows, June 13–18, 2010, San Diego, CA.<br />
116. Public Lecture at the Science Museum in La Laguna, Tenerife, Spain on Friday, November 6th at<br />
19:00. The title of the lecture was “Black Holes—The Powerhouse of the Universe.” Featured black hole<br />
simulation movies and images of TACC Ranger.<br />
PHY070001<br />
117. Results obtained from previous TeraGrid allocations have been reported in the following preprint:<br />
Marcelo Gleiser, Noah Graham, and Nikitas Stamatopoulos, “Long-Lived Time Dependent Remnants During<br />
Cosmological Symmetry Breaking: From Ination to the Electroweak Scale," arXiv:1004.4658, an expanded<br />
version of which is currently under consideration by Physical Review D.<br />
Atmospheric Sciences<br />
ATM050014<br />
118. Lin, J.-T., D. J. Wuebbles, H.-C. Huang, Z. Tao, M. Caughey, X.-Z. Liang, J.-H. Zhu and T.<br />
Holloway, 2010: Potential effects of climate and emissions changes on surface ozone in the Chicago area.<br />
J. Great Lakes Res., 36, 59-64.<br />
119. Patten, K., and D. Wuebbles, 2010: Atmospheric lifetimes and ozone depletion potentials of trans-<br />
1-chloro-3,3,3-trifluoropropylene and trans-1,2-dichloroethylene in a three-dimensional model. Atmos.<br />
Chem. Phys., under review.<br />
120. Stoner, A. M., K. Hayhoe, and D. Wuebbles, 2009: Assessing General Circulation Model<br />
Simulations of Atmospheric Teleconnection Patterns. J. Climate, 22, 4348-4372, DOI:<br />
10.1175/2009JCLI2577.1.<br />
121. Wuebbles, D. J., and K. O. Patten, 2009: HCFC-123 in the Atmosphere: Revisiting Its Potential<br />
Environmental Impacts and Rationale for Continued Use, Environ. Sci. Technol., 43, 32083213,<br />
doi:10.1021/es802308m.<br />
122. Wuebbles, D. J., K. O. Patten, D. Wang, D. Youn, M. Martínez-Avilés, and J. S. Francisco, 2010:<br />
Three-dimensional model evaluation of the ozone depletion potentials for n-propyl bromide, trichloroethylene<br />
and perchloroethylene. Atmos. Chem. Phys., under review.<br />
123. Youn, D., K. O. Patten, J.-T. Lin, and D. J. Wuebbles, 2009: Explicit calculation of Indirect Global<br />
Warming Potentials for Halons using atmospheric models. Atmos. Chem. Phys., 9, 8719-8733.<br />
124. Youn, D., K. O. Patten, D. J. Wuebbles, H. Lee, and C.-W. So, 2010: Potential impact of iodinated<br />
replacement compounds CF3I and CH3I on atmospheric ozone: A three-dimensional modeling study.<br />
Atmos. Chem. Phys., under review.<br />
ATM090032<br />
125. Yoshimura, K., and M. Kanamitsu (2009), Specification of External Forcing for Regional Model<br />
Integrations, Mon. Wea. Rev., 137, 1409-1421.<br />
ATM100048<br />
126. Becker, M., M. S. Gilmore, J. Naylor, J. Weber, G. P. Compo, J. Whitaker, T. M. Hamill, and R.<br />
Maddox, 2010: Simulations of the supercell outbreak of 18 March 1925. 25<br />
Denver, CO, October 2010.<br />
th<br />
Conf. Severe Local Storms,
127. Burkett, L. M. S. Gilmore, R. Thompson, R. Edwards, J. M. Staka, and R. B. Wilhelmson, 2010:<br />
Characteristics of supercells simulated with tornadic and non-tornadic RUC-2 proximity soundings: Part I:<br />
th<br />
Sensitivity to convective initiation mechanisms. 25 Conf. on Severe Local Storms, Denver, CO, Amer.<br />
Meteor. Soc. To be presented October 2010.<br />
128. ——, ——, ——, and ——, 2011: Embryo differences between simulated High and Low Plains<br />
hailstorms. Atmos. Res. In Preparation.<br />
129. ——, L. J. Wicker, G. S. Romine, and R. B. Wilhelmson, 2011: Initial look at dynamic processes<br />
maintaining long-track tornadoes. Geo. Res. Lett. In preparation.<br />
130. ——, L. Burkett, R. Thompson, and R. Edwards, 2010: Characteristics of supercells simulated with<br />
tornadic and non-tornadic RUC-2 proximity soundings: Part II: Relationship between sounding predictors<br />
th<br />
and low-level rotation. 25 Conf. on Severe Local Storms, Denver, CO, Amer. Meteor. Soc. To be presented<br />
October 2010.<br />
131. Naylor, J., M. Askelson, and M. S. Gilmore, 2010: Influences of low-level thermodynamic structure<br />
on convective downdraft properties. Part II: Three-dimensional simulations and trajectory analysis. Mon<br />
Wea. Rev., To be submitted Sept. 2010.<br />
132. Naylor, J., M. S. Gilmore, R. Thompson, and R. Edwards, 2010: Characteristics of supercells<br />
simulated with tornadic and non-tornadic RUC-2 proximity soundings: Part III: Comparisons at tornadoresolving<br />
gridspacing. 25 Conf. on Severe Local Storms, Denver, CO, Amer. Meteor. Soc. To be presented<br />
th<br />
October 2010.<br />
133. ——, M. S. Gilmore, J. M. Straka and R. B. Wilhelmson, 2010: Forward operators for the<br />
assimilation of polarimetric radar observations. J. Atmos. Ocean. Tech., in preparation.<br />
134. —— and ——, 2010: A novel, multiple liquid and ice hydrometeor species, hybrid bulk/bin, threemoment<br />
microphysics parameterization scheme. 13 Conf. on Cloud Physics, Portland, OR, Amer. Meteor.<br />
th<br />
Soc., Presented 28 June 2010. Also in prep for J. Appl. Meteor. Clim<br />
ATM100050<br />
135. Tan Bo, Kaplan L. Micheal 2010: Subtropical Jetstream Circulations during the 2004 North<br />
American Monsoon Part I – Synoptic Analysis on the Subtropical Jet Stream . J. Climate. (not published)<br />
MCA94P023<br />
136. Burkett, L. M. S. Gilmore, R. Thompson, R. Edwards, J. M. Staka, and R. B. Wilhelmson, 2010:<br />
Characteristics of supercells simulated with tornadic and non-tornadic RUC-2 proximity soundings: Part I:<br />
Sensitivity to convective initiation mechanisms. 25th Conf. on Severe Local Storms, Denver, CO, Amer.<br />
Meteor. Soc. To be presented October 2010.<br />
137. ——, ——, ——, and ——, 2011: Embryo differences between simulated High and Low Plains<br />
hailstorms. Atmos. Res. In Preparation.<br />
138. ——, L. J. Wicker, G. S. Romine, and R. B. Wilhelmson, 2011: Initial look at dynamic processes<br />
maintaining long-track tornadoes. Geo. Res. Lett. In preparation.<br />
139. ——, L. Burkett, R. Thompson, and R. Edwards, 2010: Characteristics of supercells simulated with<br />
tornadic and non-tornadic RUC-2 proximity soundings: Part II: Relationship between sounding predictors<br />
and low-level rotation. 25th Conf. on Severe Local Storms, Denver, CO, Amer. Meteor. Soc. To be<br />
presented October 2010.<br />
140. Grim, J. A., R. M. Rauber, G. M. McFarquhar, B. F. Jewett, and D. P. J<strong>org</strong>ensen, 2009:<br />
Development and forcing of the rear inflow jet in a rapidly developing and decaying squall line during<br />
BAMEX. Mon. Wea. Rev., 137, 1206-1229.<br />
141. Houston, A. L. and R. B. Wilhelmson, 2010a: The impact of airmass boundaries on deep<br />
convection: A numerical case study in a high-CAPE, low-shear environment. Part I: Storm <strong>org</strong>anization and<br />
propagation. Mon. Wea. Rev., in preparation.
142. Houston, A. L. and R. B. Wilhelmson, 2010a: The impact of airmass boundaries on deep<br />
convection: A numerical case study in a high-CAPE, low-shear environment. Part II: Storm longevity and<br />
the mechanism of initiaton. Mon. Wea. Rev., in preparation.<br />
143. Houston, A. L. and R. B. Wilhelmson, 2010a: The impact of airmass boundaries on deep<br />
convection: A numerical case study in a high-CAPE, low-shear environment. Part III: Mesocyclones. Mon.<br />
Wea. Rev., in preparation.<br />
144. Houston, A. L. and R. B. Wilhelmson, 2010a: The impact of airmass boundaries on deep<br />
convection: A numerical case study in a high-CAPE, low-shear environment. Part IV: Near surface vortex<br />
sheets. Mon. Wea. Rev., in preparation.<br />
145. Jewett, B. F., R. M. Rauber, G. McFarquhar, J. R. French, and K. R. Knupp, 2010: Profiling of<br />
winter storms (PLOWS): What we are learning about winter precipitation bands. Preprints, 13th Conf. on<br />
Cloud Physics, Amer. Meteor. Soc., Portland, OR, P1.74 (28 June 2010).<br />
146. Maliawco, R. J., G. M. McFarquhar, and B. F. Jewett, 2010: Sensitivity of the evolution of an<br />
idealized tropical cyclone to small perturbations in initial input fields. Preprints, 29th Conf. on Hurricanes<br />
and Tropical Meteorology, Amer. Meteor. Soc., Tucson, P2.120.<br />
147. ——, M. S. Gilmore, J. M. Straka and R. B. Wilhelmson, 2010: Forward operators for the<br />
assimilation of polarimetric radar observations. J. Atmos. Ocean. Tech., in preparation<br />
148. Romine G. S., M. S. Gilmore, J. M. Straka and R. B. Wilhelmson, 2009: Forward operators for the<br />
assimilation of polarimetric radar observations. J.<br />
149. —— and ——, 2010: A novel, multiple liquid and ice hydrometeor species, hybrid bulk/bin, threemoment<br />
microphysics parameterization scheme. 13th Conf. on Cloud Physics, Portland, OR, Amer. Meteor.<br />
Soc., Presented 28 June 2010. Also in prep for J. Appl. Meteor. Clim<br />
150. Wilhelmson, R., E. Wiziechi, A. Knecht, T. Daley, W. Davis, S. Yalda, R. D. Clark, S. States, R.<br />
Junod, S. Cecelski, L. David, 2009: An Interactive, Integrated, Instructional Pathway to the LEAD Science<br />
Gateway, (Download a PDF of the talk ) TeraGrid Science Presentation, June 22 - 25, 2009, Arlington, VA.<br />
http://www.teragrid.<strong>org</strong>/tg09/<br />
Chemical, Thermal Systems<br />
ASC040046<br />
151. Situ, Y., Liu, L., Martha, C. S., Louis, M. E., Li, Z., Sameh, A. H., Blaisdell, G. A., and Lyrintzis, A.<br />
S., “Reducing Communication Overhead in Large Eddy Simulation of Jet Engine Noise,” accepted for the<br />
IEEE Cluster 2010 Conference to be held September 20-24, 2010 in Heraklion, Crete.<br />
152. Lyrintzis, A. S., Blaisdell, G. A., Lo, S.-C., “Large Eddy Simulations of Hot Supersonic Jets for<br />
Aeroacoustics,” Final <strong>Report</strong> on subcontract to Craft Tech. on STTR Topic No. N09-T008 Phase I (US<br />
Navy), March 2, 2010.<br />
ASC090076<br />
153. L. Duan, X. Wang, and X. Zhong, “A high-order cut-cell method for numerical simulation of<br />
hypersonic boundary-layer instability with surface roughness,” Journal of Computational Physics, In Press,<br />
Corrected Proof, Available online 19 June 2010.<br />
154. P. S. Rawat and X. Zhong, “On high-order shock-fitting and front-tracking schemes for numerical<br />
simulation of shock–disturbance interactions,” Journal of Computational Physics, In Press, Corrected Proof,<br />
Available online 31 May 2010.<br />
155. E. Johnsen et al. (including X. Zhong), “Assessment of high-resolution methods for numerical<br />
simulations of compressible turbulence with shock waves,” Journal of Computational Physics, 229 (2010)<br />
1213–1237.<br />
156. Y. Huang and X. Zhong, “Numerical Study of Laser-Spot Effects on Boundary-Layer Receptivity for<br />
Blunt Compression-Cones in Mach-6 Freestream,” AIAA paper 20104447, June 2010.
157. L. Duan and X. Zhong, “A High Order Cut Cell Method for Numerical Simulation of Three<br />
Dimensional Hypersonic Boundary-Layer Transition with Finite Surface Roughness,” AIAA paper 2010-<br />
1450, January 2010.<br />
158. J. Lei and X. Zhong, “Linear Stability Analysis of Nose Bluntness Effects on Hypersonic Boundary<br />
Layer Transition,” AIAA paper 2010-898, January 2010.<br />
CTS070070<br />
159. D.H.Lee,R.Akhavan(2009) Scaling ofpolymerdrag reductionwithpolymerand flowparametersin<br />
turbulent channel flow. Advances in Turbulence XII, 359-362, Springer.<br />
160. R.Akhavan,J.Liu,D.H.Lee(2009)AmixedEulerian-Lagrangianschemefordirect numerical simulation<br />
of viscoleastic turbulent flows. J. Non-Newtonian Fluid Mech. (submitted).<br />
161. D.H.Lee,R.Akhavan(2010)Skin-frictiondragreductionbydilutepolymersolutionsinturbulent channel<br />
flow. Part 1. Scaling of drag reduction with polymer and flow parameters. J. Fluid Mech. (submitted).<br />
162. D.H.Lee,R.Akhavan(2010)Skin-frictiondragreductionbydilutepolymersolutionsinturbulent channel<br />
flow. Part 2. Mechanism of drag reduction. J. Fluid Mech. (submitted).<br />
163. R.Akhavan(2010) Stochastic simulations ofdrag reductionby dilutepolymer solutions. J. Fluid Mech.<br />
(in preparation).<br />
CTS080033<br />
164. J. B. Perot, Determination of the Decay Exponent in Mechanically Stirred Isotropic Turbulence,<br />
Fluid Dynamics Research, 2010.<br />
165. Khajeh-Saeed and J. B. Perot, Using GPU Supercomputers to Massively Accelerate Pattern<br />
Matching, GPU GEMS 4, 2010.<br />
166. T. McGuiness, A. Khajeh-Saeed, and J. B. Perot, High Performance Computing on GPU Clusters,<br />
Journal of Computational Science, 2010.<br />
167. J. B. Perot, Discrete Conservation Properties of Unstructured Mesh Schemes, Annual Reviews of<br />
Fluid Mechanics, 2010.<br />
168. Khajeh-Saeed, S. Poole and J. B. Perot, Acceleration of the Smith-Waterman Algorithm using<br />
Single and Multiple Graphics Processors, Journal of Computational Physics, 229 (11) 4247-4258, 2010.<br />
CTS090100<br />
169. S. C. Ammal and A. Heyden, “Modeling the noble Metal/TiO2 (110) interface with hybrid DFT<br />
functionals: A Periodic Electrostatic Embedded Cluster Model Study”, J. Chem. Phys. (manuscript under<br />
revision)<br />
170. S. C. Ammal and A. Heyden, “Theoretical investigation of the water-gas shift reaction at the threephase<br />
boundary of the Pt/TiO2 interface”, to be submitted to J. Phys. Chem. C.<br />
171. J. S. Ratliff, S. A. Tenney, S. C. Ammal, A. Heyden, and D. A. Chen, “CO adsorption and oxidation<br />
on Pt, Au and Pt-Au clusters on TiO2(110)”, to be submitted to J. Phys. Chem. C.<br />
CTS090103<br />
172. VAN POPPEL, B., DESJARDINS, O., DAILY, J.W. (2010) A Ghost Fluid, Level Set Methodology<br />
for Simulating Electrohydrodynamic Atomization of Liquid Fuels, manuscript submitted to J. of Comp. Phys.<br />
173. DESJARDINS, O. and PITSCH, H. (2010) Detailed numerical investigation of turbulent atomization<br />
of liquid jets, accepted for publication in Atomization and Sprays<br />
174. DESJARDINS, O. and PITSCH, H. (2009) A spectrally refined interface approach for simulating<br />
multiphase flows, J. Comp. Physics, 228 (5), 1658-1677
175. VAN POPPEL, B., DESJARDINS, O., DAILY, J.W. (2010) Simulating Electrodydrodynamic<br />
Atomization for Fuel Injection. ILASS-Americas 22nd Annual Conference on Liquid Atomization and Spray<br />
Systems, Cincinnati, OH.<br />
176. CZAJKOWSKI, M., DESJARDINS, O. (2010) Quadrature-free Discontinuous Galerkin Level Set<br />
Scheme. ILASS-Americas 22nd Annual Conference on Liquid Atomization and Spray Systems, Cincinnati,<br />
OH.<br />
177. VAN POPPEL, B., DESJARDINS, O., DAILY, J.W. (2010) Modeling two-phase<br />
electrohydrodynamic flows, 48th Annual AIAA Aerospace Sciences Meeting, Orlando, FL.<br />
178. VAN POPPEL, B., DESJARDINS, O., DAILY, J.W. (2009) Modeling electrohydrodynamic<br />
atomization, 62nd Annual Meeting of the APS Division of Fluid Dynamics, Minneapolis, MN.<br />
179. DESJARDINS, O. (2009) Adaptive spectral refinement for accurate simulations of turbulent multiphase<br />
flows, International Conference on Liquid Atomization and Sprays, Vail, CO.<br />
180. DESJARDINS, O. (2009) Detailed numerical investigation of turbulent atomization of liquid jets,<br />
International Conference on Liquid Atomization and Sprays, Vail, CO.<br />
181. PAI, M., DESJARDINS, O., PITSCH, H. (2009) Detailed numerical simulations of liquid jets in<br />
cross-flow, Proceedings of the 47th AIAA Aerospace Meeting, Orlando, FL.<br />
MCA06N038<br />
182. Reddy, H. and Abraham, J. (2010) Ignition kernel development studies relevant to lean-burn natural<br />
gas engines. Accepted for publication in Fuel.<br />
183. Venugopal, R. and Abraham, J. (2010) Numerical studies of the response of flamelets to<br />
unsteadiness in the near-field of jets under diesel conditions. Combustion Science and Technology,<br />
182(7):717-738.<br />
184. Owston, R. and Abraham, J. (2010) Structure of hydrogen triple flames and premixed flames<br />
compared. Combustion and Flame, 157:1552-1565.<br />
185. Owston, R. and Abraham, J. (2010) Numerical study of hydrogen triple flame response to mixture<br />
stratification, ambient temperature, pressure, and water vapor concentration. International Journal of<br />
Hydrogen Energy, 35(10):4723-4735.<br />
186. Owston, R. and Abraham, J. (2009) Flame propagation in stratified hydrogen-air mixtures: spark<br />
placement effects. International Journal of Hydrogen Energy, 34(15):6532-6544.<br />
187. Venugopal, R. and Abraham, J. (2009) Unsteady flamelet response in the near field of high-<br />
Reynolds-number jets. AIAA Journal, 47(6):1491-1506.<br />
188. Venugopal, R. and Abraham, J. (2009) Numerical studies of vortex-induced extinction/reignition<br />
relevant to the near-field of high-Reynolds-number jets. Physics of Fluids, 21, 055106.<br />
189. Owston, R., Magi, V. and Abraham, J. (2010) Some numerical considerations in the simulation of<br />
low-Ma number hydrogen/air mixing layers. Central States Section Meeting of the Combustion Institute,<br />
Urbana-Champaign, March.<br />
190. Mukhopadhyay, S. and Abraham, J. (2010) Ignition behavior in autoigniting stratified mixtures<br />
under IC engine conditions. Central States Section Meeting of the Combustion Institute, Urbana-Champaign,<br />
March.<br />
191. Reddy, H. and Abraham, J. (2010) Influence of vortex parameters on the outcome of kernel-vortex<br />
interactions in lean methane-air mixtures. Central States Section Meeting of the Combustion Institute,<br />
Urbana-Champaign, March.<br />
192. Owston, R. and Abraham, J. (2010) Numerical modeling of triple flames: exploratory studies.<br />
Central States Section Meeting of the Combustion Institute, Urbana-Champaign, March.
Chemistry<br />
CHE060025<br />
193. S. R. Stoyanov, A. V. Titov and P. Kr´al, Transition metal and nitrogen doped carbon<br />
nanostructures, Coord. Chem. Rev., 253, 2852 (2009).<br />
194. L. Vukovi´c and P. Kr´al, Coulombically driven rolling of nanorods on water, Phys. Rev. Lett. 103,<br />
246103 (2009).<br />
195. Titov B. Wang, K. Sint and P. Kr´al, Controllable synthetic molecular channels: Biomimetic<br />
ammonia switch, J. Phys. Chem. B 114, 1174 (2010).<br />
CHE080076<br />
196. Azaria S. Eisenberg , Vasiliy Zenamenskiy , Iya Likhtina, and Ronald L. Birke, “Electronic<br />
Spectroscopy and Computational Studies of Glutathionylco(III)balamin ” in preparation. (2010)<br />
197. R. L. Birke, V. Znamenskiy, and J. R. Lombardi, “ A charge-transfer surface enhanced Raman<br />
scatter model from time-dependent density functional theory calculations on a Ag 10-pyridine complex,” J.<br />
Chem. Phys. 132, 214707(1-15) ( 2010).<br />
198. J. R. Lombardi and R. L. Birke , “ A Unified View of Surface-Enhanced Raman Spectroscopy” ,<br />
Acct. Chem. Res., 42, 734-742 (2109)<br />
CHE080083<br />
199. E. Chagarov, A.C. Kummel, “Density Functional Theory Simulations of High-k Oxides on III-V<br />
Semiconductors”, Book Chapter in the book “Fundamentals of III-V Semiconductor MOSFET’s”, editors S.<br />
Oktyabrsky, P. Ye, by Springer Science+Business Media LLC (2010).<br />
200. M. Houssa, E. Chagarov, A. Kummel, “Surface Defects and Passivation of Ge and III-V Interfaces”,<br />
MRS Bulletin 34, 504 (2009). (Research results of Dr. Chagarov and Prof. Kummel were presented on<br />
the journal cover page).<br />
201. E.A. Chagarov, A.C. Kummel,” Molecular dynamics simulation comparison of atomic scale<br />
intermixing at the amorphous Al2O3/semiconductor interface for a-Al2O3/Ge, a-Al2O3/InGaAs, and a-<br />
Al2O3/InAlAs/InGaAs”, Surface Science 603, 3191 (2009).<br />
202. E.J. Kim, E. Chagarov, J. Cagnon, Y. Yuan, A. C. Kummel, P. M. Asbeck, S. Stemmer, K. C.<br />
Saraswat, and P. C. McIntyre, “Atomically abrupt and unpinned Al2O3 / In0.53Ga0.47As interfaces:<br />
Experiment and simulation”, Journal of Applied Physics 106 (12), 124508 (2009).<br />
203. E.A. Chagarov, A.C. Kummel, “Ab initio molecular dynamics simulations of properties of a-<br />
Al2O3/vacuum and a-ZrO2/vacuum vs. a-Al2O3/Ge(100)(2x1) and a-ZrO2/Ge(100)(2x1) interfaces”, Journal<br />
of Chemical Physics 130, 124717 (2009).<br />
204. M. Rodwell, W. Frensley, S. Steiger, E. Chagarov, S. Lee, H. Ryu, Y. Tan. G. Hegde, L. Wang, J.<br />
Law, T. Boykin, G. Klimek, P. Asbeck, A. Kummel “III-V FET Channel Designs for High Current Densities<br />
and Thin Inversion Layers” IEEE Proceedings of the Device Research Conference 2010<br />
205. Jian Shen, Jonathon B. Clemens, Darby L. Feldwinn, Wilhelm Melitz, Tao Song, Evgueni A.<br />
Chagarov, Ravi Droopad and Andrew C. Kummel, “Structural and Electronic Properties of Group III Rich<br />
In0.53Ga0.47As(001)”, Surface Science, (In press) (2010)<br />
206. E. Chagarov, A. Kummel, “Density Functional Theory Simulations of Amorphous High-k Oxides on<br />
A Compound Semiconductor Alloy: a-Al2O3/InGaAs(4x2)(100), a-HfO2/InGaAs(4x2)(100), a-<br />
HfO2/OH/InGaAs(4x2)(100) and a-ZrO2/InGaAs(4x2)(100).” (in preparation).<br />
207. E.A. Chagarov, A.C. Kummel, “Density-Functional Theory Molecular Dynamics Simulations of<br />
Oxide, Nitride, and OxyNitride High-k/Ge(100)(2x1) interface passivation”, (research paper) (in preparation).
208. J B. Clemens, E. A. Chagarov, M. Holland, R. Droopad, J Shen, and A. C. Kummel “Atomic<br />
imaging of the monolayer nucleation and unpinning of a compound semiconductor surface during atomic<br />
layer deposition” J. Chem. Phys. (submitted) (2010)<br />
209. S. R. Bishop, J. B. Clemens, E. A. Chagarov, and A. C. Kummel “Theoretical analysis of initial<br />
adsorption high-κ dielectric oxides on group III rich InxGa1-xAs(0 0 1)-(4×2) surfaces” J. Chem. Phys.<br />
(submitted) (2010)<br />
210. Jian Shen, Evgueni A. Chagarov, Darby L. Feldwinn, Wilhelm Melitz, Nancy M.Santagata and<br />
Andrew C. Kummel, Ravi Droopad, Matthias Passlack, “Scanning Tunneling Microscopy/Spectroscopy<br />
Study of Atomic and Electronic Structures of In2O on InAs and In0.53Ga0.47As(001)-(4×2) Surfaces”, J.<br />
Chem Phys. (submitted) (2010).<br />
CHE090062<br />
211. Deborah L. Heyl, Joshua M. Osborne, Sarika Pamarthy, Shyamprasad Samisetti, Andrew W. Gray,<br />
Anitha Jayaprakash, Srikanth Konda, Dorothy J. Brown, Samuel R. Miller, Reza Eizadkhah, Maria C. Milletti<br />
‘Liposome Damage and Modeling of Fragments of Human Islet Amyloid Polypeptide (IAPP) Support a Two-<br />
Step Model of Membrane Destruction’ International Journal of Peptide Research and Therapeutics, 2010,<br />
16, 43.<br />
212. Mark A. Lukowski, Dae Jin Choi and M.C. Milletti ‘Substrate Binding and Kinetic Aspects of the<br />
Peroxidation Reaction of Four Polyunsaturated Fatty Acids in the COX Active Site of PGHS-1’ Letters in<br />
Drug Design and Discovery, 2010, 7, 88-97.<br />
213. Mark Lukowski, Kathryn Jacobs, Powen Hsueh, Harriet A. Lindsay and M.C. Milletti<br />
‘Thermodynamic and kinetic factors in the aza-Cope rearrangement of a series of iminium cations’<br />
Tetrahedron 2009, 65, 10311-10316.<br />
CHE090066<br />
214. Yang, Q.; Huang, Z.; Khvostichenko, D.; Kucharski, T.; Chen, J.; Boulatov, R. A molecular force<br />
probe. Nature Nanotech. 2009, 4, 302-306.<br />
215. Huang, Z.; Boulatov, R. Exploring chemimechanics with molecular force probes (invited review).<br />
Pure Appl. Chem. 2010, 86, 931-952.<br />
216. Kucharski, T.; Yang, Q.; Huang, Z.; Tian, Y.; Rubin, N. J.; Concepcion, C. J.; Boulatov, R. Kinetics<br />
of thiol/disulfide exchange correlates weakly with the restoring force of the disulfide moiety. Angew. Chem.<br />
Int. Ed., 2009, 48, 7040 - 7043.<br />
217. Huang, Z.; Yang, Q.; Kucharski, T.; Khvostichenko, D.; Wakeman, S. M.; Boulatov, R. Macrocyclic<br />
disulfides for studies of sensitized photolysis of the S-S bond. Chem. Eur. J., 2009, 15, 5212-5214.<br />
218. Huang, Z.; Yang, Q.; Khvostichenko, D.; Kucharski, T.J.; Chen, J.; Boulatov, R. Method to derive<br />
restoring forces of strained molecules from kinetic measurements. J. Am. Chem. Soc. 2009, 131, 1407-1409<br />
CHE090070<br />
219. Xu, L.; Doubleday, C. E.; Houk, K. N. “Dynamics of 1,3-Dipolar Cycloaddition Reactions of<br />
Diazonium Betaines to Acetylene and Ethylene: Bending Vibrations Facilitate Reaction” Angew. Chem. Int.<br />
Ed. 2009, 48, 2746-2748.<br />
220. Xu, L.; Doubleday, C. E.; Houk, K. N. “Dynamics of 1,3-Dipolar Cycloadditions: Energy Partitioning<br />
of Reactants and Quantitation of Synchronicity” J. Am. Chem. Soc. 2010, 132, 3029–3037.<br />
221. Doubleday, C.; Bohnen, M. “Statistical and Nonstatistical Dynamics in a Thermal Rearrangement<br />
with Competing 1,3 and 3,3 Shifts”.
222. Doubleday, C.; Hakim, N. “Trajectory Dynamics of O( 3 P) Reactions with CH 2<br />
F and CH 2<br />
Cl<br />
Radicals”.<br />
223. Doubleday, C.; Goldenberg, S. “Trajectory Study of a Gas Phase Nucleophilic Addition to a<br />
Carbonyl: H – + HCOF”.<br />
CHE090124<br />
224. Identification of histidine biosynthesis inhibitors using docking, ensemble rescoring and whole cell<br />
assays S. T. Henriksen, J. Liu, G. Estiu, Z. N. Oltvai, O. Wiest Bio<strong>org</strong>. Med. Chem 2010 In press<br />
225. Thermal behavior, powder X-ray diffraction, DFT calculations and vibrational spectra of barium<br />
bis(pentacyanonitrosylchromate) octahydrate, Ba(3)[Cr(CN)(5)NO](2).8H(2)O(D(2)O D. B. Soria, G.L. Estiu,<br />
R. Carbonio, P.J. Aymonimo. Spectrochim Acta A Mol Biomol Spectrosc 2010. PubMed ID 20457005<br />
226. Affinity of sulfamates and sulfamides to Carbonic Anhydrase II isoform: experimental and molecular<br />
modeling approaches L. Gavernet, J. Gonzalez Funes, L.Bruno-Blanch, G. Estiu, A. Maresca, C. Supuran J.<br />
Chem. Inf. Model. In press<br />
227. On the Inhibition of Histone Deacetylase 8. G. Estiu, N. West, R. Mazitschek, E.Greenberg, J.<br />
Bradner, O. Wiest. Bio<strong>org</strong>. Med. Chem. 2010, 18(11), 4103-4110<br />
228. Mechanism of unfolding of the helicate [Cu2(mphenpr) 2]2+ into the [Cu(mphenpr)]+ monomer<br />
(mphenpr = 1,3-bis(9-methyl-1,10-phenanthrolin-2-yl)propane) and its coupling to the chlorocarbon<br />
dehalogenation. On the participation of metastable copper-alkyl species. L. Lemus, J. Guerrero,<br />
J.Costamagna, G. Estiu, G. Ferraudi, A. G. Lappin, A. Oliver and B. C. Noll, In<strong>org</strong>.Chem. 2010, 49, 4023.<br />
229. Selective Inhibition of BET Bromodomains. P. Filippakopoulos1, J. Qi, S. Picaud, Y. Shen, W. B.<br />
Smith, O. Fedorov, E. M. Morse, T. Keates, T. T. Hickman, I.Felletar, M. Philpott, Sh. Munro, N. West, M. J.<br />
Cameron, T., D. Heightman, N. La Thangue, A. Kung, Ch. A. French, O. Wiest, S. Knapp, J. E. Bradner.<br />
Nature, accepted for publication.<br />
230. On the Structure and Dynamics of GNA A.T. Johnson,.M. Schlegel, E. Meggers, L.O. Essen, O.<br />
Wiest J. Am. Chem. Soc. accepted for publication.<br />
231. Application of Q2MM for Stereoselective Reactions S. N. Lill, P. Donoghue, A. Forbes, V.<br />
Verdolino, O. Wiest, P. Rydberg, , P.O. Norrby Curr. Org. Chem. 2010 accepted<br />
232. On the Mechanism of the Rhodium Catalyzed Acrylamide Hydrogenation. V. Verdolino, A. Forbes,<br />
P. Helquist, P. O. Norrby, O. Wiest J. Mol. Cat A 2010, 324, 9-14<br />
233. Quantum Mechanical Studies of Transition Metal Catalyzed Hydrogenations V. Verdolino, A.<br />
Forbes, P. Helquist, O. Wiest. Submitted.<br />
CHE100115<br />
234. Giagou, T.; Meyer, M. P. “Mechanism of the Swern Oxidation: Significant Deviations from<br />
Transition State Theory” J. Org. Chem.(submitted).<br />
235. Zhu, H.; Meyer, M. P. “Cationic Intermediates in Friedel-Crafts Acylation: Structural Information<br />
from Theory and Experiment” Chem. Comm. (Invited article for ‘Emerging Investigators’ issue: submitted).<br />
236. Giagou, T.; Meyer, M. P. “Kinetic Isotope Effects in Asymmetric Reactions” Chem. Eur. J. (Invited<br />
Concept article: provisionally accepted).<br />
MCA08X021<br />
237. J.L. Belof, C.R. Cioce, X. Xu, X.P. Zhang, B. Space, H.L. Woodcock “Characterization of Tunable<br />
Radical Metal-Carbenes: Key Intermedia tes in Catalytic Cyclopropanation”, J. Am. Chem. Soc., (submitted)
238. M.H. Alkordi, A.C. Stern, J.L. Belof, B. Space, L. Wojtas, M.J. Zaworotko,<br />
239. M. Eddaoudi, “Molecular Squares: Confined Space with Specif ic Geometry for Hydrogen Uptake”,<br />
J. Am. Chem. Soc., (submitted)<br />
240. N.R. McIntyre, E.W. Lowe, Jr., J.L. Belof, M. Ivkovic, J. Shafer, B. Space,<br />
241. D.J. Merkler, “Evidence for Substrate Pre-<strong>org</strong>anization in the Peptidylglycine alpha-Amidating<br />
Monooxygenase (PAM) Reaction Describing the Contribution of Ground State Structure to Hydrogen<br />
Tunneling’ ’, J. Am. Chem. Soc., (submitted)<br />
242. M.H. Alkordi, J.L. Belof, E. Rivera, L. Wojtas, J.J. Zhang, A. Kassas, M. Eddaoudi, “Insight into the<br />
Assembly Mechanism of Metal-Orga nic Materials”, J. Am. Chem. Soc., (in press)<br />
243. J.L. Belof, B. Space, “Rapidly Convergent Iterative Techniques Toward the Solution of Many-body<br />
Molecular Polarization Field Equations ”, J. Chem. Theory Comput., (in press)<br />
244. J.L. Belof, A.C. Stern, B. Space, “A Predictive Model of Hydrogen Sorption in Metal-Organic<br />
Materials”, J. Phys. Chem. C, 113, 9316 (2009) [DOI]<br />
Computer and Computation Research<br />
CCR100034<br />
245. K. Kandalla, H. Subramoni and D. K. Panda. Designing Topology-Aware Collective Communication<br />
Algorithms for Large Scale InfiniBand Clusters : Case Studies wih Scatter and Gather. In Proceedings of the<br />
2010 IEEE International Symposium on Parallel and Distributed Processing, 2010.<br />
246. S. Potluri, P. Lai, K. Tomko, S. Sur, Y. Cui, M. Tatineni,K. Schulz, W. Barth, A. Majumdar and<br />
247. D. K. Panda. Quantifying Performance Benefits of Overlap using MPI-2 in a Seismic Modeling<br />
Application. Technical report, 2010.<br />
248. S. Potluri, S. Sur, P. Lai, K. Tomko, Y. Cui and D. K. Panda. Performance Analysis and<br />
Improved Communication Overlap for a Seismic Modeling Application on Large In finiBand Clusters.<br />
Technical <strong>Report</strong> OSU-CISRC-4/10-TR09, 2010.<br />
249. Y. Cui, K. B. Olsen, T. H. Jordan, K. Lee, J. Zhou, P. Small, D. Roten, G. Ely, D. K. Panda,<br />
a. Chourasia, J. Levesque, S. M. Day, P. M. Maechling. Scalable Earthquake Simulation on Petascale<br />
Supercomputers, Finalist: ACM Gordon Bell Award, SC’10. Nov 2010.<br />
Cross-Disciplinary Activities<br />
CCR100034<br />
250. R. Stephens, Ph.D. Thesis, University of Illinois at Urbana-Champaign (2008). 4 R. Stephens and<br />
R. Alkire, “Island Dynamics Algorithm for Kinetically-Limited Electrochemical Nucleation with<br />
Additives,” J. Electrochem. Soc., 156:1, D28-D35 (2009).<br />
251. R.N. Methekar, K. Chen, R.D. Braatz, and V.R. Subramanian, “Kinetic Monte Carlo Simulation of<br />
Surface Heterogeneity in Graphite Electrodes for Lithium-Ion Batteries: Passive Layer Formation and<br />
Simulation of Capacity Fade,” journal manuscript submitted (2010).<br />
252. D. Lubomirsky, Ph.D. Thesis in progress, University of Illinois, Urbana (2011).<br />
253. M. Karulkar, F. Xue, R.D. Braatz, and R.C. Alkire, “Effect of Additives on Kinetically-Limited<br />
Electrodeposition,<br />
Part I: Simulation of Nucleation and Substrate Overgrowth,” manuscript in preparation.<br />
254. A. Bezzola, Ph.D. Thesis in progress, University of California at Santa Barbara (2012).<br />
255. R. Stephens, M. Willis and R. Alkire, “Additive-Assisted Nucleation and Growth by<br />
Electrodeposition: Part II: Mathematical Model and Comparison with Experimental Data,” J. Electrochem.<br />
Soc., 156:10, D385-D394 (2009).
256. M. Willis, R. Stephens, and R. Alkire, “Additive-Assisted Nucleation and Growth by<br />
Electrodeposition: Part I: Experimental Studies with Copper Seed Arrays on Gold Films,” J. Electrochem.<br />
Soc., 156:10, D377-D384 (2009).<br />
257. R. Stephens, M. Buoni, L. Petzold, and R. Alkire, “The Effect of Solution-Phase Mass Transport on<br />
Metal Nucleation and Growth during Electrochemical Deposition with Additives,” manuscript in preparation<br />
(2010).<br />
258. M. Kishida and R.D. Braatz, “Worst-case Analysis of Distributed Parameter Systems with<br />
Application to the 2D Reaction-Diffusion Equation,” Optimal Control Applications & Methods, in press (2010).<br />
259. L. Tierney, “Markov Chains for Exploring Posterior Distributions,” Annals of Statistics, 22, 1701-<br />
1728 (1994). 18 V. Ramadesigan, V. Boovaragavan, M. Arabandi, K. Chen, H. Tuskamoto, R.D. Braatz, and<br />
V. Subramanian, “Parameter Estimation and Capacity Fade Analysis of Lithium-Ion Batteries Using First-<br />
Principles-Based Efficient Reformulated Models,” ECS Transactions, 19:16, 11-19 (2009).<br />
260. V. Ramadesigan, V. Boovaragavan, M. Arabandi, K. Chen, H. Tuskamoto, R.D. Braatz, and V.<br />
Subramanian, “Parameter Estimation and Capacity Fade Analysis of Lithium-Ion Batteries Using<br />
Reformulated Models,” journal manuscript nearly completed for submission (2010).<br />
261. R.N. Methekar, V. Boovaragavan, M. Arabandi, V. Ramadesigan, V.R. Subramanian, F. Latinwo,<br />
and R.D. Braatz, “Optimal Spatial Distribution of Microstructure in Porous Electrodes for Li-Ion Batteries,”<br />
Proceedings of the American Control Conference, Baltimore, Maryland, 6600-6605 (2010).<br />
262. V.R. Subramanian, V. Boovaragavan, V. Ramadesigan, K. Chen, and R.D. Braatz, “A<br />
Reformulated Model and Its Applications to Bayesian Parameter Estimation for Lithium-ion Batteries,”<br />
Preprints of the Foundations of Computer-Aided Process Design, Breckenridge, CO, June 7-12 (2009).<br />
263. V.R. Subramanian, V. Boovaragavan, V. Ramadesigan, K. Chen, and R.D. Braatz, “Model<br />
Reformulation and Design of Lithium-Ion Batteries,” in Design for Energy and the Environment, edited by<br />
M.M. El-Halwagi and A.A. Linninger, CRC Press, Boca Raton, FL, 987-1006 (2009).<br />
264. V. Ramadesigan, R.N. Methekar, V.R. Subramanian, F. Latinwo, and R.D. Braatz, “Optimal<br />
Porosity Distribution for Minimized Ohmic Drop Across a Porous Electrode,” journal manuscript submitted<br />
(2010).<br />
265. R.N. Methekar, V. Ramadesigan, V.R. Subramanian, and R.D. Braatz, “Optimum Charging Profile<br />
for Lithium-Ion Batteries to Maximize Energy Storage and Utilization,” ECS Transactions, 25:35, 139-146<br />
(2010).<br />
266. R.N. Methekar, V. Ramadesigan, V.R. Subramanian, and R.D. Braatz, “Optimal Charging Profile<br />
for Lithium-Ion Batteries to Maximize Energy Storage,” journal manuscript submitted (2010).<br />
Materials Research<br />
ASC090058<br />
267. Akihiro Kushima, Sidney Yip, and Bilge Yildiz, “Competing Strain Effects in Reactivity of LaCoO3<br />
with Oxygen”, submitted to Phys. Rev. B (2010).<br />
268. Akihiro Kushima and Bilge Yildiz, “Oxygen ion diffusivity in strained yttria stabilized zirconia: where<br />
is the fastest strain”, J. Mater. Chem., (2010), DOI: 10.1039/c000259c.<br />
269. Akihiro Kushima and Bilge Yildiz, “Role of Lattice Strain and Defect Chemistry on the Oxygen<br />
Vacancy Migration at the (8.3%Y2O3-ZrO2)/SrTiO3 Hetero-Interface: A First Principles Study”, ECS Trans.<br />
25, 1599, (2009).<br />
270. European Materials Research Society 2010 Spring Meeting, Strasbourg, France, June 7-11, 2010,<br />
Akihiro Kushima, Jeong Woo Han, Sidney Yip, and Bilge Yildiz, “Ionic transport and surface reactions on<br />
LaCoO3 under strain: Insights from ab initio calculations”.<br />
271. 216th Electrochemical Society Meeting, Vienna, Austria, October 4-9, 2009, Akihiro Kushima and<br />
Bilge Yildiz, “Role of Lattice Strain and Defect Chemistry on the Oxygen Vacancy Migration at the<br />
(8.3%Y2O3-ZrO2)/SrTiO3 Hetero-Interface: A First Principles Study”.
DMR050002<br />
272. Unusual dielectric response in B-site size-disordered hexagonal transition metal oxides,<br />
D. Choudhury, A. Venimadhav, C. Kakarla, K. T. Delaney, P. S. Devi, P. Mondal,<br />
R.Nirmala,J.Gopalakrishnan,N.A.Spaldin,U.V.Waghmare, andD.D.Sarma, Appl. Phys. Lett. 96,<br />
162903(2010)<br />
273. Theoretical study of Schottky-barrier formation at epitaxial rare-earthmetal/semiconductor<br />
interfaces, K. Delaney, N. A Spaldin and C. G. van de Walle, Phys. Rev. B 81, 165312(2010).<br />
274. Electron-lattice instabilities suppress cuprate-like electronic structures in SrFeO3/SrTiO3<br />
superlattices, J. M. Rondinelli and N. A. Spaldin, Phys. Rev. B 81, 085109(2010).<br />
275. Strain-induced isosymmetricphase transition in BiFeO3, A. J.Hatt, N. A.Spaldin and<br />
C. Ederer, Phys. Rev. B 81, 054109(2010).<br />
276. Strain-Induced Ferroelectricity in Simple Rocksalt Binary Oxides, E. Bousquet, N. A. Spaldin and<br />
Ph. Ghosez, Phys. Rev. Lett. 104, 037601(2010).<br />
277. Mn 3+ in Trigonal Bipyramidal Coordination: A New Blue Chromophore A. E. Smith,<br />
H. Mizoguchi, K. Delaney, N. A. Spaldin, A. W. Sleight and M. A. Subramanian,<br />
J. Am. Chem. Soc. 131, 17084(2009).<br />
278. First-principles modeling of ferroelectric capacitors via constrained displacement field calculations,<br />
M. Stengel, D. Vanderbilt and N. A. Spaldin, Phys. Rev. B 80, 224110(2009). Editor’s Choice<br />
279. A strain-driven morphotropic phase boundary in BiFeO3, R. J. Zeches, M. D. Rossell,<br />
J. X. Zhang, A. J. Hatt,Q. He, C.-H. Yang, A. Kumar, C. H. Wang, A. Melville,<br />
C. Adamo, G. Sheng, Y.-H. Chu, J. F. Ihlefeld, R. Erni, C. Ederer, V. Gopalan,<br />
L.Q.Chen,D.G.Schlom,N.A.Spaldin,L.W.Martin andR.Ramesh,Science 326,977 -980(2009).<br />
280. Current trends of the magnetoelectric effect, M. Fiebig and N. A. Spaldin, Eur. Phys.<br />
J. B 71, 293(2009).<br />
281. Strain effects on the electric polarization of BiMnO3, A. J. Hatt and N. A. Spaldin, Eur. Phys. J. B<br />
71, 435(2009).<br />
282. First-principles study of ferroelectric domain walls in multiferroic bismuth ferrite, A, Lubk, S.<br />
Gemming and N. A. Spaldin, Phys. Rev. B 80, 104110(2009). Editor’s Choice<br />
283. Non-d 0 Mn-driven ferroelectricity in antiferromagnetic BaMnO3, J. M. Rondinelli, A.<br />
S. Eidelson and N. A. Spaldin, Phys. Rev. B 79, 205119(2009). Editor’s Choice<br />
284. Enhancement of ferroelectricity at metal/oxide interfaces, M. Stengel, D. Vanderbilt and N. A.<br />
Spaldin, Nature Materials, 8, 392(2009).<br />
285. Superexchange-driven magnetoelectricity in magnetic vortices, K. T. Delaney, M. Mostovoy and N.<br />
A. Spaldin, Phys. Rev. Lett. 102, 157203(2009).<br />
286. Role of Atomic Multiplets in the Electronic Structure of Rare-Earth Semiconductors and<br />
Semimetals, L. V. Pourovskii, K. T. Delaney, C. G. Van de Walle, N. A. Spaldin, and A. Ge<strong>org</strong>es, Phys. Rev.<br />
Lett. 102, 096401(2009).<br />
287. Electric displacement as the fundamental variable in electronic-structure calculations,<br />
M. Stengel, N. A. Spaldin and D. Vanderbilt, Nature Physics 5,304 -308(2009).<br />
288. Conduction at domain walls in oxide multiferroics, J. Seidel, L. W. Martin, Q. He,<br />
Q. Zhan, Y.-H. Chu, A. Rother, M. E. Hawkridge, P. Maksymovych, P. Yu, M. Gajek, N. Balke, S. V. Kalinin,<br />
S. Gemming, F.Wang, G. Catalan, J. F.Scott, N.<br />
a. Spaldin, J. Orenstein and R. Ramesh, Nature Materials 8, 229,(2009).
289. Structural effects on the spin-state transition in epitaxially strained LaCoO3 films, J.<br />
M. Rondinelli and N. A. Spaldin, Phys. Rev. B 79, 054409(2009).<br />
DMR070008<br />
290. Ajmi BH. Hamouda, T. J. Stasevich, Alberto Pimpinelli, and TLE, Effects of Impurities on Surface<br />
Morphology: Some Examples, J. Phys.: Condens. Matter 21, 084215 (2009)<br />
291. Ajmi BH. Hamouda, Rajesh Sathiyanarayanan, A. Pimpinelli, and T. L. Einstein,<br />
Role of Codeposited Impurities During Growth: I. Explaining Distinctive Experimental Morphology on<br />
Cu(001), preprint being submitted to Phys. Rev. B ∗<br />
292. Rajesh Sathiyanarayanan, Ajmi BH. Hamouda, A. Pimpinelli, and T. L. Einstein,<br />
Role of Codeposited Impurities During Growth: II. Dependence of Morphology on Binding and Barrier<br />
Energies, preprint being submitted to Phys. Rev. B ∗<br />
293. Rajesh Sathiyanarayanan and T. L. Einstein, Ab-initio Calculations of Interactions between Cu<br />
Adatoms on Cu(110): Sensitivity of Strong Multi-site Interactions to Adatom Relaxations, Surface Sci. 603,<br />
2387 (2009)<br />
294. T. L. Einstein, Alberto Pimpinelli, et al., “Distinctive Features in Growth on Vicinal Cu(100):<br />
Understanding the Role of Impurities by Calculating Key Energies and Simulating Morphology,” 27th Max<br />
Born Symposium on Multiscale Modeling of Real Materials, Wroclaw, Poland, Sept. 2010 [invited].<br />
295. T. L. Einstein, Alberto Pimpinelli, et al., “Modeling the Role of Co-deposited Impurities in Growth:<br />
What Causes the Distinctive Step Meandering and Pyramidal Mounds on Cu(001),” German Physical<br />
Society (DPG), Symposium on Crystal Growth Kinetics, Regensburg, Germany, March 2010 [invited].<br />
296. Ajmi BH. Hamouda, Rajesh Sathiyanarayanan, A. Pimpinelli, and T. L. Einstein, “Role of<br />
Codeposited Impurities in Growth: Explaining Cu(0 0 1),” American Physical Society, Portland (OR), Mar.<br />
2010<br />
297. Rajesh Sathiyanarayanan, Ajmi BH. Hamouda, A. Pimpinelli, and T. L. Einstein, “Role of<br />
Codeposited Impurities in Growth: Dependence of Morphology on Binding and Barrier Energies,” American<br />
Physical Society, Portland (OR), Mar. 2010<br />
298. Rajesh Sathiyanarayanan and T. L. Einstein, “Role of Adatom Relaxations in Computing Latticegas<br />
Energies: Multisite Interactions,” American Physical Society, Pittsburgh, Mar. 2009<br />
299. Alberto Pimpinelli, Ajmi BH. Hamouda, and T. L. Einstein, “Impurities in Vacuum Deposition: Effect<br />
on Island Nucleation and Surface Morphologies,” American Physical Society, Pittsburgh, Mar. 2009<br />
DMR080070<br />
300. Simulation studies of length dependence scaling exponent of inverted frustum spring constant, K.K.<br />
Mon.<br />
301. Sound velocity of mass modulated nanostructured composite, K.K. Mon.<br />
302. Numerical solution of the anisotropic diffusion equation for one hard disk in a channel with<br />
obstruction, K.K. Mon.<br />
303. Monte Carlo simulations of two-dimensional model for hopping times , K.K. Mon.<br />
DMR080072<br />
304. S.R. Wilson, C.M. Hefferen, S.F. Li, J. Lind, R.M. Suter and A.D. Rollett, “Microstructural<br />
Characterization and Evolution in 3D,” Risoe 2010 Symposium proceedings, accepted.<br />
305. U. Lienert, M.C. Brandes, J.V. Bernier, M.J. Mills, M.P. Miller, S.F. Li, C.M. Hefferan,
306. J. Lind, R.M. Suter, “3DXRD at the Advanced Photon Source: Orientation Mapping and<br />
Deformation Studies,” Risoe 2010 Symposium proceedings, accepted.<br />
307. C.M. Hefferan, S.F. Li, J. Lind, U. Lienert, A.D. Rollett, P. Wynblatt, R.M. Suter, “Statistics of<br />
High Purity Nickel Microstructure From High Energy X-ray Diffraction Microscopy,” Computers, Materials<br />
and Continua, 14, 209-219 (2009).<br />
308. C.M. Hefferan, S.F. Li, J. Lind, and R.M. Suter, “Tests of Microstructure Reconstruction by Forward<br />
Modeling of HEDM Data,” Journal of Powder Diffraction, 25, 132-137 (2010).<br />
309. C. M. Hefferan, S. F. Li, J. Lind, U. Lienert, A. D. Rollett, R. M. Suter, “In-situ Observation of<br />
Recovery and Grain Growth in High Purity Aluminum,” Proceedings of the 2010 International Conference on<br />
Recrystallization and Grain Growth, Sheffield, England, July 2010.<br />
310. U. Lienert, S.F. Li, C.M. Hefferan, J. Lind, R.M. Suter, C. Brandes, J.V. Bernier,<br />
311. M.J. Mills, M.P. Miller, C. Wejdemann, and W. Pantleon, “High Energy Diffraction Microscopy at the<br />
Advanced Photon Source,” Journal of Materials, in preparation (invited paper).<br />
312. S.F. Li, C.M. Hefferan, J. Lind, U. Lienert, A.D. Rollett, R.M. Suter, “Three dimensional grain<br />
boundary networks in defected materials,” in preparation.<br />
313. S.F. Li, J. Lind, C.M. Hefferan, U. Lienert, A.D. Rollett, R.M. Suter, “Three dimensional<br />
microstructure statistics from high energy x-ray diffraction microscopy data”, in preparation.<br />
DMR090073<br />
314. Adapted Su-Schrieffer-Heeger Hamiltonian for conducting polymers Li MH and Lin X<br />
315. How does self-trapped soliton escape Shin YW, Li MH, Botelho AL, and Lin X<br />
316. Computing the viscosity of supercooled liquids. III. Markov network model Li, J, Kushima A, Eapen<br />
J, Lin X, Qian XF, Mauro JC, Diep P, and Yip S<br />
317. Twin distortions of the Peierls instability Li MH and Lin X Physical Review B 81: Act. No. 153102<br />
APR 8 2010<br />
318. Commentary on the temperature-dependent viscosity of supercooled liquids: A unified activation<br />
scenario Kushima A, Lin X, and Yip S Journal of Physics:<br />
319. Condensed Matter 21: Act. No. 504104 NOV 11 2009<br />
320. Computing the viscosity of supercooled liquids. II. Silica and strong-fragile crossover behavior<br />
Kushima A, Lin X, Li J, Qian XF, Eapen J, Mauro JC, Diep P, and Yip S Journal of Chemical Physics 131:<br />
Act. No. 164505 OCT 27 2009<br />
321. Computing the viscosity of supercooled liquids Kushima A, Lin X, Li J, Eapen J, Mauro JC, Qian<br />
XF, Diep P, and Yip S Journal of Chemical Physics 130: Act. No. 224504 June 11 2009<br />
MCA08X007<br />
322. J. W. Reiner, A. M. Kolpak, Y. Segal, K. F. Garrity, S. Ismail-Beigi, C.H. Ahn, and. F. J. Walker,<br />
“Crystalline oxides on silicon,” Adv. Mater. (Early View) (2010).<br />
323. A. M. Kolpak, F. J. Walker, J. W. Reiner, Y. Segal, D. Su, M. S. Sawicki, C. C. roadbridge, Z.<br />
Zhang, Y. Zhu,<br />
324. C. H. Ahn, and S. Ismail-Beigi, “Inhibition of ferroelectricity in Si/SrTiO3 due to intrinsic interface<br />
effects,” under review, Phys. Rev. Lett. (2010).<br />
325. A. M. Kolpak and S. Ismail-Beigi, “First-principles studies of epitaxial SrTiO3 on silicon, Part I:<br />
Interface structure and film polarization,” in preparation (2010).<br />
326. A. M. Kolpak, F. J. Walker, J. W. Reiner, C. H. Ahn, and S. Ismail-Beigi, “Ferroelectricity in atomic<br />
layer thick MX2 films on silicon,” in preparation (2010).<br />
327. A. M. Kolpak and S. Ismail-Beigi, “First-principles studies of epitaxial SrTiO3 on silicon, Part II:<br />
Thermodynamic stability and growth kinetics,” in preparation (2010).
328. H. Chen, A. M. Kolpak, and S. Ismail-Beigi, “Electronic and Magnetic Properties of SrTiO3/LaAlO3<br />
Interfaces from First Principles,” Adv. Mater. (Early View) (2010).<br />
329. H. Chen, A. M. Kolpak, and S. Ismail-Beigi, “A first-principles study of LaAlO3/SrTiO3<br />
heterointerfaces and their variants,” under review, Phys. Rev. B (2010).<br />
330. K. F. Garrity, A. M. Kolpak, S. Ismail-Beigi, and E. A. Altman, “Chemistry of ferroelectric surfaces,”<br />
Adv. Mater. (Early View) (2010).<br />
331. K. F. Garrity, A. M. Kolpak, and S. Ismail-Beigi, “Ferroelectric surface chemistry: a first-principles<br />
study of CO2 on PbTiO3,” in preparation (2010).<br />
332. H. Tang and S. Ismail-Beigi, “Phase diagram of two-dimensional Mg boride nanostructures -A<br />
genetic algorithm study,” in preparation (2010).<br />
MCA08X029<br />
333. T.-L. Chan, A.T. Zayak, G.M. Dalpian, and J.R. Chelikowsky, “Size effects on diffusion barriers in<br />
semiconductor nanocrystals," Phys. Rev. Lett., 102: 025901 (2009).<br />
334. J.R. Chelikowsky, A.T. Zayak, T.-L. Chan, M.L. Tiago, Y. Zhou, and Y. Saad, “Algorithms for the<br />
Electronic and Vibrational Properties of Nanocrystals," J. Phys. Cond. Matter, 21: 064207 (2009).<br />
335. J.R. Chelikowsky, Y. Saad, T.-L Chan, M.L. Tiago, A.T. Zayak, and Y. Zhou “Pseudopotentials on<br />
Grids: Application to the Electronic, Optical, and Vibrational Properties of Silicon Nanocrystals,” Journal of<br />
Computational and Theoretical Nanoscience, 6: 1247 (2009).<br />
336. T.-L. Chan, C.Z. Wang, K.M. Ho, and J.R. Chelikowsky, “Efficient first-principles simulation of noncontact<br />
atomic force microscopy for structural analysis," Phys. Rev. Lett., 102: 176101 (2009).<br />
337. M.L. Tiago, J.C. Idrobo, S. Ogut, J. Jellinek, and J.R. Chelikowsky, “Electronic and optical<br />
excitations in Agn clusters (n=1-8): Comparison of density-functional and many-body theories,” Phys. Rev.<br />
B, 79: 155419 (2009).<br />
338. K.H. Khoo and J.R. Chelikowsky, “Electron transport across carbon nanotube junctions decorated<br />
with Au nanoparticles: Density functional calculations,” Phys. Rev. B, 79: 205422 (2009).<br />
339. H. Kwak and J.R. Chelikowsky, “Size-dependent Spin-polarization of Carbon-doped ZnO<br />
Nanocrystals,” Appl. Phys. Lett. 95: 263108 (2009).<br />
340. I. Vasiliev, M. Lopez del Puerto, M. Jain, A. Lugo-Solis, and J.R. Chelikowsky, “Application of<br />
timedependent density-functional theory to molecules and nanostructures,” Theochem (special Issue entitled<br />
“TD-DFT for molecules and molecular solids”) 914: 115 (2009).<br />
341. J.R. Chelikowsky, Y. Saad, T.-L. Chan, M.L. Tiago, A.T. Zayak, and Y. Zhou, “Pseudopotentials on<br />
Grids: Application to the Electronic, Optical, and Vibrational Properties of Silicon Nanocrystals,” Journal of<br />
Computational and Theoretical Nanoscience 6: 1247 (2009). K. Sohlburg, editor (special issue on first<br />
principles calculations for nanostructures).<br />
342. J.R. Chelikowsky, “Using Silicon to Understand Silicon,” in Into the Nano Era – Moores Law<br />
Beyond Planar Silicon CMOS , H. Huff, editor, Springer Series in Materials Science (Springer-Verlag, 2009),<br />
p.41.<br />
343. Y. Saad, J.R. Chelikowsky, and S. Shontz, “Numerical Methods for Electronic Structure<br />
Calculations of Materials,” SIAM Review, 52: 3 (2010).<br />
344. T.-L. Chan and J.R. Chelikowsky, “Controlling diffusion in semiconductor nanostructures by size<br />
and dimensionality,” Nano Lett. 10: 821 (2010).<br />
345. J.-H. Eom, T.-L. Chan, and J.R. Chelikowsky, “The role of vacancies on B doping in Si<br />
nanocrystals,” Solid State Commun. 150: 130 (2010).<br />
346. M. Lopez del Puerto, M. Jain, and J.R. Chelikowsky, “Time-dependent density functional theory<br />
calculation of the Stokes shift in hydrogenated silicon clusters,” Phys. Rev. B 81: 035309 (2010).<br />
347. L.V. Besteiro, L. Tortajada, M.L. Tiago, L.J. Gallego, J.R. Chelikowsky, and M.M.G. Alemany:<br />
“Efficient n-type doping of zinc-blende III-V semiconductor quantum dots,” Phys. Rev. B. 81: 121307R<br />
(2010).
348. T.-L. Chan, J.R. Chelikowsky, K.-M. Ho, C.-Z. Wang, S. Zhang: “The effect of interface on quantum<br />
confinement and electronic structure across the Pb/Si(111) junction,” Nature Physics (submitted).<br />
349. T.-L. Chan, H. Kwak, J.-H. Eom, S. Zhang, and J.R. Chelikowsky: “A theory of self-purification in Si<br />
nanocrystals,” Phys. Rev. B (submitted).<br />
350. K.H. Khoo, M. Kim, G. Schofield and J.R. Chelikowsky: “Real space pseudopotential methods for<br />
ab initio molecular dynamics simulations,” Phys. Rev. B (under review).<br />
351. K.H. Khoo, A.T. Zayak, H. Kwak and J.R. Chelikowsky: “First-principles study of confinement<br />
effects on the Rahman spectra of Si Nanocrystals,” Phys. Rev. Lett. (under review).<br />
352. I. Vasliev and J.R. Chelikowsky: “Real-space calculations of atomic and molecular polarizabilities<br />
using asymptotically correct exchange-correlation potentials,” Phys. Rev. A (in press).<br />
353. G. Schofield, J.R. Chelikowsky and Y. Saad: “Using Chebyshev-Filtered Subspace Iteration and<br />
Windowing Methods to Solve the Kohn-Sham Problem,” Comp. Phys. Comm. (submitted).<br />
354. T.-L. Chan, J.R. Chelikowsky and S.B. Zhang: “Universality of formation energies in doping Si<br />
nanostructures,” Phys. Rev. Lett. (submitted).<br />
355. J. Han, T.-L. Chan and J.R. Chelikowsky: “Quantum confinement, core level shifts, and dopant<br />
segregation in P-doped Sih110i nanowires,” Phys. Rev. B (submitted).<br />
356. J.R. Chelikowsky, “Tool for predicting nanomaterial properties,” Handbook of Nanophysics, K.<br />
Sattler, editor in chief. (Taylor and Francis, NY, 2009), (in press).<br />
357. J.R. Chelikowsky, “Electrons in semiconductors: Empirical and ab initio pseudopotential theories,”<br />
Comprehensive Semiconductor Science and Technology, Hiroshi Kamimura, editor (Elsevier), (in press).<br />
358. J.R. Chelikowsky, “Algorithms for Predicting Electronic,Magnetic and Vibrational Properties of<br />
Nanocrystals,” Computational Nanoscience, RSC Theoretical and Computational Chemistry Series (in<br />
press).<br />
359. J.R. Chelikowsky, T.-L. Chan, M.M.G Alemany and G. Dalpian, “Computational Studies of<br />
Functionalized Nanostructures,” <strong>Report</strong>s on Progress in Physics, (submitted).<br />
360. J.R. Chelikowsky, “Advances in Real Space Algorithms for the Kohn-Sham Problem,” Theoretical<br />
Chemistry Accounts [feature article], (in preparation).<br />
MCA09X001<br />
361. Y. L. Lee, J. Kleis, J. Rossmeisl, and D. M<strong>org</strong>an, Ab initio energetics of LaBO3(001) (B=Mn, Fe,<br />
Co, and Ni) for solid oxide fuel cell cathodes, Physical Review B 80 (2009).<br />
362. M. de Dompablo, Y. L. Lee, and D. M<strong>org</strong>an, First Principles Investigation of Oxygen Vacancies in<br />
Columbite MNb2O6 (M = Mn, Fe, Co, Ni, Cu), Chemistry of Materials 22, 906.<br />
363. J. D. Tucker, R. Najafabadi, T. R. Allen, and D. M<strong>org</strong>an, Ab initio-based diffusion theory and tracer<br />
diffusion in Ni-Cr and Ni-Fe alloys, Submitted to Journal of Nuclear Materials (2010).<br />
364. S. Choudhury, L. Barnard, J. D. Tucker, T. R. Allen, B. D. Wirth, M. Asta, and D. M<strong>org</strong>an, Ab-initio<br />
based modeling of diffusion in bcc Fe-Ni-Cr alloys and implications for radiation induced segregation,<br />
Submitted to Journal of Nuclear Materials (2010).<br />
365. David Shrader, Sarah M. Khalil, Tyler Gerczak, Todd R. Allen, Andrew J. Heim, Izabela Szlufarska,<br />
and D. M<strong>org</strong>an, Ag Diffusion in Cubic Silicon Carbide, Submitted to Journal of Nuclear Materials (2010).<br />
366. W. Donner, A. J. Jacobson, C. L. Chen, Y.-L. Lee, M. Gadre, and D. M<strong>org</strong>an, Epitaxial Strain-<br />
Induced Chemical Ordering in La[sub 0.5]Sr[sub 0.5]CoO[sub 3-δ] Films on SrTiO[sub 3], Submitted (2010).<br />
MCA93S030<br />
367. Miguel A Morales,Carlo Pierleoni, Eric Schwegler, and D. M. Ceperley, Evidence for a firstorder<br />
liquid-liquid transition in high-pressure hydrogen from ab initio simulations, Proceedings of the National<br />
Academy of Sciences (2010).
368. M. A. Morales, C. Pierleoni, and D. M. Ceperley. ”Equation of State of liquid hydrogen at high<br />
pressure from quantum Monte Carlo Calculations”, Phys. Rev. E 81 021202 (2010).,<br />
369. F. Lin, M. A. Morales, K. T. Delaney, C. Pierleoni, R. M. Martin, and D. M. Ceperley,<br />
“Quantum Monte Carlo Calculation of Liquid Hydrogen Electrical Conductivity”, Phys. Rev. Lett. 103, 256401<br />
(2009).<br />
370. B. Clark, M. Casula, D. M. Ceperley, ”Phase Transitions in the 2D one component plasma”, Phys.<br />
Rev. Lett. 103, 055701 (2009).<br />
371. M. White, M. Pasienski, D. McKay, S. Zhou, D. Ceperley, B. DeMarco, Strongly interacting<br />
bosons in a disordered optical lattice, Phys. Rev. Letts., 102, 055301 (2009); arXiv:0807.0446v2<br />
[physics.atom-ph]<br />
372. Simo Huotari, J. Aleksi Soininen, Tuomas Pylkk¨anen, Keijo H¨am¨al¨ainen, Arezki Issolah, Andrey<br />
Titov, Jeremy McMinis, Jeongnim Kim, Ken Esler, David M. Ceperley, Markus Holzmann, Valerio Olevano,<br />
Momentum Distribution and Renormalization Factor in Sodium and the Electron Gas, submitted Phys. Rev.<br />
Letts. June 2010.<br />
373. K. P. Esler, R. E. Cohen, B. Militzer, Jeongnim Kim, R. J. Needs, and M. D. Towler,<br />
Fundamental High-Pressure Calibration from All-Electron Quantum Monte Carlo Calculations, Phys. Rev.<br />
Lett. 104, 185702 (2010)<br />
Mathematical Sciences<br />
DMS080028<br />
374. B. Kouchmeshky and N. Zabaras, "The effect of multiple sources of uncertainty on convex hull of<br />
material properties", Computational Material Science, Vol. 47, pp. 342352, 2009.<br />
375. X. Ma and N. Zabaras, "An adaptive high-dimensional stochastic model representation technique<br />
for the solution of stochastic partial differential equations", Journal of Computational Physics, Vol. 229, pp.<br />
3884-3915, 2010.<br />
376. B. Kouchmeshky and N. Zabaras, "Microstructure model reduction and uncertainty quantification in<br />
multiscale deformation processes", Computational Materials Science, Vol. 48, pp. 213-227, 2010.<br />
377. Z. Li, B. Wen and N. Zabaras, "Computing mechanical response variability of<br />
polycrystalline microstructures through dimensionality reduction techniques",<br />
Computational Materials Science, in press, 2010.<br />
B. Wen, Z. Li and N. Zabaras, "Thermal response variability of random polycrystalline microstructures",<br />
Communications in Computational Physics, submitted.<br />
378. X. Ma and N. Zabaras, "A stochastic mixed finite element heterogeneous multiscale method for<br />
flow in porous media", Journal of Computational Physics, submitted.<br />
Molecular Biosciences<br />
CHE090097<br />
379. D. Svozil, P. Hobza, and J. ˇSponer. Comparison of Intrinsic Stacking Energies of Ten Unique<br />
Dinucleotide Steps in A-RNA and B-DNA Duplexes. Can We Determine Correct Order of Stability by<br />
Quantum-Chemical Calculations J. Phys. Chem. B, 114:1191–1203, 2010.<br />
380. B. Heddi, C. Oguey, C. Lavelle, N. Foloppe, and B. Hartmann. Intrinsic flexibility of B-DNA: the<br />
experimental TRX scale. Nucl. Acid. Res., 38:1034–1047, 2010.<br />
381. A. Gil, V. Branchadell, J. Bertran, and A. Oliva. An Analysis of the Different Behavior of DNA and<br />
RNA through the Study of the Mutual Relationship between Stacking and Hydrogen Bonding. J. Phys.<br />
Chem. B, 113:4907–4914, 2009.<br />
382. R. H. D. Lyngdoh and H. F. Schaefer III. Elementary Lesions in DNA Subunits: Electron, Hydrogen<br />
Atom, Proton, and Hydride Transfers. Acc. Chem. Res., 42:563–572, 2009.
383. A. Szyperska, J. Rak, J. Leszczynski, X. Li, Y. J. Ko, H. Wang, and K. H. Bowen. Valence Anions<br />
of 9Methylguanine–1-Methylcytosine Complexes. Computational and Photoelectron Spectroscopy Studies.<br />
J. Am. Chem. Soc., 131:2663–2669, 2009.<br />
384. K. E. Riley, Pitoˇn´Cern´ On the Structure and Geometry of Biomolecularak, J. ˇy, and P. Hobza.<br />
Binding Motifs (Hydrogen-Bonding, Stacking, X—Hπ): WFT and DFT Calculations. J. Chem. Theory<br />
Comput., 6:66–80, 2010.<br />
385. R. M. Balabin. Communications: Is quantum chemical treatment of biopolymers acurate<br />
Intramolecular basis set superposition error (BSSE). J. Chem. Phys., 132:231101–1–4, 2010.<br />
MCA03S027<br />
386. Black Pyrkosz, J. Eargle, A. Sethi, and Z. Luthey-Schulten. Exit strategies for charged tRNA from<br />
GluRS. J. Mol. Biol., 397:1350-1371, Apr 2010.<br />
387. Rebecca W. Alexander, John Eargle, and Zaida Luthey-Schulten. Experimental and computational<br />
determination of tRNA dynamics. FEBS letters, 584:376-386, Jan 2010.<br />
388. D. C. Mathew and Z. Luthey-Schulten. Inuence of montmorillonite on nucleotide oligomerization<br />
reactions: a molecular dynamics study. Orig. Life Evol. Biosph., 40:303-317, Jun 2010.<br />
389. Elijah Roberts, John E Stone, Leonardo Sepulveda, Wen-Mei W Hwu, and Zaida Luthey-Schulten.<br />
Long time-scale simulations of in vivo di usion using GPU hardware. In Proceedings of the 2009 IEEE<br />
International Symposium on Parallel & Distributed Processing, May 2009.<br />
390. Sethi, J. Eargle, A. A. Black, and Z. Luthey-Schulten. Dynamical networks in tRNA:protein<br />
complexes. Proc. Natl. Acad. Sci. U.S.A., 106:6620-6625, Apr 2009.<br />
391. K. Chen, J. Eargle, K. Sarkar, M. Gruebele, and Z. A. Luthey-Schulten. The functional role of<br />
ribosomal signatures. Submitted, 2010.<br />
MCA04N033<br />
392. Bolintineanu, D.S., et al., Poisson-Nernst-Planck Models of Nonequilibrium Ion Electrodiffusion<br />
through a Protegrin Transmembrane Pore. PLoS Comput Biol, 2009. 5(1): p. e1000277.<br />
393. Sotiropoulos, V., et al., Model Reduction of Multiscale Chemical Langevin Equations: A Numerical<br />
Case Study. IEEE/ACM Transactions on Computational Biology and Bioinformatics, 2009. 99(2): p. 5555.<br />
394. Sotiropoulos, V. and Y.N. Kaznessis, Analytical Derivation of Moment Equations in Stochastic<br />
Chemical Kinetics. J. Chem. Phys., 2009. submitted.<br />
395. Sayyed-Ahmad, A. and Y.N. Kaznessis, Determining the orientation of protegrin1 in DLPC bilayers<br />
using an implicit solvent-membrane model. PLoS One, 2009. 4(3): p. e4799.<br />
396. Ramalingam, K., et al., Forward engineering of synthetic bio-logical AND gates Biochemical<br />
Engineering Journal, 2009. 47(1-3): p. 38-47.<br />
397. Bolintineanu, D., et al., Antimicrobial mechanism of pore-forming protegrin peptides: 100 pores to<br />
kill E. coli. Peptides, 2010. 31(1): p. 1-8.<br />
398. Vivcharuk, V. and Y. Kaznessis, Free energy profile of the interaction between a monomer or a<br />
dimer of protegrin-1 in a specific binding orientation and a model lipid bilayer. J Phys Chem B, 2010. 114(8):<br />
p. 2790-7.<br />
399. Vivcharuk, V. and Y.N. Kaznessis, Dimerization of protegrin-1 peptides in different environments.<br />
Int. J. Mol. Sci., 2010. Submitted June 2010.<br />
400. Biliouris, K., P. Daoutidis, and Y.N. Kaznessis, Stochastic simulations of the tetracycline operon.<br />
BMC Syst Biol, 2010. Submitted.
MCA05S010<br />
401. “Exchange often and properly in replica exchange molecular dynamics.” Daniel J. Sindhikara,<br />
Daniel J. Emerson§, Adrian E. Roitberg. Journal of Chemical Theory and Computation. Submitted and under<br />
review (2010)<br />
402. “Electronic Spectra of the Nanostar Dendrimer: Theory and Experiment.” Julio L. Palma, Evrim<br />
Atas, Lindsay Hardison, Todd B. Marder, Jonathan C. Collings, Andrew Beeby, Joseph S. Melinger, Jeffrey<br />
L. Krause, Valeria D. Kleiman, and Adrian E. Roitberg. Journal of Physical Chemistry. Submitted and under<br />
review (2010)<br />
403. “Comparative validation of computational methods for conformationalanalysis. The Non aromatic<br />
cases of alanine3 and glycine3”. L. Fusti-Molnar, G. Seabra, Robert Abel, A. E. Roitberg and Kenneth M.<br />
Merz, Jr. Journal of Chemical Theory and Computation. Submitted and under review (2010)<br />
404. “p38γ activation triggers dynamical changes in allosteric docking sites”. Ramiro G. Rodriguez<br />
Limardo, Dardo N. Ferreiro, Adrian E. Roitberg, Marcelo A. Marti and Adrian G. Turjanski. Biochemistry.<br />
Submitted and under review (2010)<br />
405. “Unidirectional energy transfer in conjugated molecules: the crucial role of high frequency C(triple)C<br />
bonds.” S. Fernandez-Alberti, Valeria D. Kleiman, S. Tretiak, and Adrian E. Roitberg. Journal of Physical<br />
Chemistry Letters. Submitted and under review (2010) “Substrate Stereo-specificity in tryptophan<br />
dioxygenase and indoleamine 2,3-dioxygenase” L. Capece, M. Arrar, A. E. Roitberg, Syun-Ru Yeh, M. A.<br />
Marti, D. A. Estrin. Proteins: Structure, Function, and Bioinformatics. doi:10.1002/prot.22819 (2010)<br />
406. “Impact of calcium on N1 influenza neuraminidase dynamics and binding free energy.” M<strong>org</strong>an<br />
Lawrenz, Jeff Wereszczynski, Rommie Amaro, Ross Walker, Adrian Roitberg, J. Andrew McCammon.<br />
Proteins: Structure, Function, and Bioinformatics 78:2523–2532 (2010)<br />
407. "Constant pH Replica Exchange Molecular Dynamics in Biomolecules Using a Discrete Protonation<br />
Model". Meng Y, Roitberg AE, Journal of Chemical Theory and Computation, 6:1401–1412 (2010).<br />
408. “Computational Studies of Ammonia Channel Function in Glutamine 5'-<br />
Phosphoribosylpyrophosphate Amidotransferase”, Wang, XS, Roitberg, AE, Richards, NGJ, Biochemistry,<br />
48:12272-12282 (2009).<br />
409. “Apo and Nickel-Bound Forms of the Pyrococcus horikoshii Species of the Metalloregulatory<br />
Protein: NikR Characterized by Molecular Dynamics Simulations”, Sindhikara, DJ, Roitberg, AE, Merz, KM,<br />
Biochemistry, 48:12024-12033 (2009).<br />
410. “AM1 Parameters for the Prediction of H-1 and C-13 NMR Chemical Shifts in Proteins”, Williams,<br />
DE, Peters, MB, Wang, B, Roitberg, AE, Merz, KM, Journal of Physical Chemistry A, 113:11550-11559<br />
(2009).<br />
411. “Are Current Semiempirical Methods Better Than Force Fields A Study from the Thermodynamics<br />
Perspective”, Seabra, GD, Walker, RC, Roitberg, AE, Journal of Physical Chemistry A, 113:11938-11948<br />
(2009).<br />
MCA05S027<br />
412. Bellesia, G.; Shea, J.-E. What determines the structure and stability of KFFE monomers, dimers,<br />
and protofibrils Biophys. J. 2009, 96, 875-886.<br />
413. Wu, C.; Murray, M. M.; Bernstein, S. L.; Condron, M. M.; Bitan, G.; Shea, J.-E.; Bowers, M. T. The<br />
structure of A42 C-terminal fragments probed by a combined experimental and theoretical study. J. Mol.<br />
Biol. 2009, 387, 492-501.<br />
414. Murray, M. M.; Krone, M. G.; Bernstein, S. L.; Baumketner, A.; Condron, M. M.; Lazo, N. D.;<br />
Teplow, D. B.; Wyttenbach, T.; Shea, J.-E.; Bowers, M. T. Amyloid -protein: Experiment and theory on the<br />
21-30 fragment. J. Phys. Chem. B 2009, 113, 6041-6046.<br />
415. Bernstein, S. L.; Dupuis, N. F.; Lazo, N. D.; Wyttenbach, T.; Condron, M. M.; Bitan, G.; Teplow, D.<br />
B.; Shea, J.-E.; Ruotolo, B. T.; Robinson, C. V.; Bowers, M. T. Amyloid- protein oligomerization and the<br />
importance of tetramers and dodecamers in the aetiology of Alzheimer's disease. Nat. Chem. 2009, 1, 326-<br />
331.
416. Armstrong, B. D.; Soto, P.; Shea, J.-E.; Han, S. Overhauser dynamic nuclear polarization and<br />
molecular dynamics simulations using pyrroline and piperidine ring nitroxide radicals. J. Magn. Reson. 2009,<br />
200, 137-141.<br />
417. Penev, E. S.; Lampoudi, S.; Shea, J.-E. TiReX: Replica-exchange molecular dynamics using<br />
TINKER. Comput. Phys. Commun. 2009, 180, 2013-2019.<br />
418. Wu, C.; Biancalana, M.; Koide, S.; Shea, J.-E. Binding modes of thioflavin-T to the single-layer -<br />
sheet of the peptide self-assembly mimics. J. Mol. Biol. 2009, 394, 627-633.<br />
419. Dupuis, N. F.; Wu, C.; Shea, J.-E.; Bowers, M. T. Human islet amyloid polypeptide monomers form<br />
ordered -hairpins: A possible direct amyloidogenic precursor. J. Am. Chem. Soc. 2009, 191, 18283–18292.<br />
420. Grabenauer, M.; Wu, C.; Soto, P.; Shea, J.-E.; Bowers, M. T. Oligomers of the prion protein<br />
fragment 106-126 are likely assembled from -hairpins in solution and methionine oxidation inhibits<br />
assembly without altering the peptide's monomeric conformation. J. Am. Chem. Soc. 2010, 132, 532-539.<br />
421. Wei, G. H.; Jewett, A. I.; Shea, J.-E. Structural diversity of the dimer of the Alzheimer amyloid-<br />
(25-35) peptide and polymorphism of the resulting fibrils. Phys. Chem. Chem. Phys. 2010, 12, 3622-3629.<br />
422. Wu, C.; Bowers, M. T.; Shea, J.-E. Molecular structures of quiescently-grown and brain-derived<br />
polymorphic fibrils of the Alzheimer amyloid Aβ9-40 peptide: A comparison to agitated fibrils. PLoS Comp.<br />
Biol. 2010, 6, e1000693.<br />
423. Wu, C.; Bowers, M. T.; Shea, J.-E. The binding of thioflavin T and of PIB to the cross-β subunit of<br />
the Alzheimer Aβ9-40 protofibril. 2010, submitted.<br />
424. Wu, C.; Shea, J.-E. On the origins of the weak folding cooperativity of a designed ultrafast<br />
protein. 2010, submitted.<br />
MCA06N060<br />
425. G. Enkavi and E. Tajkhorshid. Dynamics of Spontaneous Substrate Binding and Identi fication of the<br />
Binding Site in GlpT. Biochemistry., 49:1105–1114, 2010.<br />
426. J. Diao, A. J. Maniotis, R. Follberg and E. Tajkhorshid. Interplay of mechanical and binding<br />
properties of type I fibronectin. Theoret. Chem. Acc., 125:397–405, 2010.<br />
427. Y. Wang, S. A. Shaikh and E. Tajkhorshid. Exploring transmembrane diffusion pathways with<br />
molecular dynamics. Physiology, 25: 142-145, 2010.<br />
428. Y. Z. Ohkubo, J. H. Morrissey, and E. Tajkhorshid. Dynamical view of membrane binding and<br />
complex formation of human factor VIIa and tissue factor. J. Thromb. Haemost., 8:1044, 2010<br />
429. Y. Wang and E. Tajkhorshid. Exploring the Permeability of Aquaporin AQP4 to Signaling<br />
Gas Molecule NO with Molecular Dynamics. PROTEINS, 78(3): 661-670, 2010. 6. J. H. Morrissey, R. L.<br />
Davis-Harrison, N. Tavoosi, K. Ke, V. Pureza, J. M. Boettcher, M. C. Clay, C. M. Rienstra, Y. Z. Ohkubo, T.<br />
V. Pogorelov, and Emad Tajkhorshid. Protein-Phospholipid interactions in blood clotting. Thromb. Research,<br />
125 (Suppl. 1):S23, 2010 Saher Shaikh, Po-Chao Wen, Giray Enkavi, Zhijian Huang, and Emad<br />
Tajkhorshid. Capturing Functional Motions of Membrane Channels and Transporters with Molecular<br />
Dynamics Simulation. J. Comput. Theor. Nanosci., 7:1-10, 2010 8. J. Feng, E. Lucchinetti, G. Enkavi, Y.<br />
Wang, P. Gehrig, B. Roschitzki, M. Schaub, E. Tajkhorshid, K. Zaugg and M. Zaugg. Phosphorylation of the<br />
conserved aromatic ladder (Y194/Y190) in adenine nucleotide translocase 1 (ANT1, AAC1) is Src-family<br />
kinase-mediated and regulates ADP/ATP exchange in mitochondria: a downstream mechanism of<br />
cardioprotection. Am. J. Physiol. Cell Physiol.,298:C740C748, 2010.<br />
430. F. Khalili-Araghi, V. Jogini, V. Yarov-Yarovoy, E. Tajkhorshid, B. Roux, and K. Schulten.<br />
Calculation of the gating charge for the Kv1.2 voltageactivated potassium channel. Biophys. J., 98:2189–<br />
2198, 2010.<br />
431. X.-D. Yang, E. Tajkhorshid, and L.-F. Chen Functional Interplay between Acetylation and<br />
Methylation of the RelA Subunit of NF-kappaB. Mol. Cell. Biol., 30:2170–2180, 2010.<br />
432. Z. Huang and E. Tajkhorshid. Identification of the Third Na + Site and the Sequence of Extracellular<br />
Binding Events in the Glutamate Transporter. Biophys. J., In press.
433. H.-C. Siebert, M. Burg-Roderfeld, Th. Eckert, S. Sttzel, U. Kirch, T. Diercks, M. J.<br />
Humphries, M. Frank, R. Wechselberger, E. Tajkhorshid, and S. Oesser. Interaction of the alpha2A domain<br />
of integrin with small collagen fragments. Protein and Cell, In press.<br />
434. P. C. Wen, Z. Huang, G. Enkavi, Y. Wang, J. C. Gumbart and E. Tajkhorshid. Molecular<br />
Mechanisms of active transport across the cellular membrane. In, Editors: Mark S.P. Sansom and Philip C.<br />
Biggin, “Molecular Simulations and Biomembranes: From Biophysics to Function”, Royal Society of<br />
Chemistry. ISBN:978-0-85404-189-3. In Press.<br />
435. S. A. Shaikh and E. Tajkhorshid. Modeling and dynamics of the inward-facing state of a Na + /Cl−<br />
dependent neurotransmitter transporter homologue. PLoS Comput. Biol., In Press.<br />
436. J. C. Gumbart, M. C. Wiener and E. Tajkhorshid. Coupling of calcium and substrate binding<br />
through<br />
loop alignment in the outer membrane transporter BtuB. J. Mol. Biol., 393:1129–1142, 2009.<br />
437. C. J. Law, G. Enkavi, D. N. Wang and E. Tajkhorshid. Structural basis of substrate selectivity in the<br />
glycerol-3-phosphate:phosphate antiporter GlpT. Biophys. J.,97:1346–1353, 2009.<br />
438. J. Li and E. Tajkhorshid. Ion-releasing State of A Secondary Membrane Transporter. Biophys.<br />
J.,97:L29–<br />
L31, 2009. 18. J. Dittmer, L. Thøgersen, J. Underhaug, K. Bertelsen, T. Vosegaard, J. M. Pedersen, B.<br />
Schiøtt, E. Tajkhorshid, T. Skrydstrup and N. C. Nielsen Incorporation of Antimicrobial Peptides into<br />
Membranes: A Combined Liquid-State NMR and Molecular Dynamics Study of Alamethicin in DMPC/DHPC<br />
Bicelles. J. Phys. Chem. B, 113:6928–6937, 2009.<br />
439. K. Bertelsen, B. Paaske, L. Thogersen, E. Tajkhorshid, B. Schiott, T. Skrydstrup, N. Chr. Nielsen,<br />
and Th. Vosegaard Residue-Specific Information about the Dynamics of Antimicrobial Peptides from<br />
1H-15N and 2H Solid-State NMR Spectroscopy. J. Am. Chem. Soc., 131:18335–18342, 2009.<br />
440. J. H. Morrissey, V. Pureza, R. L. Davis-Harrison, S. G. Sligar, C. M. Rienstra, A. Z. Kijac, Y. Z.<br />
Ohkubo, and Emad Tajkhorshid. Proteinmembrane interactions: blood clotting on nanoscale bilayer. J.<br />
Thromb. Haemost., 7 (Suppl. 1):169, 2009<br />
441. P. C. Wen and E. Tajkhorshid. Capturing Large-Scale Conformational Dynamics of P-glycoprotein<br />
by<br />
MD Simulations. Proc. Natl. Acad. Sci. U.S.A., Submitted.<br />
442. J. Li and E. Tajkhorshid. Spontaneous Unbinding of the Substrate in Sodium-Glucose<br />
Transporter.<br />
Biophys. J., Submitted.<br />
443. P. C. Wen and E. Tajkhorshid. Capturing Large-Scale Conformational Dynamics of P-glycoprotein<br />
by<br />
MD Simulations. (Oral presentation) FASEB Summer Research Conference 2010. Snowmass Village,<br />
CO.<br />
444. P. C. Wen and E. Tajkhorshid. The Origin of Nucleotide Dependence of Conformational<br />
Changes<br />
in ABC Transporters. (Selected oral presentation at Permeation & Transport Subgroup meeting)<br />
Biophysical Society 54th Annual Meeting. San Francisco, CA. Biophys. J., 98:627a–628a, 2010.<br />
445. Z. Huang and E. Tajkhorshid. Sequence of Events in the Extracellular Half of the Transport Cycle in<br />
Glutaamte Transporter. Biophysical Society 54th Annual Meeting. San Francisco, CA. Biophys. J.,<br />
98:686a, 2010.
446. S. A. Shaikh and E. Tajkhorshid. Modeling of the Inward-Facing State of LeuT and Dynamics of the<br />
Outward-To-Inward Transition. Biophys. J., 98:687a, 2010.<br />
447. G. Enkavi and E. Tajkhorshid. Molecular determinants of the Stoichiometry of Transport in<br />
GlpT.<br />
Biophys. J., 98:687a, 2010.<br />
448. J. Li and E. Tajkhorshid. Structural Transition Between the Ion-Releasing and Ion-Binding States of<br />
a Secondary Membrane Transporter. Biophys. J., 98:686a–687a, 2010.<br />
449. Y. Z. Ohkubo and E. Tajkhorshid. Dynamical Basis of the enhancement of the enzymatic activity of<br />
factor FVIIa by tissue factor. Biophys. J., 98:689a, 2010<br />
450. Y. Z. Ohkubo and E. Tajkhorshid. Dynamical view of membrane binding and complex formation of<br />
human tissue factor and factor VIIa. The 5th Symposium on Hemostasis, 2010<br />
MCA08X025<br />
451. Claxton DP, Quick M, Shi L, de Carvalho FD, Weinstein H, Javitch JA, Mchaourab HS.<br />
Ion/substrate-dependent conformational dynamics of a bacterial transporter homolog of<br />
neurotransmitter:sodium symporters. Nat Struct Mol Biol. 2010 June 20.<br />
452. Zhao Y, Terry D, Shi L, Weinstein H, Blanchard SC, Javitch JA. Single-molecule dynamics of<br />
gating in a neurotransmitter transporter homolog. Nature. 2010 May 13;465(7295):188-93<br />
MCA08X026<br />
453. Krepkiy, D., Mihailescu, M., Freites, J. A., Schow, E. V., Worcester, D. L., Gawrisch, K., Tobias, D.<br />
J., White, S. H., Swartz, K. J.. (2009). Structure and hydration of membranes embedded with voltagesensing<br />
domains. Nature 462:473-479<br />
454. Schow, E. V., Freites, J. A., Gogna, K. , White, S. H. and Tobias, D. J. (2010). Down-state model of<br />
the voltage-sensing domain of a potassium channel. Biophys. J. 98:2857-2866.<br />
455. Jardon-Valadez, E., Bondar, A.-N., and Tobias, D.J. Coupling of retinal, water, and protein in squid<br />
rhodopsin. (2010) Biophys. J., In Press.<br />
456. Bondar, A.-N., Del Val, C., Freites, J.A., Tobias, D.J., and White, S.H. (2010). Dynamics of SecY<br />
translocons with translocation-defective mutations. Structure, 18:847-857.<br />
457. Klauda, J. B., Venable, R. M., Freites, J. A., O'Connor, J. W., Tobias, D. J., Mondragon-Ramirez,<br />
C., Vorobyov, I., MacKerell, Jr., A. D., Pastor, R. W. (2010). Update of the CHARMM all-atom additive force<br />
field for lipids: validation on six lipid types. J. Phys. Chem. B. 114:7830-7843.<br />
458. Bondar, A.-N., Tobias, D.J., and White, S.H. (2010). How binding of the signal peptide unlocks the<br />
translocon. Biophys. J. 98, Supplement 1, 435a.<br />
459. Bondar, A.-N., and White, S.H. (2010). Lipid membrane composition has a dramatic effect on the<br />
dynamics of the GlpG rhomboid protease from Escherichia coli. Biophys. J. 98, Supplement 1, 224a.<br />
460. Schow, E. V., Nizkorodov, A., Freites, J. A. White, S. H. and Tobias, D. J. (2010) Down-State<br />
Model of the KvAP Full Channel. Biophys. J. 98, Supplement 1, 315a.<br />
461. Keer, H. S., Freites, J. A., Mihailescu, E., White, S. H., and Tobias, D. J. (2010) Structure of a<br />
DOTAP Lipid Bilayer: A Concerted Neutron Scattering and Molecular Dynamics Study. Biophys. J. 98,<br />
Supplement 1, 492a.<br />
462. Jardon-Veladez, E., Freites, J. A., Mihailescu, E., Worcester, D., White, S. H.,and Tobias, D.J.<br />
(2010) Neutron Scattering and MD Simulation Study of DOPC and DOPC/cholesterol Bilayers. Biophys. J.<br />
98, Supplement 1, 493a.
463. Clemens, D.M., Freites, J.A., Kalman, K., Neméth-Cahalan, K,. Tobias, D.J., and Hall, J.E. A<br />
Molecular Dynamics Study of the AQP0-CaM Interaction (talk). ARVO Meeting, May 2010, Fort Lauderdale,<br />
FL.<br />
464. Clemens, D.M., Freites, J.A., Kalman, K., Neméth-Cahalan, K,. Tobias, D.J., and Hall, J.E. A<br />
Molecular Dynamics Study of the AQP0-CaM Interaction (poster). NLM Informatics Training Meeting, June<br />
2010, Denver, CO. (Best Poster Award).<br />
465. Schow, E. V., A. Nizkorodov, J. A. Freites, S. H. White, and D. J. Tobias Atomistic Model of a<br />
Closed Voltage-gated Potassium Channel Derived from Experimental Data.<br />
466. Andersson, M., Freites, J.A., Tobias, D.J., and White, S.H. Accommodating charge in the lipid<br />
bilayer.<br />
467. Freites, J.A., Kasho, V., Tobias, D.J., Kaback, H.R., and White, S.H. Identifying transmembrane<br />
segments in a membrane protein of complex fold.<br />
468. Freites, J.A., Andersson, M., Kasho, V., Smirnova, I., Tobias, D.J., Kaback, H.R., and White, S.H.<br />
Structural insights into the inward-facing conformation of lactose permease from atomistic simulations.<br />
469. Jardon-Valadez, E., and Tobias, D.J. Protein and lipid dynamical coupling in bacteriorhodopsin and<br />
squid rhodopsin at room temperature.<br />
470. Bondar, A.-N., Tobias, D.J., and White, S.H. How binding of the signal peptide opens the protein<br />
translocon.<br />
471. Clemens, D.M., Freites, J.A., Kalman, K., Neméth-Cahalan, K,. Tobias, D.J., and Hall, J.E.<br />
Mechanism of AQP0 Permeability Blockade by CaM.<br />
472. Mihailescu, M., Vaswanic,R.G., Castro-Román, F., Jardón-Valadez, E., Freites, Worcester, D.L.,<br />
Chamberlin, A.R., Tobias, D.J., White, S.H. Acyl-chain methyl distributions of liquid-disordered and liquidordered<br />
membranes.<br />
MCA93S020<br />
473. R. Acharya, V. Carnevale, G. Fiorin, B. G. Levine, A L. Polishchuk, V. Balannik, I. Samish, R. A.<br />
Lamb, L. H. Pinto and. W. F. DeGrado, and M. L. Klein. Structure and mechanism of proton transport<br />
through the transmembrane tetrameric m2 protein bundle of the in fluenza a virus. Proc. Natl. Acad. Sci.<br />
U.S.A, 2010. (in press).<br />
474. Victoria Balannik, Vincenzo Carnevale, Giacomo Fiorin, Benjamin G Levine, Robert A Lamb,<br />
Michael L Klein, William F Degrado, and Lawrence H Pinto. Functional studies and modeling of pore-lining<br />
residue mutants of the influenza a virus m2 ion channel. Biochemistry, 49(4):696– 708, 2010.<br />
475. B. L. Bhargava and Michael L Klein. Aqueous solutions of imidazolium ionic liquids: molecular<br />
dynamics studies. Soft Matter, 5:3475–3480, 2009.<br />
476. B. L. Bhargava and Michael L Klein. Formation of micelles in aqueous solutions of a room<br />
temperature ionic liquid: a study using coarse grained molecular dynamics. Mol Phys, 107:393–401, 2009.<br />
477. B. L. Bhargava and Michael L Klein. Initial stages of aggregation in aqueous solutions of ionic<br />
liquids: molecular dynamics studies. J Phys Chem B, 113(28):9499–9505, 2009.<br />
478. G. Brannigan, D. N. LeBard, J. Hnin, R. Eckenhoff, and M. L. Klein. Multiple binding sites for the<br />
general anesthetic isoflurane identified in the nicotinic acetylcholine receptor transmembrane domain. Proc.<br />
Natl. Acad. Sci. USA, 2010. (in press).<br />
479. V. Carnevale, G. Fiorin, B. G. Levine, W. F. DeGrado, and M. L. Klein. Charge localization in the<br />
pore of a biological proton channel. 2010. (in preparation).<br />
480. V. Carnevale, G. Fiorin, B. G. Levine, W. F. DeGrado, and M. L. Klein. Polarization of water<br />
molecules in the m2 channel of the in fluenza a virus. 2010. (in preparation).<br />
481. R. H. DeVane, A. Jusufi, P. Moore, and M. L. Klein. Parametrization and application of a coarse<br />
grained forcefield for benzene/ fullerene interactions with lipids. 2010. (in preparation).<br />
482. J. E. Donald, Y. Zhang, G. Fiorin, V. Carnevale, D. Slochower, F. Gai, M. L. Klein, and W. F.<br />
DeGrado. Association of transmembrane helices in viral fusion peptides suggests a protein-centric<br />
mechanism of membrane fusion. 2010. (in preparation).
483. J´erˆome H´enin, Grace Brannigan, William P Dailey, Roderic Eckenhoff, and Michael L Klein. An<br />
atomistic model for simulations of the general anesthetic iso flurane. J. Phys. Chem. B, 114(1):604–612, Jan<br />
2010.<br />
484. Ming-Hsun Ho, Marco De Vivo, Matteo Dal Peraro, and Michael L. Klein. Unraveling the catalytic<br />
pathway of metalloenzyme farnesyltransferase through qm/mm computation. J. Chem. Theory Comput.,<br />
5:16571666, 2009.<br />
485. Robert R. Johnson, Axel Kohlmeyer, A.T. Charlie Johnson, and Michael L. Klein. Free energy<br />
landscape of a dna-carbon nanotube hybrid using replica exchange molecular dynamics. Nano Letters,<br />
9:537–541, 2009.<br />
486. A. Jusufi, R. H. DeVane, W. Shinoda, and M. L. Klein. Fullerene size effects on their interaction<br />
with lipid membranes. 2010. (in preparation).<br />
487. S. M. Loverde, V. Ortiz, R. D. Kamien, M. L. Klein, and D. E. Discher. Curvature-driven molecular<br />
demixing in the budding and breakup of mixed component worm-like micelles. Soft Matter, 6:1419–1425,<br />
2010. (cover article).<br />
488. S. M. Loverde, K. Rajagopal, M. L. Klein, and D. E. Discher. Molecular simulation studies of<br />
polyethylene oxide polycaprolactone diblock copolymer micelles. 2010. (in preparation).<br />
489. Virgil Percec, Daniela A Wilson, Pawaret Leowanawat, Christopher J Wilson, Andrew D Hughes,<br />
Mark S Kaucher, Daniel A Hammer, Dalia H Levine, Anthony J Kim, Frank S Bates, Kevin P Davis, Timothy<br />
P Lodge, Michael L Klein, Russell H DeVane, Emad Aqad, Brad M Rosen, Andreea O Argintaru, Monika J<br />
Sienkowska, Kari Rissanen, Sami Nummelin, and Jarmo Ropponen. Self-assembly of janus dendrimers into<br />
uniform dendrimersomes and other complex architectures. Science, 328(5981):1009– 1014, May 2010.<br />
490. Werner Treptow and Michael L Klein. The membrane-bound state of k2p potassium channels. J<br />
Am Chem Soc, 132(23):8145–8151, Jun 2010.<br />
491. L. Sangeetha Vedula, Grace Brannigan, Nicoleta J Economou, Jin Xi, Michael A Hall, Renyu Liu,<br />
Matthew J Rossi, William P Dailey, Kimberly C Grasty, Michael L Klein, Roderic G Eckenhoff, and Patrick J<br />
Loll. A unitary anesthetic binding site at high resolution. J. Biol. Chem., 284(36):24176–24184, Sep 2009.<br />
492. M. Wanunu, D. Cohen-Karni, R. R. Johnson, Y. Zheng, M. L. Klein, and M. Drndic. Discrimination<br />
of modified cytosines in dna. 2010. (in preparation).<br />
MCB050002<br />
493. S. Dorairaj & T. W. Allen. 2007. On the thermodynamic stability of a charged arginine side chain in<br />
a transmembrane helix. Proc.Natl.Acad.Sci. 104:4943-4948.<br />
494. I. Vorobyov and T. W. Allen. 2009. “Molecular Dynamics Computations for Proteins: a case study in<br />
membrane ion permeation.” Handbook of Molecular Biophysics: methods and applications. Chapter 8. ed.<br />
H.G.Bohr. Wiley, Weinheim.<br />
495. A.N. Thompson, I. Kim, T. D. Panosian, T. Iverson, T. W. Allen and C. M. Nimigean. 2009.<br />
“Mechanism of potassium channel selectivity revealed by Na+ and Li+ binding sites inside the KcsA pore.”<br />
Nat. Struct. Mol. Biol. 16: 1317-1324.<br />
496. I. Vorobyov, B. Bekker and T.W. Allen. 2010. “The electrostatics of deformable lipid membranes.”<br />
Biophysical Journal. 98: 2904-2913.<br />
497. I. Vorobyov and T. W. Allen. 2010. “Electrostatics of solvent and membrane interfaces and the role<br />
of electronic polarizability. J. Chem. Phys. 132:185101.<br />
498. A.H. de Jesus and T. W. Allen. “On the structural and energetic consequences of membrane<br />
hydrophobic mismatch and protein anchoring side chains”, in preparation for Biophys. J.<br />
499. B. Bekker and T. W. Allen. “The membrane permeation mechanism of the ionophore valinomycin”,<br />
in preparation for Biophys. J.<br />
500. T. W. Allen, I. Vorobyov, R. Koeppe III and Olaf. S. Andersen. “On the permeation of guanidinium<br />
and other ions through lipid bilayers”, in preparation for Biophys. J.<br />
501. D. Bennet, I. Vorobyov, D. P. Tieleman. T. W. Allen and S. Noskov. “Chloroform partitioning into<br />
membranes and the role of electronic polarizability”, in preparation for J. Am. Chem. Soc.
502. L. Li and T.W. Allen. “Membrane deformations under the actions of lysine and arginine containing<br />
peptides”, in preparation for J. Phys. Chem. B.<br />
503. Vorobyov, I. and T. W. Allen. “The role of anionic lipids in membrane association and translocation<br />
of charged peptides”, in preparation for J. Am. Chem. Soc.<br />
MCB060006<br />
504. Calculation of free energies in fluid membranes subject to heterogeneous curvature fields, N. J.<br />
Agrawal, R. Radhakrishnan, Physical Review E, 80: 011925, 2009. Pubmed ID: 19658747.<br />
505. Computational Delineation of the Catalytic Step of a High Fidelity DNA Polymerase, R.<br />
Venkatramani, R. Radhakrishnan, Protein Science (A Protein Society Journal), 19(4), 815-825, 2010.<br />
Pubmed ID: 20162624.<br />
506. The ErbB3/HER3 Intracellular Domain is Competent to Bind ATP and Catalyze<br />
Autophosphorylation, F. Shi, S. E. Telesco, Y. Liu, R. Radhakrishnan*, M. A. Lemmon*, Proceedings of the<br />
National Academy of Sciences, 2010, in press. Pubmed ID: 20351256; *Co-corresponding authors.<br />
507. Linking oncogenic signaling to molecular structure, Multiscale Cancer Modeling, J. Purvis, A. Shih,<br />
Y. Liu, R. Radhakrishnan, Eds. T.S. Deisboeck, G. Stamatakos, Chapman & Hall-CRC Mathematical and<br />
Computational Biology Series, 2010, in press.<br />
508. Computational methods related to reaction chemistry, A. J. Shih*, S. T. Telesco*, Y. Liu, R.<br />
Venkatramani, R. Radhakrishnan, Comprehensive Biomaterials, eds. P. Ducheyne, Elsevier London, 2010,<br />
in press. *These authors contributed equally.<br />
509. Structural analysis of the activation loop conformation of Epidermal Growth Factor Receptor<br />
Tyrosine Kinase through molecular dynamics simulations, A. Shih, S. H. Choi, M. A. Lemmon, R.<br />
Radhakrishnan, 2009, submitted to Biophsical J.<br />
510. Protein-Mediated Orchestration of Vesiculation Prior to Vesicle-Scission in Clathrin-Dependent<br />
Endocytosis, N. Agrawal, J. Nukpezah, R. Radhakrishnan, 2009, submitted to Plos Computational Biology.<br />
511. A finite element method for simulating thermally fluctuating Brownian particles, B. Uma, T. N.<br />
Swaminathan, R. Radhakrishnan, D. M. Eckmann, P. S. Ayyaswamy, 2010, submitted to Phys. Rev. E.<br />
512. Design of functionalized nanocarriers for endothelial targeting based on Monte Carlo calculations of<br />
absolute free energy of binding, J. Liu, D. M. Eckmann, P. S. Ayyaswamy, V. Muzykantov, R.<br />
Radhakrishnan, 2010, submitted to PNAS.<br />
513. A molecular docking study of the binding of ruthenium complex compounds to PIM1, GSK3, and<br />
CDK2 protein kinases, Y. Liu, N. J. Agrawal, R. Radhakrishnan, 2009, submitted.<br />
514. Mechanisms of phosphoryl transfer in ErbB3 kinase, F. Jia, S. Telesco, Y. Liu, R. Radhakrishnan,<br />
2010, to be submitted.<br />
515. Energy landscapes in tyrosine phosphorylation and mechanisms of phosphoryl transfer in ErbB1<br />
kinase, Y. Liu, R. Radhakrishnan, 2010, to be submitted.<br />
516. Binding of Erlotinib to Inactive ErbB1 tyrosine kinase domain, Y. Liu, J. Park, M. A. Lemmon, R.<br />
Radhakrishnan, 2010, to be submitted.<br />
517. Mechanism of clathrin-mediated vesicle budding in receptor endocytosis, N. J. Agrawal, S. Engles,<br />
R. Toy, R. Radhakrishnan, 2010, to be submitted.<br />
MCB060069<br />
518. Gedeon, P.C.; Indarte, M.; Surratt, C.K.; Madura, J.D. Molecular dynamics of leucine and<br />
dopamine transporter proteins in a model cell membrane lipid bilayer, Proteins, 2010. 78, 797.<br />
519. Asciutto, E.K.; General, I.J.; Xiong, K.; Asher S.A.; and Madura, J.D. Sodium Perchlorate effects on<br />
the helical stability of a mainly alanine peptide. Biophys. J, 2010, 98, 186.<br />
520. Indarte, M; Liu, Y.; Madura, J.D. and Surratt, C.K. Receptor---based discovery of plasmalemmal<br />
monoamine transporter inhibitor via high---throughput docking and pharmacophore modeling. ACS Chemical<br />
Neuroscience, 2010. 1, 223.
MCB070009<br />
521. J. Lee, S. Ham, and W. Im (2009) Beta-Hairpin Restraint Potentials for Calculations ofPotentials of<br />
Mean Force as a Function of Beta-Hairpin Tilt, Rotation, and Distance. J. Comput. Chem. 30:1334-1343.<br />
522. W. Im, J. Lee, T. Kim, and H. Rui (2009) Novel Free Energy Calculations to Explore Mechanisms<br />
and Energetics of Membrane Protein Structure and Function, J. Comput. Chem. 30:1622-1633.<br />
523. S. Jo, J.B. Lim, J.B. Klauda, and W. Im (2009) CHARMM-GUI Membrane Builder for Mixed Bilayers<br />
and Its Application to Yeast Membranes, Biophys. J. 97:50-58.<br />
524. H. Rui, J. Lee, and W. Im (2009) Comparative Molecular Dynamics Simulation Studies of Protegrin-<br />
1 Monomer and Dimer in Different Lipid Bilayers, Biophys. J. 97:787-795.<br />
525. T. Rathinavelan, L. Zhang, W.L. Picking, D.D. Weis, R.N. De Guzman, and W. Im (2010) A<br />
Repulsive Electrostatic Mechanism for Protein Export through the Type III Secretion Apparatus, Biophys. J.<br />
98:452-461 [cover].<br />
526. T. Kim and W. Im (2010) Revisiting Hydrophobic Mismatch with Free Energy Calculations<br />
ofTransmembrane Helix Tilt and Rotation. Biophys. J. (in press).<br />
527. H. Rui and W. Im (2010) Protegrin-1 Orientation and Physicochemical Properties in Membrane<br />
Bilayers Studied by Potential of Mean Force Calculations. J. Comput. Chem. (in press).<br />
528. K.C. Song, P.W. Livanec, J.B. Klauda, K. Kuczera, R.C. Dunn, and W. Im (2010) Orientation of<br />
Fluorescent Lipid Analog BODIPY-PC to Probe Lipid Membrane Properties: A Comparison of Simulation<br />
with Experiment. submitted.<br />
529. S. Jo, H. Rui, J.B. Lim, J.B. Klauda, and W. Im (2010) Cholesterol Flip-Flop: Insights from Free<br />
Energy Simulation Studies, submitted.<br />
MCB070118<br />
530. Ruibo Wu, Shenglong Wang, Nengjie Zhou, Zexing Cao and Yingkai Zhang, A Proton-Shuttle<br />
Reaction Mechanism for Histone Deacetylase 8 and the Catalytic Role of Metal Ions. J. Am. Chem. Soc., in<br />
press, 2010.<br />
531. Yanzi Zhou, Shenglong Wang and Yingkai Zhang, Catalytic Reaction Mechanism of<br />
Acetylcholinesterase Determined by Born-Oppenheimer ab initio QM/MM Molecular Dynamics Simulations,<br />
J. Phys. Chem. B, 114, 8817 - 8825, 2010.<br />
532. Ruibo Wu, Po Hu, Shenglong Wang, Zexing Cao and Yingkai Zhang, Flexibility of Catalytic Zinc<br />
Coordination in Thermolysin and HDAC8: A Born-Oppenheimer ab initio QM/MM Molecular Dynamics<br />
Study, J. Chem. Theory Comput., 6, 337-343, 2010.<br />
533. Kamau Fahie, Po Hu, Steven Swatkoski, Robert J. Cotter, Yingkai Zhang, and Cynthia Wolberger.<br />
Side chain specificity of ADP-ribosylation by a sirtuin. FEBS J., 276, 7159-71767, 2009.<br />
534. Zhenyu Lu, Jonathan Lai, Yingkai Zhang. Importance of Charge Independent Effects in Readout<br />
the Trimethyllysine Mark by HP1 Chromodomain. J. Am. Chem. Soc., 131, 14928-14931, 2009.<br />
MCB080015<br />
535. “Structural Determinants of Cadherin-23 Function in Hearing and Deafness”. M. Sotomayor, W. A.<br />
Weihofen, R. Gaudet, D. P. Corey. Neuron, 66: 85-100, 2010.<br />
536. “Calmodulin-interaction sites are critical for the thermosensitivity of a heat-activated ion channel”.<br />
E. Procko, S.-Y. Lau, M. Sotomayor, A. Beristain-Barajas, A. McFedries, R. Gaudet. In preparation.<br />
537. “Characterization and Structural Studies of the Plasmodium falciparum Ubiquitin and Nedd8<br />
Hydrolase UCHL3”. K. Artavanis-Tsakonas, W. A. Weihofen, J. M. Antos, B. I. Coleman, C. Comeaux, M. T.<br />
Duraisingh, R. Gaudet, H. L. Ploegh. Journal of Biological Chemistry, 285: 6857-6866, 2010.
MCB080093<br />
538. J. Spiriti and A. van der Vaart, Mechanism of the calcium-induced trans-cis isomerization of a nonprolyl<br />
peptide bond in Clostridium histolyticum collagenase, Biochemistry, 49, 5314-5320 (2010).<br />
539. C. Wagner, P. Jurutka, P. Marshall, T. Groy, A. van der Vaart, J. Ziller, J. Furmick, M. Graeber, E.<br />
Matro, B. Miguel, I. Tran, J. Kwon, J. Tedeschi, S. Moosavi, A. Danishyar, J. Philp, R. Khamees, J. Jackson,<br />
D. Grupe, S. Badshah, and J. Hart, Modeling, Synthesis and biological evaluation of potential retinoid-Xreceptor<br />
(RXR) selective agonists: Novel analogs of 4-[1-(3,5,5,8,8-pentamethyl-5,6,7,8-tetrahydro-<br />
2naphthyl)ethynyl]benzoic acid (bexarotene), Journal of Medicinal Chemistry, 52, 59505966 (2009).<br />
540. H. Kamberaj and A. van der Vaart, Extracting the causality of correlated motions from molecular<br />
dynamics simulations, Biophysical Journal, 97, 1747-1755 (2009).<br />
541. P. Maragakis, A. van der Vaart, and M. Karplus, Gaussian-mixture umbrella sampling, Journal of<br />
Physical Chemistry B, 113, 4664-4673 (2009).<br />
542. H. Kamberaj and A. van der Vaart, An optimized replica exchange molecular dynamics method,<br />
Journal of Chemical Physics, 130, 074906 (2009).<br />
543. H. Kamberaj and A. van der Vaart, Correlated motions and interactions at the onset of the DNAinduced<br />
partial unfolding of Ets-1, Biophysical Journal, 96, 1307-1317 (2009).<br />
MCB090134<br />
544. J. Wereszczynski and J. Andrew McCammon “Simulations of p97 Reveal a Structure Consistent<br />
With Crystallographic and Small-Angle X-Ray Scattering Experiments” (In Preparation)<br />
545. J. Wereszczynski and I. Andricioaei, “Free Energy Calculations Reveal Rotating-Ratchet<br />
Mechanism for DNA Supercoil Relaxation by Topoisomerase IB and its Inhibition” Biophys J. (In Press)<br />
546. M. Lawrenz, J. Wereszczynski, R. Amaro, R. Walker, A. Roitberg, and J. Andrew McCammon,<br />
“Impact of Calcium on N1 Influenza Neuraminidase Dynamics and Binding Free Energy” Proteins (In Press)<br />
547. J. Wereszczynski and I. Andricioaei, “Conformational and Solvent Entropy Contributions to the<br />
Thermal Response of Nucleic Acid-Based Nanothermometers” J. Phys. Chem. B 114, 2076-2082 (2010)<br />
548. J. Wereszczynski and I. Andricioaei, “On structural transitions, thermodynamic equilibrium and the<br />
phase diagram of DNA and RNA duplexes under external tension and torque” Proc. Natl. Acad. Sci.<br />
MCB090159<br />
549. G.T. Beckham, J.F. Matthews, Y.J. Bomble, Lintao Bu, W.S. Adney, M.E. Himmel, M.R. Nimlos,<br />
M.F. Crowley, 114, 2010, “Identification of amino acids responsible for processivity for a Family 1 CBM from<br />
a fungal cellulase”, J. Phys. Chem. B.<br />
550. M. Alahuhta, Q. Xu, Y.J. Bomble, W.S. Adney, S.Y. Ding, M. E. Himmel, and V. V. Lunin, “The<br />
unique binding mode of the Cellulosomal CBM4 from Clostridium thermocellum Cellobiohydrolase A”, in<br />
press at J. Mol. Biol.<br />
551. G.T. Beckham, Y.J. Bomble, J.F. Matthews, M.G. Resch, J.S. Yarbrough, S.R. Decker, L. Bu, X.<br />
Zhao, C. McCabe, J. Wohlert, M. Bergenstråhle, J.W. Brady, M.E. Himmel, M.F. Crowley, in review, “The<br />
linker peptide from the Trichoderma reesei Family 7 cellulase is a flexible, disordered tether between the<br />
catalytic domain and carbohydrate-binding module further extended by O-linked glycosylation”<br />
552. Y. J. Bomble, M. R. Nimlos, G. T. Beckham, J. F. Matthews, M. E. Himmel, and M. F. Crowley<br />
“Understanding the function of the Immunoglobulin-like domain in Family 9 enzymes for C. thermocellum”, in<br />
preparation.<br />
553. Y. J. Bomble, G. T. Beckham, J. F. Matthews, M. R. Nimlos , M. E. Himmel, and M. F. Crowley, “A<br />
study of cellulosomal enzyme complex self-assembly”, to be submitted to J. Biol. Chem.<br />
554. G.T. Beckham, B. Peters, J.F. Matthews, W.S. Adney, M.E. Himmel, M.F. Crowley, “The molecularlevel<br />
basis for biomass recalcitrance”, in preparation.
555. Lintao Bu, G.T. Beckham, M.R. Shirts, M.R. Nimlos, W.S. Adney, M.E. Himmel, and M.F. Crowley.<br />
Potential of mean force for cellobiose expulsion from the catalytic tunnel of Cel7A, in preparation.<br />
556. Lintao Bu, G.T. Beckham, M.R. Shirts, M.R. Nimlos, W.S. Adney, M.E. Himmel, and M.F. Crowley.<br />
Calculating absolute binding free energy of cellobiose to the catalytic tunnel of Cel7A, in preparation.<br />
557. Lintao Bu, M.R. Nimlos, and M.E. Himmel. Meso-scale modeling of polysaccharides in plant cell<br />
walls: an application to carbohydrate-binding modules translation on cellulose surface. In Computational<br />
Modeling in Lignocellulosic Biofuel Production, in press.<br />
558. Y. J. Bomble, Q. Xu, M. E. Himmel, and M. F. Crowley “Molecular modeling of cellulosomal<br />
systems”. In Computational Modeling in Lignocellulosic Biofuel Production, in press.<br />
559. A.P. Hynninen, J.F. Matthews, G.T. Beckham, M.F. Crowley, M.R. Nimlos, Coarse-grained model<br />
for glucose, cellobiose, and cellotetraose in water”, in preparation.<br />
560. J.F. Matthews, G.T. Beckham, M.R. Nimlos, M.E. Himmel, M.F. Crowley, A new high temperature<br />
polymorph of cellulose predicted from molecular simuation, in preparation.<br />
561. J.F. Matthews, M.E. Himmel, M.F. Crowley, A comprehensive comparison of cellulose models for<br />
cellulose Ib, in preparation<br />
MCB090161<br />
562. Spahn, C.M., and Penczek, P.A. (2009). Exploring conformational modes of macromolecular<br />
assemblies by multiparticle cryo-EM. Curr Opin Struct Biol 19, 623-631.<br />
MCB090163<br />
563. P.I.: Michael F. Hagan<br />
564. Yu, N., Pontiggia, F., Hagan M. F.: Mechanism of HIV CA-C Dimerization, manuscript in<br />
preparation.<br />
MCB090168<br />
565. A.K. Ghosh, J. Takayama, K.V. Rao, K. Ratia, R. Chaudhuri, D.C. Mulhearn, H. Lee, D.B. Nichols,<br />
S. Baliji, S.C. Baker, M.E. Johnson, and A.D. Mesecar. 2010. “Severe Acute Respiratory Syndrome-<br />
Coronavirus Papain-Like Novel Protease Inhibitors: Design, Synthesis, Protein-Ligand X-ray Structure and<br />
Biological Evaluation”, Journal of Medicinal Chemistry, 53(13), 4968–4979. (Teragrid support for very late<br />
stages of modeling.)<br />
566. R. Chaudhuri, S. Tang, G. Zhao, H. Lu, D.A. Case and M.E. Johnson. Comparative Analysis of the<br />
papain-like protease active sites of SARS and NL63 coronaviruses: Implications for drug design. Draft<br />
completed, in preparation for submission.<br />
567. R. Chaudhuri, H. Lee, J. Torres, and M.E. Johnson. Identification of novel inhibitor scaffolds for the<br />
inhibition of the papain-like protease of SARS Coronavirus using computational models. Draft completed, in<br />
preparation for submission.<br />
MCB090172<br />
568. Salis, H., Tamsir, A., and Voigt, C. (2009a). Engineering bacterial signals and sensors. Contrib<br />
Microbiol 16, 194-225.<br />
569. Salis, H.M., Mirsky, E.A., and Voigt, C.A. (2009b). Automated design of synthetic ribosome binding<br />
sites to control protein expression. Nat Biotechnol 27, 946-950.<br />
570. Tabor, J.J., Salis, H.M., Simpson, Z.B., Chevalier, A.A., Levskaya, A., Marcotte, E.M., Voigt, C.A.,<br />
and Ellington, A.D. (2009). A synthetic genetic edge detection program. Cell 137, 1272-1281.
MCB090174<br />
571. W. Huang, J. Kim, S. Jha, and F. Aboul-ela. A mechanism for s-adenosyl methionine assisted<br />
formation of a riboswitch conformation: A small molecule with a strong arm. Nucleic Acid Res., 37(19):6528–<br />
6539, 2009.<br />
572. J. Kim, W. Huang, S. Maddineni, F. Aboul-ela, and S. Jha. Exploring the RNA folding energy<br />
landscape using scalable distributed cyberinfrastructure. ACM HPDC 2010, Emerging Computational<br />
Methods for the Life Sciences, 2010.<br />
573. Wei Huanga, Vamsi Boyapatia, Sunayana Mitraa, Jooyun Kimb, Shantenu Jhab, Fareed Aboulelaa,<br />
"Bridging the Gap between “ON” state and “OFF” State of the SAM-I riboswitch", Zing conference on<br />
Nucleic Acids, Nov 2010<br />
574. S.H Ko, N. Kim, J. Kim, A. Thota and S. Jha, "Efficient Runtime Environment for Coupled Multi-<br />
Physics Simulations: Dynamic Resource Allocation and Load-Balancing", CCGrid 2010<br />
575. S. Jha H. Kim, Y. Khamra and M. Parashar. "Exploring application and infrastructure adaptaion on<br />
hybrid grid-cloud infrastructure". ScienceCloud: Workshop on Scientific Cloud Computing, in conjunction<br />
with ACM HPDC-2010<br />
576. S. Jha Y. Khamra and C. White. "Modelling data-driven co2 sequestration using distributed hpc<br />
cyberinfrastructure: A case study". TeraGrid 2010 Conference, 2010<br />
577. S. Jha H. Kim, Y. Khamra and M. Parashar. "Autonomic approach to integrated hpc grid and cloud<br />
usage". IEEE Conference on eScience 2009, Oxford, 2009<br />
578. S. Jha Y. Khamra. "Modelling data-driven co2 sequestration using distributed hpc<br />
cyberinfrastructure". Microsoft E-Science Conference, Pittsburgh PA, 2009<br />
579. Yaakoub Youssef El-Khamra. "Real-time reservoir characterization and beyond:<br />
Cyberinfrastructure tools and technologies". Master’s thesis, Louisiana State University, Baton Rouge,<br />
Louisiana, 2009<br />
MCB090176<br />
580. J. R. Tusell and P. R. Callis, The role of tryptophan quenching in reporting folding of the villin<br />
headpiece., manuscript in preparation for submission to Physical Chemistry Letters.<br />
MCB990010<br />
581. 1. Y. Luo, C.E. Ergenekan, J.T. Fischer, M.-L. Tan, & T. Ichiye, The molecular determinants of the<br />
increased reduction potential of the rubredoxin domain rubrerythrin relative to rubredoxin. Biophys. J., 2010.<br />
98, 560-568.<br />
582. Y. Luo, M.-L. Tan, & T. Ichiye, Slow time scale dynamics and the intramolecular electron transfer in<br />
ferredoxin: A molecular dynamics study. Proteins Struc. Func. Bioinf., in preparation.<br />
583. M.-L. Tan, Y. Luo, & T. Ichiye, Tuning the intramolecular electron transfer by mutations in<br />
ferredoxin: A molecular dynamics study. Proteins Struc. Func. Bioinf., in preparation.<br />
584. Y. Luo & T. Ichiye, Dynamical relaxation of Clostridium acidurici ferredoxin: A molecular dynamics<br />
study. Proteins Struc. Func. Bioinf., in preparation.<br />
Ocean Sciences<br />
OCE100015<br />
585. Ozdemir, C. E., T.-J. Hsu, S. Balachandar, 2010a, Simulation of fine sediment transport in<br />
oscillatory boundary layer, Journal of Hydro-environment Research, 3, 247-259.<br />
586. Ozdemir, C. E., T.-J. Hsu, S. Balachandar, 2010b, A numerical investigation of fine particle laden<br />
flow in oscillatory channel: The role of particle induced density stratification, Journal of Fluid Mechanics,<br />
accepted.
Physics<br />
MCA06N025<br />
587. “Nuclear Physics from Lattice QCD” S. R. Beane, W. Detmold, K. Orginos and M. J. Savage<br />
arXiv:1004.2935 [hep-lat] submitted to Progress in Particle and Nuclear Physics.<br />
588. “A method to study complex systems of mesons in Lattice QCD” W. Detmold and M. J. Savage<br />
arXiv:1001.2768 [hep-lat], accepted for publication in Phys. Rev. D.<br />
589. “High Statistics Analysis using Anisotropic Clover Lattices: (III) Baryon- Baryon Interactions” S. R.<br />
Beane et al. [NPLQCD Collaboration] Phys. Rev. D 81, 054505 (2010)<br />
590. “Meson-Baryon Scattering Lengths from Mixed-Action Lattice QCD” A. Torok et al. [NPLQCD<br />
Collaboration] Phys. Rev. D 81, 074506 (2010)<br />
591. “High Statistics Analysis using Anisotropic Clover Lattices: (II) Three-Baryon Systems” S. R. Beane<br />
et al. [NPLQCD Collaboration] Phys. Rev. D 80, 074501 (2009)<br />
MCA08X037<br />
592. “Area Invariance of Apparent Horizons under Arbitrary Boosts" Sarp Akcay, Richard A. Matzner<br />
and Vishnu Natchu, Journal of General Relativity and gravitation 41 387-402 (2010).<br />
593. “The Volume Inside a Black Hole", Brandon S. DiNunno, Richard A. Matzner, Journal of General<br />
Relativity and Gravitation 42 63 (2010) DOI: 10.1007/s10714-009-0814-x (2009).<br />
594. “Superkicks in Hyperbolic Encounters of Binary Black Holes" (James Healy, Frank Herrmann, Ian<br />
Hinder, Deirdre M. Shoemaker, Pablo Laguna, Richard A. Matzner)Phys. Rev. Lett. 102.041101 (2009).<br />
595. “Status of NINJA: the Numerical INJection Analysis project" (Cadonati, Laura; Aylott, Benjamin;<br />
Baker, John G.; Boggs, William D.; Boyle, Michael; Brady, Patrick R.; Brown, Duncan A.; Brgmann, Bernd;<br />
Buchman, Luisa T.; Buonanno, Alessandra; Camp, Jordan; Campanelli, Manuela; Centrella, Joan; Chatterji,<br />
Shourov; Christensen, Nelson; Chu, Tony; Diener, Peter; Dorband, Nils; Etienne, Zachariah B.; Faber,<br />
Joshua; Fairhurst, Stephen; Farr, Benjamin; Fischetti, Sebastian; Guidi, Gianluca; Goggin, Lisa M.; Hannam,<br />
Mark; Herrmann, Frank; Hinder, Ian; Husa, Sascha; Kalogera, Vicky; Keppel, Drew; Kidder, Lawrence E.;<br />
Kelly, Bernard J.; Krishnan, Badri; Laguna, Pablo; Lousto, Carlos O.; Mandel, Ilya; Marronetti, Pedro;<br />
Matzner, Richard; McWilliams, Sean T.; Matthews, Keith D.; Mercer, R. Adam; Mohapatra, Satyanarayan R.<br />
P.; Mrou, Abdul H.; Nakano, Hiroyuki; Ochsner, Evan; Pan, Yi; Pekowsky, Larne; Pfeier, Harald P.; Pollney,<br />
Denis; Pretorius, Frans; Raymond, Vivien; Reisswig, Christian; Rezzolla, Luciano; Rinne, Oliver; Robinson,<br />
Craig; Rver, Christian; Santamara, Luca; Sathyaprakash, Bangalore; Scheel, Mark A.; Schnetter, Erik;<br />
Seiler, Jennifer; Shapiro, Stuart L.; Shoemaker, Deirdre; Sperhake, Ulrich; Stroeer, Alexander; Sturani,<br />
Riccardo; Tichy, Wolfgang; Liu, Yuk Tung; van der Sluys, Marc; van Meter, James R.; Vaulin, Ruslan;<br />
Vecchio, Alberto; Veitch, John; Vicer, Andrea; Whelan, John T.; Zlochower, Yosef) Classical and Quantum<br />
Gravity 26 Issue 11, pp. 114008 (June 2009) IOP DOI: 10.1088/0264-9381/26/11/114008[arXiv:0905.4227 ]<br />
(2009).<br />
596. “Final Mass and Spin of Merged Black Holes and the Golden Black Hole" James Healy, Pablo<br />
Laguna, Richard A. Matzner, Deirdre M. Shoemaker, Phys. Rev D81 (Rapid Communication) (No.8)<br />
(electronic publication 8 April, 2010) DOI: 10.1103/PhysRevD.81.081501,<br />
http://link.aps.<strong>org</strong>/doi/10.1103/PhysRevD.81.081501<br />
MCA09X003<br />
597. “M. Anderson, L. Lehner, M. Megevand, D. Neilsen, \Post-merger electromag-netic emissions from<br />
disks perturbed by binary black holes," Physical Review D81, 044004 (2010).<br />
598. “C. Palenzuela, M. Anderson, L. Lehner, S.L. Liebling, D. Neilsen, \Stirring, not shaking: binary<br />
black holes' effects on electromagnetic felds," Physical Review Letters 103, 081101 (2009).<br />
599. “S.L. Liebling, L. Lehner, D. Neilsen, C. Palenzuela, \Evolutions of Magnetized and Rotating<br />
Neutron Stars," Physical Review D 81 124023 (2010). gr-qc/1001.0575
600. “C. Palenzuela, L. Lehner, S.L. Liebling, \Dual Jets From Binary Black Holes," Accepted by<br />
Science.<br />
601. “C. Palenzuela, T. Garrett, L. Lehner, S.L. Liebling, \Magnetospheres of Black Hole Systems in<br />
Force-Free Plasma," gr-qc/1007.1198<br />
602. “S. Chawla, M. Anderson, M. Besselman, L. Lehner, S.L. Liebling, P.M. Motl, D. Neilsen, \Mergers<br />
of Magnetized Neutron Stars with Spinning Black Holes: Disruption, Accretion and Fallback," Submitted to<br />
PRL. gr-qc/1006.2839<br />
603. “S.L. Liebling, \Dynamics of Rotating, Magnetized Neutron Stars," Conference Proceedings of the<br />
Twelfth Marcel Grossman Meeting on General Relativity, 2009. gr-qc/1002.2217<br />
MCA93S002<br />
604. The B → D∗ lν form factor at zero recoil from three-flavor lattice QCD: A model independent<br />
determination of |Vcb|, The Fermilab Lattice and MILC Collaborations: C. Bernard, C. DeTar, M. DiPierro,<br />
A.X. El-Khadra, R.T. Evans, E.D. Freeland, E. Gamiz, Steven Gottlieb, U.M. Heller, J.E. Hetrick, A.S.<br />
Kronfeld, J. Laiho, L. Levkova, P.B. Mackenzie, M. Okamoto, J. Simone, R. Sugar, D. Toussaint, R.S. Van<br />
de Water, Phys. Rev. D79, 014506 (2009) [arXiv:0808.2519 [hep-lat].<br />
605. The B→πlν semileptonic form factor from three-flavor lattice QCD: A model-independent<br />
determination of |Vub|, The Fermilab Lattice and MILC Collaborations: Jon A. Bailey, C. Bernard, C. DeTar,<br />
M. Di Pierro, A. X. El-Khadra, R. T. Evans, E. D. Freeland, E. Gamiz, Steven Gottlieb, U. M. Heller, J. E.<br />
Hetrick, A. S. Kronfeld, J. Laiho, L. Levkova, P. B. Mackenzie, M. Okamoto, J. N. Simone, R. Sugar, D.<br />
Toussaint, R. S. Van deWater, Phys. Rev. D79, 054507 (2009) [arXiv:0811.3640 [hep-lat]].<br />
606. Full nonperturbative QCD simulations with 2+1 flavors of improved staggered quarks, The MILC<br />
Collaboration: A. Bazavov, C. Bernard, C. DeTar, Steven Gottlieb, U.M. Heller, J.E. Hetrick, J. Laiho, L.<br />
Levkova, P.B. Mackenzie, M.B. Oktay, R. Sugar, D. Toussaint, and R.S. Van de Water, Rev. Mod. Phys. 82,<br />
1349-1417 (2010) [arXiv:0903.3598 [hep-lat]].<br />
607. Equation of state and QCD transition at finite temperature, The HotQCD Collaboration: A. Bazavov,<br />
T. Bhattacharya, M. Cheng, N.H. Christ, C. DeTar, S. Ejiri, Steven Gottlieb, R. Gupta, U.M. Heller, K.<br />
Huebner, C. Jung, F. Karsch, E. Laermann, L. Levkova, C. Miao, R.D. Mawhinney, P. Petreczky, C.<br />
Schmidt, R.A. Soltz, W. Soeldner, R. Sugar, D. Toussaint, and P. Vranas, Phys. Rev. D80, 014504 (2009)<br />
[arXiv:0903.4379 [hep-lat]].<br />
608. The strange quark condensate in the nucleon in 2+1 flavor QCD, D. Toussaint and W. Freeman for<br />
the MILC Collaboration, Phys. Rev. Lett. 103, 122002 (2009) [arXiv:0905.2432 [hep-lat]].<br />
609. Visualization of semileptonic form factors from lattice QCD, The Fermilab Lattice and MILC<br />
Collaborations: C. Bernard, C. DeTar, M. Di Pierro, A.X. El-Khadra, R.T. Evans, E.D. Freeland, E. Gamiz,<br />
Steven Gottlieb, U.M. Heller, J.E. Hetrick, A.S. Kronfeld, J. Laiho, L. Levkova, P.B. Mackenzie, M. Okamoto,<br />
M.B. Oktay, J.N. Simone, R. Sugar, D. Toussaint, R.S. Van de Water, Phys. Rev. D80, 034026, (2009)<br />
[arXiv:0906.2498 [hep-lat]].<br />
610. QCD Thermodynamics from the Lattice, C. DeTar and U. Heller, European Physical Journal A 41,<br />
404-437 (2009) [arXiv:0905.2949 [hep-lat]]. 764. Quarkonium mass splittings in three-flavor lattice QCD, T.<br />
Burch, C. DeTar, M. Di Pierro, A.X. El- Khadra, E.D. Freeland, Steven Gottlieb, A.S. Kronfeld, L. Levkova,<br />
P.B. Mackenzie, J.N. Simone, Phys. Rev. D81, 034508 (2010) [arXiv:0912.2701 [hep-lat]].<br />
611. Tuning Fermilab Heavy Quarks in 2+1 Flavor Lattice QCD with Application to Hyper fine Splittings,<br />
The Fermilab Lattice and MILC Collaborations: C. Bernard, C. DeTar, M. Di Pierro, A.X. El-Khadra, R.T.<br />
Evans, E.D. Freeland, E. Gmiz, Steven Gottlieb, U.M. Heller, J.E. Hetrick, A.S. Kronfeld, J. Laiho, L.<br />
Levkova, P.B. Mackenzie, J.N. Simone, R. Sugar, D. Toussaint, R.S. Van de Water, arXiv:1003.1937 [heplat],<br />
submitted to Phys. Rev. D.<br />
612. QCD thermodynamics with nonzero chemical potential at Nt = 6 and effects from heavy quarks,<br />
The MILC Collaboration: C. DeTar, L. Levkova, Steven Gottlieb, U.M. Heller, J.E. Hetrick, R. Sugar, D.<br />
Toussaint, Phys. Rev. D81, 114504 (2010) [arXiv:1003.5682 [hep-lat]].<br />
613. Topological susceptibility with the asqtad action, MILC collaboration: A. Bazavov, C. Bernard, B.<br />
Billeter, C. DeTar, Steven Gottlieb, U. M. Heller, J. E. Hetrick, J. Laiho, L. Levkova, M.B. Oktay, J. Osborn,<br />
R. L. Sugar, D. Toussaint, R. S. Van de Water, Phys. Rev. D81 114501 (2010) [arXiv:1003.5695 [hep-lat]].
614. Scaling studies of QCD with the dynamical HISQ action, MILC collaboration: A. Bazavov, C.<br />
Bernard, C. DeTar,W. Freeman, Steven Gottlieb, U. M. Heller, J. E. Hetrick, J. Laiho, L. Levkova, M. Oktay,<br />
J. Osborn, R.L. Sugar, D. Toussaint, R.S. Van de Water, arXiv:1004.0342 [hep-lat], submitted to Phys. Rev.<br />
D.<br />
615. Status and prospect for determining fB, fBs, fBs / fB on the lattice, C. Bernard, T. Blum, T.A. De-<br />
Grand, C. DeTar, S. Gottlieb, U.M. Heller, N. Ishizuka, L. K¨ arkk¨ainen, J. Labrenz, K. Rummukainen, A.<br />
Soni, R.L. Sugar and D. Toussaint, Proceedings of the LISHEP ’95 Session C: Heavy Flavor Physics, page<br />
399-407, Edited by F. Caruso, M.E. Pol, A. Santoro and R. Shellard, Editions Frontieres.<br />
616. Light Hadron Spectrum and Heavy-light Decay Constants from the MILC Collaboration, C. Bernard,<br />
T.A. DeGrand, C. DeTar, Steven Gottlieb, Urs M. Heller, J. E. Hetrick, N. Ishizuka, C. McNeile, K.<br />
Rummukainen, R. Sugar, D. Toussaint, and M. Wingate, Proceedings of DFP99, Katsushi Arisaka and Zvi<br />
Bern, eds.<br />
617. Sample size effects inmultivariate fitting of correlated data, D. Toussaint andW. Freeman,<br />
[arXiv:0808.2211[hep-lat]].<br />
618. Contributions of charm anihilation to the hyper fine splitting in charmonium, L. Levkova and C.<br />
DeTar, Proceedings of Science (Lattice 2008), 133 (2009) [arXiv:0809.5086 [hep-lat]].<br />
619. Recent Progress in Lattice QCD Thermodynamics, C. DeTar Proceedings of Science (Lattice<br />
2008), 001 (2009) [arXiv:0811.2429 [hep-lat]].<br />
620. 108. Heavy baryon mass spectrum from lattice QCD with 2+1 dynamical sea quark flavors, H. Na<br />
and S.Gottlieb, Proceedings of Science (Lattice 2008), 119 (2009) [arXiv:812.1235 [hep-lat]].<br />
621. Electromagnetic splittings of hadrons from improved staggered quarks in full QCD, The MILC<br />
Collaboration: S. Basak, A.Bazavov, C. Bernard, C. DeTar, W. Freeman, Steven Gottlieb, U.M. Heller, J.E.<br />
Hetrick, J. Laiho, L. Levkova, J. Osborn, R. Sugar, D. Toussaint Proceedings of Science (Lattice 2008), 127<br />
(2009) [arXiv:0812.4486 [hep-lat]].<br />
622. HISQ action in dynamical simulations, The MILC Collaboration: A. Bazavov, C. Bernard, C. DeTar,<br />
W. Freeman, Steven Gottlieb, U.M. Heller, J.E. Hetrick, J. Laiho, L. Levkova, J. Osborn, R. Sugar, D.<br />
Toussaint, Proceedings of Science (Lattice 2008), 033 (2009) [arXiv:0903.0874 [hep-lat]].<br />
623. B and D Meson Decay Constants, The Fermilab Lattice and MILC Collaborations: C. Bernard, C.<br />
DeTar, M. Di Pierro, A. X. El-Khadra, R. T. Evans, E. D. Freeland, E. Gamiz, Steven Gottlieb, U. M. Heller,<br />
J. E. Hetrick, A. S. Kronfeld, J. Laiho, L. Levkova, P. B. Mackenzie, J. Simone, R. Sugar, D. Toussaint, R. S.<br />
Van de Water, Proceedings of Science (Lattice 2008), 278 (2009) [arXiv:0904.1895 [hep-lat]].<br />
624. QCD equation of state at non-zero chemical potential, The MILC Collaboration: S. Basak, A. Bazavov,<br />
C. Bernard, C. DeTar,W. Freeman, U.M. Heller, J.E. Hetrick, J. Laiho, L. Levkova, J. Osborn, R. Sugar,<br />
D. Toussaint, Proceedings of Science (Lattice 2008), 171 (2009) [arXiv:0910.0276 [hep-lat]].<br />
625. Cell processor implementation of a MILC lattice QCD application, Guochun Shi, Volodymyr Kindratenko,<br />
Steven Gottlieb, Proceedings of Science (Lattice 2008), 026 (2009) [arXiv:0910.0262 [hep-lat]].<br />
15114. MILC results for light pseudoscalars, The MILC Collaboration: A. Bazavov, C. Bernard, C. DeTar, X.<br />
Du,W. Freeman, Steven Gottlieb, UrsM. Heller, J.E. Hetrick, J. Laiho, L. Levkova, M.B. Oktay, J. Osborn, R.<br />
Sugar, D. Toussaint, R.S. Van de Water, Proceedings of Science (Chiral Dynamics) CD09 (2009)<br />
[arXiv:0910.2966 [hep-lat]].<br />
626. Results from the MILC collaboration’s SU(3) chiral perturbation theory analysis, The MILC<br />
Collaboration: A. Bazavov, C. Bernard, C. DeTar, X. Du, W. Freeman, Steven Gottlieb, Urs M. Heller, J.E.<br />
Hetrick, J. Laiho, L. Levkova, M.B. Oktay, J. Osborn, R. Sugar, D. Toussaint, R.S. Van de Water,<br />
Proceedings of Science (Lattice 2009) 079 (2009) [arXiv:0910.3618 [hep-lat]].<br />
627. Quarkonium mass splittings with Fermilab heavy quarks and 2+1 flavors of improved staggered sea<br />
quarks, T. Burch, C.E. DeTar, M. Di Pierro, A.X. El-Khadra, Steven Gottlieb, A.S. Kronfeld, L. Levkova, P.B.<br />
Mackenzie, J. Simone (Fermilab Lattice and MILC Collaboratins), Proceedings of Science (Lattice 2009) 115<br />
(2009) [arXiv:0911.0361].<br />
628. SU(2) chiral fits to light pseudoscalar masses and decay constants, The MILC Collaboration: A.<br />
Bazavov, C. Bernard, C. DeTar, X. Du, W. Freeman, Steven Gottlieb, Urs M. Heller, J.E. Hetrick, J. Laiho, L.<br />
Levkova, M.B. Oktay, J. Osborn, R. Sugar, D. Toussaint, R.S. Van de Water, Proceedings of Science<br />
(Lattice 2009) 077 (2009) [arXiv:0911.0472].
629. Progress on four flavor QCD with the HISQ action, MILC Collaboration: A. Bazavov, C. Bernard, C.<br />
DeTar, W. Freeman, Steven Gottlieb, U.M. Heller, J.E. Hetrick, J. Laiho, L. Levkova, J. Osborn, R. Sugar, D.<br />
Toussaint, R.S. Van de Water, Proceedings of Science (Lattice 2009) 123 (2009) [arXiv:0911.0869].<br />
630. Progress on charm semileptonic form factors from 2+1 flavor lattice QCD, The Fermilab Lattice and<br />
MILC Collaborations: Jon A. Bailey, A. Bazavov, C. Bernard, C. Bouchard, C. DeTar, A.X. El-Khadra, E.D.<br />
Freeland, W. Freeman, E. Gamiz, Steven Gottlieb, U.M. Heller, J.E. Hetrick, A.S. Kronfeld, J. Laiho, L.<br />
Levkova, P.B.Mackenzie, M.B. Oktay, M. Di Pierro, J.N. Simone, R. Sugar, D. Toussaint, R.S. Van de<br />
Water, Proceedings of Science (Lattice 2009) 250 (2009) [arXiv:0912.0214].<br />
631. The strange quark content of the nucleon in 2+1 flavor lattice QCD,Walter Freeman, Doug<br />
Toussaint, Proceedings of Science (Lattice 2009) 137 (2009) [arXiv:0912.1144].<br />
632. The Ds and D+ Leptonic Decay Constants from Lattice QCD, The Fermilab Lattice andMILC<br />
Collaborations: A. Bazavov, C. Bernard, C. DeTar, E.D. Freeland, E. Gamiz, Steven Gottlieb, U.M. Heller,<br />
J.E. Hetrick, A.X. El-Khadra, A.S. Kronfeld, J. Laiho, L. Levkova, P.B. Mackenzie, M.B. Oktay, M. Di Pierro,<br />
J.N. Simone, R. Sugar, D. Toussaint, R.S. Van de Water, Proceedings of Science (Lattice 2009) 249 (2009)<br />
[arXiv:0912.5221].<br />
633. First results on QCD thermodynamics with HISQ action, A. Bazavov and P. Petreczky, Proceedings<br />
of Science (Lattice 2009) 163 (2009) [arXiv:0912.5421 [hep-lat]].<br />
634. Deconfinement and chiral transition with the highly improved staggered quark (HISQ) action, A.<br />
Bazavov and P. Petreczky, arXiv:10004.0342 [hep-lat], to be published in Journal of Physics: Conference<br />
Series.<br />
635. Accelerating Quantum Chromodynamics Calculations with GPUs, Guochun Shi, Steven Gottlieb,<br />
Aaron Torok, and Volodymyr Kindratenko, Accepted by 2010 Symposium on Application Accelerators in<br />
High Performance Computing (SAAHPC), July 13–15, 2010, Knoxville TN.<br />
636. Parallel Zero-Copy Algorithms for Fast Fourier Transform and Conjugate Gradient usingMPI<br />
Datatypes, Torsten Hoefler and Steven Gottlieb, Accepted by EuroMPI 2010, September, 12–15, 2010,<br />
Suttgart, Germany.<br />
MCA99S008<br />
637. B. D. Farris, Y. T. Liu, and S. L. Shapiro. Binary black hole mergers in gaseous environments:<br />
\Binary Bondi" and \binary Bondi-Hoyle-Lyttleton" accretion. Phys. Rev. D. , 81(8):084008{+, April 2010.<br />
638. Y. T. Liu, Z. B. Etienne, and S. L. Shapiro. Evolution of near-extremal-spin black holes using the<br />
moving puncture technique. Phys. Rev. D. , 80(12):121503{+, December 2009.<br />
MCA99S015<br />
639. Donald Sinclair, “New results with colour-sextet quarks”,<br />
http://agenda.infn.it/sessionDisplay.pysessionId=19&slotId=0&confId=2128#2010- 06-15<br />
640. W. Armour, J. B. Kogut and C. Strouthos, “Chiral symmetry breaking and monopole dynamics in<br />
non-compact QED3 coupled to a four-fermi interaction,”arXiv:1004.3053 [hep-lat].<br />
641. D. K. Sinclair and J. B. Kogut, “Lattice Gauge Theory and (Quasi)-Conformal Technicolor,”<br />
arXiv:1003.0439 [hep-lat].<br />
642. J. B. Kogut and D. K. Sinclair, “Thermodynamics of lattice QCD with 2 flavours of colour-sextet<br />
quarks: A model of walking/conformal Technicolor,” Phys. Rev. D (to be published) arXiv:1002.2988 [heplat].<br />
643. D. K. Sinclair and J. B. Kogut, “QCD thermodynamics with colour-sextet quarks,” arXiv:0909.2019<br />
[hep-lat].<br />
644. C. Strouthos and J. B. Kogut, “Chiral Symmetry breaking in Three Dimensional QED,” J. Phys.<br />
Conf. Ser. 150, 052247 (2009) [arXiv:0808.2714 [cond-mat.supr-con]].
PHY060028<br />
645. M. Campanelli, C. O. Lousto, B. C.Mundim, H. Nakano, Y. Zlochower and H. P. Bischof, “Advances<br />
in Simulations of Generic Black-Hole Binaries,” Class. Quant. Grav. 27, 084034 (2010) [arXiv:1001.3834 [grqc]].<br />
646. C. O. Lousto, H. Nakano, Y. Zlochower and M. Campanelli, “Intermediate Mass Ratio Black Hole<br />
Binaries: Numerical Relativity meets Perturbation Theory,” Phys. Rev. Lett. 104, 211101 (2010)<br />
[arXiv:1001.2316 [gr-qc]].<br />
647. C. O. Lousto, H. Nakano, Y. Zlochower and M. Campanelli, “Statistical studies of Spinning Black-<br />
Hole Binaries,” Phys. Rev. D 81, 084023 (2010) [arXiv:0910.3197 [gr-qc]].<br />
648. C. O. Lousto, M. Campanelli and Y. Zlochower, “Remnant Masses, Spins and Recoils from the<br />
Merger of Generic Black-Hole Binaries,” Class. Quant. Grav. 27, 114006 (2010) [arXiv:0904.3541 [gr-qc]].<br />
649. B. J. Kelly, W. Tichy, Y. Zlochower, M. Campanelli and B. F. Whiting, “Post-Newtonian Initial Data<br />
with Waves: Progress in Evolution,” Class. Quant. Grav. 27, 114005 (2010) [arXiv:0912.5311 [gr-qc]].<br />
650. S. C. Noble, J .H. Krolik, and J. F. Hawley “Dependence of inner accretion disk stress on<br />
parameters: the Schwarzschild case,” Astrophys. J, 711, 959-973 (2010) [arXiv:1001.4809 [astro-ph]].<br />
651. Antonini, Faber, Gualandris, Merritt, “Tidal Disruption of Binaries by a Supermassive Black Hole:<br />
Progeny of the S-stars”, submitted to Astrophys. J. (2009).<br />
652. Cadonati et al, “Status of NINJA: the Numerical INJection Analysis project”, Class. Quant. Grav. 26<br />
114008 (2009).<br />
653. Faber, “Status of neutron star-black hole and binary neutron star simulations”, Class. Quant. Grav.<br />
26 114004 (2009).<br />
654. Aylott et al, “Testing gravitational-wave searches with numerical relativity waveforms: Results from<br />
the first Numerical INJection Analysis (NINJA) project.”, Accepted to Class. Quant. Grav. (2009).<br />
655. Campanelli, Lousto, Zlochower, “Algebraic Classi fication of Numerical Spacetimes and Black-Hole-<br />
Binary Remnants”, Phys. Rev. D 79 084012 (2009).<br />
656. Noble, Krolik, “GRMHD prediction of coronal variability in accreting black holes”, Astrophys. J 703<br />
964-975 (2009).<br />
657. Campanelli, Lousto, Nakano, Zlochower, “Comparison of Numerical and Post-Newtonian<br />
Waveforms for Generic Precessing Black-Hole Binaries”, Phys. Rev. D 79 084010 (2009).<br />
658. Lousto and Zlochower, “Modeling gravitational recoil from precessing highly-spinning unequalmass<br />
black-hole binaries”, Phys. Rev. D 79 064018 (2009).<br />
659. Noble, Krolik, Hawley, “Direct calculation of the radiative efficiency of an accretion disk around a<br />
black hole”, Astrophys. J 692 411-421 (2009).<br />
PHY090002<br />
660. A.H. Castro Neto et al., Rev. Mod. Phys. , 109 (2009); A.K. Geim, Science , 1530 (2009).<br />
661. R. Golizadeh-Mojarad and S. Datta, Phys. Rev. B , 085410 (2009).<br />
662. Q. Ran et al., Appl. Phys. Lett. , 103511 (2009).<br />
663. P. Blake et al., Solid State Commun. , 1068 (2009).<br />
664. D. B. Farmer et al., Nano Lett. , 388 (2009).<br />
665. D. B. Farmer et al., Appl. Phys. Lett. , 213106 (2009).<br />
SES070004<br />
666. Bennett, D. A., W. Tang & S. Wang (Accpeted) Toward an understanding of provenance in<br />
complex land use dynamics. Journal of Land Use Science.
667. Liu, Y., S. Wang & A. M. Segre (In preparation) A Scalable Parallel Genetic Algorithm for the<br />
Generalized Assignment Problem. Journal of Supercomputing.<br />
668. Shook, E., S. Wang & W. Tang (In preparation) A parallel communication framework for spatiallyexplicit<br />
agent-based models. International Journal of Geographical Information Science.<br />
669. Tang, W. & D. A. Bennett (2010) Agent-based modeling of animal movement: A review. Geography<br />
Compass, 4(7), 682-700.<br />
670. Tang, W. & D. A. Bennett (forthcoming) The Explicit representation of context in agent-based<br />
modeling of complex adaptive spatial systems. The Annals of the Association of American Geographers.<br />
671. Tang, W., D. A. Bennett & S. Wang (In revision) A parallel agent-based model of land use opinions.<br />
Journal of Land Use Science.<br />
672. Wang, S., Y. Liu, N. Wilkins-Diehr & S. Martin (2009) SimpleGrid toolkit: Enabling geosciences<br />
gateways to cyberinfrastructure. Comput. Geosci., 35, 2283-2294.<br />
673. Wang, S. & Y. Liu (2009) TeraGrid GIScience Gateway: Bridging cyberinfrastructure and<br />
GIScience. International Journal of Geographical Information Science, 23(5), 631-656.<br />
674. Wang, S. (2010) A CyberGIS Framework for the Synthesis of Cyberinfrastructure, GIS, and Spatial<br />
Analysis. Annals of the Association of American Geographers, 100(3), 1-23.<br />
675. Zhu, X., Wang, S., Shook, E. & Long, S. (In revision) The grain drain. Ozone, a hidden current and<br />
mounting cause of worldwide grain shortage. Nature.<br />
A.2.2 Publications from TeraGrid Users (for 2010 Q4)<br />
The following 616 publications were gathered primarily from renewal Research submissions to<br />
the December 2010 TRAC meeting. The extraction process was conducted by a UCSD<br />
undergraduate student at SDSC. The publications are <strong>org</strong>anized by field of science and by the<br />
proposal with which they were associated.<br />
Advanced Scientific Computing<br />
ASC100002<br />
1. A. Tumin, X. Wang, and X. Zhong, “Direct numerical simulation and theoretical analysis of<br />
perturbations in hypersonic boundary layers,” Proceedings of the seventh IUTAM symposium on<br />
laminar-turbulent transition, Stockholm, Sweden, 2009, edited by P. Schlatter and D. S. Henningson,<br />
2010.<br />
2. P. S. Rawat and X. Zhong, “On high-order shock-fitting and front-tracking schemes for numerical<br />
simulation of shock–disturbance interactions,” Journal of Computational Physics, 229 (2010) 6744-<br />
4780.<br />
3. E. Johnsen et al. (including P. S. Rawat), “Assessment of high-resolution methods for numerical<br />
simulations of compressible turbulence with shock waves,” Journal of Computational Physics, 229<br />
(2010) 1213–1237.<br />
4. A. Tumin, X. Wang, and X. Zhong, “Direct numerical simulation and theoretical analysis of perturbations<br />
in hypersonic boundary layers,” Proceedings of the seventh IUTAM symposium on laminar-turbulent<br />
transition, Stockholm, Sweden, 2009, edited by P. Schlatter and D. S. Henningson, 2010.<br />
5. P. S. Rawat and X. Zhong, “On high-order shock-fitting and front-tracking schemes for numerical<br />
simulation of shock–disturbance interactions,” Journal of Computational Physics, 229 (2010) 6744-<br />
4780.<br />
6. E. Johnsen et al. (including P. S. Rawat), “Assessment of high-resolution methods for numerical<br />
simulations of compressible turbulence with shock waves,” Journal of Computational Physics, 229<br />
(2010) 1213–1237.
Astronomical Sciences<br />
AST060031<br />
7. Gravitational Collapse and Filament Formation: Comparison with the Pipe Nebula Heitsch, F.,<br />
Ballesteros-Paredes, J., Hartmann, L. 2009, ApJ, 704, 1735 AST060031<br />
8. The Fate of High Velocity Clouds: Warm or Cold Cosmic Rain Heitsch, F., Putman, M.E. 2009, ApJ,<br />
698, 1485 AST060031<br />
9. Effects of Magnetic Field Strength and Orientation on Molecular Cloud Formation Heitsch, F., Stone,<br />
J.M., Hartmann, L.W. 2009, ApJ, 695, 248 AST060031<br />
10. The Fate of Taurus: Simmering Star Formation or Dramatic Exit Heitsch, F., Hartmann, L., ApJ<br />
AST060031<br />
11. Flow-Driven Cloud Formation: The Role of Magnetic Fields and Self-Gravity Heitsch, F., Stone, J.,<br />
Hartmann, L., ApJ AST060031<br />
AST070022<br />
12. Choi, J. & Nagamine, K. 2009, MNRAS, 393, 1595, Effects of metal enrichment and metal cooling in<br />
galaxy growth and cosmic star formation history<br />
a. 2010a,MNRAS,407,1464,Effectsof cosmologicalparametersand starformationmodels on<br />
the cosmic star formation history in ΛCDM cosmological simulations<br />
b. 2010b, ArXiv e-prints, Multicomponent and Variable Velocity Galactic flow Out in<br />
Cosmological Hydrodynamic Simulations<br />
13. Nagamine, K., Choi, J., & Yajima, H. 2010, ArXiv e-prints, Effects of UV background and local stellar<br />
radiation on the H I column density distribution<br />
14. Niino, Y., Choi, J., Kobayashi, M. A. R., Nagamine, K., Totani, T., & Zhang, B. 2010, ArXiv e-prints,<br />
Luminosity Distribution of Gamma-Ray Burst Host Galaxies at redshift z=1 in Cosmological Smoothed<br />
Particle Hydrodinamic Simulations: Implications for the Metallicity Dependence of GRBs<br />
15. Yajima, H., Choi, J., & Nagamine, K. 2010, ArXiv e-prints, Escape fraction of ionizing photons from<br />
high-redshift galaxies in cosmological SPH simulations<br />
AST080040<br />
16. E. Macaulay, Hume A. Feldman, P.G. Ferreira, M. Hudson & R. Watkins, A Slight Excess of Large<br />
Scale Power from Moments of the Peculiar Velocity Field, MNRAS submitted (ArXiv: 1010.2651).<br />
17. Shankar Agarwal & Hume A. Feldman, The Effect of Massive Neutrinos on Matter Power Spectrum,<br />
(2010) MNRAS in Press (ArXiv:1006.0689)<br />
18. Hume A. Feldman, Richard Watkins & Michael J. Hudson, Cosmic Flows on 100 Mpc/h Scales:<br />
Standardized Minimum Variance Bulk Flow, Shear and Octupole Moments, MNRAS, 407, 2328-2338<br />
(2010) (ArXiv:0911.5516)<br />
19. Roman Juszkiewicz, Hume A. Feldman, J. N. Fry & Andrew H. Jaffe, Weakly nonlinear dynamics and<br />
the σ8 parameter, JCAP 02 021 (2010) (ArXiv:0901.0697)<br />
20. Richard Watkins, Hume A. Feldman & Michael J. Hudson, Consistently Large Cosmic Flows on Scales<br />
of 100 h -1 Mpc: a Challenge for the Standard ΛCDM Cosmology, MNRAS, 392, 743 (2009)<br />
(ArXiv:0809.4041)<br />
AST090007<br />
21. Moscibrodzka, M., Gammie, C. F., Dolence, J., Shiokawa, H., Leung, P.-K. 2010, “Numerical Models of<br />
Sgr A*,” to appear in ”The Galactic Center: A Window on the Nuclear Environment of Disk Galaxies”,<br />
ed. Mark Morris, Daniel Q. Wang and Feng Yuan, arXiv:1002.1261<br />
22. Guan, X., & Gammie, C. F. 2010, “Stratified Local Models of Isothermal Disks of Large Radial Extent,”<br />
in preparation.<br />
23. Moscibrodzka, M., Gammie, C. F., Dolence, J., Shiokawa, H., & Leung, P.-K. 2010, “Pair Production in<br />
Low Luminosity AGN”, in preparation.<br />
24. Dolence, J., Gammie, C. F., Moscibrodzka, M., Shiokawa, H., & Leung, P.-K. “Self-Consistent Light<br />
Curves from 3D GR-MHD Models of Sgr A*,” in preparation.<br />
25. Shiokawa, H., Gammie, C. F., and Dolence, J., 2011 “Convergence of Global General Relativistic<br />
Accretion Disk Models,” in preparation.
AST090108<br />
26. Woodward, P. R., J. Jayaraj, P.-H. Lin, P.-C. Yew, M. Knox, J. Greensky, A. Nowatzki, and K. Stoffels,<br />
“Boosting the performance of computational fluid dynamics codes for interactive supercomputing,” Proc.<br />
Intntl. Conf. on Comput. Sci., ICCS 2010, Amsterdam, Netherlands, May, 2010. Preprint available at<br />
www.lcse.umn.edu/ICCS2010.<br />
27. Herwig, F., M. Pignatari, P. R. Woodward, D. H. Porter, G. Rockefeller, C. L. Fryer, M. Bennett, and R.<br />
12<br />
Hirschi, “Convective-reactive proton- C combustion in Sakurai’s object (V4334 Sagittarii) and<br />
implications for the evolution and yields from the first generations of stars,” preprint, LA-UR 10-00630,<br />
Feb. 10, 2010, available at http://arxiv4.library.cornell.edu/<strong>pdf</strong>/1002.2241.<br />
28. Woodward, P. R., D. H. Porter, W. Dai, T. Fuchs, A. Nowatzki, M. Knox, G. Dimonte, F. Herwig, and C.<br />
Freyer, “The Piecewise-Parabolic Boltzmann Advection Scheme (PPB) Applied to Multifluid<br />
Hydrodynamics,” preprint LA-UR 10-01823, March, 2010, available at www.lcse.umn.edu/PPMplusPPB.<br />
AST100001<br />
29. Numerical Support for the Hydrodynamic Mechanism of Pulsar Kicks<br />
Nordhaus, J., Brandt, T. D., Burrows, A., Livne, E., Ott, C. D. 2010 – Submitted to Physical Review D --<br />
available from arXiv:1010.0674<br />
30. Dimension as a Key to the Neutrino Mechanism of Core-Collapse Supernova Explosions<br />
Nordhaus, J., Burrows, A., Almgren, A., Bell, J. B. 2010 Astrophysical Journal, 720, 694<br />
MCA08X016<br />
31. Vasil, G. & Brummell, N.H., “Constraints on magnetic buoyancy instabilities of a shear-generated<br />
magnetic layer”, Astrophys. J., 690, 783-794 (2009)<br />
32. Silvers, L.J., Vasil, G.M., Brummell, N.H., & Proctor, M.R.E., “Double-diffusive instabilities of a sheargenerated<br />
magnetic layer”, Astrophys. J., 702, L14-L18 (2009).<br />
33. Brummell, N.H., Tobias, S.M. & Cattaneo, F., “Dynamo efficiency in compressible dynamos with and<br />
without penetration”, Geophys. Astrophys. Fluid Dyn., accepted, to appear (2010).<br />
34. Byington, B., Brummell, N.H. & Tobias, S.M., “The effect of small-scale motion on an essentially<br />
nonlinear dynamo”, in Proceedings of IAU Symposium 271 Astrophysical Dynamics: From Stars to<br />
Galaxies”, eds. N.H. Brummell, A.S. Brun. M. Miesch, accepted, to appear (2011)<br />
35. Silvers, L.J., Vasil, G.M., Brummell, N.H. & Proctor, M.R.E., “The evolution of a double diffusive<br />
magnetic buoyancy instability”, in Proceedings of IAU Symposium 271 Astrophysical Dynamics: From<br />
Stars to Galaxies”, eds. N.H. Brummell, A.S. Brun. M. Miesch, accepted, to appear (2011)<br />
36. Tobias, S.M., Cattaneo, F. & Brummell, N.H., “On the generation of <strong>org</strong>anized magnetic fields”.<br />
Astrophys. jou., submitted (2010).<br />
37. Traxler, A., Stellmach, S., Garaud, P., Radko, T., & Brummell, N., “Dynamics of fingering convection I:<br />
small-scale fluxes and large-scale instabilities”, Jou. Fluid Mech., submitted (2010).<br />
38. Stellmach, S., Traxler, A., Garaud, P., Brummell, N. &Radko, T., “Dynamics of fingering convection I:<br />
The formation of thermohaline staircases”, Jou. Fluid Mech., submitted (2010).<br />
39. Stone, J., Byington, B., Brummell, N.H. & Gough, D.O., “Stoked dynamos” in “Proceedings of ISIMA<br />
2010”, accepted 2010.<br />
40. Kosuga, Y. & Brummell, N.H., “Suppression of jets in Howard-Krishnamurti convection” in “Proceedings<br />
of ISIMA 2010”, accepted 2010.<br />
MCA98N020<br />
41. Arieli, Y., Rephaeli, Y., Norman, M. L. 2010. “Hydrodynamical Simulations of Galaxy Clusters with<br />
Galcons”, ApJ, 716, 918<br />
42. Norman, M. L. 2010. “Simulating Galaxy Clusters”, arXiv:1005.1100<br />
43. Norman, M. L.; Reynolds, D. R.; So, G. C. 2009. “Cosmological Radiation Hydrodynamics with Enzo”,<br />
AIP Conference Proceedings, Volume 1171, pp. 260-272<br />
44. Norman, M. L., Paschos, P. & Harkness, R. H. 2009. “Baryon acoustic oscillations in the Lyman alpha<br />
forest”, J. Phys. Conf. Ser., 180, 2021<br />
45. Norman, M. L., So, G. C., Reynolds, D., Paschos, P., & Harkness, R. H. 2010. “Radiation<br />
Hydrodynamics of Cosmic Reionization. I. The Lower Cutoff on the Dwarf Galaxy Luminosity Function”,<br />
ApJ, submitted<br />
46. Paschos, P., Jena, T., Tytler, D., Kirkman, D., Norman, M. L., 2009. “The Lyα forest at redshifts 0.1-1.6:<br />
good agreement between a large hydrodynamic simulation and HST spectra”, MNRAS, 399, 1934
47. Reynolds, D. R., Hayes, J. C., Paschos, P. & Norman, M. L. 2009, “Self-consistent solution of<br />
cosmological radiation-hydrodynamics and chemical ionization”, J. Comp. Phys., 228, 6833<br />
48. Skory, Stephen; Turk, Matthew J.; Norman, Michael L.; Coil, Alison L., 2010. “Parallel HOP: A Scalable<br />
Halo Finder for Massive Cosmological Data Sets”, ApJ Supp., in press, arXiv:1001.3411<br />
49. Turk, M. J., Norman, M. L., & Abel, T. 2010. “High Entropy Polar Regions Around the First Protostars”,<br />
ApJLett, submitted<br />
50. Turk, M. J., Smith, B. D., Oishi, J. S., Skory, S., Skillman, S. W., Abel, T. and Norman, M. L.,2010. “yt:<br />
An Multi-Code Analysis Toolkit for Astrophysical Simulation Data”, ApJSupp, submitted<br />
51. Tytler, D., Paschos, P., Kirkman, D., Norman, M. L., & Jena, T. 2009. “The effect of large-scale power<br />
on simulated spectra of the Lyα forest”, MNRAS, 393, 723<br />
52. Wise, J. H., Norman, M. L., Abel, T., & Turk, M. J. 2010. “The Birth of a Galaxy: Population III Metal<br />
Enrichment and Population II Stellar Populations”, ApJ, submitted<br />
Atmospheric Sciences<br />
EAR010001<br />
53. Liu, S., W. Gao, M. Xu, X. Wang, and X.-Z. Liang, 2009: Regional climate model simulation of China<br />
summer precipitation using an optimal ensemble of cumulus parameterization schemes. Frontiers of<br />
Earth Science in China, 3(2), 248–257, DOI 10.1007/s11707-009-0022-8.<br />
54. Beth Plale, LEAD II hybrid workflows for timely weather products, Keynote talk, 19th Pacific Rim Applications<br />
and Grid Middleware Assembly (PRAGMA), Changchun, China, Sept 2010.<br />
55. Beth Plale, Metadata and Provenance Collection and Representation: Antecedent to Scientific Data<br />
Preservation, Open Data Seminar, University of Michigan, November 2010.<br />
56. Beth Plale, LEAD II/Trident workflows for timely weather products: challenge of Vortex2, American<br />
Chinese Cyberinfrastructure and E-science Workshop (ACCESS) on Data Intensive Sciences and<br />
Computing (DISC), Urbana, Illinois, August 2010.<br />
57. Beth Plale and Ashish Bhangale, LEADII: hybrid workflows in atmospheric science, DemoFest,<br />
Microsoft Faculty Research Summit, Seattle, Washington July 2010.<br />
58. Beth Plale, Provenance and Workflows, Computer Network Information Center of Chinese Academy of<br />
Sciences, Beijing, China, October 2010.<br />
59. Beth Plale, Craig Mattocks, Keith Brewster, Eran Chinthaka, Jeff Cox, Chathura Herath, Scott Jensen,<br />
Yuan Lao, Yiming Sun, Felix Terkhorn, Ashish Bhangale, Kavitha Chandrasekar, Prashant Sabhnani,<br />
and Robert Ping. "LEAD II : Hybrid Workflows in Atmospheric Science." Poster presented by Felix<br />
Terkhorn at TeraGrid 2010, August 2010.<br />
60. Eran Chinthaka Withana and Beth Plale, "Usage Patterns to Provision for Scientific Experimentation in<br />
Clouds". 2nd IEEE International Conference on Cloud Computing and Science 2010, Indianapolis,<br />
Indiana, Nov 30,2010. 25% acceptance.<br />
61. Eran Chinthaka Withana and Chathura Herath, “Towards Hybrid Workflow Execution in Scientific<br />
Workflows”, Presentation at the Supercomputing 2010 Conference, New Orleans, Louisiana, Nov 16,<br />
2010.<br />
62. Anderson, B.T., K. Hayhoe, and X.-Z. Liang, 2009: Anthropogenic-induced changes in the 21st Century<br />
summertime hydroclimatology of the Northeastern US. Climatic Change, DOI 10.1007/s10584-009-<br />
9674-3.<br />
63. Lin, J.-T., D.J. Wuebbles, H.-C. Huang, Z.-N. Tao, M. Caughey, X.-Z. Liang, J. Zhu, and T. Holloway,<br />
2010: Potential effects of climate and emissions changes on surface ozone in the Chicago area. J.<br />
Great Lakes Research (in press).<br />
64. Weaver, C.P., X.-Z. Liang, J. Zhu, P.J. Adams, P. Amar, J. Avise, M. Caughey, J. Chen, R.C. Cohen, E.<br />
Cooter, J.P. Dawson, R. Gilliam, A. Gilliland, A.H. Goldstein, A. Grambsch, A. Guenther, W.I.<br />
Gustafson, R.A. Harley, S. He, B. Hemming, C. Hogrefe, H.-C. Huang, S.W. Hunt, D.J. Jacob, P.L.<br />
Kinney, K. Kunkel, J.-F. Lamarque, B. Lamb, N.K. Larkin, L.R. Leung, K.-J. Liao, J.-T. Lin, B.H. Lynn, K.<br />
Manomaiphiboon, C. Mass, D. McKenzie, L.J. Mickley, S.M. O’Neill, C. Nolte, S.N. Pandis, P.N.<br />
Racherla, C. Rosenzweig, A.G. Russell, E. Salathe, A.L. Steiner, E. Tagaris, Z. Tao, S. Tonse, C.<br />
Wiedinmyer, Williams, D.A. Winner, J.-H. Woo, S. Wu, and D.J. Wuebbles, 2009: A preliminary<br />
synthesis of modeled climate change impacts on U.S. regional ozone concentrations. Bull. Amer.<br />
Meteorol. Soc., 90, 1843–1863.<br />
65. Kunkel, K.E., X.-Z. Liang, and J. Zhu, 2010: Regional climate model projections of United States heat<br />
st<br />
waves in the 21 Century. J. Climate (accepted).<br />
66. Wang, X., X.-Z. Liang, W. Jiang, Z. Tao, J.X.L. Wang, H. Liu, Z. Han, S. Liu, Y. Zhang, G.A. Grell, and
S.E. Peckham, 2010: WRF-Chem simulation of East Asian air quality: Sensitivity to temporal and<br />
vertical emissions distributions. Atmos. Environ., 44, 660-669, doi:10.1016/j.atmosenv.2009.11.011.<br />
67. Choi,H.I.,and X.-Z.Liang,2009: Improved terrestrial hydrologic representation in mesoscale land surface<br />
models. J. Hydrometeorology (accepted).<br />
68. Xu, M., X.-Z. Liang, W. Gao, and N. Krotkov, 2010: Comparison of TOMS retrievals and UVMRP<br />
measurements of surface spectral UV radiation in the United States. Atmos. Chem. Phys. (submitted).<br />
69. Liu, S., and X.-Z. Liang, 2010: Observed diurnal cycle climatology of planetary boundary layer height. J.<br />
Climate (accepted).<br />
70. Wang, C., X.-Z. Liang, and A.N. Samel, 2010: AMIP GCM Simulation of Precipitation Variability over the<br />
Yangtze River Valley. J. Climate (submitted).<br />
71. Drewry, D.T., P. Kumar, S. Long, C. Bernacchi, X.-Z. Liang, and M. Sivapalan, 2010: Ecohydrological<br />
responses of dense canopies to environmental variability, Part 2: Role of acclimation under elevated<br />
CO2. J. Geophys. Res.Biogeosciences (submitted).<br />
72. Drewry, D.T., P. Kumar, S. Long, C. Bernacchi, X.-Z. Liang, and M. Sivapalan, 2010: Ecohydrological<br />
responses of dense canopies to environmental variability, Part 1: Interplay between vertical structure<br />
and photosynthetic pathway. J. Geophys. Res. Biogeosciences (submitted).<br />
73. Yuan, X., and X.-Z. Liang, 2010: Evaluation of a Conjunctive Surface- Subsurface Process model<br />
(CCSP) over the United States. J. Hydrometeorology(submitted).<br />
74. Liang, X.-Z., M. Xu, W. Gao, K.R. Reddy, K.E. Kunkel, and D.L. Schmoldt, 2010: Physical modeling of<br />
U.S. cotton yields and climate stresses during 1979-2005. Agronomy Journal (in revision).<br />
75. 47. Ling, T.-J.,X.-Z. Liang, M. Xu, Z. Wang, and B. Wang, 2010: A multilevel ocean mixed-layer model<br />
for 2-dimension applications. Acta Oceanologica Sinica (submitted).<br />
76. 48. Hayhoe, K., C. Wake, B. Anderson, J. Bradbury, A. DeGaetano, X.-Z. Liang, J. Zhu, E. Maurer, D.<br />
Wuebbles. 2009: Translating global change into regional trends: Climate drivers of past and future<br />
trends in the U.S. Northeast. J. Climate (submitted).<br />
77. 49. Markus, M., D. J. Wuebbles, X.-L. Liang, K. Hayhoe, and d. A. R. Kristovich, 2010: Diagnostic<br />
analysis of future climate scenarios applied to urban flooding in the Chicago metropolitan area. Climatic<br />
Change (submitted).<br />
78. Chen, L., and X.-Z. Liang, 2010: CWRF optimized physics ensemble improving U.S. 1993 flood<br />
prediction. In Proceedings of the 11th Annual WRF User’s Workshop, Boulder, CO, June 21-25, 4 pp.<br />
79. Qiao, F., and X.-Z. Liang, 2010: A comparative CWRF study of the 1993 and 2008 summer U.S.<br />
Midwest floods. In Proceedings of the 11th Annual WRF User’s Workshop, Boulder, CO, June 21-25, 4<br />
pp.<br />
80. Yuan, X., and X.-Z. Liang, 2010: CWRF incorporation of a conjunctive surface-subsurface process<br />
model to improve seasonal-interannual hydroclimate forecasts. In Proceedings of the 11thAnnual WRF<br />
User’s Workshop, Boulder, CO, June 21-25, 4 pp.<br />
81. Liang, X.-Z., 2009: Integrated Earth System Modeling: Development and Applications. Invited seminar<br />
at Institute of Natural Resource Sustainability, University of Illinois, February 2.<br />
82. Liu, F., X.-Z. Liang, Z. Tao, X. Wang, J. Zhu, M. Caughey, S.M. Schindler, and S. Liu, 2009: Impact of<br />
Low-level Jet on Regional Ozone Distributions. Invited seminar at Illinois State Water Survey, University<br />
of Illinois, March 11.<br />
83. Liang, X.-Z., 2009: Develop and Apply the Digital Earth System Model to Address Societal and<br />
Environmental Issues. Invited seminar at Tongji University, Hangzhou, March 24<br />
84. Liang, X.-Z., F. Zhang, E. Joseph and V. R. Morris, and J.X.L. Wang, 2009: CWRF Cloud-Aerosol-<br />
Radiation Ensemble Modeling System: Validation. Oral presentation at the 10thWRF Users’ Workshop,<br />
National Center for Atmospheric Research, Boulder, Colorado, June 23-26.<br />
85. Liang, X.-Z., 2009: Regional Climate Model Downscaling Projection of Future Climate Changes. Invited<br />
seminar at the Tropical Marine Science Institute, National University of Singapore, Singapore,<br />
September 30.<br />
86. Liang, X.-Z., X. Yuan, T. Ling, L. Chen, and J.X.L. Wang, 2009: Improving Precipitation Prediction by<br />
the CWRF Ensemble Downscaling Approach. Oral presentation at the AGU (American Geophysical<br />
Union) Fall Meeting, San Francisco, December 14-18.<br />
87. Liang, X.-Z., and Feng Zhang, 2009: Development of the Cloud-Aerosol-Radiation Ensemble Modeling<br />
System. Invited talk at the Conference for the Dynamic Earth System Model Research, Chinese<br />
Academy of Sciences, Beijing, December 21.<br />
88. Liang, X.-Z., 2009: Optimizing Ensemble Physics Representation to Enhance Weather Forecast Skill.<br />
Invited talk at the Foreign Expert Forum for the GRAPES Model System Development, China<br />
Meteorological Administration, Beijing, December 21.<br />
89. Liang, X.-Z., 2010: Integrated Regional Earth System Modeling: Development and Application. Invited<br />
seminar at the Desert Research Institute, Reno, January 29.<br />
90. Liang, X.-Z., 2010: Integrated Regional Earth System Modeling: Development and Application. Invited<br />
seminar at the Geophysical Fluid Dynamics Laboratory/NOAA, Princeton University Forrestal Campus,
February 17.<br />
91. Liang, X.-Z., 2010: Integrated Regional Earth System Modeling: Development and Application. Invited<br />
seminar at the Earth System Science Interdisciplinary Center jointed between the University of<br />
Maryland and the NASA/Goddard Space Flight Center, College Park, Maryland, March 25.<br />
92. Liang, X.-Z., 2010: Development of Improved Physics Representation for Weather Forecast and<br />
Climate Prediction. Invited seminar at the China Meteorological Administration, Beijing, May 12.<br />
93. Liang, X.-Z., 2010: Integrated Regional Earth System Modeling: Development and Application. Invited<br />
seminar at the School of Resources and Environmental Sciences, East China Normal University,<br />
Shanghai, China, May 14.<br />
94. Liang, X.-Z., M. Xu, X. Yuan, T. Ling, H.I. Choi, F. Zhang, L. Chen, S. Liu,<br />
95. S. Su, F. Qiao, J.X.L. Wang, K.E. Kunkel, W. Gao, E. Joseph, V. Morris, T.-W. Yu, J. Dudhia, and J.<br />
Michalakes, 2010: Development of CWRF for regional weather and climate prediction: General model<br />
description and basic skill evaluation—first release. Oral presentation at a planetary session of the<br />
11thAnnual WRF User’s Workshop, Boulder, CO, June 21-25.<br />
MCA07S033<br />
96. Pogorelov, N.V., Borovikov, S.N., Zank, G.P., Ogino, T., Three-dimensional features of the outer<br />
heliosphere due to coupling between the interstellar and interplanetary magnetic fields. III. The effects<br />
of solar rotation and activity cycle, Astrophys. J., 696, 1478 (2009).<br />
97. Florinski, V.; Balogh, A.; Jokipii, J. R.; McComas, D. J.; Opher, M.; Pogorelov, N. V.; Richardson, J. D.;<br />
Stone, E. C.; Wood, B. E., The Dynamic Heliosphere: Outstanding Issues. <strong>Report</strong> of Working Groups 4<br />
and 6, Space Sci. Rev., 143, 57 (2009).<br />
98. Zank, G.P., Pogorelov, N.V., Heerikhuisen, J., et al., Space Sci. Rev., Physics of the Solar wind – local<br />
interstellar medium interaction: Role of magnetic fields, DOI 10.1007/s11214-009-9497-6 (2009).<br />
99. Florinski, N.V., Pogorelov, N.V., Four-dimensional transport of galactic cosmic rays in the outer<br />
heliosphere and heliosheath, Astrophys. J., 701, 642–651 (2009).<br />
100. Pogorelov, N.V., Heerikhuisen, J., Zank, G.P., Mitchell, J.J., Cairns, I.H., Heliospheric asymmetries due<br />
to the action of the interstellar magnetic field, Adv. Space Res., 44, 13371344 (2009).<br />
101. McComas, D. J., Allegrini, F., Bochsler, P.; Bzowski, M.; Christian, E. R.; Crew, G. B.; DeMajistre, R.;<br />
Fahr, H.; Fichtner, H.; Frisch, P. C.; Funsten, H. O.; Fuselier, S. A.; Gloeckler, G.; Gruntman, M.;<br />
Heerikhuisen, J.; Izmodenov, V.; Janzen, P.; Knappenberger, P.; Krimigis, S.; Kucharek, H.; Lee, M.;<br />
Livadiotis, G.; Livi, S.; MacDowall, R. J.; Mitchell, D.; Möbius, E.; Moore, T.; Pogorelov, N. V.;<br />
Reisenfeld, D.; Roelof, E.; Saul, L.; Schwadron, N. A.; Valek, P. W.; Vanderspek, R.; Wurz, P.; Zank, G.<br />
P., Global Observations of the Interstellar Interaction from the Interstellar Boundary Explorer (IBEX),<br />
Science, 326, Issue 5955, pp. 959962 (2009).<br />
102. Schwadron, N. A.; Bzowski, M.; Crew, G. B.; Gruntman, M.; Fahr, H.; Fichtner, H.; Frisch, P. C.;<br />
Funsten, H. O.; Fuselier, S.; Heerikhuisen, J.; Izmodenov, V.; Kucharek, H.; Lee, M.; Livadiotis, G.;<br />
McComas, D. J.; Moebius, E.; Moore, T.; Mukherjee, J.; Pogorelov, N. V.; Prested, C.; Reisenfeld, D.;<br />
Roelof, E.; Zank, G. P., Comparison of Interstellar Boundary Explorer Observations with 3D Global<br />
Heliospheric Models, Science, 326, Issue 5955, 966968 (2009).<br />
103. Heerikhuisen, J.; Pogorelov, N. V.; Zank, G. P.; Crew, G. B.; Frisch, P. C.; Funsten, H. O.; Janzen, P.<br />
H.; McComas, D. J.; Reisenfeld, D. B.; Schwadron, N. A., Pick-Up Ions in the Outer Heliosheath: A<br />
Possible Mechanism for the Interstellar Boundary EXplorer Ribbon, Astrophys. J., 708, L126-L130<br />
(2010).<br />
104. Zank, G. P.; Heerikhuisen, J.; Pogorelov, N. V.; Burrows, R.; McComas, D., Microstructure of the<br />
Heliospheric Termination Shock: Implications for Energetic Neutral Atom Observations, Astrophys. J.,<br />
708, 10921106 (2010).<br />
105. Frisch, Priscilla C.; McComas, D. J.; Allegrini, F.; Bochsler, P.; Bzowski, M.; Christian, E. R.; Crew, G.<br />
B.; DeMajistre, B.; Fahr, H.; Fichtner, H.; Funsten, H.; Fuselier, S. A.; Gloeckler, G.; Gruntman, M.;<br />
Heerikhuisen, J.; Izmodenov, V.; Janzen, P.; Knappenberger, P.; Krimigis, S.; Kucharek, H.; Lee, M.;<br />
Livadiotis, G.; Livi, S.; MacDowall, R. J.; Mitchell, D.; Moebius, E.; Moore, T.; Pogorelov, N. V.;<br />
Reisenfeld, D.; Roelof, E.; Saul, L.; Schwadron, N. A.; Valek, P. W.; Vanderspek, R.; Wurz, P.; Zank, G.<br />
P., First Global Observations Of The Interstellar Interaction From The Interstellar Boundary Explorer<br />
(IBEX), American Astronomical Society, AAS Meeting #215, #415.20 (2009).<br />
106. Borovikov, S.; Pogorelov, N. V.; Hu, Q.; Burlaga, L. F.; Zank, G. P., Numerical modeling of transient<br />
phenomena in the distant solar wind, Eos Trans. AGU, Fall Meet. Suppl., SH21A-1499 (2009).<br />
107. Kucharek, H.; Pogorelov, N. V.; Lee, M. A.; Moebius, E.; Funsten, H. O.; Reisenfeld, D. B.; Janzen, P.,<br />
Pickup Ion Distributions at the Termination Shock: A Numerical Study for IBEX, Eos Trans. AGU, Fall<br />
Meet. Suppl., SH21B-1514 (2009).
108. Frisch, P. C.; Heerikhuisen, J.; Pogorelov, N. V.; Zank, G. P.; Mueller, H.; Crew, G. B.; Demajestre, B.;<br />
Funsten, H. O.; McComas, D. J.; Moebius, E.; Reisenfeld, D. B.; Roelof, E. C.; Schwadron, N. A.;<br />
Slavin, J. D., Studies in the Sensitivity of IBEX to Variable Heliosphere Boundary Conditions, Eos<br />
Trans. AGU, Fall Meet. Suppl., SH21B-1515 (2009).<br />
109. Pogorelov, N. V.; Frisch, P. C.; Heerikhuisen, J.; Zank, G. P.; McComas, D. J.; Schwadron, N. A.;<br />
Christian, E. R.; Crew, G. B.; Demajistre, R.; Gloeckler, G.; Gruntman, M.; Fahr, H.; Fichtner, H.;<br />
Funsten, H. O.; Fuselier, S. A.; Krimigis, S. M.; Kucharek, H.; Lee, M. A.; Moebius, E.; Prested, C. L.;<br />
Reisenfeld, D. B.; Roelof, E. C.; Stone, E. C.; Witte, M., Global Structure of the Heliosphere in the<br />
Interstellar Magnetic Field, Eos Trans. AGU, Fall Meet. Suppl., SH21B-1517 (2009).<br />
110. Pogorelov, N. V.; Suess, S. T.; Ebert, R. W.; Borovikov, S.; McComas, D. J.; Zank, G. P., Solar Cycle<br />
Model Based on Ulysses Measurements (Invited), Eos Trans. AGU, Fall Meet. Suppl., SH24A-02<br />
(2009).<br />
111. Schwadron, N. A.; Bzowski, M.; Crew, G. B.; Gruntman, M.; Fahr, H.; Fichtner, H.; Frisch, P. C.;<br />
Funsten, H. O.; Fuselier, S. A.; Heerikhuisen, J.; Izmodenov, V.; Kucharek, H.; Lee, M. A.; Livadiotis,<br />
G.; McComas, D. J.; Moebius, E.; Moore, T. E.; Mukherjee, J.; Pogorelov, N. V.; Prested, C. L.;<br />
Reisenfeld, D. B.; Roelof, E. C.; Zank, G. P., Comparison of Interstellar Boundary Explorer<br />
Observations with 3-D Global Heliospheric Models, Eos Trans. AGU, Fall Meet. Suppl., SH32A-01<br />
(2009).<br />
112. Mueller, H.; Moebius, E.; Bzowski, M.; Pogorelov, N. V.; Heerikhuisen, J., Toward calculating<br />
heliospheric filtration of interstellar oxygen on its way to IBEX, Eos Trans. AGU, Fall Meet. Suppl.,<br />
SH32A-03 (2009).<br />
113. Heerikhuisen, J.; Pogorelov, N. V.; Zank, G. P.; McComas, D. J.; Funsten, H. O.; Moebius, E.; Fuselier,<br />
S. A.; Schwadron, N. A.; Frisch, P. C., IBEX observations in the context of a global heliospheric model,<br />
Eos Trans. AGU, Fall Meet. Suppl., SH32A-06 (2009).<br />
114. Zank, G. P.; Heerikhuisen, J.; Pogorelov, N. V.; Burrows, R.; McComas, D. J.; Schwadron, N. A.; Frisch,<br />
P. C.; Funsten, H. O.; Moebius, E.; Fuselier, S. A.; Gruntman, M., Pick-up ions, the termination shock,<br />
and energetic neutral atoms (Invited), Eos Trans. AGU, Fall Meet. Suppl., SH32A-07 (2009).<br />
115. Borovikov, S. N., Kryukov, I. A., Pogorelov, N. V., Adaptive mesh refinement on curvilinear grids, in<br />
Astronomical Society of the Pacific Conf. Ser. 406, Numerical Modeling of Space Plasma Flows:<br />
ASTRONUM-2008, ed. N.V. Pogorelov, E. Audit, P. Colella, & G.P. Zank, 127 – 134, ASP, San<br />
Francisco (2009).<br />
116. Pogorelov, N. V., Borovikov, S. N., Florinski, V., Heerikhuisen, J., Kryukov, I. A., Zank, G. P., Multi-scale<br />
fluid-kinetic simulation suite: A tool for efficient modeling of space plasma flows, in Astronomical Society<br />
of the Pacific Conf. Ser. 406, Numerical Modeling of Space Plasma Flows: ASTRONUM-2008, ed. N.V.<br />
Pogorelov, E. Audit, P. Colella, & G.P. Zank, 149 – 159, ASP, San Francisco (2009).<br />
117. Heerikhuisen, J., Pogorelov, N. V., Florinski, V., Zank, G. P., Kharchenko, V., Kinetic modeling of<br />
neutral atom transport in the heliosphere, in Astronomical Society of the Pacific Conf. Ser. 406,<br />
Numerical Modeling of Space Plasma Flows: ASTRONUM-2008, ed. N.V. Pogorelov, E. Audit, P.<br />
Colella, & G.P. Zank, 189 – 1198, ASP, San Francisco (2009).<br />
118. Slavin, J. D.; Frisch, P. C.; Heerikhuisen, J.; Pogorelov, N. V.; Mueller, H. -R.; Reach, W. T.; Zank, G.<br />
P.; Dasgupta, B.; Avinash, K., Exclusion of Tiny Interstellar Dust Grains from the Heliosphere,<br />
TWELFTH INTERNATIONAL SOLAR WIND CONFERENCE. AIP Conference Proceedings, Volume<br />
1216, pp. 497-501 (2010).<br />
119. Pogorelov, N.V., Borovikov, S.N., Burlaga, L.F., Ebert, R.W., Heerikhuisen, J., Hu, Q., McComas, D.J.,<br />
Suess, S.T., Zank, G.P., Transient Phenomena in the Distant SolarWind and in the Heliosheath,<br />
TWELFTH INTERNATIONAL SOLAR WIND CONFERENCE. AIP Conference Proceedings, Volume<br />
1216, pp. 559-562 (2010).<br />
Biological and Critical Systems<br />
MCA00N019<br />
120. S. K. Easley, M. G. Jekir, A. J. Burghardt, M. Li, and T. M. Keaveny. Contribution of the intra-specimen<br />
variations in tissue mineralization to PTH-and raloxifene-induced changes in stiffness of rat vertebrae.<br />
Bone, 46:1162-1169, 2010.<br />
121. A. J. Fields, G. L. Lee, X.S. Liu, M.G. Jekir, X.E. Guo, and T. M. Keaveny. Influenceof vertical<br />
trabeculae on the compressive strength of the human vertebra. Journal of Bone and Mineral Research,<br />
2010 Aug 16. [Epub ahead of print].<br />
122. A. J. Fields, G. L. Lee, and T. M. Keaveny. Mechanisms of initial endplate failure in the human vertebral<br />
body. Journal of Biomechanics, 2010 Sep 2. [Epub ahead of print].
123. J.C. Fox, A. Gupta, H.H. Bayraktar, A.L. Romens, D. Newitt, S. Majumdar and T.M.Keaveny. Role of<br />
trabecular bone elastic anisotropy in the structural behaviour of the proximal femur. Computer Methods<br />
in Biomechanics and Biomedical Engineering (2010) In Review.<br />
124. S.K. Easley, M.T. Chang, D. Shindich, C.J. Hernandez and T.M. Keaveny. Biomechanical Effects of<br />
Simulated Resorption Cavities in Trabecular Bone Across a Wide Range of Bone Volume Fraction.<br />
Journal of Bone and Mineral Research (2010) In Review<br />
125. A.J.Fields , G.L.Lee , X.S.Liu , M.G.Jekir, X.E.Guo and T.M.Keaveny. Vertebral compressive strength is<br />
th<br />
explained by the apparent density of the trabeculae that are vertically oriented. 56 Annual Meeting of<br />
the Orthopaedic Research Society, New Orleans LA, 2010.<br />
126. X.S.Liu, A.J.Fields, T.M.Keaveny, E.Shane and X.E.Guo. µCT/HR-pQCT Image based plate-rod<br />
microstructural finite element model efficiently predicts the elastic moduli and yield strength of human<br />
trabecular bone. Proceedings of the American Society of Mechanical Engineers, Naples FL, 2010.<br />
127. S.Nawathe, A.L.Romens, A.J.Fields, B.Roberts, M.L.Bouxsein and T.M.Keaveny. Cortical-Trabecular<br />
load sharing and high risk tissue distribution in the human proximal femur. 57<br />
Orthopaedic Research Society, Long Beach CA, 2011.<br />
th<br />
Annual Meeting of the<br />
Chemical, Thermal Systems<br />
ASC040030<br />
128. M. Zhou, O. Sahni, M. S. Shephard, C.D. Carothers and K.E. Jansen, “Adjacency-based Data<br />
Reordering Algorithm for Acceleration of Finite Element Computations”, Scientific Programming,<br />
18(2):107-123, 2010.<br />
129. J.C. Vaccaro, Y. Elimelechi, Y. Chen, O. Sahni, M. Amitay and K.E. Jansen, “On the Interaction of<br />
Control Jets in a Short Duct”, Proceedings of the 5th AIAA Flow Control Conference, Chicago, IL,<br />
June/July 2010.<br />
130. K.E. Jansen, O. Sahni, A. Ovcharenko, M.S. Shephard and M. Zhou, “Adaptive Computational Fluid<br />
Dynamics: Petascale and Beyond”, Journal of Physics: Conference Series, (submitted July 2010).<br />
131. M. Zhou, O. Sahni, T. Xie, M.S. Shephard and K.E. Jansen, “Unstructured Mesh Partition Improvement<br />
for Implicit Finite Element at Extreme Scale”, Journal of Supercomputing, (submitted Feb. 2010).<br />
132. M. Zhou, T. Xie, S. Seol, M.S. Shephard, O. Sahni and K.E. Jansen, “Tools to Support Mesh Adaptation<br />
on Massively Parallel Computers”, Engineering with Computers, (submitted Jan. 2010).<br />
133. O. Sahni, M. Zhou, M.S. Shephard and K.E. Jansen, “Scalable Implicit Finite Element Solver for<br />
Massively Parallel Processing with Demonstration to 160K cores”, Proceedings of the 2009 ACM/IEEE<br />
Conference on Supercomputing, 2009 ACM Gordon Bell Finalist, Portland, OR, Nov. 2009.<br />
134. J.C. Vaccaro, Y. Elimelechi, Y. Chen, O. Sahni, M. Amitay and K.E. Jansen, “Experimental and<br />
Numerical Investigation on the Flow Control of a Low Aspect Ratio Short Duct”, Journal of Fluid<br />
Mechanics, 2010 (under preparation)<br />
135. J.C. Vaccaro, O. Sahni, J. Olles, K.E. Jansen, and M. Amitay, “Experimental and Numerical<br />
Investigation of Active Control of Inlet Ducts”, International Journal of Flow Control, 1(2):133-154, 2009.<br />
136. J. Wood, O. Sahni, K.E. Jansen and M. Amitay, “Experimental and Numerical Investigation of Active<br />
Control of 3-D Flows”, Proceedings of the 39th AIAA Fluid Dynamics Conference, San Antonio, TX,<br />
June 2009.<br />
137. O. Sahni, J. Olles and K.E. Jansen, “Simulation of Flow Control in a Serpentine Duct”, Proceedings of<br />
the 47th AIAA Aerospace Sciences Meeting and Exhibit, Orlando, FL, Jan. 2009.<br />
138. O. Sahni, J. Wood, K.E. Jansen and M. Amitay, “3-D Interactions between Finite-span Synthetic Jet and<br />
Cross Flow at a Low Reynolds Number and Angle of Attack”, (accepted), Journal of Fluid Mechanics,<br />
2010.<br />
139. O. Sahni, C.D. Carothers, M.S. Shephard, and K.E. Jansen, “Strong Scaling Analysis of a Parallel,<br />
Unstructured, Implicit Solver and the Influence of the Operating System Interference”, Scientific<br />
Programming, 17(3):261-274, 2009.<br />
140. M. Zhou, O. Sahni, H.J. Kim, C.A. Figueroa, C.A. Taylor, M.S. Shephard and K.E. Jansen,<br />
“Cardiovascular Flow Simulations at Extreme Scale”, Computational Mechanics, 46(1):71-82, 2010.<br />
141. M. Zhou, O. Sahni, K.D. Devine, M.S. Shephard and K.E. Jansen, “Controlling Unstructured Mesh<br />
Partitions for Massively Parallel Simulations”, SIAM Journal on Scientific Computing, (accepted, under<br />
print), 2010.<br />
142. M. Zhou, C.A. Figueroa, H.J. Kim, C.A. Taylor, O. Sahni and K.E. Jansen, “Parallel Adaptive<br />
Computation of Blood Flow in a 3D ‘Whole’ Body Model”, Proceedings of the 2009 ASME Summer<br />
Bioengineering Conference, Lake Tahoe, CA, June 2009.<br />
143. M. Rasquin, N. Mati, O. Sahni and K.E. Jansen, “Numerical Investigation of the Interaction between a
Finite-span Synthetic Jet and a Cross Flow over a Swept Wing”, 63rd Annual Meeting of the APS<br />
Division of Fluid Dynamics, Long Beach, CA, Nov. 2010 (to be held).<br />
144. K.E. Jansen, O. Sahni, A. Ovcharenko, M.S. Shephard and M. Zhou, “Adaptive Computational Fluid<br />
Dynamics: Petascale and Beyond”, The SciDAC 2010 Conference, Chattanooga, TN, July 2010.<br />
145. Onkar Sahni, Kenneth Jansen, Min Zhou and Mark S. Shephard, “Scalable Massively Parallel Implicit<br />
Simulations of Fluid Flows to over 250,000 Processor-cores”, SIAM Conference on Parallel Processing,<br />
Seattle, WA, Feb. 2010.<br />
146. Min Zhou, Onkar Sahni, Karen D. Devine, Mark S. Shephard and Kenneth Jansen, “Improved<br />
Unstructured Mesh Partitions for Parallel Simulations at Extreme Scale”, SIAM Conference on Parallel<br />
Processing, Seattle, WA, Feb. 2010.<br />
147. Mark S. Shephard, Aleksandr Ovcharenko, Ting Xie, Seegyoung Seol, Min Zhou, Onkar Sahni, and<br />
Kenneth Jansen, “Unstructured Mesh Adaptation on Massively Parallel Computers”, SIAM Conference<br />
on Parallel Processing, Seattle, WA, Feb. 2010.<br />
148. J.C. Vaccaro, Y. Elimelechi, Y. Chen, O. Sahni, M. Amitay, and K.E. Jansen, “On the Interaction of<br />
Control Jets in a Short Duct”, 5th AIAA Flow Control Conference, Chicago, IL, June/July 2010.<br />
149. O. Sahni, M. Zhou, M.S. Shephard, and K.E. Jansen, “Scalable Implicit Finite Element Solver for<br />
Massively Parallel Processing with Demonstration to 160K cores”, 2009 ACM/IEEE Conference on<br />
Supercomputing, Gordon Bell Finalist, Portland, OR, Nov. 2009.<br />
150. O. Sahni, K.E. Jansen and M. Amitay, “3-D Interactions of Synthetic Jets and Cross-Flows CFD”, 62nd<br />
Annual Meeting of the APS Division of Fluid Dynamics, Minneapolis, MN, Nov. 2009.<br />
151. K.E. Jansen, O. Sahni and M. Amitay, “Numerical Investigation of Actuators for Flow Control in Inlet<br />
Ducts”, 62nd Annual Meeting of the APS Division of Fluid Dynamics, Minneapolis, MN, Nov. 2009.<br />
152. O. Sahni, M. Amitay, M.S. Shephard and K.E. Jansen, “Investigation of Active Control of 3D Flows<br />
using Adaptive Simulations”, 10th US National Congress on Computational Mechanics, Columbus, OH,<br />
July 2009.<br />
153. M. Zhou, O. Sahni, C.A. Figueroa, H.J. Kim, C.A. Taylor and K.E. Jansen “Parallel Adaptive<br />
Computation of Blood Flow in a 3D ‘Whole’ Body Model”, 10th US National Congress on Computational<br />
Mechanics, Columbus, OH, July 2009.<br />
154. K.E. Jansen, M.S. Shephard, O. Sahni and M. Zhou, “Petascale Adaptive Implicit Computational Fluid<br />
Dynamics”, 10th US National Congress on Computational Mechanics, Columbus, OH, July 2009.<br />
155. M.S. Shephard, K.E. Jansen, O. Sahni, X.J. Luo, M. Zhou, A. Ovcharenko and T. Xie, “Recent<br />
Advances and Experiences in Adaptive Mesh Generation and Control for Large-Scale Simulations”,<br />
Keynote Talk, 10th US National Congress on Computational Mechanics, Columbus, OH, July 2009.<br />
156. J. Wood, O. Sahni, K.E. Jansen and M. Amitay, “Experimental and Numerical Investigation of Active<br />
Control of 3-D Flows, 39th AIAA Fluid Dynamics Conference, San Antonio, TX, June 2009.<br />
157. K.E. Jansen, O. Sahni, M.S. Shephard and M. Zhou, “Anisotropic, Adaptive, Implicit CFD”, Science<br />
Track Talk, TeraGrid’09 Meeting, Arlington, VA, June 2009.<br />
158. M. Zhou, C.A. Figueroa, H.J. Kim, C.A. Taylor, O. Sahni and K.E. Jansen, “Parallel Adaptive<br />
Computation of Blood Flow in a 3D ‘Whole’ Body Model”, 2009 ASME Summer Bioengineering<br />
Conference, Lake Tahoe, CA, June 2009.<br />
159. M. Zhou, O. Sahni, M.S. Shephard and K.E. Jansen, “Local Partition Modification for Improved Parallel<br />
Finite Element Computations”, Finalist in 2nd BGCE Student Paper Prize, 2009 SIAM Conference on<br />
Computational Science and Engineering, Miami, FL, March 2009.<br />
160. K.E. Jansen, O. Sahni, C.D. Carothers and M.S. Shephard, “Performance Analysis of an Adaptive<br />
Computational Fluid Dynamics Solver on Near Petascale Supercomputer Systems”, 2009 SIAM<br />
Conference on Computational Science and Engineering, Miami, FL, March 2009.<br />
CHE100005<br />
161. S. Aboud, J. Wilcox, A density functional theory study of the charge state of hydrogen in metal-hydrides,<br />
J. Phys. Chem. C, 2010.<br />
162. B. Padak and J. Wilcox, Understanding Mercury Binding on Activated Carbon, Carbon, 2009, 47(12),<br />
2855-2864.<br />
163. E. Sasmaz, S. Aboud, J. Wilcox, Hg Binding on Pd Binary Alloys and Overlays, J. Phys. Chem. C,<br />
2009, 113(18), 7813-7820.<br />
164. J. Wilcox, A Kinetic Investigation of High-Temperature Mercury Oxidation by Chlorine, J. Phys. Chem.<br />
A, 2009, 113, 6633-6639.<br />
CTS030005<br />
165. Bernard,P.S.(2009) “Vortexfilamentsimulationof theturbulentcoflowingjet,” Phys. Fluids 21, 025107.
166. Bernard,P.S.,Collins,J.P.andPotts,M.(2009) “Vortexfilamentsimulationof the turbulent boundary layer,”<br />
AIAA Paper 2009-3547.<br />
167. Bernard,P.S.,Collins,J.P.andPotts,M.(2010) “Vortexfilamentsimulationof the turbulent boundary layer ,”<br />
AIAA Journal, 48, 1757 -1771.<br />
168. Bernard,P.S.,(2010) “Vortexfurrowsinboundarylayertransitionandturbulence,” Physics of Fluids, under<br />
review<br />
CTS080002<br />
169. A.Ö. Yazaydin, R.Q. Snurr, T.-H. Park, K. Koh, J. Liu, M.D. LeVan, A.I. Benin, P. Jakubczak, M.<br />
Lanuza, D.B. Galloway, J.J. Low, R.R. Willis, “Screening of metal-<strong>org</strong>anic frameworks for carbon<br />
dioxide capture from flue gas using a combined experimental and modeling approach,” J. Am. Chem.<br />
Soc. 131, 18198-18199 (2009).<br />
170. X.Y. Bao, L.J. Broadbelt, R.Q. Snurr, “Elucidation of consistent enantioselectivity for a homologous<br />
series of chiral compounds in homochiral metal-<strong>org</strong>anic frameworks,” Phys. Chem. Chem. Phys. 12,<br />
6466-6473 (2010). (Special issue on Characterization of Adsorbed Species)<br />
171. C.E. Wilmer, R.Q. Snurr, “Towards rapid computational screening of metal-<strong>org</strong>anic frameworks for<br />
carbon dioxide capture: calculation of framework charges via charge equilibration,” Chem. Eng. J., in<br />
press. (Special issue on CO 2 Capture)<br />
172. G.A.E. Oxford, R.Q. Snurr, L.J. Broadbelt, “Hybrid quantum mechanics/molecular mechanics<br />
investigation of (salen)Mn for use in metal-<strong>org</strong>anic frameworks,” Ind. Eng. Chem. Res., in press<br />
173. G.A.E. Oxford, D. Dubbeldam, L.J. Broadbelt, R.Q. Snurr, “Elucidating steric effects on<br />
enantioselective epoxidation catalyzed by (salen)Mn in metal-<strong>org</strong>anic frameworks,” J. Molec. Catal. A,<br />
submitted.<br />
174. X.Y. Bao, R.Q. Snurr, L.J. Broadbelt, “Insights into the complexity of chiral recognition by a threepoint<br />
model,” Langmuir, in preparation<br />
175. A.N. Dickey, A. Ö. Yazaydin, R.R. Willis, R.Q. Snurr, “Screening CO 2/N 2 selectivity in metal-<strong>org</strong>anic<br />
frameworks using Monte Carlo simulations,” Energy & Environmental Science, in preparation. (Invited<br />
article)<br />
CTS080046<br />
176. H. Wang, S. B. Pope, D. A. Caughey. Central-difference schemes on non-uniform grids and their<br />
applications in large-eddy simulations of turbulent jets and jet flames. Journal of Computational Physics,<br />
(to be submitted), 2010.<br />
177. K. A. Kemenov, H. Wang, S. B. Pope. Modeling effects of subgrid-scale mixture fraction variance in<br />
LES of a piloted diffusion flame. Combustion Theory and Modelling, (submitted), 2010.<br />
178. D. H. Rowinski, S. B. Pope. PDF calculations of piloted premixed flames. Combustion Theory and<br />
Modelling, (submitted), 2010.<br />
179. H. Wang, S. B. Pope. Large eddy simulation/probability density function modeling of a turbulent CH4<br />
/H2 /N2 jet flame. Proceedings of the Combustion Institute (accepted), 2010.<br />
180. K. A. Kemenov, S. B. Pope. Molecular diffusion effects in LES of a piloted methane-air flame.<br />
Combustion and Flame (in press), doi:10.1016/j.combust flame.2010.08014, 2010.<br />
CTS100002<br />
181. Zheng, X., Xue. Q. Mittal R., and Bielamowicz, S, "A Coupled Sharp-Interface Immersed-<br />
Boundary-Finite-Element Method for Flow-Structure Interaction with Application to Human<br />
Phonation," J. Biomechanical Engineering, in press.<br />
182. Zheng, X., Mittal, R. and Bielamowicz, S. "A Computational Study of Asymmetric Glottal Jet<br />
Deflection During Phonation," J. Acoustical Society of America, in review.<br />
183. Zheng, X., Mittal, R., Xue. Q. and Bielamowicz, "Direct-Numerical Simulation of the Glottal Jet<br />
and Vocal-Fold Dynamics in a Three-Dimensional Laryngeal Model," J. Acoustical Society of<br />
America, in review.<br />
184. Xue, Q., Mittal, R. and Zheng, X., "A Computational Study of the Effect of Vocal-fold<br />
Asymmetry on Phonation," J. Acoustical Society of America, Vol. 128, No. 2, 2010, pp. 818-827.<br />
185. Seo, J. H. and Mittal, R., "A New Immersed Boundary Method for Aeroacoustic Sound
Prediction around Complex Geometries, " 40 th AIAA Fluid Dynamics Conference & Exhibit, AIAA<br />
paper 2010-4434.<br />
186. Seo, J. H. and Mittal, R., "A High-order Immersed Boundary Method for Acoustic Wave<br />
Scattering and Low Mach number Flow Induced Sound in Complex Geometries, " J. of Computational<br />
Physics, in press.<br />
187. Zheng, L., Mittal, R., and Hedrick, T., "A Search for Optimal Wing Stokes in Flapping<br />
Flight: Can Engineers Improve upon Nature " 28 th AIAA Applied Aerodynamics Conference, AIAA<br />
paper 2010-4944.<br />
CTS100003<br />
188. Journal of Computational Physics: A 3D Parallel Adaptive Mesh Refinement Method for Fluid Structure<br />
Interaction (in preparation)<br />
189. Journal of Experimental Biology: Numerical investigation of the hydrodynamics of oscillatory batoid<br />
swimming (in preparation)<br />
190. Journal of Experimental Biology: Numerical investigation of the hydrodynamics of undulatory batoid<br />
swimming (in preparation)<br />
191. 63rd Annual Meeting of the APS Division of Fluid Dynamics: Numerical investigation of the 3D flow field<br />
generated by a self-propelling manta ray<br />
192. SICB Annual Meeting 2011: A numerical investigation of the hydrodynamic signature of batoid<br />
swimming<br />
CTS100012<br />
193. Clausen, J. R., Aidun, C. K., “Capsule dynamics and rheology in shear flow: Particle pressure and<br />
normal stress,” accepted by Physics of Fluids, Aug. 2010.<br />
194. Aidun, C. K., Clausen, J. R., Reasor, D. A., Wu, J., “Visualizing Deformable Suspension Simulations in<br />
ParaView,” Kitware Inc., The Source online magazine, www.kitware.com/products/thesource.html,<br />
October 2010.<br />
195. Reasor, D. A., Clausen, J. R, Aidun, C. K., “Coupling the lattice-Boltzmann and spectrin-link methods<br />
for the direct numerical simulation of cellular blood flow,” under consideration for International Journal<br />
for Numerical Methods in Fluids, June 2010.<br />
196. Wu J., Yun B.M., Fallon A.M., Hanson S.R., Aidun C.K., Yoganathan A.P., “Numerical investigation of<br />
the effects of channel geometry on platelet activation and blood damage,” accepted by Annals of<br />
Biomedical Engineering, October 2010.<br />
197. Khan, I., Aidun, C. K., “Direct numerical simulation of flow through deformable porous media using<br />
parallel hybrid Lattice Boltzmann and Finite Element method,” under review for International Journal of<br />
Numerical Methods in Engineering, 2010.<br />
198. Khan, I., Aidun, C. K., “Modeling the macroscopic behavior of saturated deformable porous media using<br />
direct numerical simulations,” under review for International Journal of Multiphase Flow, 2010.<br />
199. Clausen, J. R., Reasor, D. A., Aidun, C. K., “The rheology and microstructure of concentrated<br />
noncolloidal suspensions of deformable capsules,” in preparation for Journal of Fluid Mechanics, Fall<br />
2010.<br />
200. Wu J., Yun B.M., Simon H.A., Sotiropoulos F., Aidun C.K., Yoganathan A.P., “A numerical investigation<br />
of blood damage in the hinge area of BMHV during mid-diastole,” in preparation for Annals of<br />
Biomedical Engineering, 2010.<br />
201. Yun B. M., Aidun C. K., “Lateral migration of a deformable spherical particle in Poiseuille flow,” in<br />
preparation, 2010.<br />
202. Reasor D. A., Clausen J. R., Aidun, C. K., “Rheological Characterization of Cellular Blood in Shear,” in<br />
preparation, 2010.<br />
203. Aidun, C. K., Clausen, J. R., Reasor, D. A., “Direct Numerical Simulation of Red Blood Cells in Shear<br />
Flow,” 14 th SIAM Conference on Parallel Processing for Scientific Computing, Seattle, WA; Feb. 2010.<br />
204. Clausen, J. R., Reasor, D. A., Wu, J., Aidun, C. K., “DNS of deformable capsules, fibers, and particles<br />
suspended in flow,” Winner of the Outstanding Video Presentation,7 th International Conference for<br />
Multiphase Flow, Tampa, FL, 2010.<br />
205. Clausen, J. R., Reasor, D. A., Aidun, C. K., “The Effect of Particle Deformation on the Rheology of
Noncolloidal Suspensions,” 7 th International Conference for Multiphase Flow, Tampa, FL; May 2010.<br />
206. Reasor, D. A., Clausen, J. R., Aidun, C. K., “Red Blood Cell Concentration Distributions in Rigid<br />
Arteries,” 7 th International Conference for Multiphase Flow, Tampa, FL; May 2010.<br />
207. Khan, I., Aidun, C. K., “Direct Numerical Simulation of Flow in Deformable Porous Media using a Hybrid<br />
Lattice Boltzmann and Finite Element Method,” 7 th International Conference for Multiphase Flow,<br />
Tampa, FL; May 2010.<br />
208. Wu, J., Aidun, C. K., “A method for direct simulation of deformable particle suspensions using lattice–<br />
Boltzmann equation with external boundary force,” 7 th International Conference for Multiphase Flow,<br />
Tampa, FL; May 2010.<br />
209. Khan, I., Aidun, C. K., “Direct Numerical Simulation of Flow in Deformable Porous Media,” 7 th<br />
International Conference for Multiphase Flow, Tampa, FL; May 2010.<br />
210. Wu J., Yun B. M., Fallon A. M., Simon H. A., Aidun C. K., Yoganathan A. P.: Numerical investigation of<br />
blood damage in the hinge area of bileaflet mechanical heart valves. American Society of Mechanical<br />
Engineers: Summer Bioengineering Conference, Naples, FL. June 2010.<br />
211. Clausen, J. R., Reasor, D. A., Aidun, C. K., “The direct numerical simulation of dense suspensions of<br />
deformable particles,” in preparation for the 82 nd Annual Meeting of the Society of Rheology, Santa Fe,<br />
NM; Oct. 2010.<br />
212. Clausen, J. R., Reasor, D. A., Aidun, C. K., “The Rheology and Microstructure of Dense Suspensions of<br />
Elastic Capsules,” in preparation for the 63 rd Annual APS Division of Fluid Dynamics Meeting, Long<br />
Beach, CA; Nov. 2010.<br />
213. Reasor, D. A., Clausen, J. R., Aidun, C. K., “Rheological characterization of cellular blood via a hybrid<br />
lattice-Boltzmann / coarse-grained spectrin-link method,” in preparation for the 63 rd Annual APS Division<br />
of Fluid Dynamics Meeting, Long Beach, CA; Nov. 2010.<br />
214. Yun B. M., Wu J., Simon H. A., Sotiropoulos F., Aidun C. K., Yoganathan A. P. “A numerical<br />
investigation of blood damage in the hinge area of bilea flet mechanical heart valves,” in preparation for<br />
the 63 rd Annual APS Division of Fluid Dynamics Meeting, Long Beach, CA. Nov. 2010.<br />
Chemistry<br />
CHE030089<br />
215. Meng, L.; Wang, S. C.; Fettinger, J. C.; Kurth, M. J.; Tantillo, D. J. Eur. J. Org. Chem. 2009, 1578-1584:<br />
"Controlling Selectivity for Cycloadditions of Nitrones and Alkenes Tethered by Benzimidazoles:<br />
Combining Experiment and Theory"<br />
216. Ho, G. A.; Nouri, D. H.; Tantillo, D. J. Tetrahedron Lett. 2009, 50, 1578-1581: "Carbocation<br />
Rearrangements in Aspernomine Biosynthesis"<br />
217. Hong, Y. J.; Tantillo, D. J. J. Am. Chem. Soc. 2009, 131, 7999-8015: "Consequences of Conformational<br />
Pre<strong>org</strong>anization in Sesquiterpene Biosynthesis. Theoretical Studies on the Formation of the Bisabolene,<br />
Curcumene, Acoradiene, Zizaene, Cedrene, Duprezianene, and Sesquithuriferol Sesquiterpenes"<br />
218. Lodewyk, M. W.; Kurth, M. J.; Tantillo, D. J. J. Org. Chem. 2009, 74, 4804-4811: "Mechanisms for<br />
Formation of Diazocinones, Pyridazines, and Pyrazolines from Tetrazines - Oxyanion-Accelerated<br />
Pericyclic Cascades"<br />
219. Hong, Y. J.; Tantillo, D. J. Nature Chem. 2009, 1, 384-389: "A Potential Energy Surface Bifurcation in<br />
Terpene Biosynthesis." This article was highlighted in Nature Chemical Biology (2009, 5, 614).<br />
220. Wang, S. C.; Troast, D.; Sheridan-Conda, M.; Zuo, G.; LaGarde, D.; Louie, J.; Tantillo,<br />
a. D. J. J. Org. Chem. 2009, 74, 7822-7833: "Mechanism of the Ni(0)-Catalyzed<br />
Vinylcyclopropane - Cyclopentene Rearrangement"<br />
221. Hong, Y. J.; Tantillo, D. J. Org. Biomol. Chem. 2009, 7, 4101-4109: "Modes of Inactivation of<br />
Trichodiene Synthase by a Cyclopropane-Containing Farnesyldiphosphate Analog"<br />
222. Wang, S. C.; Beal, P. A.; Tantillo, D. J. J. Comput. Chem. 2010, 31, 721-725: "Covalent Hydration<br />
Energies for Purine Analogs by Quantum Chemical Methods"<br />
223. Gribanova, T. N.; Starikov, A. G.; Minyaev, R. M.; Minkin, V. I.; Siebert, M. R.; Tantillo,<br />
a. D. J. Chem. Eur. J. 2010, 16, 2272-2281: "Sandwich Compounds of Transition Metals with<br />
Cyclopolyenes and Isolobal Boron Analogues"<br />
224. Davis, R. L.; Tantillo, D. J. J. Org. Chem. 2010, 75, 1693-1700: "Dissecting a Dyotropic<br />
Rearrangement"<br />
225. Tantillo, D. J. Org. Lett. 2010, 12, 1164-1167: "How an Enzyme Might Accelerate an Intramolecular<br />
Diels-Alder Reaction - Theozymes for the Formation of Salvileucalin B"
226. Hong, Y. J.; Tantillo, D. J. J. Am. Chem. Soc. 2010, 132, 5375-5386: "Formation of Beyerene, Kaurene,<br />
Trachylobane and Atiserene Diterpenes by Rearrangements that Avoid Secondary Carbocations"<br />
227. Tantillo, D. J. Chem. Soc. Rev. 2010, 39, 2847-2854: "The Carbocation Continuum in Terpene<br />
Biosynthesis - Where are the Secondary Cations" invited tutorial review.<br />
228. Siebert, M. R.; Tantillo, D. J. J. Phys. Org. Chem. 2010, in press: "Fundamental Properties of N-<br />
Alkenylaziridines - Implications for the Design of New Reactions and Organocatalysts"<br />
229. Hong, Y. J.; Tantillo, D. J. Chem. Sci. 2010, in press: "A Tangled Web - Interconnecting Pathways to<br />
Amorphadiene and the Amorphene Sesquiterpenes"<br />
230. Gutierrez, O.; Tantillo, D. J. Organometallics 2010, 29, 3541-3545: "Transition Metal Intervention for a<br />
Classic Reaction - Assessing the Feasibility of Ni(0)-promoted [1,3] Sigmatropic Shifts of<br />
Bicyclo[3.2.0]hept-2-enes"<br />
231. Hong, Y. J.; Tantillo, D. J. Org. Biomol. Chem. 2010, 8, 4589-4600: "Quantum Chemical Dissection of<br />
The Classic Terpinyl/Pinyl/Bornyl/Camphyl Cation Conundrum - The Role of Pyrophosphate in<br />
Manipulating Pathways to Monoterpenes"<br />
232. Siebert, M. R.; Osbourn, J. M.; Brummond, K. M.; Tantillo, D. J. J. Am. Chem. Soc. 2010, 132, 11952-<br />
11966: "Differentiating Mechanistic Possibilities for the Thermal, Intramolecular [2 + 2] Cycloaddition of<br />
Allene-ynes"<br />
233. Siebert, M. R.; Yudin, A. K.; Tantillo, D. J. Eur. J. Org. Chem. 2010, in press: "On the Mechanism of the<br />
Rh(I)-Catalyzed Rearrangement of N-Allylaziridines to (Z)-N-Alkenylaziridines"<br />
234. Davis, R. L.; Tantillo, D. J. Curr. Org. Chem. 2010, 14, 1561-1577: "Theoretical Studies on<br />
Pentadienyl Cation Electrocyclizations," invited review for special issue on Molecular Simulations in<br />
Organic Chemistry.<br />
CHE070035<br />
235. A Computational Analysis of the Interaction between Flavin and Thiol(ate) Groups.<br />
236. Implications for Flavoenzyme Catalysis (Submitted to Journal of Sulfur Chemistry)<br />
237. Olga Dmitrenko* and Colin Thorpe C.W. Muller, J.J. Newby, C.P. Liu, C.P. Rodrigo and T.S.<br />
Zwier Duschinsky mixing between four non-totally symmetric normal coordinates in the S1-S0 vibronic<br />
structure of (E)-phenylvinylacetylene: a quantitative analysis Phys. Chem. Chem. Phys., 2010, DOI:<br />
10.1039/b919912h<br />
238. W.H. James III, E.E. Baquero, S.H. Choi, S.H. Gellman and T.S. Zwier Laser spectroscopy of<br />
conformationally constrained alpha/beta-peptides : Ac-ACPC-Phe-NHMe and Ac-Phe-ACPC-NHMe J.<br />
Phys. Chem. A., 2010, 114, 1581-1591<br />
239. Chirantha P. Rodrigo, Christian Muller, William H. James III, Nathan R. Pillsbury and Timothy S.<br />
Zwier Conformational Isomerization of bis-(4-HYDROXYPHENYL) Methane in a Supersonic Jet<br />
Expansion, Part I: Low Barrier Potential Energy Surface in the S 0 State OSU International Symposium<br />
on Molecular Spectroscopy, 2009<br />
240. J.J. Newby, C.W. Muller, C.P. Liu and T. S. Zwier Jet-cooled vibronic spectroscopy and asymmetric<br />
torsional potentials of phenylcyclopentene Phys. Chem. Chem. Phys., 2009, 11, 8330-8341<br />
241. J.J. Newby, C.P. Liu, C.W. Muller and T. S. Zwier Jet-cooled vibronic spectroscopy of potential<br />
intermediates along the pathway to PAH: phenylcyclopenta-1,3-diene Phys. Chem. Chem. Phys., 2009,<br />
11, 8316-8329<br />
242. John M. Bailey, Catherine E. Check and Thomas M. Gilbert Computational Studies of Pericyclic<br />
Reactions between Aminoalanes and Ethyne, Ethene, and Dienes. A Reactive Aminoalane That Should<br />
Prefer [2 + 2] and [4 + 2] Cyclizations to Dimerization Organometallics, 2009, 28 (3), pp 787-794<br />
243. J. Suarez, S.C. Farantos, S. Stamatiadis and L. Lathouwers A Method for Solving the Molecular<br />
Schrodinger Equation in Cartesian Coordinates via Angular Momentum Projection Operators Computer<br />
Physics Communications Volume 180, Issue 11, November 2009, Pages 2025-2033<br />
244. Nathan R. Pillsbury, Christian W. Muller and Timothy S. Zwier Conformational Isomerization and<br />
Collisional Cooling Dynamics of Bis(2-hydroxyphenyl)methane J. Phys. Chem. A, 2009, 113 (17), pp<br />
5013-5021<br />
245. Nathan R. Pillsbury and Timothy S. Zwier Conformational Isomerization of 5-Phenyl-1-pentene Probed<br />
by SEP-Population Transfer Spectroscopy J. Phys. Chem. A, 2009, 113 (1), pp 126-134<br />
246. R. Bach Ring Strain Energy in the Cyclooctyl System. The Effect of Strain Energy on [3 + 2]<br />
Cycloaddition Reactions with Azides J. Am. Chem. Soc., 2009, 131 (14), pp 5233-5243<br />
247. Sylvian Cadars, Darren H. Brouwerb and Bradley F. Chmelka Probing Local Structures of Siliceous<br />
Zeolite Frameworks by Solid-state NMR and First-principles Calculations of 29Si-O-29Si Scalar<br />
Couplings Phys. Chem. Chem. Phys., 2009, 11, 1825-1837<br />
248. T.A. LeGreve, W.H. James III and T.S. Zwier Solvent effects on the conformational preferences of<br />
serotonin: serotonin-(H 2O) n, n=1,2 J. Phys. Chem. A., 2009, 113, 399-410
249. V.A. Shubert, C.W. Muller and T.S. Zwier Water's Role in Reshaping a Macrocycle's Binding Pocket:<br />
Infrared and ultraviolet spectroscopy of benzo-15-crown-5-(H 2O) n and 4-aminobenzo-15-crown-5-<br />
(H 2O) n, n=1, 2 J. Phys. Chem. A, 2009, 113, 8067-8079<br />
250. V. Alvin Shubert, William H. James III and Timothy S. Zwier Jet-Cooled Electronic and Vibrational<br />
Spectroscopy of Crown Ethers: Benzo-15-Crown-5 Ether and 4'-Amino-Benzo-15-Crown-5 Ether J.<br />
Phys. Chem. A, 2009, 113 (28), pp 8055-8066<br />
251. Wei Huang, Joohyun Kim, Shantenu Jha and Fareed Aboul-ela A Mechanism for S-adenosyl<br />
Methionine Assisted Formation of a Riboswitch Conformation: a Small Molecule with a Strong<br />
Arm Nucleic Acids Res. 2009 Oct; 37(19):6528-39<br />
252. William H. James III, Esteban E. Baquero, V. Alvin Shubert, Soo Hyuk Choi, Samuel H. Gellman and<br />
Timothy S. Zwier Single-Conformation and Diastereomer Specific Ultraviolet and Infrared Spectroscopy<br />
of Model Synthetic Foldamers: α/β-Peptides J. Am. Chem. Soc., 2009, 131 (18), pp 6574-6590<br />
253. Yong-Hui Tian and Miklos Kertesz Ladder-Type Polyenazine Based on Intramolecular S⋅⋅⋅N<br />
Interactions: A Theoretical Study of a Small-Bandgap Polymer Macromolecules, 2009, 42 (16), pp<br />
6123-6127<br />
254. Yong-Hui Tian and Miklos Kertesz Molecular Actuators Designed with S∴N(sp 2 ) Hemibonds Attached to<br />
a Conformationally Flexible Pivot Chem. Mater., 2009, 21 (10), pp 2149-2157<br />
255. Yong-Hui Tian and Miklos Kertesz Low-Bandgap Pyrazine Polymers: Ladder-Type Connectivity by<br />
Intramolecular S⋅⋅⋅N(sp 2 ) Interactions and Hydrogen Bonds Macromolecules, 2009, 42 (7), pp 2309-<br />
2312<br />
256. R. Dooley and S. Pamidighantam After the Dance: Hope for CI software<br />
sustainability cyberinfrastructure soft sustainability and reusability Workshop, Mar 2009<br />
257. HP-CAST meeting Krakow, Poland<br />
CHE070060<br />
258. “The Extraordinary Stability Imparted to Silver Monolayers by Chloride: Novel Atomically-flat Surfaces”<br />
Erin V. Iski, Mahnaz El-Kouedi, Camilo Calderon, Feng Wang, Darin O. Bellisario, Tao Ye, E. Charles<br />
H. Sykes, Electrochimica Acta, submitted<br />
259. “Correcting for dispersion interaction and beyond in density functional theory through force matching”,<br />
Yang Song, Omololu Akin-ojo and Feng Wang, J. Chem. Phys., in press<br />
260. “Mimicking coarse-grained simulation without Coarse-graining, Enhanced Sampling through Shortrange<br />
Damped Potential”, Dongshan Wei and Feng Wang, J. Chem. Phys, 133, 084101 (2010). [Cover<br />
Article], [Selected for Virtual Journal of Biological Physics Research]<br />
261. “The Quest for the Best Non-polarizable Force Field for Water from the Adaptive Force Matching<br />
Method”, Omololu Akin-ojo, and Feng Wang, J. Comput. Chem., in press, 10.1002/jcc.21634<br />
262. “Computational Investigation of 1-Palmitoyl-2-Oleoyl-sn-Glycero-3-Phosphocholine Lipid at Three<br />
Hydration Levels”, Eric Pinnick, Shyamsunder Erramilli, and Feng Wang, Molecular Physics, 108, 2027<br />
(2010)<br />
263. “The potential of mean force of nitrous oxide in 1,2-dimyristoylphosphatidylcholine lipid bilayer” Eric<br />
Pinnick, Shyamsunder Erramilli and Feng Wang, Chem. Phys. Lett., 489, 96 (2010)<br />
264. “Understanding the rotational mechanism of a single molecule: STM and DFT investigations of dimethyl<br />
sulfide molecular rotors on Au(111)”, Heather L. Tierney, Camilo E. Calderon, Ashleigh E. Baber, E.<br />
Charles H. Sykes, and Feng Wang, J. Phys. Chem. C, 114, 3152 (2010)<br />
CHE090047<br />
265. “O + C 2H 4 Potential Energy Surface: Initial Excited States, Biradicals, and Lowest-lying Singlet at the<br />
Multireference Level”, A.C. West, J.D. Lynch, B. Sellner, H. Lischka, W.L. Hase, and T.L. Windus, J.<br />
Chem. Phys. A, submitted<br />
266. "Fragmentation and Reactivity in Collisions of Protonated Diglycine with Chemically Modified F-SAM<br />
Surfaces", G. L. Barnes, K. Young, and W. L. Hase, Journal of Chemical Physics, submitted.<br />
267. "Potential Energy Surfaces of the First Three Singlet States of CH 3Cl", G. Granucci, G. Medders, and A.<br />
M. Velasco, Chemical Physics Letters, accepted.<br />
268. “Singlet and Triplet Potential Surfaces for the O 2 + C 2H 4 Reaction”, K. Park, A.C. West, E. Raheja, B.<br />
Sellner, H. Lischka, T.L. Windus, and W.L. Hase, J. Chem. Phys. A, accepted<br />
269. “O( 3 P) + C 2H 4 Potential Energy Surface: Study at the Multireference Level”, A.C. West, J.S. Kretchmer,<br />
B. Sellner, K. Park, W.L. Hase, H. Lischka, and T.L. Windus, J. Phys. Chem. A, 2009, 113, 12663-<br />
12674. Invited.<br />
270. “Photodynamics simulations of thymine: relaxation into the first excited singlet state”, J.J. Szymczak, M.<br />
Barbatti, J. Soo Hoo, J. Adkins, T.L. Windus, D. Nachtigallova, and H. Lischka, J. Phys. Chem. A, 2009,<br />
113, 12686-12693. Invited.
CHE090095<br />
271. Shields,* RM; Temelso, B; Archer,* KA; Morrell,* TE; Shields, GC. “Accurate Predictions of Water<br />
Cluster Formation, (H2O)n=2-10.” J. Phys. Chem. A. (2010) In Press.<br />
272. “Atmospheric Implications for Formation of Clusters of Ammonium and One to Ten Water Molecules”<br />
Thomas E. Morrell* and Ge<strong>org</strong>e C. Shields, The Journal of Physical Chemistry A, 114(12), 4266–4271.<br />
273. Temelso, B; Archer,* KA; Shields, GC. “Benchmark Quality Structures and Binding Energies of Small<br />
Water Clusters” In preparation.<br />
274. Morrell,* TE; Shields,* RM; Allodi,* MA; Wood,* EK; Kirschner, KN; Castonguay, TC; Temelso, B;<br />
Archer,* KA; Shields, GC. “The Growth of Sulfuric Acid/Water Clusters in the Atmosphere”. In<br />
preparation.<br />
275. Temelso, B; Gauthier,* A; Barnes,* AB; Shields, GC. “Multi-level Computational Study of AFP-like<br />
Peptides and Their Anti-breast Cancer Properties.” In preparation.<br />
CHE100010<br />
276. “Bonding and Isomerism in SF n-1Cl (n = 1-6): A quantum chemical study,” J. Leiding, D.E. Woon and T.<br />
H. Dunning, Jr., J. Phys. Chem. A., submitted.<br />
CHE110018<br />
277. North, M. A., Bhattacharyya, S., and Truhlar, D. G. (2010) Improved density functional description of<br />
the electrochemistry and structure-property descriptors of substituted flavins J. Phys. Chem. B (in<br />
press).<br />
CTS080032<br />
278. Hansgen, D. A.; Vlachos, D. G.; Chen, J. G. Using first principles to predict bimetallic catalysts for the<br />
ammonia decomposition reaction. Nature Chem. 2010, 2, 484-489.<br />
279. Hansgen, D. A.; Vlachos, D. G.; Chen, J. G. Correlating ammonia decomposition activity with nitrogen<br />
binding energy on Co-Pt, Ni-Pt, Fe-Pt and Cu-Pt bimetallic surfaces. Surf. Sci. In preparation.<br />
280. Chen, Y.; Vlachos, D. G. Hydrogenation of Ethylene and Dehydrogenation and Hydrogenolysis of<br />
Ethane on Pt(111) and Pt(211): A Density Functional Theory Study. J. Phys. Chem. C 2010, 114, 4973-<br />
4982.<br />
281. Salciccioli, M; Chen, Y.; Vlachos, D.G. DFT derived group additivity and linear scaling methods for<br />
prediction of oxygenate stability on metal catalysts: Adsorption of open-ring alcohol and polyol<br />
dehydrogenation intermediates on Pt based metals. J. Phys. Chem. C Accepted for publication.<br />
282. Salciccioli, M.; Chen, Y.; Vlachos, D. G. Microkinetic Modeling and Reduced Rate Expressions of<br />
Ethylene Hydrogenation and Ethane Hydrogenolysis on Platinum. Ind. Eng. Chem. Res. In Press.<br />
283. Salciccioli, M.; Chen, Y.; Vlachos, D. G. DFT studies of transition states of ethylene glycol thermal<br />
decomposition on platinum based catalysts. J. Phys. Chem. C In preparation.<br />
Earth Sciences<br />
EAR090003<br />
284. Hartzell, S., L.Ramirez-Guzman, L. Carver, and P-C Liu (2010). Short Baseline Variations in Site<br />
Response and Wave Propagation Effects and Their Structural Causes: Four Examples in and<br />
around the Santa Clara Valley, California, Bull Seismol. Soc. Am. Vol 100 5A.<br />
285. Ramirez-Guzman, L., O. Boyd, S. Hartzell, and R. Williams (2010). Large Scale Earthquake<br />
Simulations in the Central United States, SCEC Annual Meeting, Palm Springs, CA, USA,<br />
September 11–15.<br />
286. Ramirez-Guzman, L., O. Boyd, S. Hartzell, and R. Williams (2010). Large Scale Earthquake Ground
Motion Simulations in the Central United States, Geological Society of America Annual Meeting,<br />
Denver, CO, USA, 31 October – 3 November.<br />
287. Ramirez-Guzman, L., O. Boyd, S. Hartzell, and R. Williams. Central United States Velocity Model<br />
Version 1: Description and Validation Tests, in preparation.<br />
Materials Research<br />
DMR010001<br />
288. E. Cockayne and A. van de Walle. Building effective models from scarce but accurate data: Application<br />
to an alloy cluster expansion model. Phys. Rev. B, 81:012104, 2010.<br />
289. P. Dalach, D. E. Ellis, and A. van de Walle. First principles thermodynamic modeling of atomic ordering<br />
in yttria-stabilized zirconia. Phys. Rev. B, 2010. Accepted.<br />
290. B. Meredig, A. Thompson, H.A. Hansen, C. Wolverton, and A. van de Walle. A method for locating lowenergy<br />
solutions within DFT+U. 2010. submitted.<br />
291. G.S. Pomrehn,E. S.Toberer,G.J.Snyder, andA.van de Walle. Entropic stabilization and<br />
retrograde solubility in Zn4Sb3. 2010. submitted.<br />
292. C.Ravi, A. vandeWalle, B.K.Panigrahi, H.K.Sahu, andM.C.Valsakumar. Cluster expansion-monte carlo<br />
study of phase stability of vanadium nitrides. Phys. Rev. B, 81:104111, 2010.<br />
293. P. Tiwary, A. van de Walle, and N. Gronbech Jensen. Ab initio construction of interatomic potentials for<br />
uranium dioxide across all interatomic distances. Phys. Rev. B, 80:174302, 2009.<br />
294. P. Tiwary, A. van de Walle, B. Jeon, and N. Gronbech-Jensen. Interatomic potentials for mixed<br />
oxide (MOX) nuclear fuels. 2010. submitted.<br />
DMR050036<br />
295. R. G. Hennig, A. Wadehra, K. P. Driver, W. D. Parker, C. J. Umrigar, and J. W. Wilkins. Phase<br />
transformation in Si from semiconducting diamond to metallic β-Sn phase in QMC and DFT under<br />
hydrostatic and anisotropic stress. Phys. Rev. B, 82(1):014101, Jul 2010.<br />
296. S. Alireza Ghasemi, Maximilian Amsler, Richard G. Hennig, Shantanu Roy, Stefan Goedecker,<br />
Thomas J. Lenosky, C. J. Umrigar, Luigi Genovese, Tetsuya Morishita, and Kengo Nishio. Energy<br />
landscape of silicon systems and its description by force fields, tight binding schemes, density functional<br />
methods, and quantum Monte Carlo methods. Phys. Rev. B, 81(21):214107, Jun 2010.<br />
297. W. D. Parker, J. W. Wilkins, and R. G. Hennig. Accuracy of quantum Monte Carlo methods for point<br />
defects in solids. Phys. Stat. Sol., 2010. (in print, http:<br />
//onlinelibrary.wiley.com/doi/10.1002/pssb.201046149/abstract).<br />
298. Justin L. Luria, Kathleen A. Schwarz, Michael J. Jaquith, Richard G. Hennig, and John A.<br />
Marohn. Spectroscopic characterization of charged defects in polycrystalline pentacene by time-and<br />
wavelength-resolved electric force microscopy. Advanced Materials, 2010. (in print).<br />
299. D. R. Trinkle K. Thornton and R. G. Hennig. Applying for computational time on NSF’s TeraGrid–<br />
the worlds largest cyberinfrastructure supporting open research. JOM, 62(3):17–18, 2010.<br />
DMR070003<br />
300. G. Ghosh, “First-principles Calculation of Phase Stability and Cohesive Properties of Ni-Sn<br />
Intermetallics”, Metall Mater Trans A, 40A, p 4 (2009).<br />
DMR080044<br />
301. T. Sadowski and R. Ramprasad, "First Principles Computational Study of Wurtzite CdTe<br />
Nanowires", J. Mater. Sci. 45, 5463 (2010).<br />
302. S. K. Yadav, T. Sadowski and R. Ramprasad, "Density functional theory study of ZnX (X=O, S, Se,Te)<br />
under uniaxial strain", Phys. Rev. B 81, 144120 (2010).<br />
303. C. Tang and R. Ramprasad, "Point defect chemistry in amorphous HfO2: Density functional theory<br />
calculations", Phys. Rev. B 81, 161201(R) (2010).<br />
304. T. Sadowski and R. Ramprasad, "Core/Shell CdSe/CdTe Heterostructure Nanowires Under<br />
AxialStrain", J. Phys. Chem. C 114, 1173 (2010).<br />
305. M. E. Stournara and R. Ramprasad, "A first principles investigatioin of isotactic polypropylene", J.
Mater. Sci. 45, 443 (2010).<br />
306. H. Zhu, M. Aindow, and R. Ramprasd, “Stability and work function of TiCxN1-x alloy surfaces:<br />
Density functional theory calculations”, Phys. Rev. B 80, 201406(R) (2009).<br />
307. G. Pilania, D. Q. Tan, Y. Cao, V. S. Venkataramani, Q. Chen and R. Ramprasad, "Ab initio study of<br />
antiferroelectric PbZrO3 (001) surfaces", J. Mater. Sci. 44, 5249 (2009).<br />
308. G. Pilania, S. P. Alpay and R. Ramprasad, “Ab initio study of ferroelectricity in BaTiO3 nanowires”,<br />
Phys. Rev. B 80, 014113 (2009).<br />
309. G. Pilania, T. Sadowski and R. Ramprasad, “Oxygen adsorption on CdSe surfaces: Case study of<br />
asymmetric anisotropic growth through ab initio computations”, J. Phys. Chem. C. 113(5), 1863 (2009).<br />
310. G. Pilania and R. Ramprasad, "Adsorption of atomic oxygen on cubic PbTiO3 and LaMnO3 (001)<br />
surfaces: A density functional theory study", Surf. Sci. (in print).<br />
311. C. C. Wang and R. Ramprasad, “Dielectric properties of <strong>org</strong>anosilicons from first principles”, J.Mater.<br />
Sci. (in print).<br />
312. G. Pilania and R. Ramprasad, "Complex Polarization Ordering in PbTiO3 Nanowires: A First Principles<br />
Computational Study ", submitted to Phys. Rev. B.<br />
313. H. Zhu, C. Tang and R. Ramprasad, “Phase equilibria at Si-HfO2 and Pt-HfO2 interfaces from first<br />
principles thermodynamics”, submitted to Phys. Rev. B.<br />
314. H. Zhu and R. Ramprasad, “Determination of the effective work function of metals interfacedwith<br />
dielectrics using first principles thermodynamics: A case study of the Pt-HfO2 interface”, in preparation.<br />
315. H. Zhu and R. Ramprasd, “Work function of TaCxN1-x alloys”, in preparation.<br />
316. S. Yadav and R. Ramprasad, “ZnX nanowire (X=O, S, Se, Te) under uniaxial strain”, in<br />
preparation.<br />
317. Y. Cardona-Quintero, H. Zhu and R. Ramprasad. "Effect of SCH3 and SCF3 molecules in the work<br />
function of Pt metal", in preparation.<br />
318. S. K. Yadav, G. Pilania, and R. Ramprasad, “GGA, LDA, and LDA+U phase diagram of the ZnO(10-10)<br />
surface in equilibrium with oxygen and hydrogen.” in preparation.<br />
319. P. Shimpi, S. K. Yadav, R. Ramprasad, P. Gao, “C Assisted ZnO Nanowire Growth From<br />
Nanofilm Along Non-polar Direction.” in preparation.<br />
DMR080045<br />
320. A. Y. Liu, “Electron-phonon coupling in compressed 1T-TaS2: Stability and superconductivity from first<br />
principles,” Phys. Rev. B 79, 220515(R) (2009).<br />
321. J. K. Freericks, H. R. Krishnamurthy, Y. Ge, A. Y. Liu, and Th. Pruschke, “Theoretical description of<br />
time-resolved pump/probe photoemission in TaS2: a single-band DFT+DMFT(NRG) study within the<br />
quasiequilibrium approximation,” Phys. Status Solidi B 246, 948 (2009).<br />
322. Y. Ge and A. Y. Liu, “First-principles investigation of the charge-density-wave instability in 1T-TaSe2,”<br />
Phys. Rev. B, to appear.<br />
DMR090006<br />
323. J. A. Yasi, T. Nogaret, D. R. Trinkle, Y. Qi, L. G. Hector Jr., andW. A. Curtin. “Basal and prism<br />
dislocation cores in magnesium: comparison of first-principles and embeddedatom-potential methods<br />
predictions.” Modelling Simul. Mater. Sci. Eng. 17, 055012 (2009).<br />
324. T. Nogaret,W. A. Curtin. J. A. Yasi, L. G. Hector Jr., and D. R. Trinkle. “Atomistic study of edge and<br />
screw hc+ai dislocations in magnesium.” ActaMater. 58, 4332-4343 (2010). 3. J. A. Yasi, L. G. Hector<br />
Jr., and D. R. Trinkle. “First-principles data for solid-solution strengthening of magnesium: From<br />
geometry and chemistry to properties.” Acta Mat. 58, 5704–5713 (2010).<br />
325. H.M. Lawler and D. R. Trinkle, “First-principles calculation of H vibrational excitation at a dislocation<br />
core of Pd.” (in review), arXiv:1008.1256 (2010).<br />
326. M. Yu and D. R. Trinkle, “Geometry and energetics of Au/TiO2 interface from firstprinciples.” (in<br />
preparation).<br />
327. H. H. Wu and D. R. Trinkle, “Mechanism of oxygen diffusion in alpha-titanium.” (in preparation).<br />
DMR100005<br />
328. Yufeng Liang, Li Yang, “Anisotropic Adsorption of Hydrogen Atoms on Strained Graphene”, in<br />
preparing.<br />
329. Li Yang, “First-principles study of the optical absorption spectra of electrically gated bilayer graphene”,<br />
Phys. Rev. B 81, 155445 (2010).<br />
330. Li Yang, “Novel Excitons in Graphene and Bilayer Graphene”, submitted to Phys. Rev. Lett.<br />
MCA08X038
331. L.K. Wagner and J.C. Grossman. Phys. Rev. Lett. 104, 210201 (2010).<br />
332. J.-H. Lee, J. Wu and J. C. Grossman,Phys. Rev. Lett., 104, 16602 (2010)<br />
333. Z. Wu, M.D. Allendorf and J.C. Grossman, J. Am. Chem. Soc., 131, 13198 (2009).<br />
MCA08X040<br />
334. B.J. Demaske, V.V. Zhakhovsky, N.A. Inogamov, and I.I. Oleynik, “Ablation and spallation of gold films<br />
irradiated by ultrashort laser pulse”, Physical Review B 82 064113 (2010).<br />
335. M.M. Budzevich, A. Landerville, M.W. Conroy, I.I. Oleynik, and C.T. White, “Anisotropic constitutive<br />
relationships in 1,3,5-triamino-2,4,6-trinitrobenzene (TATB), Journal of Applied Physics 107, 096205<br />
(2010).<br />
336. J. Lahiri, Y. Lin, P. Bozkurt, I.I. Oleynik, and M. Batzill, “An extended defect in graphene as a metallic<br />
wire”, Nature Nanotechnology, 5, 326 (2010).<br />
337. A.C. Landerville, M.W. Conroy, I.I. Oleynik, and C.T. White, “Prediction of isothermal equation of state<br />
of an explosive nitrate ester by van der Waals density functional theory” J. Physical Chemistry Letters 1,<br />
346, (2010).<br />
338. L. Adamska, M.A. Kozhushner, and I.I. Oleynik, “Electron-plasmon interactions in resonant molecular<br />
tunnel junctions”, Physical Review B 81, 035404 (2010).<br />
339. I. Dıez-Perez, J. Hihath, Y. Lee, L. Yu, L. Adamska, M.A. Kozhushner, I.I. Oleynik, and N. Tao,<br />
“Rectification and stability of a single molecular diode with controlled orientation”, Nature Chemistry, 1,<br />
635 (2009).<br />
340. M.W. Conroy, I.I. Oleynik, S.V. Zybin, and C.T. White, “Density functional theory calculations of solid<br />
nitromethane under hydrostatic and uniaxial compressions with empirical van der Waals corrections”,<br />
Journal of Physical Chemistry A 113, 3610 (2009).<br />
341. A.C. Landerville, I.I. Oleynik, and C.T. White, “Reactive molecular dynamics in detonating PETN”,<br />
Journal of Physical Chemistry A 113, 12094 (2009).<br />
342. R. Perriot, X. Gu, and I.I. Oleynik, Computational Nanomechanics of Graphene Membranes, Mater.<br />
Res. Soc. Symp. Proc. 1185, II05-04 (2009).<br />
343. A. Landerville, I.I. Oleynik, and C.T. White, “First-principles investigation of reactive molecular dynamics<br />
in detonating RDX and TATB”, AIP Conference proceedings “Shock Compression of Condensed Matter<br />
-2009” 1195, 817 (2009).<br />
344. M. Budzevich, M. Conroy, A. Landerville, Y. Lin , I.I. Oleynik, and C.T. White, “Hydrostatic equation of<br />
state and anisotropic constitutive relationships in 1,3,5-triamino-2,4,6-trinitrobenzene (TATB)”, AIP<br />
Conference proceedings “Shock Compression of Condensed Matter -2009” 1195, 545 (2009).<br />
345. M. Conroy, M. Budzevich, A. Landerville, Y. Lin , I.I. Oleynik, and C.T. White, “Application of van der<br />
waals density functional theory to study physical properties of energetic materials”, AIP Conference<br />
proceedings “Shock Compression of Condensed Matter -2009” 1195, 805 (2009).<br />
346. A. Landerville, I.I. Oleynik, and C.T. White, “Reactive molecular dynamics of detonating PETN”, AIP<br />
Conference proceedings “Shock Compression of Condensed Matter -2009” 1195, 813 (2009).<br />
347. M. Conroy, A. Landerville, I.I. Oleynik, and C.T. White, “First-principles studies of hydrostatic and<br />
uniaxial compression of a new energetic material – an energetic nitrate ester”, AIP 2 Conference<br />
proceedings “Shock Compression of Condensed Matter -2009” 1195, 482 (2009).<br />
348. X. Gu, Y. Lin, I.I. Oleynik, and C.T. White, “Molecular dynamics simulations of shockinduced defect<br />
healing in silicon”, AIP Conference proceedings “Shock Compression of Condensed Matter -2009”<br />
1195, 793 (2009).<br />
349. Y. Lin, M. Budzevich, A. Landerville, I.I. Oleynik, and C.T. White, “Physical and chemical properties of a<br />
new energetic material Si-PETN”, AIP Conference proceedings “Shock Compression of Condensed<br />
Matter -2009” 1195, 482 (2009).<br />
Mathematical Sciences<br />
CHE090098<br />
350. Wang, Junmei.; Cieplak, P.; Jie, L.; Wang, J.; Luo, R.; Duan, D. Development and test a set of dipole<br />
interaction models, in preparation.<br />
351. Wang, Junmei, Hou T.; Molecular properties prediction using molecular dynamics simulations 1 –<br />
density and heat of vaporization, in preparation.<br />
352. Wang, Junmei, Hou T.; Molecular properties prediction using molecular dynamics simulations 2 –<br />
diffusion coefficient, in preparation.<br />
CTS070004<br />
353. S. Dong & X. Zheng. Direct numerical simulation of spiral turbulence. Journal of Fluid Mechanics, in<br />
press, 2010.
354. S. Dong & J. Shen. An unconditionally stable rotational velocity-correction scheme for incompressible<br />
flows. Journal of Computational Physics, 229, 7013{7029, 2010.<br />
355. S. Dong. BDF-like methods for nonlinear dynamic analysis. Journal of Computational Physics, 229,<br />
3019{3045, 2010.<br />
356. K. Yeo, S. Dong, E. Climent & M.R. Maxey. Modulation of homogeneous turbulence seeded with finite<br />
size bubbles or particles. International Journal of Multiphase Flow, 36, 221{233, 2010.<br />
357. S. Dong & X. Zheng. Spiral turbulence in a Taylor-Couette geometry. AIAA Paper 2010-1257. 48th<br />
AIAA Aerospace Sciences Meeting, Orlando, FL, Jan. 2010.<br />
358. S. Dong. Blended time integration algorithms for nonlinear dynamics. AIAA Paper 2010-2890. 51st<br />
AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics, and Materials Conference, Orlando, FL,<br />
Apr. 2010.<br />
359. S. Dong. A time-stepping scheme for flow simulations that allows the use of large time step sizes. 63rd<br />
Annual Meeting of APS Division of Fluid Dynamics, Long Beach, CA, Nov. 2010.<br />
360. X. Zheng & S. Dong. An eigen-based high order expansion basis for spectral elements. SIAM<br />
Conference on Computational Science and Engineering, Reno, NV, March 2011.<br />
DMS090001<br />
361. Sengupta, K., Direct and Large-eddy Simulation of Compressible Flows with Spectral/hp Element<br />
Methods, Ph.D. Thesis, University of Illinois at Chicago, Chicago, IL, 2009.<br />
362. Kanchi, H., Sengupta, K., Jacobs, G. B., and Mashayek, F., Large-eddy Simulation of Compressible<br />
Flow over Backward-facing Step Using Chebyshev Multidomain Method, AIAA 2010-922, 2010.<br />
363. Kanchi, H., RANS and LES Study of Fluidic Control of Shear Layer in Dump Combustors using<br />
Microjets, Ph.D. Thesis, University of Illinois at Chicago, Chicago, IL (in preparation)<br />
DMS100004<br />
364. Masud A, Calderer R (2010) A variational multiscale method for incompressible turbulent flows: residual<br />
free bubbles and the fine scale fields. Submitted for publication to Computer Methods in Applied<br />
Mechanics and Engineering.<br />
365. The flow around a cylinder problem will be featured on the back cover of the Fall 2010 issue of Access,<br />
the NCSA’s quarterly magazine.<br />
366. Ferguson Lecture 2010: "Stabilized Methods and Turbulence: Application to Fluid-Structure Interaction."<br />
University of Texas at Austin. Department of Civil and Environmental Engineering. May 5, 2010<br />
367. "Stabilized Finite Element Methods and Turbulence." University of Tokyo, Department of Mechanical<br />
Engineering, Tokyo, Japan. June 2, 2010.<br />
368. "Stabilized Finite Element Methods and Turbulence." Purdue University. Department of Computer<br />
Science. March 11, 2010.<br />
Mechanical and Structural Systems<br />
MSS110004<br />
369. Song J, Medhekar N. Novel Scaffolding Structure for Li-ion Battery anodes. Submitted<br />
370. Song J, Medhekar N. Thermal Properties of Impurities and Defects Decorated Graphene Nanoribbons.<br />
In preparation.<br />
371. Song J, Curtin WA. A Nanoscale Mechanism of Hydrogen Embrittlement in Metals. Submitted. (partially<br />
supported by TG-MSS100005)<br />
Molecular Biosciences<br />
CHE060063<br />
372. Durrant, J., Hall, L., Swift, R., Landon, M., Schnaufer, A., and Amaro, R. (2010) “Novel Naphthalene-<br />
Based Inhibitors of Trypanosoma brucei RNA Editing Ligase 1,” PLoS Neglected Tropical Diseases,<br />
4(8): e803. (doi:10.1371/journal.pntd.0000803).<br />
373. Durrant, J., Amaro, R. E., Xie, L., Urbaniak, M. D., Ferguson, M. A. J, Haapalainen, A., Chen, Z., Di<br />
Guilmi, A. M., Wunder, F., Bourne, P. E., McCammon, J. A. (2010). A Multidimensional Strategy to<br />
Detect Polypharmacological Targets in the Absence of Structural and Sequence Homology. PLoS<br />
Computational Biology, 6(1), e1000648.<br />
374. Amaro, R. E., Li, W. W. (2010). Emerging Ensemble-based Methods in Virtual Screening. Current<br />
Topics in Medicinal Chemistry, 10(1), 3-13.<br />
375. Swift, R. V., Amaro, R. E. (2009). Discovery and design of DNA and RNA ligase inhibitors in infectious<br />
micro<strong>org</strong>anisms. Expert Opinion in Drug Discovery, 4(12), 1281-1294.<br />
376. Wong, S., Amaro, R. E., McCammon, J. A. (2009). MM-PBSA captures key role of intercalating water<br />
molecules at a protein-protein interface. Journal of Chemical Theory and Computation, 5(2), 422-429.
377. Lawrenz, M., Wereszczynski, J., Amaro, R., Walker, R., Roitberg, A., and McCammon, J.A., “Impact of<br />
calcium on N1 influenza neuraminidase dynamics and binding free energy,” Proteins, 78: 2523-2532<br />
(2010).<br />
378. Sung, J.C., Van Wynsberghe, A.W., Amaro, R. E., Li, W. W., and McCammon, J. A., “The role of<br />
secondary sialic acid binding sites in influenza N1 neuraminidase,” Journal of the American Chemical<br />
Society Communication, 132(9): 2883 - 2885 (2010).<br />
379. Newhouse, E.I., Xu, D., Markwick, P.R., Amaro, R.E., Pao, H.C., Wu, K.,J., Alam, M., McCammon, J.A.,<br />
Arzberger, P.W., and Li, W.W., “Mechanism of Glycan Receptor Recognition and Specificity Switch for<br />
Avian, Swine, and Human Adapted Influenza Virus Hemagglutinins: A Molecular Dynamics<br />
Perspective,” Journal of the American Society, 131(47): 17430-42 (2009).<br />
MCA01S027<br />
380. In Suk Joung, Ö Persil Çetinkol, NV Hud, and TE Cheatham, III. “Molecular dynamics simulations and<br />
coupled nucleotide substitution experiments indicate the nature of A•A base pairing and a putative<br />
structure of the coralyne-induced homo-adenine duplex.” Nuc. Acids Res. 37, 7715-7727 (2009).<br />
PMCID: 2794157<br />
381. R Lavery, K Zakrzewska, DL Beveridge, TC Bishop, DA Case, TE Cheatham, III, S Dixit, B Jayaram, F<br />
Lankas, C Laughton, JH Maddocks, A Michon, R Osman, M Orozco, A Perez, T Singh, N Spackova,<br />
and J Sponer. “A systematic molecular dynamics study of nearest-neighbor effects on base pair and<br />
base pair step conformations and fluctuations in B-DNA”. Nuc. Acids Res. 38, 299-313 (2010). PMCID:<br />
2800215<br />
382. T Truong, H Freedman, L Le, TE Cheatham, III, J Tuszynski, and L Huynh. “Explicitly-solvated ligand<br />
contribution to continuum solvation models for binding free energies: Selectivity of theophylline binding<br />
to an RNA aptamer.” J. Phys. Chem. B 114, 2227-2237 (2010).<br />
383. RB Paulsen, PP Seth, EE Swayze, RH Griffey, JJ Skalicky, TE Cheatham, III, and DR Davis. “Inhibitor<br />
induced structure change in the HCV IRES domain IIa RNA.” Proc. Natl. Acad. Sci. 107, 7263-7268<br />
(2010).<br />
384. CD Moore, K Shahrokh, SF Sontum, TE Cheatham, III, and GS Yost. “Improved Cyp3A4 molecular<br />
models accurately predict Phe215 requirement for raloxifene dehydrogentation selectivity.” Biochemistry<br />
ASAP, DOI: 10.1021/bi101139q (2010).<br />
385. R DeMille, TE Cheatham, III, and V Molinero. “A coarse-grained model of DNA with explicit solvation by<br />
water and ions.” J. Phys. Chem. B [in press]<br />
386. P Banas, D Hollas, M Zgarbova, P Jurecka, M Orozco, TE Cheatham, III, J Sponer, and M Otyepka.<br />
“Performance of molecular mechanics force fields for RNA simulations. Stability of UUCG and GNRA<br />
hairpins.” J. Chem. Theory Comp. [in press]<br />
387. SS Pendley, and TE Cheatham, III. “Assessing model structures of small, parallel IAAL-E3/K3 dimeric<br />
coiled-coils with the AMBER force fields.” J. Chem. Theory Comp. [in revision]<br />
388. SS Pendley and TE Cheatham, III. “Explicit solvent effects and parameterization corrections in MM-<br />
PBSA calculations.” J. Chem. Theory. Comp. [in revision]<br />
389. X Cang, J Sponer, and TE Cheatham, III. “Explaining the varied glycosidic conformational, G-tract<br />
length, and sequence preferences for anti-parallel G-quadruplexes.” Nuc. Acids Res. [in minor revision]<br />
390. AS Dixon, SS Pendley, BJ Bruno, DW Woessner, AA Shimpi, TE Cheatham, III and CS Lim. “Disruption<br />
of Bcr-Abl coiled-coil oligomerization by design.” J. Biol. Chem. [in revision]<br />
391. K Shahrokh, A Orendt, G Yost, and TE Cheatham, III. “Quantum mechanically derived AMBERcompatible<br />
heme parameters for the states on the cytochrome P450 catalytic cycle.” J. Phys. Chem. B<br />
[submitted]<br />
392. X Cang, J Sponer, and TE Cheatham, III. “Insight into telomere structural polymorphism and G-DNA<br />
folding from sequence and loop connectivity through free energy analysis.” J. Amer. Chem. Soc.<br />
[submitted]<br />
393. M Zgarbova, M Otyepka, J Sponer, A Mladak, P Banas, TE Cheatham, III, M Orozco, F Javier Luque,<br />
and P Jurecka. “Refinement of the AMBER nucleic acid force field based on quantum chemical<br />
calculations of glycosidic torsion profiles. [near final revision to be submitted soon, likely to JCTC]<br />
MCA02N028<br />
394. A partial nudged elastic band implementation for use with large or explicitly solvated systems Campbell,<br />
A.J., Walker, R.C., Simmerling, C.. Intl J. of Quantum Chemistry, 2009. 109, 3781-3790.<br />
395. An Improved Reaction Coordinate for Nucleic Acid Base Flipping Studies, Song, K., Campbell, A.J.,<br />
Bergonzo, C., de los Santos, C., Grollman, A.P., Simmerling, C. J. Chem. Theory & Computation, 2009.<br />
396. Recent Advances in the Study of the Bioactive Conformation of Taxol, Sun, L., Simmerling, C. and<br />
Ojima, I., ChemMedChem, 4, 719 (2009)
397. Evaluating the Performance of the FF99SB Force Field Based on NMR Scalar Coupling Data<br />
Wickstrom, L., Okur, A. and Simmerling, C., “”, Biophys. J., 97, 853 (2009)<br />
398. Dynamic Behavior of DNA Duplexes Suggests a Mechanism for Selective Destabilization of Lesions by<br />
a DNA Glycosylase, Bergonzo, C., Song, K., de los Santos, C., Grollman, A., Simmerling, C. (in<br />
revision: Nucleic Acids Research, 2010)<br />
399. Slow onset inhibition of bacterial & β-ketoacyl-ACP synthases by thiolactomycin, Machutta, C., Gopal,<br />
B., Luckner, S., Kapilashrami, K., Ruzsicska, B., Simmerling, C., Kisker, C., and Tonge, P., J. Biol.<br />
Chem. 285, 6161 (2010)<br />
400. Three-dimensional molecular theory of solvation coupled with molecular dynamics in Amber Luchko, T.,<br />
Gusarov, S., Roe, D., Simmerling, C., Case, D., Tuszynski, J. and Kovalenko, A., J. Chem. Theory &<br />
Comput, 6, 607 (2010).<br />
401. Synthesis and Molecular Modeling of a Nitrogen Mustard DNA Interstrand Crosslink Guainazzi, A.,<br />
Campbell, A., Angelov, T., Simmerling, C. and Schärer, O., “”, Chemistry: A European Journal, in press.<br />
402. Mechanism of Glycosylase Discrimination of Oxidatively Damaged DNA, Arthur J. Campbell, Christina<br />
Bergonzo, Lin Fu, Haoquan Li, Dmitry O. Zharkov, Carlos de los Santos, Arthur Grollman and Carlos<br />
Simmerling, submitted<br />
403. Improving the Description of Salt Bridge Strength and Geometry in a Generalized Born Model, Y.<br />
Shang, H. Nguyen, L. Wickstrom, A. Okur and C. Simmerling, J. Mol. Graphics & Modelling, submitted<br />
404. Role of the Glycosylase Wedge in Base Excision Repair of Oxidative DNA Damage, Campbell, A.,<br />
Bergonzo, C., Fu, L., de los Santos, C., Grollman, A., Zharkov, D. and Simmerling, C., submitted<br />
405. ff10SB: Optimization of protein backbone and sidechain dihedrail parameters from the ff99SB force<br />
field. Martinez, C., Maier, J., Wickstrom, L. and Simmerling, C., in preparation<br />
406. Enhanced Sampling In Multidimensional Replica Exchange Umbrella Sampling, Lin, F., Bergonzo, C.,<br />
C., Grollman, A. and Simmerling, C., in preparation<br />
407. Translocation of Repair Proteins along DNA during Lesion Search, Carvalho, A., Bergonzo, C. and<br />
Simmerling, C., in preparation<br />
408. Mapping Conformational Changes During Base Excision Repair, Bergonzo, C., Campbell, A.J., Fu, L.,<br />
Song, K., de los Santos, C., Grollman, A., Simmerling, C. in preparation<br />
409. Role of the Glycosylase Active Site Loop in Early Recognition of Oxidative DNA Damage. Lin, F.,<br />
Campbell, A., Bergonzo, C., de los Santos, C., Grollman, A. and Simmerling, C., in preparation<br />
410. Changes in Dimer Interface Dynamics Result in Modulation of Drug Sensitivity between HIV-1 Protease<br />
Subtypes, Shang, M., Roitberg, A., Fanucci, G. and Simmerling, C., in preparation<br />
411. Structural Basis of Slow Onset Inhibition Mechanism in Mycobacterium tuberculosis InhA. Lai, C., Li, H.,<br />
Liu, N., Garcia-Diaz, M., Tonge, P. and Simmerling, C., in preparation<br />
412. The Transition between the Closed and Semi-open Form of HIV-1 Protease through Swapping of the<br />
Flap Tip Hydrophobic Cluster, F. Ding and C. Simmerling, in preparation.<br />
413. Flap opening mechanism of HIV-PR explored by microsecond-scale MD simulations: Role of the Dimer<br />
Interface, F. Ding and C. Simmerling, in preparation.<br />
414. Improved Parameters for the Generalized Born Neck Model, Nguyen, H., Roe, D., Shang, Y. and<br />
Simmerling, C. in preparation<br />
MCA05S028<br />
415. B. Luan, R. Carr, M. Caffrey and A. Aksimentiev. The effect of calcium on the conformation of<br />
cobalamin transporter BtuB. Proteins: Structure, Function, and Bioinformatics 78:1153–1162 (2010).<br />
416. B. Luan and A. Aksimentiev. Electric and electrophoretic inversion of the DNA charge in multivalent<br />
electrolytes. Soft Matter 6, 243-246 (2010).<br />
417. A. Aksimentiev. Deciphering ionic current signatures of DNA transport through a nanopore. Nanoscale<br />
2, 468–483 (2010). Cover.<br />
418. W. Timp, U. M. Mirsaidov, D. Wang, J. Comer, A. Aksimentiev, and G. Timp. Nanopore sequencing:<br />
Electrical measurements of the code of life. IEEE Transactions on Nanotechnology 9:281-294 (2010).<br />
Cover.<br />
419. D. Wells, A. Aksimentiev. Mechanical properties of a complete microtubule revealed through molecular<br />
dynamics simulation. Biophysical Journal 99:629-637 (2010).<br />
420. B. Luan and A. Aksimentiev. Control and reversal of the electrophoretic force on DNA in a charged<br />
nanopore. Journal of Physics: Condensed Matter, in press (2010).<br />
421. R. Carr, J. Comer, M. Ginsberg and A. Aksimentiev. Modeling pressure-driven transport of proteins<br />
through a nanochannel. IEEE Transactions on Nanotechnology 99, doi:10.1109/TNANO.2010.2062530<br />
(2010).<br />
422. U. Mirsaidov, J. Comer, V. Dimitrov, A. Aksimentiev and G. Timp. Slowing the translocation of doublestranded<br />
DNA using a nanopore smaller than the double helix. Nanotechnology 21:395501-10 (2010).
423. C. Maffeo, R. Schopflin, H. Brutzer, R. Stehr, A. Aksimentiev, G. Wedemann and R. Seidel. A small<br />
effective charge describes DNA-DNA interactions in tight supercoils. Physical Review Letters<br />
105:158101 (2010).<br />
424. C. Stavis, T. L. Clare, J. E. Butler, A. D. Radadia, R Carr, H. Zeng, W. King, J. A. Carlisle, A.<br />
Aksimentiev, R. Bashir and R.J. Hamers. Surface functionalization of thin-film diamond for highly stable<br />
and selective biological interfaces. Proceedings of the National Academy of Sciences USA,<br />
doi:10.1073/pnas.1006660107 (2010).<br />
425. A. D. Radadia, C. J. Stavis, R. Carr, H. Zeng, W. King, J. A. Carlisle, A. Aksimentiev, R. J. Hamers, R.<br />
Bashir. Ultrananocrystalline diamond thin films as stable antibody tethering surfaces for bacterial<br />
capture. Submitted.<br />
426. S. Bhattacharya, J. Muzard, L. Payet, J. Mathe, U. Bockelmann, A. Aksimentiev, V. Viasnoff. Molecular<br />
origin of cation dependent current rectification in large proteinaceous channels: the case of alphahemolysin.<br />
Submitted.<br />
427. M. Wanunu, S. Bhattacharya, Y. Xie, Y. Tor, A. Aksimentiev, and M. Drndic Nanopore analysis of<br />
individual RNA/antibiotic complexes. Submitted.<br />
428. M. Venkatesan, J. Comer, A. Aksimentiev and R. Bashir. Lipid bilayer coated Al2O3 nanopore sensors:<br />
Towards a hydrid biological solid-state nanopore. Submitted.<br />
429. S. W. Kowalczyk, D. B. Wells, A. Aksimentiev, and C. Dekker. Charge reduction of DNA by transient<br />
binding of counter ions. Submitted.<br />
430. C Maffeo, B Luan and A. Aksimentiev. End-to-end attraction of duplex DNA. Submitted.<br />
431. R. Carr, J. Comer, M Ginzberg, and A. Aksimentiev. Transport of small solutes through a sticky<br />
nanochannel. In preparation.<br />
432. S. Bhattacharya, A. Ho and A. Aksimentiev. Engineering membrane channel MspA for sequencing<br />
DNA. In preparation.<br />
433. J. Comer and A. Aksimentiev. Sequencing double stranded DNA by measuring ionic current through<br />
solid-state nanopores. In preparation.<br />
434. D. Wang, J. W. Shim, J. Comer, A. Ho, W. Timp, A. Aksimentiev, G. Timp. Using Measurements of the<br />
Ion Current through a Synthetic Nanopore to Discriminate Nucleotides in a single DNA molecule. In<br />
preparation.<br />
MCA07S015<br />
435. Lambert, R.A., P. O’Shaughnessy, M. H. Tawhai, E. A. Hoffman, and C.-L. Lin, “Regional deposition of<br />
particles in an image-based airway model: large-eddy simulation and left-right lung ventilation<br />
asymmetry,” Aerosol Science & Technology, 45:1-15, 2011.<br />
436. Tawhai, M. H. and C.-L. Lin, “Airway Gas Flow,” Comprehensive Physiology, Wiley-Blackwell, in press,<br />
2010.<br />
437. Tawhai, M. H. and C.-L. Lin, “Image-based modeling of lung structure and function,” Journal of<br />
Magnetic Resonance Imaging, in press, 2010.<br />
438. Choi, J., G. Xia, M.H. Tawhai, E.A. Hoffman, and C.-L. Lin, “Numerical study of high frequency<br />
oscillatory air flow and convective mixing in a CT-based human airway model,” DOI: 10.1007/s10439-<br />
010-0110-7, Annals of Biomedical Engineering, 2010.<br />
439. Yin, Y., J. Choi, E.A. Hoffman, M.H. Tawhai, and C.-L. Lin, “Simulation of pulmonary air flow with a<br />
subject-specific boundary condition,” Journal of Biomechanics, 43(11):2159-2163, 2010.<br />
440. Xia, G., M. H. Tawhai, E. A. Hoffman, and C.-L. Lin, “Airway Wall Stiffness and Peak Wall Shear Stress:<br />
A Fluid-Structure Interaction Study in Rigid and Compliant Airways,” Annals of Biomedical Engineering,<br />
38(5), 1836-1853, 2010.<br />
441. Lin, C.-L., M. H. Tawhai, G. McLennan, and E.A. Hoffman, “Multiscale Simulation of Gas Flow in<br />
Subject-Specific Models of the Human Lung,” IEEE Eng. in Medicine and Biology, 28(3): 25-33, 2009.<br />
442. Choi, J., M.H. Tawhai, E.A. Hoffman, and C.-L. Lin, “On intra- and inter-subject variabilities of airflow in<br />
the human lungs,” Phys. Fluids, 21, 101901, 2009.<br />
MCA99S007<br />
443. Vila, JA; Serrano, P; Wuthrich, K; Scheraga, HA. Sequential nearest-neighbor effects on computed C-<br />
13(alpha) chemical shifts. JOURNAL OF BIOMOLECULAR NMR, 48 23-30, 2010 DOI:<br />
10.1007/s10858-010-9435-7<br />
444. Maisuradze, GG; Liwo, A; Oldziej, S; Scheraga, HA. Evidence, from Simulations, of a Single State with<br />
Residual Native Structure at the Thermal Denaturation Midpoint of a Small Globular Protein. JOURNAL<br />
OF THE AMERICAN CHEMICAL SOCIETY, 132 9444-9452 2010 DOI: 10.1021/ja1031503<br />
445. Kozlowska, U; Liwo, A; Scheraga, HA. Determination of Side-Chain-Rotamer and Side-Chain and<br />
Backbone Virtual-Bond-Stretching Potentials of Mean Force from AM1 Energy Surfaces of Terminally-
Blocked Amino-Acid Residues, for Coarse-Grained Simulations of Protein Structure and Folding I: The<br />
Method. JOURNAL OF COMPUTATIONAL CHEMISTRY, 31 1143-1153 2010 DOI: 10.1002/jcc.21399<br />
446. Kozlowska, U; Maisuradze, GG; Liwo, A; Scheraga, HA. Determination of Side-Chain-Rotamer and<br />
Side-Chain and Backbone Virtual-Bond-Stretching Potentials of Mean Force from AM1 Energy Surfaces<br />
of Terminally-Blocked Amino-Acid Residues, for Coarse-Grained Simulations of Protein Structure and<br />
Folding II: Results, Comparison with Statistical Potentials, and Implementation in the UNRES Force<br />
Field. JOURNAL OF COMPUTATIONAL CHEMISTRY, 31 1154-1167 2010 DOI: 10.1002/jcc.21402<br />
447. Martin, OA; Villegas, ME; Vila, JA; Scheraga, HA. Analysis of C-13(alpha) and C-13(beta) chemical<br />
shifts of cysteine and cystine residues in proteins: a quantum chemical approach. JOURNAL OF<br />
BIOMOLECULAR NMR, 46 217-225 2010 DOI: 10.1007/s10858-010-9396-x<br />
448. Liwo, A; Oldziej, S; Czaplewski, C; Kleinerman, DS; Blood, P; Scheraga, HA. Implementation of<br />
Molecular Dynamics and Its Extensions with the Coarse-Grained UNRES Force Field on Massively<br />
Parallel Systems: Toward Millisecond-Scale Simulations of Protein Structure, Dynamics, and<br />
Thermodynamics. JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 6 890-909 2010 DOI:<br />
10.1021/ct9004068<br />
449. Maisuradze, GG; Liwo, A; Scheraga, HA. Relation between Free Energy Landscapes of Proteins and<br />
Dynamics. JOURNAL OF CHEMICAL THEORY AND COMPUTATION, 6 583-595 2010 DOI:<br />
10.1021/ct9005745<br />
450. Makowski, M; Czaplewski, C; Liwo, A; Scheraga, HA. Potential of Mean Force of Association of Large<br />
Hydrophobic Particles: Toward the Nanoscale Limit. JOURNAL OF PHYSICAL CHEMISTRY B, 114<br />
993-1003 2010 DOI: 10.1021/jp907794h<br />
451. Vila, JA; Scheraga, HA. Assessing the Accuracy of Protein Structures by Quantum Mechanical<br />
Computations of C-13(alpha) Chemical Shifts. ACCOUNTS OF CHEMICAL RESEARCH, 42 1545-1553<br />
2009 DOI: 10.1021/ar900068s<br />
452. Vila, JA; Arnautova, YA; Martin, OA; Scheraga, HA. Quantum-mechanics-derived C-13(alpha) chemical<br />
shift server (CheShift) for protein structure validation. PROCEEDINGS OF THE NATIONAL ACADEMY<br />
OF SCIENCES OF THE UNITED STATES OF AMERICA, 106 16972-16977 2009 DOI:<br />
10.1073/pnas.0908833106<br />
453. Sobolewski, E; Makowski, M; Oldziej, S; Czaplewski, C; Liwo, A; Scheraga, HA. Towards temperaturedependent<br />
coarse-grained potentials of side-chain interactions for protein folding simulations. I:<br />
Molecular dynamics study of a pair of methane molecules in water at various temperatures. PROTEIN<br />
ENGINEERING DESIGN & SELECTION, 22 547-552 2009 DOI: 10.1093/protein/gzp028<br />
454. He, Y; Xiao, Y; Liwo, A; Scheraga, HA. Exploring the Parameter Space of the Coarse-Grained UNRES<br />
Force Field by Random Search: Selecting a Transferable Medium-Resolution Force Field. JOURNAL<br />
OF COMPUTATIONAL CHEMISTRY, 30 2127-2135 2009 DOI: 10.1002/jcc.21215<br />
MCB080011<br />
455. Sodium release triggers galactose exit from the sodium-galactose transporter, vSGLT. Watanabe, A.S.<br />
Choe, V. Chaptal, J.M. Rosenberg, E.M. Wright, M. Grabe, and J. Abramson Nature (in press)<br />
456. Water permeation through the sodium-dependent galactose cotransporter vSGLT. Choe, S., J.M.<br />
Rosenberg, J. Abramson, E.M. Wright, and M. Grabe Biophys. J. 99: L56-L58.<br />
MCB080014<br />
457. G. Reddy, J.E. Straub and D. Thirumalai, “Dry amyloid fibril assembly in a yeast prion peptide is<br />
mediated by long-lived structures containing water wires”, Proc. Natl. Acad. Sci. (in review).<br />
458. M.S. Li, N.T. Co, G. Reddy, C.K. Hu and D. Thirumalai, “Determination of factors governing<br />
fibrillogenesis using lattice models”, Phys. Rev. Lett. (in review).<br />
MCB090005<br />
459. Genchev, G. Z., M. Kallberg, G. Gursoy, A. Mittal, L. Dubey, O. Perisic, G. Feng, R. Langlois, and H.<br />
Lu. 2009. Mechanical signaling on the single protein level studied using steered molecular dynamics.<br />
Cell Biochem Biophys 55:141-152<br />
MCB090110<br />
460. Goetz, A.W., Woelfe, T., Walker, R.C., "Quantum Chemistry on Graphics Processing Units", Ann. Rev.<br />
Comp. Chem., 2010, Accepted.<br />
461. Xu, D., Williamson, M.J., Walker, R.C., "Advancements in Molecular Dynamics Simulations of<br />
Biomolecules on Graphical Processing Units.", Ann. Rev. Comp. Chem., 2010, Accepted<br />
462. Lawrenz, M., Wereszczynski, J., Amaro, R., Walker, R.C., Roitberg, A.E., McCammon, J.A., "Impact of<br />
calcium on N1 influenza neuraminidase dynamics and binding free energy", Proteins, 2010, Accepted.
463. Zhong, L., Walker, R.C., Brady, J.W. et al., “Computational Simulations of the Trichoderma reesei<br />
Cellobiohydrolase I Acting on Microcrystalline Cellulose Iβ: The Enzyme-Substrate Complex”, Carb.<br />
Res., 2009, 344, 1984-1992.<br />
464. Seabra, G.M., Walker, R.C., Roitberg, A.E., “Are Current Semi-Empirical Methods Better than Force<br />
Fields A Study from the Thermodynamics Perspective”, J. Phys. Chem. A., 2009, 113, 11938-11948.<br />
465. Bergonzo, C., Campbell, A.J., Walker, R.C., Simmerling, C., “A Partial Nudged Elastic Band<br />
Implementation for Use with Large or Explicitly Solvated Large Systems”, Int. J. Quant. Chem., 2009,<br />
109, 15, 3781-3790.<br />
466. Crowley, M.F., Williamson, M.J., Walker, R.C., “CHAMBER: Comprehensive Support for CHARMM<br />
Force Fields Within the AMBER Software”, Int. J. Quant. Chem., 2009, 109, 15, 3767-3772<br />
467. C. David Sherrill, Bobby G. Sumpter, Mutasem O. Sinnokrot, Michael S. Marshall, Edward G.<br />
Hohenstein, Ross C. Walker, Ian R. Gould, “Assessment of standard force field models against highquality<br />
ab initio potential curves for prototypes of -, CH/, and SH/ interactions”, J. Comput. Chem, 2009,<br />
30, 2187-2193.<br />
468. Parastou Sadatmousavi, Mark J. Williamson, Dong Xu, Mark R. Nimlos, Michael F. Crowley, Michael E.<br />
Himmel & Ross C. Walker, "Recent Advances in Understanding the Mechanism of Action of the<br />
Catalytic Binding Module of CBH I Cellulase through Advanced MD Simulations", 239th American<br />
Chemical Society National Meeting, San Francisco, CA, Mar 2010.<br />
469. Michael F. Crowley, Mark J. Williamson, Ross C. Walker et al. – “Cellobiohydrolase Processivity:<br />
Improved enzymes for bioethanol”, SciDAC 2009, San Diego, CA, Jun 2009.<br />
470. Michael F. Crowley, Mark J. Williamson, Ross C. Walker – “CHAMBER A CHARMM Format Convertor<br />
to Enable the use of CHARMM Force Fields in AMBER”, 49th Sanibel Symposium, St. Simon’s Island,<br />
GA, Feb 2009.<br />
471. Warshel, A., Levitt, M., “Theoretical Studies of Enzymic Reactions - Dielectric, Electrostatic and Steric<br />
Stabilization of Carbonium-Ion in Reaction of Lysozyme.” J. Molec. Biol.,1976 103, 227-249.<br />
472. Senn, H. M., Thiel, W., “QM/MM Methods for Biomolecular Systems.” Angew. Chem. Int. Ed., 2009, 48,<br />
1198-1229.<br />
473. Bule, R. E., Ensing, B., Sikkema, J. Vissscher, L., “Toward a Practical Method for Adaptive QM/MM<br />
Simulations.” J. Chem. Theory Comput., 2009, 5, 2212-2221.<br />
474. de Seabra, G. M., Walker, R. C., Roitberg, A., “Are Current Semiempirical Methods Better Than Force<br />
Fields A Study from the Thermodynamics Perspective.” J. Phys. Chem. A, 2009, 113, 1938-11948.<br />
475. Torrie, G. M., Valleau, J. P., “Non-Physical Sampling Distributions in Monte-Carlo Free-Energy<br />
Estimation - Umbrella Sampling.”, J. Comp. Phys., 1977, 23, 187-199.<br />
476. Mills, G., Jonsson, H., Schenter, G. K., “Reversible Work Transition-State Theory - Application to<br />
Dissociative Adsorption of Hydrogen.”, Surface Science, 1995, 324, 305-337.<br />
477. Hansmann, U. H. E., “Parallel Tempering Algorithm for Conformational Studies of Biological<br />
Molecules.”, Chem. Phys. Lett., 1997, 281, 140-150.<br />
MCB090160<br />
478. Pfaendtner, J., E. Lyman, T. D. Pollard, and G. A. Voth. 2010. Structure and dynamics of the actin<br />
filament. J. Mol. Biol. 396:252-263.<br />
479. Pfaendtner, J., E. De La Cruz, and G. A. Voth. 2010. Actin filament remodeling by cofilin. Proc. Natl.<br />
Acad. Sci. 107:7299-7304.<br />
480. Pfaendtner, J., D. Branduardi, M. Parrinello, T. D. Pollard, and G. A. Voth. 2009. Nucleotide-dependent<br />
conformational states of actin. Proc. Natl. Acad. Sci. 106:12723-12728.<br />
MCB110020<br />
481. J. K. Shen, “Uncovering specific electrostatic interactions in the denatured states of proteins”, Biophys.<br />
J., 99, 924-932 (2010).<br />
482. J. A. Wallace and J. K. Shen, “Probing strand orientation and registry alignment in the propagation of<br />
amyloid fibrils”, Biochemistry, 49, 5290-5298 (2010).<br />
483. Y. Wang, J. A. Wallace and J. K. Shen, “Titration simulations of lauric acid in anionic, cationic and<br />
neutral micelles”, in preparation.<br />
484. J. A. Wallace and J. K. Shen, “Continuous constant pH molecular dynamics simulations in explicit<br />
solvent”, in preparation.<br />
485. C. Shi, J. A. Wallace and J. K. Shen, “Improving pKa predictions for internal residues”, in preparation.<br />
MCB110024<br />
486. Mamonova T., Wang B., Kurnikova M. and Friedman PA. Binding Capacity of the PDZ2 Domain of<br />
NHERF1 for the Target Ligands (manuscript in preparation).
487. Mamonova T., Wang B., Kurnikova M. and Friedman P.A. Modeling and Mutation of NHERF1<br />
Dimerization Domains, Biophys.J.98 (3), Suppl. 58a (2010).<br />
MCB110028<br />
488. N. Haspel, D. Zanuy, R. Nussinov, T. Teesalu, E. Ruoslahti and C. Aleman, Binding of a C-end rule<br />
peptide to neuropilin-1 receptor: A molecular modeling approach. To be submitted, 2010.<br />
Ocean Sciences<br />
OCE030007<br />
489. Boe, J., A. Hall, F. Colas, J. C. McWilliams, X. Qu, and J. Kurian, What shapes mesoscale wind<br />
anomalies in coastal upwelling zones, Clim. Dyn., submitted, 2010.<br />
490. Buijsman, M. C., Y. Kanarska, and J. C. McWilliams, On the generation and evolution of nonlinear<br />
internal waves in the South China Sea, J. Geophys. Res., 115, C02012, 2010a.<br />
491. Buijsman, M. C., J. C. McWilliams, and C. R. Jackson, East-west asymmetry in nonlinear internal waves<br />
from Luzon Strait, J. Geophys. Res., in press, 2010b.<br />
492. Buijsman, M. C., Y. Uchiyama, J. C. McWilliams, and C. Hill-Lindsay, Internal tides in the Southern<br />
California Bight, J. Geophys. Res., submitted, 2010c.<br />
493. Colas, F., X. Capet, and J. C. McWilliams, Eddy fluxes and induced circulation in Eastern Boundary<br />
Systems, in preparation, 2010.<br />
494. Colas, F., J. C. McWilliams, X. Capet, and J. Kurian, Heat balance and eddies in the Peru-Chile Current<br />
System, submitted, 2010.<br />
495. Colas, F., M. J. Molemaker, J. C. McWilliams, and X. Capet, Regimes of near-surface submesoscale<br />
dynamics in Eastern Boundary Systems, in preparation, 2010.<br />
496. Dong, C., E. Y. Idica, and J. C. McWilliams, Circulation and multiple-scale variability in the Southern<br />
California Bight, J. Phys. Oceanogr., 82, doi:10.1016/j.pocean.2009.07.005, 2009.<br />
497. Dong, C., and J. C. McWilliams, Island wakes in shallow water, J. Phys. Oceanogr., in preparation,<br />
2010.<br />
498. Dong, C., J. C. McWilliams, A. Hall, and M. Hughes, Numerical simulation of a synoptic event in the<br />
Southern California Bight, J. Geophys. Res., submitted, 2010a.<br />
499. Dong, C., P. Sangra, Y. Liu, J. C. McWilliams, and M. Hughes, Island effects on the Canary Current, J.<br />
Geophys. Res., in preparation, 2010b.<br />
500. Gruber, N., Z. Lachkar, H. Frenzel, P. Marchesiello, M. M¨unnich, and J. C. McWilliams, Mesoscale<br />
eddyinduced reduction of biological production in eastern boundary upwelling systems, Nature, 2010.<br />
501. Idica, E., 2010, Contaminant Transport in the Southern California Bight, Ph.D. thesis, Dept. of Civil and<br />
Environmental Engineering, University of California, Los Angeles.<br />
502. Jin, X., C. Dong, J. Kurian, J. C. McWilliams, D. B. Chelton, and Z. Li, SST-wind interaction in coastal<br />
upwelling: Oceanic simulation with empirical coupling, J. Phys. Oceanogr., 39, 2957–2970, 2009.<br />
503. Jin, X., N. Gruber, C. Deutsch, H. Frenzel, Z. Lachkar, D. Loher, J. C. McWilliams, and T. Nagai, The<br />
impact of the coastal ocean on global biogeochemical cycles from both global and regional model<br />
simulations, Global Biogeochem. Cycles, in preparation, 2010.<br />
504. Kurian, J., F. Colas, X. Capet, J. C. McWilliams, and D. B. Chelton, Eddy properties in the California<br />
Current System, in preparation, 2010.<br />
505. LeMari´e, F., J. C. McWilliams, A. F. Shchepetkin, L. Debreu, and M. J. Molemaker, Minimizing spurious<br />
diapycnal mixing associated with tracer advection, Ocean Modell., in preparation, 2010.<br />
506. Li, Z., K. Ide, Y. Chao, and J. C. McWilliams, A multi-scale three-dimensional variational data<br />
assimilation system for coastal ocean forecasting systems, in preparation, 2010.<br />
507. Mason, E., 2009, High-resolution modelling of the Canary Basin oceanic circulation, Ph.D. thesis,<br />
Universidad de Las Palmas de Gran Canaria.<br />
508. Mason, E., F. Colas, J. Molemaker, A. F. Shchepetkin, C. Troupin, J. C. McWilliams, and P. Sangra,<br />
Seasonal variability in the Canary Basin: a numerical study, J. Geophys. Res., submitted, 2010a.<br />
509. Mason, E., M. J. Molemaker, A. Shchepetkin, F. Colas, J. C. McWilliams, and P. Sangra, Procedures<br />
for offline grid nesting in regional ocean models, Ocean Modell., 35, 1–15, 2010b.<br />
510. McWilliams, J. C., A perspective on submesoscale geophysical turbulence, in Proceedings of Newton<br />
Institute Conference on The Nature of High Reynolds Number Turbulence, Oct. 30, 2008 to Dec. 12,<br />
2008, in press, 2010.<br />
511. McWilliams, J. C., and F. Colas, Climate heat balance off western south america: regional oceanic<br />
circulation and eddies, CLIVAR Exchanges Newsletter, 53, 14–16, 2010.<br />
512. McWilliams, J. C., F. Colas, and M. J. Molemaker, Cold filamentary intensification and oceanic surface<br />
convergence lines, Geophys. Res. Lett., 36, L18602, doi:10.1029/2009GL039402, 2009.
513. McWilliams, J. C., and M. J. Molemaker, Baroclinic frontal arrest: a sequel to unstable frontogenesis, J.<br />
Phys. Oceanogr., submitted, 2010.<br />
514. McWilliams, J. C., M. J. Molemaker, and E. I. Olafsdottir, Linear fluctuation growth during frontogenesis,<br />
J. Phys. Oceanogr., 39, 3111–3129, 2009.<br />
515. Mitarai, S., D. A. Siegel, J. R. Watson, C. Dong, and J. C. McWilliams, Quantifying connectivity in the<br />
coastal ocean with application to the Southern California Bight, J. Geophys. Res., 114, C10026, 2009.<br />
516. Molemaker, M. J., J. C. McWilliams, and W. K. Dewar, Submesoscale generation of mesoscale<br />
anticyclones in the California Undercurrent, in preparation, 2010.<br />
517. Nencioli, F., C. Dong, T. Dickey, L. Washburn, and J. C. McWilliams, A vector geometry based eddy<br />
detection algorithm and its application to a high-resolution numerical model product and high-frequency<br />
radar surface velocities in the Southern California Bight, J. Atmos. Oceanic Technology, 27, 564–579,<br />
2010.<br />
518. Roullet, G., J. C. McWilliams, X. Capet, and M. J. Molemaker, Impact of surface pv on the energetics of<br />
the forced-dissipated baroclinic instability, J. Phys. Oceanogr., submitted, 2010.<br />
519. Sangra, P., A. Pascual, A. Rodriguez-Santana, F. Machin, E. Mason, J. C. McWilliams, J.-L. Pelegri, C.<br />
Dong, A. Rubio, J. Aristegui, A. Marrero-Diaz, A. Hernandez-Guerra, A. Martinez-Marrero, and M.<br />
Auladell, The Canary eddies corridor: A major pathway for long-lived eddies in the subtropical North<br />
Atlantic, Deep-Sea Res. I, 56, 2100–2114, 2009.<br />
520. Shchepetkin, A. F., and J. C. McWilliams, An accurate Boussinesq oceanic model with a practical,<br />
“stiffened” equation of state, Ocean Modell., submitted, 2010.<br />
521. Uchiyama, Y., and J. C. McWilliams, Three-dimensional unstable rip currents, J. Geophys. Res., in<br />
preparation, 2010.<br />
522. Uchiyama, Y., J. C. McWilliams, and J. M. Restrepo, Wave-current interaction in nearshore shear<br />
instability analyzed with a vortex-force formalism, J. Geophys. Res., 114, C06021,<br />
doi:10.1029/2008JC005135, 2009.<br />
523. Uchiyama, Y., J. C. McWilliams, and A. F. Shchepetkin, Wave-current interaction in a three-dimensional<br />
circulation model with a vortex force formalism: Application to the surf zone, Ocean Modell., 34, 16–35,<br />
2010.<br />
524. Wang, X., Y. Chao, C. Dong, J. Farrara, Z. Li, J. McWilliams, J. D. Paduan, and L. K. Rosenfeld,<br />
Modeling tides in Monterey Bay, California, Deep-Sea Res. II, 56, 219–231, 2009.<br />
525. Watson, J. R., C. G. Hays, P. T. Raimondi, S. Mitarai, D. A. Siegel, C. Dong, J. C. McWilliams, and C.<br />
A. Blanchette, Currents connecting communities: The decay of nearshore community similarity with<br />
ocean circulation, Ecology, submitted, 2010.<br />
526. Watson, J. R., S. Mitarai, D. A. Siegel, J. Caselle, C. Dong, and J. C. McWilliams, Simulating larval<br />
connectivity in the Southern California Bight, Mar. Ecol. Progr. Ser., submitted, 2009.<br />
527. Weir, B., Y. Uchiyama, E. M. Lane, J. M. Restrepo, and J. C. McWilliams, A vortex force analysis of the<br />
interaction of rip currents and gravity waves, J. Geophys. Res., submitted, 2010.<br />
OCE100001<br />
528. A. Griesel, S.T. Gille, J. Sprintall, J.L. McClean, M. Maltrud : Eulerian and Lagrangian diffusivities in the<br />
Southern Ocean inferred from an eddying model, in preparation. 2011.<br />
529. A. Griesel, S.T. Gille, J. Sprintall, J.L. McClean, J.H. La-Casce, M. Maltrud: Isopycnal diffusivities in the<br />
Antarctic Circumpolar Current inferred from Lagrangian floats in an eddying model, J. Geophys. Res.,<br />
115, doi:10.1029/2009JC005821. 2010.<br />
Physics<br />
ASC090004<br />
530. Y. Nakatsukasa, Z. Bai and F. Gygi, “Optimizing Halley’s Iteration for Computing the Matrix Polar<br />
Decomposition”, SIAM J. Matrix Anal. & Appl., 31, 2700 (2010).<br />
531. D. Donadio, L. Spanu, I. Duchemin, F. Gygi, G. Galli, “Ab initio investigation of the melting line of<br />
nitrogen at high pressure”, Phys. Rev. B82, 020102(R) (2010).<br />
532. Duchemin and F. Gygi, “A scalable and accurate algorithm for the computation of Hartree–Fock<br />
exchange”. Comput. Phys. Comm. 181, 855 (2010).<br />
533. T.A. Pham, T.S. Li, S. Shankar, F. Gygi, and G. Galli, First-principles investigations of the dielectric<br />
properties of crystalline and amorphous Si3N4 thin films, Applied Physics Letters 96 (2010).<br />
534. C. Zhang, D. Donadio and G. Galli, First Principle Analysis of the IR Stretching Band of Liquid Water", J.<br />
Phys. Chem. Lett.1, 1398 (2010).<br />
DMR080007
535. Combining optical transparency with electrical conductivity: challenges and prospects, J.E. Medvedeva,<br />
in “Transparent Electronics: From Synthesis to Applications”, Editors: A. Facchetti and T. Marks,<br />
Publisher: John Wiley & Sons. April, 2010<br />
536. Complex transparent conducting oxides with tunalbe properties: role of crystal symmetry, chemical<br />
composition and carrier generation, J.E. Medvedeva and C.L. Hettiarachchi Physical Review B, 81,<br />
125116 (2010)<br />
MCA08X034<br />
537. An Arnoldi-Lanczos Program to Propagate the Time-Dependent Schr¨odinger equation X. Guan, C.J.<br />
Noble, O. Zatsarinny, K. Bartschat, B. I. Schneider, Comp. Phys. Commun. 180, 2401 (2009).<br />
538. Probing Electron Correlation via Attosecond XUV Pulses in the Two-Photon Double Ionization of Helium<br />
J. Feist, S. Nagele, R. Pazourek, E. Persson, B. I. Schneider, L. A. Collins, and J. Burgd¨orfer, Phys.<br />
Rev. Lett. 103, 063002 (2009).<br />
539. Electron Correlation in Two Photon Double Ionization of Helium from Attosecond to XFEL Pulses J.<br />
Feist, R. Pazourek, S. Nagele, E. Persson, B. I. Schneider, L. A. Collins, and J. Burgd¨orfer, J. Phys. B.<br />
42, 134014 (2009).<br />
540. Spin-asymmetry function for elastic electron scattering from lead atoms in the energy range 11−14 eV<br />
V. Hamelbeck, G.F. Hanne, O. Zatsarinny, K. Bartschat, R.K. Gangwar, and R. Srivastava, Phys. Rev.<br />
A 80, 062711 (2009).<br />
541. Complete Breakup of the Helium Atom by Proton and Antiproton Impact X. Guan and K. Bartschat,<br />
Phys. Rev. Lett. 103, 213201 (2009).<br />
542. Time-dependent B-spline R-matrix Approach to Double Ionization of Atoms by XUV Laser Pulses X.<br />
Guan, O. Zatsarinny, C.J. Noble, K. Bartschat, and B.I. Schneider, J. Phys. B 42, 134015 (2009).<br />
543. Absolute angle-differential cross sections for electron-impact excitation of neon within the first 3.5 eV<br />
above threshold M. Allan, K. Franz, H. Hotop, O. Zatsarinny, and K. Bartschat, J. Phys. B 42, 044009<br />
(2009).<br />
544. Ab Initio Calculation of Two-Electron Emission by Attosecond Pulses J. Feist, R. Pazourek, S. Nagele,<br />
E. Persson, B. I. Schneider, L. A. Collins, and J. Burgd¨orfer, J. Phys.: Conf. Ser. 194, 012010 (2009).<br />
545. Spin-resolved electron-impact excitation of the 6s6p (J = 1) states in mercury F. J¨uttemann, G.F.<br />
Hanne, O. Zatsarinny, and K. Bartschat, Phys. Rev. A 79, 042712 (2009).<br />
546. Fully Relativistic R-Matrix Calculations for Electron Collisions with Mercury O. Zatsarinny and K.<br />
Bartschat, Phys. Rev. A 79, 042713 (2009).<br />
547. Benchmark Calculations for Near-Threshold Electron-Impact Excitation of Krypton and Xenon Atoms O.<br />
Zatsarinny and K. Bartschat, J. Phys. B 43, 074031 (2010).<br />
548. Few-cycle Intense Laser Interactions with Complex Atoms X. Guan and K. Bartschat, J. Phys.: Conf.<br />
Ser. 212, 012023 (2010).<br />
549. Ionization of Atomic Hydrogen in Strong Infrared Laser Fields A. N. Grum-Grzhimailo, B. Abeln, K.<br />
Bartschat, D. Weflen, and T. Urness, Phys. Rev. A 81, 043408 (2010).<br />
550. Differential Cross Sections for Non-Sequential Double Ionization of He by 52 eV Photons from FLASH<br />
M. Kurka, J. Feist, D. A. Horner, A. Rudenko, Y. H. Jiang, K. U. K¨uhnel, L. Foucar, T. N. Rescigno, C.<br />
W. McCurdy, R. Pazourek, S. Nagele, M. Schulz, O. Herrwerth, M. Lezius, M. F. Kling, M. Sch¨offler, A.<br />
Belkacem, S. D¨usterer, R. Treusch, B. I. Schneider, L. A. Collins, J. Burgd¨orfer, C. D. Schr¨oter, R.<br />
Moshammer, and J. Ullrich, New J. Phys. 12, 073035 (2010).<br />
551. Delay in Photoemission M. Schultze, M. Fiess, N. Karpowicz, J. Gagnon, M. Korbman, M. Hofstetter, S.<br />
Neppl, A. L. Cavalieri, Y. Komninos, Th. Mercouris, C. A. Nicolaides, R. Pazourek, S. Nagele, J. Feist,<br />
J. Burgd¨orfer¨orfer, A. M. Azzeer, R. Ernstorfer, R. Kienberger, U. Kleineberg, E. Goulielmakis, F.<br />
Krausz, and V. S. Yakovlev, Science 328, 1658 (2010).<br />
552. Angle-differential Stokes parameters for spin-polarized electron-impact excitation of the Hg (6s6p)3P1<br />
state at 25 eV scattering energy F. J¨uttemann, G.F. Hanne, O. Zatsarinny, K. Bartschat, R. Srivastava,<br />
R.K. Gangwar, and A.D. Stauffer, Phys. Rev. A 81, 012705 (2010).<br />
553. New light on the Kr (4p55s2) Feshbach resonances: high-resolution electron scattering experiments<br />
and B-spline R-matrix calculations T.H. Hoffmann, M.-W. Ruf, H. Hotop, O. Zatsarinny, K. Bartschat,<br />
and M. Allan, J. Phys. B 43, 085206 (2010).<br />
554. Benchmark calculations for near-threshold electron-impact excitation of krypton and xenon atoms O.<br />
Zatsarinny and K. Bartschat, J. Phys. B 43, 074031 (2010).<br />
555. Electron impact excitation of the (3d104s) 2S1/2 ! (3d94s2) 2D5/2,3/2 transitions in copper atoms O.<br />
Zatsarinny, K. Bartschat, V. Suvorov, P.J.O. Teubner, and M.J. Brunger, Phys. Rev. A 81, 062705<br />
(2010).<br />
556. Two-Photon Double Ionization of H2 in Intense Femtosecond Laser Pulses X. Guan, K. Bartschat, and<br />
B.I. Schneider, accepted as Rapid Communication in Phys. Rev. A (2010); preprint available from<br />
http://arxiv.<strong>org</strong>/abs/1009.4866
557. Strong-Field Ionization of Lithium in a MOTREMI M. Schuricke, J. Steinmann, G. Zhu, I. Ivanov, A.S.<br />
Kheifets, A.N. Grum-Grzhimailo, K. Bartschat, A. Dorn, and J. Ullrich, submitted to Phys. Rev. A (2010).<br />
558. Near-threshold electron impact excitation of the argon 3p54s configuration – new and revised<br />
normalized differential cross sections using recent time-of-flight measurements for normalization. M.A.<br />
Khakoo, O. Zatsarinny, and K. Bartschat submitted to J. Phys. B (2010).<br />
559. Unexpected effects in spin-polarized electron impact excitation of the (3d104s5s) 3S1 state in zinc L.<br />
Pravica, J.F. Williams, D. Cvejanovi´c, S. Samarin, K. Bartschat, O. Zatsarinny, A.D. Stauffer, and R.<br />
Srivastava. submitted to Phys. Rev. Lett. (2010).<br />
560. Benchmark Calculations of Atomic Data for Plasma and Lighting Applications K. Bartschat and O.<br />
Zatsarinny, submitted to Plasma Sources Sci. and Technol. (2010).<br />
561. Universal Features in Sequential and Nonsequential Two-Photon Double Ionization of Helium R.<br />
Pazourek, J. Feist, S. Nagele, E. Persson, B. I. Schneider, L. A. Collins, and J. Burgd¨orfer<br />
562. Using Neutrons in Atomic Physics: Neutron Impact Ionization of Helium M. Liertzer, J. Feist, S. Nagele,<br />
and J. Burgd¨orfer<br />
563. Apparent Time Delays in Classical and Quantum Mechanical Atomic Attosecond Streaking S. Nagele,<br />
R. Pazourek, J. Feist, K. Tokesi, C. Lemell, and J. Burgd¨orfer<br />
564. Two-Photon Double Ionization of Helium by Chirped Attosecond XUV Pulses R. Pazourek, G.<br />
Schoissengeier, S. Nagele, J. Feist, B. I. Schneider, L. A. Collins, and J. Burgd¨orfer<br />
565. Time-Dependent Perturbation Theory for Two-Photon Processes: Sequential vs. Nonsequential<br />
Ionization J. Feist<br />
566. High-Precision Study of Elastic Electron Scattering from Krypton Atoms M. Allan, O. Zatsarinny, and K.<br />
Bartschat<br />
567. Electron Impact Excitation of the 4p55s States in Krypton M. Allan, O. Zatsarinny, and K. Bartschat<br />
568. Fully Relativistic Dirac B-Spline R-Matrix Calculations for Electron Scattering from Xenon M. Allan, O.<br />
Zatsarinny, and K. Bartschat<br />
569. Elastic electron scattering from atomic and molecular iodine O. Zatsarinny, K. Bartschat, F. Blanco, and<br />
G. Garcia<br />
570. Electron scattering from copper O. Zatsarinny and K. Bartschat<br />
571. Alignment effects in two-photon double ionization of H2 X. Guan, K. Bartschat, and B.I. Schneider<br />
572. Time-dependent treatment of one-photon double ionization of H2 X. Guan, K. Bartschat, and B.I.<br />
Schneider<br />
PHY080042<br />
573. T. DeGrand, Y. Shamir and B. Svetitsky, “Running coupling and mass anomalous dimension of SU(3)<br />
gauge theory with two flavors of symmetric-representation fermions,” Phys. Rev. D 82, 054503 (2010)<br />
[arXiv:1006.0707 [hep-lat]].<br />
PHY090003<br />
574. Sperhake, U., E. Berti, V. Cardoso, F. Pretorius & N.Yunes (2010) Cross sections and maximum<br />
radiated energy in grazing collisions of spinning and nonspinning black hole binaries in preparation<br />
575. Berti, E., V. Cardoso & B. Kipapa (2010) Radiation from paticles with arbitrary energy falling into higherdimensional<br />
black holes in preparation<br />
576. Witek, H., D. Hilditch & U. Sperhake (2010) Stability of the puncture method with a generalized BSSN<br />
formulation in preparation<br />
577. Sperhake, U., E. Berti, V. Cardoso, F. Pretorius & N.Yunes (2010) Superkicks in Ultrarelativistic Grazing<br />
Collisions of Spinning Black Holes in preparation<br />
578. Brizuela, D., J. M. Mart´ın-Garcia, U. Sperhake & K. Kokkotas (2010) High-order perturbations of a<br />
spherical collapsing star accepted for publication in Phys. Rev. D, arXiv:1009.5605 [gr-qc]<br />
579. Barausse, E., V. Cardoso & G. Khanna (2010) Test bodies and naked singularities: is the self-force the<br />
cosmic censor submitted to Phys. Rev. D, arXiv:1008.5159 [gr-qc]<br />
580. Witek, H., M. Zilh˜ao, L. Gualtieri, V. Cardoso, C. Herdeiro, A. Nerozzi & U. Sperhake (2010) Numerical<br />
relativity for D dimensional spacetimes: head-on collisions of black holes and gravitational wave<br />
extraction accepted for publication in Phys. Rev. D, arXiv:1006.3081 [hep-th]<br />
581. Witek, H., V. Cardoso, C. Herdeiro, A. Nerozzi, U. Sperhake & M.Zilh˜ao (2010) Black holes in a box:<br />
towards the numerical simulation of black holes in AdS accepted for publication in Phys. Rev. D,<br />
arXiv:1004.4633 [hep-th]<br />
582. Molina, C., P. Pani, V. Cardoso & L. Gualtieri (2010) Gravitational signature of Schwarzschild black<br />
holes in dynamical Chern-Simons gravity Phys. Rev. D 81, 124021<br />
583. Kesden, M., U. Sperhake & E. Berti (2010) Relativistic suppression of black hole recoils Astrophys. J.<br />
715, 1006-1011
584. Berti, E., V. Cardoso, T. Hinderer, M. Lemos, F. Pretorius, U. Sperhake & N. Yunes (2010)<br />
Semianalytical estimates of scattering thresholds and gravitational radiation in ultrarelativistic black hole<br />
encounters Phys. Rev. D 81, 104048<br />
585. Kesden, M., U. Sperhake & E. Berti (2010) Final spins from the merger of precessing binary black holes<br />
Phys. Rev. D 81, 084054<br />
586. Zilh˜ao, M., H. Witek, U. Sperhake, V. Cardoso, L. Gualtieri, C. Herdeiro & A. Nerozzi (2010) Numerical<br />
relativity for D dimensional axially symmetric space-times: formalism and code tests Phys. Rev. D 81,<br />
084052<br />
587. Berti, E., V. Cardoso, L. Gualtieri, F. Pretorius & U. Sperhake (2009) Comment on “Kerr Black Holes as<br />
Particle Accelerators to Arbitrarily High Energy” Phys. Rev. Lett. 103, 239001<br />
588. Sperhake, U., V. Cardoso, F. Pretorius, E. Berti, T. Hinderer & N. Yunes (2009) Cross section, final spin<br />
and zoom-whirl behavior in high-energy black-hole collisions Phys. Rev. Lett. 103, 131102<br />
589. Lovelace, G., Y. Chen, M. Cohen, J. D. Kaplan, D. Keppel, K. D. Matthews, D. A. Nichols, M. A. Scheel<br />
& U. Sperhake (2010) Momentum flow in black-hole binaries: II. Numerical simulations of equal-mass,<br />
headon mergers with antiparallel spins Phys. Rev. D 82, 064031<br />
590. Gabler, M., U. Sperhake & N. Andersson (2009) Non-linear radial oscillations of neutron stars Phys.<br />
Ref. D 80, 064012<br />
591. Sperhake, U., V. Cardoso, F. Pretorius, E. Berti, T. Hinderer & N. Yunes (2010) Ultra-relativistic grazing<br />
collisions of black holes Proceedings for the 12th Marcel Grossmann Meeting, arXiv:1003.0882 [gr-qc]<br />
592. Witek, H., V. Cardoso, C. Herdeiro, A. Nerozzi, U. Sperhake & M. Zilh˜ao (2010) Black holes in a box<br />
Proceedings for the 12th Marcel Grossmann Meeting<br />
593. Witek, H., V. Cardoso, C. Herdeiro, A. Nerozzi, U. Sperhake & M. Zilh˜ao (2010) Black holes in a box<br />
594. J. Phys. Conf. Ser. 229, 012072 Proceedings of the Spanish Relativity Meeting – ERE 2009<br />
595. Zlh˜ao, M., H. Witek, V. Cardoso, C. Herdeiro, A. Nerozzi & U. Sperhake Numerical relativity in higher<br />
dimensions J. Phys. Conf. Ser. 229, 012074 Proceedings of the Spanish Relativity Meeting – ERE 2009<br />
596. Sperhake, U., Black hole collisions in higher-dimensional spacetimes, invited Seminar at the<br />
Department of Mathematics, Dec 2010, University of Southampton, UK<br />
597. Sperhake, U., Black-hole collisions and gravitational waves, Plenary Talk, Iberian Meeting Physics<br />
Education, Sep 2010, Vila Real, Portugal<br />
598. Cardoso, V., Quasinormal modes of black holes: recent (and past) advances, Invited Talk at the 19th<br />
Conference on General Relativity, Jul 2010, Mexico City, Mexico<br />
599. Sperhake, U., Relativistic suppression of black-hole superkicks, 19th Conference on General Relativity,<br />
Jul 2010, Mexico City, Mexico<br />
600. Berti, E., The interface between numerical relativity and data analysis: open problems, 8th International<br />
LISA Symposium, Jul 2010, Stanford University, CA<br />
601. Sperhake, U., 11 Orbit inspiral of unequal mass black-hole binaries, Theory Meets Data Analysis at<br />
Comparable and Extreme Mass Ratios Workshop, Jun 2010, Perimeter Institute, Waterloo, CA<br />
602. Pretorius, F., Black Hole Collisions, invited Talk, Taming Complexity Workshop, Apr 2010, CUNY, New<br />
York, NY<br />
603. Pretorius, F., Black Hole Collisions, invited Talk, APS Meeting, Feb 2010, Washington D.C.<br />
604. Pretorius, F., Black Hole Mergers, Feb 2010, Mathematics Department Colloquium, Harvard, MA<br />
605. Cardoso, V., High energy collisions of black holes, Invited seminar at the Department de Fisica<br />
Fondamental, Nov 2009, Universitat de Barcelona, Spain<br />
PHY110009<br />
606. S.X. Hu, B. Militzer, V. N. Goncharov, S. Skupsky, “Strong Coupling and Degeneracy Effects in Inertial<br />
Confinement Fusion Implosions”, Phys. Rev. Lett. 104, 235003 (2010).<br />
607. S. X. Hu, “Optimizing the FEDVR-TDCC code for exploring the quantum dynamics of two-electron<br />
systems in intense laser pulses”, Phys. Rev. E 81, 056705 (2010).<br />
608. S. X. Hu, “Attosecond timing the ultrafast charge-transfer process in atomic collisions”, Phys. Rev. Lett.<br />
105, xxxxxx (2010) (in press).<br />
609. S. X. Hu and A. F. Starace, “Attosecond phase control of core-excited ionization of He atoms”, Phys.<br />
Rev. Lett. (to be submitted).<br />
610. S. X. Hu, L. A. Collins, and B. I. Schneider, “Attosecond soft-X-ray scanning of H2 + with circularlypolarized<br />
pulses”, Phys. Rev. A (in preparation).<br />
PHY990002<br />
611. T. Chu, H. P. Pfeiffer, and M. A. Scheel. High accuracy simulations of black hole binaries: spins antialigned<br />
with the orbital angular momentum. Phys. Rev., D80:124051, 2009, arXiv:0909.1313 [gr-qc].<br />
612. M. D. Duez, F. Foucart, L. E. Kidder, C. D. Ott, and S. A. Teukolsky. Equation of state effects in black<br />
hole-neutron star mergers. Class. Quant. Grav., 27:114106, 2010, arXiv:0912.3528 [astro-ph.HE].
613. F. Foucart, M. D. Duez, L. E. Kidder, and S. A. Teukolsky. Black hole-neutron star mergers: effects of<br />
the orientation of the black hole spin. 2010, arXiv:1007.4203 [astro-ph.HE].<br />
614. G. Lovelace et al. Momentum flow in black-hole binaries: II. Numerical simulations of equalmass, headon<br />
mergers with antiparallel spins. Phys. Rev., D82:064031, 2010, arXiv:0907.0869 [gr-qc].<br />
615. G. Lovelace, M. A. Scheel, and B. Szilagyi. Simulating merging binary black holes with nearly extremal<br />
spins. 2010, arXiv:1010.2777.<br />
616. B. Szilagyi, L. Lindblom, and M. A. Scheel. Simulations of Binary Black Hole Mergers Using Spectral<br />
Methods. Phys. Rev., D80:124010, 2009, arXiv:0909.3557 [gr-qc].
0SB EOT<br />
Event Details<br />
UIndiana<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
Workshop MEAP – Networking IUPUI,<br />
Indianap<br />
olis, IN<br />
Workshop MEAP – LEGO<br />
Mindstorms<br />
Workshop<br />
MEAP – No Guts, No<br />
Glory<br />
Workshop Exxon/Mobil Summer<br />
Camp<br />
Workshop Wonderlab Summer<br />
Camp<br />
Demo IU Statewide IT<br />
Conference Show and<br />
Tell Reception<br />
Tour Garland with<br />
Engineering Dean and<br />
Faculty<br />
IUPUI,<br />
Indianap<br />
olis, IN<br />
IUPUI,<br />
Indianpoli<br />
s, IN<br />
Indiana<br />
Universit<br />
y,<br />
Blooming<br />
ton, IN<br />
Indiana<br />
Universit<br />
y,<br />
Blooming<br />
ton, IN<br />
Indiana<br />
Universit<br />
y,<br />
Blooming<br />
ton, IN<br />
IUPUI,<br />
Indianap<br />
olis, IN<br />
Tour Joan Savage & Guest IUPUI,<br />
Indianpoli<br />
s, IN<br />
Tour Networking Group IUPUI,<br />
Indinapoli<br />
s, IN<br />
Tour Lighting & Field<br />
Production Class<br />
IUPUI,<br />
Indianap<br />
olis, IN<br />
14 JUN<br />
2010<br />
28-2<br />
JUN/JUL<br />
2010<br />
19 JUL<br />
2010<br />
13-14<br />
JUL 2010<br />
23 JUL<br />
2010<br />
27 SEP<br />
2010<br />
OCT<br />
2010<br />
OCT<br />
2010<br />
OCT<br />
2010<br />
OCT<br />
2010<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
Method<br />
3 29 29 S<br />
10 23 23 S<br />
3 28 28 S<br />
12 22 12 S<br />
4 18 6 S<br />
3 42 6 S<br />
1 7 1 S<br />
1 2 0 S<br />
1 17 2 S<br />
1 14 2 S
Seminar What is<br />
Cyberinfrastructure<br />
Fair<br />
Research Technologies<br />
Fair<br />
Tour Northern Kentucky<br />
University<br />
Tour/Demo Palakal and India<br />
Guests Demo/Tour<br />
SIGUICC<br />
S2010,<br />
Norfolkd,<br />
VA<br />
Simon<br />
Hall,<br />
Indiana<br />
Universit<br />
y,<br />
Blooming<br />
ton, IN<br />
IUPUI,<br />
Indianap<br />
olis, IN<br />
IUPUI,<br />
Indianpoli<br />
s, IN<br />
Tour Dentistry IUPUI,<br />
Indianpoli<br />
s, IN<br />
Tour Tech Center IUPUI,<br />
Indianap<br />
olis, IN<br />
Conference LEADING THE WAY –<br />
Stereoscopic<br />
Presentation at<br />
SACNAS<br />
Expo LEADING THE WAY –<br />
Stereoscopic<br />
Presentation at USA<br />
Science & Engineering<br />
Expo<br />
Seminar<br />
Presentatio<br />
n<br />
FutureGrid – A Testbed<br />
for Cloud and Grid<br />
Computing<br />
No Guts, No Glory<br />
Workshop Gateway Computing<br />
Environments 2010<br />
Demo Exposing Real-World<br />
Applications of<br />
Computational Science<br />
using Stereoscopic<br />
SACNAS<br />
2010,<br />
Anaheim,<br />
CA<br />
Washingt<br />
on DC<br />
Indiana<br />
Universit<br />
y,<br />
Blooming<br />
ton, IN<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
25 OCT<br />
2010<br />
25 OCT<br />
2010<br />
NOV<br />
2010<br />
NOV<br />
2010<br />
NOV<br />
2010<br />
NOV<br />
2010<br />
2-4 OCT<br />
2010<br />
24-25<br />
OCT<br />
2010<br />
27 OCT<br />
2010<br />
13 NOV<br />
2010<br />
14 NOV<br />
2010<br />
15-18<br />
NOV<br />
2010<br />
1 18 3 S<br />
3 14 Unknow<br />
n<br />
1 4 0 S<br />
1 9 1 S<br />
1 3 0 S<br />
1 12 1 0<br />
12 650 620 S<br />
14 5500 2400 S<br />
1 32 3 S<br />
1.5 25 3 S<br />
8.5 50 Unknow<br />
n<br />
21 75 18 S<br />
S<br />
S
Demo<br />
Demo<br />
Demo<br />
Demo<br />
Seminar<br />
Presentation<br />
Supporting Real Time<br />
Weather Predictions<br />
with LEAD II –<br />
Experiences with<br />
Vortex2 Experiment<br />
GlobalNOC Worldview<br />
Interactive Networking<br />
Visualization Tool<br />
Cloud Computing and<br />
FutureGrid<br />
Hybrid Workflows with<br />
LEAD II and Trident<br />
Pushing Back on the<br />
Data<br />
Deluge:<br />
Advancements in<br />
Metadata, Archiving<br />
and Workflows<br />
Symposium Open Gateway<br />
Computing<br />
Environments<br />
Seminar New and Emerging<br />
Technologies and<br />
Applications<br />
Seminar<br />
FutureGrid and Cloud<br />
Computing<br />
Panel Future of<br />
Supercomputing<br />
Centers<br />
Tour Learning Community<br />
Class<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
IUPUI,<br />
Indianap<br />
olis, IN<br />
Classroom OSG Grid School Sao<br />
Paulo,<br />
Brazil<br />
Tour/Demo CEMT IUPUI,<br />
Indianap<br />
olis, IN<br />
15-18<br />
NOV<br />
2010<br />
15-18<br />
NOV<br />
2010<br />
15-18<br />
NOV<br />
2010<br />
15-18<br />
NOV<br />
2010<br />
16 NOV<br />
2010<br />
16 NOV<br />
2010<br />
17 NOV<br />
2010<br />
17 NOV<br />
2010<br />
19 NOV<br />
2010<br />
23 NOV,<br />
2010<br />
6-10<br />
DEC<br />
2010<br />
DEC<br />
2010<br />
21 45 12 S<br />
21 60 10 S<br />
21 38 2 S<br />
21 35 2 S<br />
4 8 1 S<br />
2 10 Unknow<br />
n<br />
1 18 1 S<br />
1 5 1 S<br />
.25 150 40 S<br />
1 38 2 S<br />
8 50 45 S<br />
1 3 0 S<br />
S
P<br />
Tour Tech 104 IUPUI,<br />
Indianap<br />
olis, IN<br />
Tour Tech 104 IUPUI,<br />
Indianpoli<br />
s, IN<br />
DEC<br />
2010<br />
DEC<br />
2010<br />
1 57 2 S<br />
1 20 2 S<br />
ULONI<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
W<br />
T<br />
LONI HPC Workshop<br />
at Univ. Louisiana at<br />
Lafayetts<br />
Intro to HPC: Account<br />
Allocation &<br />
Management<br />
T Job Management w/<br />
PBS & Loadleveler<br />
Lafayette<br />
, LA<br />
11/1&2/1<br />
0<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
16 12 4 S<br />
LSU 9/8/10 2 6 0 S<br />
LSU 9/15/10 2 5 S<br />
T Introduction to MPI LSU 9/22/10 2 7 S<br />
T Advanced MPI LSU 9/29/10 2 4 S<br />
T Introduction to Cactus LSU 10/4/10 2 15 3 S<br />
T<br />
T<br />
Introduction to<br />
Gaussian<br />
Introduction to<br />
OpenMP<br />
LSU 10/13/10 2 4 S<br />
LSU 10/20/10 2 5 S<br />
T OpenMP part 2 LSU 10/27/10 2 4 S<br />
T Globus LSU 11/10/10 2 8 S<br />
T<br />
Introduction to HPC<br />
Visualization<br />
LSU 11/3/10 2 7 S<br />
T Introduction SAGA LSU 11/29/10 2 12 S<br />
Method<br />
UNCAR<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
NCAR:<br />
Recent<br />
Advancements in<br />
th<br />
14P<br />
Workshop on<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
1 Nov. 1.0 150 unknown S<br />
Method
Seminar<br />
High Resolution<br />
Climate Modeling<br />
the use of<br />
HPC in<br />
Meteorology<br />
2010<br />
NCAR:<br />
Tutorial<br />
Building Cloud<br />
Clusters with<br />
Amazon EC2<br />
SC10, New<br />
Orleans<br />
15 Nov.<br />
2010<br />
1.0 100 unknown S<br />
NCAR:<br />
Tutorial<br />
VAPOR: A Tool for<br />
Interactive<br />
Visualization of<br />
Massive Earth-<br />
Science Data Sets<br />
AGU 2010,<br />
San<br />
Francisco<br />
13 Dec.<br />
2010<br />
1.0 28 unknown S<br />
UNCSA<br />
Type Title Location Date(s) Hours Number of<br />
Participants<br />
Online<br />
Tutorial<br />
Online<br />
Tutorial<br />
Online<br />
Tutorial<br />
Number<br />
of Underrepresent<br />
ed people<br />
Access Grid Tutorials CI-Tutor Ongoing N/A 17 Unknown A<br />
BigSim: Simulating<br />
PetaFLOPS<br />
Supercomputers<br />
Debugging Serial and<br />
Parallel Codes<br />
CI-Tutor Ongoing N/A 7 Unknown A<br />
CI-Tutor Ongoing N/A 31 Unknown A<br />
Method<br />
Online<br />
Tutorial<br />
Getting Started on<br />
theTeraGrid<br />
CI-Tutor Ongoing N/A 13 Unknown A<br />
Online<br />
Tutorial<br />
Online<br />
Tutorial<br />
Intermediate MPI CI-Tutor Ongoing N/A 84 Unknown A<br />
Introduction to MPI CI-Tutor Ongoing N/A 354 Unknown A<br />
Online<br />
Tutorial<br />
Introduction to Multicore<br />
Performance<br />
CI-Tutor Ongoing N/A 32 Unknown A<br />
Online<br />
Tutorial<br />
Introduction to<br />
OpenMP<br />
CI-Tutor Ongoing N/A 102 Unknown A<br />
Online<br />
Tutorial<br />
Online<br />
Tutorial<br />
Introduction to<br />
Performance Tools<br />
Introduction to<br />
Visualization<br />
CI-Tutor Ongoing N/A 12 Unknown A<br />
CI-Tutor Ongoing N/A 18 Unknown A<br />
Online<br />
Tutorial<br />
Multilevel Parallel<br />
Programming<br />
CI-Tutor Ongoing N/A 34 Unknown A<br />
Online<br />
Tutorial<br />
Parallel Computing<br />
Explained<br />
CI-Tutor Ongoing N/A 45 Unknown A
Online<br />
Tutorial<br />
Parallel Numerical<br />
Libraries<br />
CI-Tutor Ongoing N/A 20 Unknown A<br />
Online<br />
Tutorial<br />
Performance Tuning<br />
for Clusters<br />
CI-Tutor Ongoing N/A 24 Unknown A<br />
Online<br />
Tutorial<br />
Tuning Applications<br />
for High Performance<br />
Networks<br />
CI-Tutor Ongoing N/A 8 Unknown A<br />
Workshop<br />
2010 NWChem<br />
Workshop<br />
NCSA Dec 1-2,<br />
2010<br />
16 14 Unknown S<br />
UNICS<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
Workshop<br />
Workshop<br />
College<br />
Class<br />
College<br />
Class<br />
College<br />
Class<br />
College<br />
Class<br />
College<br />
Class<br />
College<br />
Class<br />
NICS/OLCF XT Hex-<br />
Core Workshop<br />
Computational Thinking<br />
for Educators<br />
Data<br />
U.Tennessee-<br />
Knoxville:<br />
Structures<br />
U.Tennessee-<br />
Knoxville: graduate<br />
HPC class<br />
Wofford College<br />
Tufts University<br />
Brown University<br />
University of Colorado<br />
Oak<br />
Ridge<br />
National<br />
Lab, TN<br />
ORAU,<br />
Oak<br />
Ridge,<br />
TN<br />
Knoxville,<br />
TN<br />
Knoxville,<br />
TN<br />
Spartanb<br />
urg, SC<br />
Boston,<br />
MA<br />
Providen<br />
ce, RI<br />
Boulder,<br />
CO<br />
May 10-<br />
12, 2010<br />
Jun 12-<br />
14, 2010<br />
2010<br />
Spring<br />
semester<br />
2010<br />
Spring<br />
semester<br />
2010<br />
Spring<br />
semester<br />
2010<br />
Spring<br />
semester<br />
2010<br />
Spring<br />
semester<br />
2010<br />
Spring<br />
semester<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
24 hrs 65 20 Live<br />
21 hrs 20 17 Live<br />
45 16 Live<br />
18 3 Live<br />
6 2 Live<br />
12 Unknown Live<br />
11 Unknown Live<br />
14 Unknown Live<br />
Method
UORNL<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
Talk<br />
SNS and TeraGrid's<br />
NSTG: A best practice<br />
for integration of<br />
national<br />
cyberinfrastructure with<br />
an experimental user<br />
facility<br />
Galtinbue<br />
g, TN<br />
Workshop NOBUGS 2010 Gatlinbur<br />
g, TN<br />
SC’10<br />
booth and<br />
Sc’ed<br />
Let me know if<br />
someone else does not<br />
fill this in for TG as<br />
whole<br />
Oct 10-<br />
13, 2010<br />
Oct. 10-<br />
13, 2010<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
1 56 13 S<br />
24 56 13 S<br />
Method<br />
UPSC<br />
Type Title Location Date(s) Hours Number of<br />
Participants<br />
Focus<br />
Group<br />
Presentation<br />
Conference<br />
PSC Visit<br />
Role of SafeNet in K-12<br />
Cyber Safety Training<br />
Computational<br />
Approach to Teaching<br />
Secondary Math and<br />
Science<br />
Exhibit booth at Three<br />
Rivers Ed Tech<br />
Conference<br />
Careers in HPC/PSC<br />
Overview<br />
PSC<br />
Pittsburgh<br />
PA<br />
TRETC<br />
Pittsburgh<br />
PA<br />
TRETC<br />
Pittsburgh<br />
PA<br />
PSC<br />
Pittsburgh<br />
PA<br />
Number<br />
of<br />
Underrepresented<br />
people<br />
MethodP0F<br />
12/15 4 9 4 S<br />
11/10 2 6 3 S<br />
11/10 8 30 18 S<br />
11/17 2 13 2 S<br />
1<br />
1 Methodology: Synchronous (e.g., face to face in classroom or live instruction over the Internet) or asynchronous<br />
(e.g., via the Internet using WebCT or other authoring software.) S=synchronous; A=asynchronous.
Type Title Location Date(s) Hours Number of<br />
Participants<br />
Poster<br />
Session<br />
CMIST: Enzyme –<br />
Structure & Function<br />
Module<br />
BEST: Better<br />
Educators of Science<br />
for Tomorrow<br />
Bioinformatics in High<br />
School (presented by<br />
Pittsburgh area high<br />
school teachers and<br />
PSC staff)<br />
SC10<br />
New<br />
Orleans<br />
LA<br />
SC10<br />
New<br />
Orleans<br />
LA<br />
SC10<br />
New<br />
Orleans<br />
LA<br />
Number<br />
of<br />
Underrepresented<br />
people<br />
MethodP0F<br />
11/13 1.5 25 15 S<br />
11/13 1.5 40 15 S<br />
11/13 1.5 50 15 S<br />
1<br />
Poster<br />
Session<br />
Poster<br />
Session<br />
Training<br />
Conference<br />
Conference<br />
Conference<br />
High School Research<br />
Internship in<br />
Bioinformatics<br />
(presented by PSC<br />
summer interns)<br />
Summer Internship at<br />
PSC (presented by<br />
PSC summer interns)<br />
TeraGrid New User<br />
Training Session<br />
SC10 Conference<br />
Exhibitor Booth<br />
SC10 Conference,<br />
Education Program<br />
SC10 Conference,<br />
Student Field Trips<br />
SC10<br />
New<br />
Orleans<br />
LA<br />
SC10<br />
New<br />
Orleans<br />
LA<br />
PSC/<br />
Online<br />
SC10<br />
New<br />
Orleans,<br />
LA<br />
SC10<br />
New<br />
Orleans,<br />
LA<br />
SC10<br />
New<br />
Orleans,<br />
LA<br />
Training Security for Lunch Pittsburgh<br />
PA<br />
11/13 1.5 50 15 S<br />
11/13 1.5 50 15 S<br />
10/29 2 30 Unknown<br />
(6 survey<br />
responses)<br />
Presentation<br />
Presentation<br />
11/12-<br />
11/19<br />
11/12-<br />
11/19<br />
11/12-<br />
11/19<br />
24 app. 750 app. 100 S<br />
45 135<br />
supported,<br />
54 self-paid<br />
5 210 high<br />
school<br />
students +<br />
chaperones<br />
S<br />
75 S<br />
will know<br />
later<br />
11/8 1.5 65 15 S<br />
S
Type Title Location Date(s) Hours Number of<br />
Participants<br />
Training<br />
Performance<br />
Engineering of Parallel<br />
Applications<br />
European-<br />
US<br />
Summer<br />
HPC<br />
School<br />
Catania,<br />
Italy.<br />
Number<br />
of<br />
Underrepresented<br />
people<br />
MethodP0F<br />
10/6 1.5 60 12 S<br />
1<br />
UPurdue<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
Presentatio<br />
n<br />
Panel<br />
Presentatio<br />
n<br />
Presentatio<br />
n<br />
Han Zhang. "Domainspecific<br />
web services<br />
for scientific<br />
application<br />
developers," GCE10 at<br />
SC10<br />
Carol Song, Preston<br />
Smith. Panel “Grid<br />
Computing<br />
Technologies” panel<br />
Kay Hunt At<br />
Cyberinfrastructure<br />
Assessment Workshop<br />
Kay Hunt. Fall 2010<br />
Internet2 member<br />
meeting.<br />
SC10,<br />
New<br />
Orleans,<br />
LA<br />
Purdue<br />
CI Days,<br />
West<br />
Lafayette<br />
, IN<br />
Arlington,<br />
VA<br />
Atlanta,<br />
GA<br />
Nov 14,<br />
2010<br />
Dec. 9,<br />
2010<br />
Oct.<br />
2010<br />
Nov.<br />
2010<br />
30 min 20<br />
1 hr 40<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
Method<br />
USDSC<br />
Type Title Location Date(s) Hours # of Partic.<br />
Tutorial<br />
(Teachers)<br />
Webinar: Using<br />
Graphical Organizer<br />
to Engage Students in<br />
# of<br />
Underrep.<br />
Webbased<br />
Tutorial<br />
12-9-<br />
2010<br />
Method<br />
1.5 6 2 S
the Classroom<br />
Conf.<br />
Present‟n<br />
From Data to<br />
Discovery: Lessons<br />
on Using Data to<br />
Connect Events in<br />
Our Environment<br />
SC-10<br />
New<br />
Orleans,<br />
LA<br />
Nov.<br />
2010<br />
1.5 20 14 S<br />
Conf.<br />
Present‟n<br />
Understanding the<br />
Impact of Emerging<br />
Non-Volatile<br />
Memories on High-<br />
Performance, IO-<br />
Intensive Computing<br />
SC-10<br />
New<br />
Orleans,<br />
LA<br />
Nov.<br />
2010<br />
1 200 20 S<br />
Conf.<br />
Present‟n<br />
Moneta: A Highperformance<br />
Storage<br />
Array Architecture for<br />
Next-generation, Nonvolatile<br />
Memories<br />
IEEE/AC<br />
M<br />
MICRO;<br />
Atlanta,<br />
GA<br />
Dec.<br />
2010<br />
1 200 20 S<br />
Conf.<br />
Booth<br />
SACNAS Annual<br />
Conference<br />
Anaheim,<br />
CA<br />
October,<br />
2010<br />
10 3,350 3,350 S<br />
Workshop<br />
(Teachers)<br />
The Basics: DNA<br />
Extraction, Size<br />
Exclusion,<br />
Chromatography<br />
SDSC,<br />
La Jolla,<br />
CA<br />
9-23-<br />
2010<br />
2 22 17 S<br />
Workshop<br />
(Teachers)<br />
SMART Team Teacher<br />
Meeting<br />
SDSC,<br />
La Jolla,<br />
CA<br />
9-24-<br />
2010<br />
2 6 4 S<br />
Workshop<br />
(Teachers)<br />
Graphing Calculators<br />
for Educators<br />
SDSC,<br />
La Jolla,<br />
CA<br />
10-6-<br />
2010<br />
3 9 6 S<br />
Workshop<br />
(Students)<br />
Fall Fest Digital<br />
Costume Illustration<br />
Contest<br />
SDSC,<br />
La Jolla,<br />
CA<br />
10-9-<br />
2010<br />
6 17 17 S<br />
Workshop<br />
(Students)<br />
Introduction to<br />
Graphing Calculators<br />
for Middle and High<br />
School Students<br />
SDSC,<br />
La Jolla,<br />
CA<br />
10-16-<br />
2010<br />
4 19 10 S<br />
Workshop<br />
(Students)<br />
SMART Team:<br />
Introduction to<br />
Biochemistry<br />
SDSC,<br />
La Jolla,<br />
CA<br />
10-16-<br />
2010<br />
4 26 15 S<br />
Workshop<br />
(Students)<br />
Introduction to Adobe<br />
Illustrator – Part 1<br />
SDSC,<br />
La Jolla,<br />
CA<br />
10-16-<br />
2010<br />
6 7 5 S<br />
Workshop<br />
(Teachers)<br />
Proteins in Action 1:<br />
ELIZA<br />
SDSC,<br />
La Jolla,<br />
10-19-<br />
2010<br />
2 14 10 S
4P<br />
1P<br />
P Step<br />
P State<br />
CA<br />
Workshop<br />
(Student)<br />
Introduction to Adobe<br />
Illustrator – Part 2<br />
SDSC,<br />
La Jolla,<br />
CA<br />
10-23-<br />
2010<br />
6 7 5 S<br />
Workshop<br />
(Student)<br />
SMART Team Handson<br />
Biotech Activities<br />
TSRI 10-30-<br />
2010<br />
4 18 11 S<br />
Workshop<br />
(Teachers)<br />
Analysis of Proteins:<br />
Protein Quantification<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-2-<br />
2010<br />
2 12 8 S<br />
Workshop<br />
(Teachers)<br />
Finding Great Web 2.0<br />
Resources for the<br />
Classroom<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-4-<br />
2010<br />
2 11 7 S<br />
Workshop<br />
(Teachers)<br />
SpaceTECH: Handson,<br />
Standards-based<br />
Astronomy Activities<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-6-<br />
2010<br />
3 19 14 S<br />
Workshop<br />
(Student)<br />
SMART Team RasMol<br />
Training<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-6-<br />
2010<br />
3 21 13 S<br />
Workshop<br />
(Student)<br />
Introduction to Digital<br />
Comic Book Creation<br />
– Part 1<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-13-<br />
2010<br />
6 15 11 S<br />
Workshop<br />
(Teachers)<br />
DNA Analysis: DNA<br />
Electrophoresis<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-16-<br />
2010<br />
2 18 13 S<br />
Workshop<br />
(Teachers)<br />
Gaseous Plasma: The<br />
th<br />
of Matter, or<br />
st<br />
Toward<br />
Fusion<br />
General<br />
Atomics<br />
Corp.,<br />
San<br />
Diego,<br />
CA<br />
11-17-<br />
2010<br />
2 18 12 S<br />
Workshop<br />
(Teachers)<br />
Compute This: Online<br />
Space Science<br />
Resources for the<br />
Classroom<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-20-<br />
2010<br />
2 22 15 S<br />
Workshop<br />
(Teachers)<br />
Proteins in Action: Biofuels<br />
and Enzymes<br />
SDSC,<br />
La Jolla,<br />
CA<br />
11-30-<br />
2010<br />
2 16 12 S<br />
Workshop<br />
(Teachers)<br />
Star Parties and More:<br />
Astronomy Tools and<br />
Techniques for the<br />
Classroom<br />
SDSC,<br />
La Jolla,<br />
CA<br />
12-2-<br />
2010<br />
2.5 15 10 S<br />
Workshop<br />
(Teachers)<br />
Understanding Mars<br />
Space<br />
Travelers<br />
Emporiu<br />
12-4-<br />
2010<br />
3 21 14 S
m; San<br />
Diego,<br />
CA<br />
Workshop<br />
(Teachers)<br />
All About Space Flight:<br />
Curriculum and<br />
Materials for Grades<br />
4-9<br />
San<br />
Diego<br />
Air and<br />
Space<br />
Museum,<br />
San<br />
Diego,<br />
CA<br />
12-11-<br />
2010<br />
3 18 11 S<br />
Workshop<br />
(Student)<br />
SMART Team Mentor<br />
Match<br />
The<br />
Scripps<br />
Research<br />
Institute,<br />
La Jolla,<br />
CA<br />
12-11-<br />
2010<br />
3 30 18 S<br />
Workshop<br />
(Student)<br />
Intro to Digital Comic<br />
Book Creation – 2<br />
SDSC,<br />
La Jolla,<br />
CA<br />
12-11-<br />
2010<br />
6 15 11 S<br />
UTACC<br />
Type Title Location Date(s) Hours Number of<br />
Participant<br />
s<br />
Workshop<br />
Workshop<br />
Workshop<br />
Workshop<br />
Workshop<br />
Workshop<br />
Defensive<br />
Programming Part 1<br />
Defensive<br />
Programming Part 2:<br />
System Testing<br />
Parallel Optimization<br />
for Ranger<br />
C++ Programming<br />
Basics<br />
Fortran90/95/2003<br />
Programming for HPC<br />
Data Analysis on<br />
Ranger<br />
TACC 10/7/201<br />
0<br />
TACC 10/28/20<br />
10<br />
TACC 10/29/20<br />
10<br />
TACC 12/7/201<br />
0<br />
TACC 12/7/201<br />
0<br />
Cornell 12/8-<br />
9/2010<br />
Number<br />
of<br />
Underreprese<br />
nted<br />
people<br />
3 12 2 S<br />
3 9 2 S<br />
7.5 8 3 S<br />
3 18 4 S<br />
4 12 3 S<br />
16 12 S<br />
Method
Table: Ranger Virtual Workshop Usage<br />
Quarter Total Logins Unique Logins<br />
Q1 „08 74 20<br />
Q2 „08 129 26<br />
Q3 „08 198 31<br />
Q4 „08 135 31<br />
Q1 „09 184 29<br />
Q2 „09 638 216<br />
Q3 '09 504 403<br />
Q4 „09 438 289<br />
Q1 '10 425 349<br />
Q2 '10 657 377<br />
Total 4220 1639<br />
Starting with Q3 '10, the Usage Table is being derived from a web tracker service. The<br />
combination of allowing guest visitors and the rapid increase in web crawlers made the data<br />
increasingly difficult to interpret.<br />
Page Loads Unique Visitors First Time<br />
Visitors<br />
Q3 '10 2697 1026 970 56<br />
Q4 ‘10 2648 862 658 204<br />
Returning<br />
Visitors<br />
TACC openly advertises tours of our facilities to the Austin and central Texas area. Tour groups<br />
include visitors in K-12, higher education, industry, government, and the general public. An<br />
overview of TeraGrid and its contributions to STEM research are provided at each event.<br />
Type Title Location Date(s) Number of<br />
Participants<br />
Number of<br />
Under-<br />
Represente<br />
d People<br />
Vislab Tour<br />
SURGE: Roads to<br />
Research / UT Austin Vislab Oct. 1 60 18<br />
students<br />
Vislab Tour 10THome School Group Vislab Oct. 5 9 5<br />
The Austin<br />
Forum on<br />
Science,<br />
Technology &<br />
Society<br />
01T“Hopes and Fears for<br />
Big Science” with<br />
Nobel Laureate Dr.<br />
Steven Weinberg<br />
AT&T<br />
Executive<br />
Education<br />
and<br />
Conference<br />
Center<br />
Oct. 5 361 Not tracked.<br />
Machine Room 10TSt. Edward‟s Ranger Oct. 6 20 6
Tour<br />
Ranger Tour<br />
and Cluster<br />
Management<br />
Discussion<br />
Ranger and<br />
Vislab Tours<br />
Ranger and CMS<br />
Machine Rooms<br />
Ranger and<br />
Vislab Tours<br />
Ranger Tour<br />
Vislab Tour<br />
Vislab Tour<br />
Vislab Tour<br />
Vislab Tour<br />
Vislab tour<br />
Ranger Machine<br />
Room Tour<br />
Vislab Tour<br />
The Austin<br />
Forum on<br />
Science,<br />
Technology &<br />
Society<br />
Vislab Tour<br />
Vislab Tour<br />
University / Data<br />
Storage Class<br />
01TChevron<br />
01TShell<br />
01TUT Austin<br />
undergraduate class in<br />
Parallel Computer<br />
Architecture<br />
01TDell and Bell<br />
Helicopter<br />
01TUS Senate Commerce<br />
Committee & Kay<br />
Bailey Hutchison‟s<br />
office<br />
01TPhysics & CS UT<br />
Austin Freshman<br />
Interest Group<br />
01TUT Austin freshman<br />
public affairs class<br />
01TElectrical Engineering<br />
UT Austin freshman<br />
interest group<br />
01TChemistry &<br />
Biochemistry UT<br />
Austin Freshman<br />
01TElectrical & Computer<br />
Engineering UT<br />
freshman students<br />
Austin Community<br />
College class /<br />
Computer Security<br />
Houston Livestock<br />
Show & Rodeo<br />
Committee<br />
01T“Transmedia<br />
Storytelling” with<br />
Daniel Lorenzetti<br />
& JuanGarcia<br />
Originality in the Arts<br />
and Sciences<br />
Aerospace<br />
Engineering UT<br />
Austin freshman year<br />
interest group<br />
Machine<br />
Room<br />
Ranger<br />
Machine<br />
Room and<br />
ROC<br />
Ranger<br />
Machine<br />
Room &<br />
Vislab<br />
Ranger &<br />
CMS<br />
Machine<br />
Rooms<br />
Ranger<br />
Machine<br />
Room &<br />
Vislab<br />
Ranger<br />
Machine<br />
Room<br />
Oct. 11 7 N/A<br />
Oct. 12 3 N/A<br />
Oct. 18 15 Not tracked.<br />
Oct. 19 7 N/A<br />
Oct. 20 3 N/A<br />
Vislab Oct. 20 18 6<br />
Vislab Oct. 21 18 11<br />
Vislab Oct. 27 12 Not tracked.<br />
Vislab Nov. 2 18 6<br />
Vislab Nov. 3 20 5<br />
Ranger<br />
Machine<br />
Room<br />
Nov. 3 17 8<br />
Vislab Nov. 4 20 Not tracked.<br />
AT&T<br />
Executive<br />
Education<br />
and<br />
Conference<br />
Center<br />
Nov. 9 158 Not tracked.<br />
Vislab Nov. 9 16 Not tracked.<br />
Vislab Dec. 1 19 7
The Austin<br />
Forum on<br />
Science,<br />
Technology &<br />
Society<br />
Ranger Tour<br />
“CPRIT:<br />
Opportunities and<br />
Challenges” with<br />
Nobel Laureate Dr. Al<br />
Gilman (Cancer<br />
Prevention Institute<br />
of Texas)<br />
National Instruments<br />
AT&T<br />
Executive<br />
Education<br />
and<br />
Conference<br />
Center<br />
Ranger<br />
Machine<br />
Room<br />
Dec. 7 86 Not tracked.<br />
Dec. 15 5 N/A<br />
Totals 892 72<br />
TACC at SC10<br />
TACC staff members participated in the SC10 conference. 36 staff attended and contributed to<br />
tutorials, provided demonstrations and talks in the TACC booth, assisted in the TeraGrid booth,<br />
and presented at special sessions, BoFs, panels, and in vendor booths. TACC presented 32<br />
posters in its booth, 12 described research projects accomplished by TeraGrid principal<br />
investigators and another 6 described TeraGrid HPC, Vis, and data systems (i.e. Ranger,<br />
Longhorn, Ranch) or other NSF-funded projects (FutureGrid, STCI awards). TACC booth<br />
activities and participation in other SC10 events are summarized in the following tables.<br />
SC10 TACC Booth Activities<br />
Type Title Presenter<br />
Presentation<br />
Demo<br />
Data-centric Cyberinfrastructure and<br />
Applications at the Texas Advanced Computing<br />
Center<br />
The Longhorn Visualization Portal: A Web-<br />
Based Interface to the Longhorn Visualization<br />
and Data Analysis Cluster<br />
Chris Jordan, TACC<br />
Greg Johnson, TACC<br />
Presentation<br />
Parallel Optimization of Large Constrained Least<br />
Squares Problems in a Production Environment<br />
Emre Brooks, UT Health<br />
Science Center at San Antonio<br />
Presentation<br />
Lagrangian Simulation of the Deepwater Horizon<br />
Oil Spill: Application of the Advanced<br />
CIRCulation Model for Oceanic, Coastal, and<br />
Estuarine Waters (ADCIRC)<br />
Corey Trahan, The University<br />
of Texas at Austin<br />
Demo Visualizing the BP Oil Spill Karla Vega, TACC<br />
Presentation Visualizing Science at Scale with Longhorn Kelly Gaither, TACC<br />
Presentation Large Scale Distributed GPU Isosurfacing Paul Navratil, TACC<br />
Presentation<br />
Presentation<br />
NEMO and OMEN: A Journey Through<br />
Massively Parallel Nanoelectronics Modeling<br />
Tools<br />
XTED: Identifying and Evaluating the Next<br />
Generation of Cyberinfrastructure Software for<br />
Science<br />
Sebastien Steiger, Purdue<br />
Maytal Dahan, Jay Boisseau<br />
(TACC)<br />
Presentation Supercomputing to Feed the World Dan Stanzione, TACC
Presentation<br />
Presentation<br />
Extreme-Scale AMR<br />
TACC's Big Projects for Big Science in 2011 and<br />
Beyond<br />
Omar Ghattas, The University<br />
of Texas at Austin<br />
Jay Boisseau, TACC<br />
Presentation Advanced Portals and Services at TACC Steve Mock, TACC<br />
Demo Stereoscopic Molecular Visualization Brandt Westing, TACC<br />
TACC Participation in SC10 Events<br />
Type Location Title Presenter<br />
Special Topic<br />
Special Topic<br />
Grand Challenges in Humanities, Arts, and<br />
Social Sciences Computing<br />
Recruiting and Training HPC Systems<br />
Administrators and Engineers<br />
Maria Esteva<br />
Dan Stanzione<br />
Presentation NVIDIA Booth Large Scale Distributed GPU Isosurfacing Paul Navratil<br />
Special Topic<br />
TACC's Scientific Computing Curriculum:<br />
Supporting Academia and Industry<br />
Presentation Intel Theater TACC Aids Study of Gulf Oil Spill Impact Karla Vega<br />
Paper<br />
Tutorial<br />
PerfExpert: An Easy-to-Use Performance<br />
Diagnosis Tool for HPC Applications<br />
MO2: Hybrid MPI and OpenMP Parallel<br />
Programming<br />
Student Cluster Competition<br />
Melyssa Fratkin, Bill<br />
Barth, Dan Stanzione<br />
B.D. Kim, John McCalpin,<br />
Lars Koesterke<br />
Gabriele Jost<br />
B.D. Kim, Carlos Rosales<br />
(Supervised student team<br />
from UT Austin)
C<br />
Performance on CY2010 Milestones
Work Package Dependencies Planned<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
QSR 2010Q2 <strong>Report</strong> Values<br />
QSR 2010Q1 <strong>Report</strong> Values<br />
Resources 12/31/10 AD Status 12/31/10 AD Notes 09/30/10 AD Status 09/30/10 AD Notes 06/30/10 AD Status 06/30/10 AD Notes 3/31/10 AD Status 03/31/10 AD Notes<br />
QSR 2009Q4 <strong>Report</strong> Values<br />
12/31/09 AD<br />
12/31/09 AD Notes<br />
Status<br />
Project-ID<br />
WBS<br />
O P M<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
SI 2 Area: Software Integration<br />
Objective 2.0 Operations: maintain/sustain<br />
SI 2.0<br />
current capabilities<br />
SI.Coord 2.0.1 O SI Area Coordination<br />
UC Ian Foster[3%], Lee<br />
SI.Coord 2.0.1.3 O SI Area Coordination PY5 Aug-09 Jul-10 Liming[35%]; ANL JP<br />
Navarro [35%]<br />
SI.Coord 2.0.1.3.1 M PY 5 Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-09 Jul-10 100% 100% 75% 50% 25%<br />
UC Ian Foster[3%], Lee<br />
SI.Coord 2.0.1.4 O SI Area Coordination Extension Aug-10 Jul-11 Liming[35%]; ANL JP<br />
Navarro [35%]<br />
SI.Coord 2.0.1.4.1 M Extension Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-10 Jul-11 50% 0%<br />
SI.Inf 2.0.2 O Operate Infrastructure Services<br />
SI.Inf 2.0.2.3 O Operate Infrastructure Services PY5 Aug-09 Jul-10<br />
ANL JP Navarro [2%];<br />
Eric Blau[2%]; NCSA<br />
Jason Brechin[5%];<br />
TACC Warren Smith<br />
[10%]<br />
SI.Inf 2.0.2.3.1 M PY5 Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-09 Jul-10 100% 100% 75% 50% 25%<br />
SI.Inf 2.0.2.4 O Operate Infrastructure Services Extension Aug-10 Jul-10<br />
ANL JP Navarro [2%];<br />
Eric Blau[2%]; NCSA<br />
Jason Brechin[5%];<br />
TACC Warren Smith<br />
[10%]<br />
SI.Inf 2.0.2.4.1 M Extension Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-10 Jul-11 50% 0%<br />
SI.CTSS 2.0.3 O Maintain Current CTSS Kits<br />
UC Eric Blau[45%], Lee<br />
Liming[15%],<br />
TBD[20%]; NCSA Jason<br />
Brechin[14%]; TACC<br />
SI.CTSS 2.0.3.3 O Maintain Selected Current CTSS Kits PY5 Aug-09 Jul-10 Warren Smith[10%];<br />
ANL JP Navarro [10%];<br />
UW TBD[7%], Jaime<br />
Frey[7%], Greg<br />
Thaine[6%]<br />
SI.CTSS 2.0.3.3.1 M PY5 Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-09 Jul-10 100% 100% 75% 50% 25%<br />
More RPs are in the<br />
More RPs are in the<br />
SI.CTSS 2.0.3.4 P Globus Tool Kit 5 - GRAM, GridFTP, and Client Oct-09 Jul-10 100%<br />
process of installing,<br />
100%<br />
process of installing,<br />
In beta, new release<br />
70%<br />
40% Still on original target 20% New Effort Added 2009Q4<br />
closed and tracked under<br />
closed and tracked under<br />
being prepared for RPs<br />
entension line<br />
entension line<br />
UC Eric Blau[45%], Lee<br />
Liming[15%],<br />
TBD[20%]; NCSA Jason<br />
Brechin[14%]; TACC<br />
SI.CTSS 2.0.3.5 O Maintain Selected Current CTSS Kits Extension Aug-10 Jul-11 Warren Smith[10%];<br />
ANL JP Navarro [10%];<br />
UW TBD[7%], Jaime<br />
Frey[7%], Greg<br />
Thaine[6%]<br />
SI.CTSS 2.0.3.5.1 M Extension Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-10 Jul-11 50%<br />
Talking about talking<br />
GRAM to production,<br />
More RPs are in the<br />
SI.CTSS 2.0.3.4 P Globus Toolkit 5 - GRAM, GridFTP, and Client Aug-10 Jul-11 50%<br />
GridFTP is production 10%<br />
process of installing<br />
and partically deployed.<br />
Client close to prod<br />
SI.Pkg 2.0.4 O Package Software<br />
ANL JP Navarro<br />
[15%];NCSA Jason<br />
Brechin[15%]; UC TBD<br />
[10%], Eric Blau[44%],<br />
SI.Pkg 2.0.4.3 O Package Software PY5 Aug-09 Jul-10<br />
Joseph Bester[35%];<br />
UW TBD [10%], Jaime<br />
Frey [10%], Greg<br />
Th i [10%]<br />
SI.Pkg 2.0.4.3.1 M PY5 Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-09 Jul-10 100% 100% 75% 50% 25%<br />
SI.Pkg 2.0.4.4 P Local Compute Oct-09 Mar-10 Completed Completed Completed Completed 50% New Effort Added 2009Q4<br />
SI.Pkg 2.0.4.5 P Distributed Programming Jul-09 Jun-10 Completed On target Completed On target Completed On target 75% On target 50% New Effort Added 2009Q4<br />
ANL JP Navarro<br />
[15%];NCSA Jason<br />
Brechin[15%]; UC TBD<br />
[10%], Eric Blau[44%],<br />
SI.Pkg 2.0.4.6 O Package Software Extension Aug-10 Jul-11<br />
Joseph Bester[35%];<br />
UW TBD [10%], Jaime<br />
Frey [10%], Greg<br />
Th i [10%]<br />
SI.Pkg 2.0.4.6.1 M Extension Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-10 Jul-11 50%<br />
SI.Pkg 2.0.4.7 O Continue to package software Aug-10 Jul-11 50% 15%<br />
Objective 2.1 Projects: Information Services<br />
SI 2.1<br />
Enhancements<br />
ANL JP Navarro [30%];<br />
NCSA Jason<br />
SI.IS 2.1.4 P Information Services Enhancements PY5 Aug-09 Jul-10 Brechin[10%]; SDSC<br />
Ken Yashimoto[15%];<br />
UC Eric Blau[10%],<br />
TBD[20%]<br />
demonstrate ability for resource operators to register<br />
SI.IS 2.1.4.1 P Core2 Aug-09 Jul-10<br />
their own locally supported capabilities<br />
Low priority because<br />
SI.IS 2.1.4.1.2 P Capability documented, and registration designed Nov-09 Jan-10 55%<br />
RPs are not requesting 55%<br />
this capability<br />
Cornell MATLAB<br />
SI.IS 2.1.4.1.3 P Prototype registration(s) deployed Feb-10 Apr-10 100%<br />
25%<br />
registration<br />
SI.IS 2.1.4.1.4 P Production registration(s) deployed May-10 Jul-10 100%<br />
evaluate the feasibility of gateways publishing about<br />
SI.IS 2.1.4.2 P SGW Aug-09 Jul-10<br />
themselves to information services<br />
Low priority because<br />
RPs are not requesting<br />
this capability<br />
Cornell MATLAB<br />
registration<br />
50%<br />
Low priority because<br />
RPs are not requesting<br />
this capability<br />
50% 50% Draft doc to be circulated to RPS
Nancy working with Doc<br />
Dependent on gateway<br />
Dependent on gateway<br />
in info serv teams to<br />
team to complete their<br />
team to complete their<br />
SI.IS 2.1.4.2.4 P Publishing evaluation and next step decision May-10 Jul-10 85%<br />
have a way to let 80%<br />
80%<br />
On-going<br />
part before this can be<br />
part before this can be<br />
gateways publish<br />
completed.<br />
completed.<br />
themselves<br />
respond to data, gateways, visualization, and<br />
From<br />
SI.IS 2.1.4.3 P scheduling needs for expanded information service<br />
Mar-09 Jul-10 100% 100% 75% 75% 50%<br />
SGW, DV<br />
content<br />
ANL JP Navarro [30%];<br />
NCSA Jason<br />
Brechin[10%]; SDSC<br />
SI.IS 2.1.5 Information Services Enhancements Extension Apr-10 Jul-11<br />
Ken Yashimoto[15%];<br />
UC Eric Blau[10%],<br />
TBD[20%]<br />
Track2D, Viz, and storage capability deployment<br />
Working with Viz and<br />
SI.IS 2.1.5.1 O Apr-10 Jul-11 25%<br />
15%<br />
registration<br />
data working groups<br />
SI.IS 2.1.5.2 O New TG capabilities are registered Apr-10 Jul-11 50% 15%<br />
SI.IS 2.1.5.3 O Community capabilities are registered Apr-10 Jul-11 50% 15%<br />
SI.IS 2.1.5.4 P IIS transition to XD Apr-10 Jul-11 10% 0%<br />
SI 2.2<br />
Objective 2.2 Projects: Application Service Hosting<br />
Partnership<br />
Kate Keahy[20%],<br />
Lee Liming[5%]<br />
SI.Host 2.2.2 P Application Hosting Services Partnership V1 PY4 Aug-08 Jul-09<br />
8/3 Update: Percentage 75%, The imp<br />
Docs<br />
docs are no longer required. The<br />
Application Hosting Service Description &<br />
SI.Host 2.2.2.1 M Team, Aug-08 Jul-09 75% Lee to talk to Kate 75% Lee to talk to Kate 75% Lee to talk to Kate 75% Lee to talk to Kate 75% description document has mostly been<br />
Implementation Documents<br />
SGW<br />
completed. This still needs a small<br />
amount work to complete<br />
IS Schema proto available, no INCA test<br />
SI.Host 2.2.2.2 M Coordinate Inca, IS Schema for Hosting Service NOS Jul-09 Oct-09 Lee Liming[6%] 40% Lee to talk to Kate 40% Lee to talk to Kate 40% Lee to talk to Kate 40% Lee to talk to Kate 40%<br />
yet<br />
SI.Host 2.2.3 P Application Hosting Services Partnership V2 PY5 Aug-09 Jul-10 UC Kate Keahey[25%]<br />
SI.Host 2.2.3.1 M Documentation V2 and transition plan Aug-09 Jul-10<br />
SI.Host 2.2.3.1.1 M Analyze V1 experiences and produce V2 goals, plan Aug-09 Oct-09 0% Low priority 0% Low priority 0% Low priority 0% Low priority 0% Low priority<br />
SI.Host 2.2.3.1.2 M Develope V1->V2 transition plan Feb-10 Apr-10 0% Low priority 0% Low priority 0% Low priority 0% Low priority<br />
SI.Host 2.2.3.1.3 M Documnent V2 May-10 Jul-10 0% 0% 0%<br />
SI.Host 2.2.3.2 M Coordinate changes and deployment Aug-09 Jul-10<br />
To RPs<br />
SI.Host 2.2.3.2.1 M Collect V1 RP experiences Aug-09 Oct-09 0% Low priority 0% Low priority 0% Low priority 0% Low priority 0% Low priority<br />
SI.Host 2.2.3.2.2 M Coordinate V2 testing deployment Feb-10 Apr-10 0% 0% 0%<br />
SI.Host 2.2.3.2.3 M Coodinate V2 production deployment May-10 Jul-10 0% 0% 0%<br />
Objective 2.3 Projects: Public Build and Test<br />
SI 2.3<br />
Service<br />
ANL JP Navarro[8%];<br />
SI.STCI 2.3.2 P TeraGrid STCI User Support PY5 Aug-09 Jul-10<br />
UC Eric Blau[9%]<br />
SI.STCI 2.3.2.1 M Feasibility Evaluation for Community Usage Aug-09 Jan-10<br />
Analyze early user experiences and summarize, assess<br />
SI.STCI 2.3.2.1.1 M<br />
community interest<br />
Aug-09 Oct-09 10%<br />
Due to staff losses and<br />
lag in hiring<br />
10%<br />
replacements these task<br />
are behind schedule.<br />
Due to staff losses and<br />
lag in hiring<br />
10%<br />
replacements these task<br />
are behind schedule.<br />
Due to staff losses and<br />
lag in hiring<br />
10%<br />
replacements these task<br />
are behind schedule.<br />
Initial discussions have<br />
taken place<br />
10% Initial discussions have taken place<br />
SI.STCI 2.3.2.1.2 M Produce evaluation report and conclusion Nov-09 Jan-10 0% 0% 0% 0%<br />
SI.STCI 2.3.2.2 M Standalone Build & Test Capability Kit Feb-10 Jul-10<br />
2.3.2.1<br />
SI.STCI 2.3.2.2.1 M Capability kit beta testing, produce documentation Feb-10 Apr-10<br />
SI.STCI 2.3.2.2.2 M Capability kit production rollout May-10 Jul-10<br />
SI 2.5 Objective 2.5 Projects: Scheduling<br />
Suman Nadella[100%],<br />
Nicholas Trebon[100%],<br />
SI.SPRC 2.5.2.17 P SPRUCE PY4 To RPs Aug-08 Jul-09<br />
Pete Beckman [5%],<br />
On-demand compute capability documented and<br />
SI.SPRC<br />
M<br />
67% 67% 67% 67% 67%<br />
integrated into production operations on RP systems<br />
2.5.2.17.2<br />
UC Suman<br />
Nadella[40%], Nicholas<br />
SI.SPRC 2.5.2.18 P SPRUCE PY5 To RPs Aug-09 Jul-10 Trebon[35%], Pete<br />
Beckman [5%]<br />
Deploy a production service for guaranteed network<br />
bandwidth for urgent computations.<br />
Deploy Spruce for virtualized cloud infrastructure<br />
(collaboration with Eucalyptus).<br />
SI.SPRC 2.5.2.18.3 P Work with users from VT to utilize Spruce for H1N1<br />
Feb-10 Apr-10<br />
influenza simulations on cloud infrastructures<br />
Work with Purdue and other cloud providers to support<br />
Contigent on Pete's<br />
Spruce.<br />
100% 100% 100%<br />
report<br />
Lee to check with Pete<br />
Maintain Spruce infrastructure (i.e., web service server,<br />
database, user/admin portals).<br />
Update Spruce plug-ins at resource-end as new<br />
schedulers are introduced.<br />
SI.SPRC 2.5.2.18.4 P Work to get Spruce supported at new TeraGrid<br />
May-10 Jul-10<br />
resources that have replaced retired machines that<br />
supported Spruce (e.g., Frost).<br />
67% Lee to talk to Pete B. 67% Lee to talk to Pete B. 67%<br />
SI.Skd 2.5.3 O Scheduling Working Group<br />
TACC Warren Smith<br />
SI.Skd 2.5.3.3 O Scheduling Working Group Coordination PY5 Aug-09 Jul-10 100% 100% 75% 50%<br />
[25%]<br />
TACC Warren Smith<br />
SI.Skd 2.5.3.4 O Scheduling Working Group Coordination Extension Aug-10 Jul-11<br />
[25%]<br />
SI.QBETS 2.5.9 P QBETS PY4 moved to PY5 Aug-09 Feb-10 Warren<br />
SI.QBETS 2.5.9.2<br />
Queue prediction capability documented and integrated<br />
P<br />
into TeraGrid's production operations<br />
Per Warren: TG Portal<br />
Lee to talk to Warren<br />
Lee to talk to Warren<br />
Lee to talk to Warren<br />
SI.QBETS 2.5.9.2.1 P Queue Predicition Capability Documented Aug-09 Nov-09 60%<br />
doc complete. In 60%<br />
about Karnak<br />
60%<br />
about Karnak<br />
60%<br />
about Karnak<br />
process of adding doc to<br />
deployment<br />
deployment<br />
deployment<br />
Per Warren: Karnak is<br />
Queue Predicition Capability integrated into TG<br />
Not formally complete<br />
Not formally complete<br />
Not formally complete<br />
SI.QBETS 2.5.9.2.2 P Nov-09 Feb-10 25%<br />
now beta with an up 25%<br />
25%<br />
0%<br />
operations<br />
but available for use<br />
but available for use<br />
but available for use<br />
time of over 90%<br />
NICS Troy Baer[6%];<br />
SI.MTS 2.5.10 P Metascheduling PY5 Aug-09 Jul-10<br />
SDSC Kenneth<br />
Yashimoto[35%]; TACC<br />
Warren Smith[55%]<br />
SI.MTS 2.5.10.1 P Detailed PY5 metascheduling plan Aug-09 Oct-09 Completed Completed Completed Completed<br />
8/3 Update: On-demand capability has<br />
already been implement by a number of<br />
RPs, documentation and awareness is<br />
still needed for this implementation.<br />
This will be completed by Mar 10
SI.MTS 2.5.10.2 P Metascheduling capabilities developed Nov-09 Jan-10<br />
Condor-G/class AD<br />
Condor-G/class AD<br />
Condor-G/class AD<br />
Condor-G/class AD<br />
Completed<br />
metascheduling Completed<br />
metascheduling Completed<br />
metascheduling 80%<br />
metascheduling<br />
SI.MTS 2.5.10.3<br />
Testing automated and dynamic (information services<br />
P Feb-10 Apr-10<br />
based) metascheduling<br />
Completed Completed Completed Completed<br />
SI.MTS 2.5.10.4<br />
Production automated and dynamic (information<br />
P May-10 Jul-10<br />
services based) metascheduling<br />
Completed Lee to talk to Warren Lee to talk to Warren Lee to talk to Warren<br />
NICS Troy Baer[6%];<br />
SI.MTS 2.5.11 P Metascheduling Extension Apr-10 Jul-11<br />
SDSC Kenneth<br />
Yashimoto[35%]; TACC<br />
Warren Smith[55%]<br />
SI.MTS 2.5.11.1<br />
Advanced scheduling tools ported to new TeraGrid<br />
O<br />
systems as appropriate per RP wishes 63% 38% 13%<br />
Best practice documentation for using advanced<br />
SI.MTS 2.5.11.2 P scheduling capabilities in conjunction with non-<br />
TeraGrid systems<br />
0% Not started 0% Not started 0% Not started<br />
Improved Coordinated Capability User<br />
SI.ICUD 2.7 P Aug-09 Jul-10<br />
Documentation PY5<br />
SDSC Fariba Fana[15%]<br />
SI.ICUD 2.7.1<br />
High-level documentation layout designed, detailed<br />
JP has talked to the doc<br />
JP needs to talk to doc<br />
JP needs to talk to doc<br />
JP needs to talk to doc<br />
P Aug-09 Oct-09<br />
project tasks identified, scheduled, and assigned<br />
5%<br />
team 0%<br />
team 0%<br />
team 0%<br />
team<br />
We are using this line<br />
Capability kit introduction/use case documentation<br />
item as a method to<br />
SI.ICUD 2.7.2 P Nov-09 Jan-10<br />
improvements<br />
move doc off of the wiki<br />
10%<br />
and to doc for XD 0% 0%<br />
SI.ICUD 2.7.3<br />
Half of capability kit detailed documentation<br />
P Feb-10 Apr-10<br />
improvements including IS integration<br />
0% 0% 0%<br />
SI.ICUD 2.7.4<br />
Half of capability kit detailed documentation<br />
P May-10 Jul-10<br />
improvements including IS integration<br />
0% 0% 0%<br />
0%<br />
Contigent on LifeRay implementation<br />
dependant<br />
SI<br />
END SI
Work Package Dependencies Planned<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
Resources 12/31/10 AD Status 12/31/10 AD Notes 9/30/10 AD Status 9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
OPMD<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
SGW 1 Area: Science Gateways<br />
SGW 1.0<br />
Objective 1.0 Operations: maintain/sustain<br />
current capabilities<br />
SGW.Coord 1.0.1.2 O SGW Area Coordination PY4 Aug-08 Jul-09 Nancy Wilkins-Diehr[50%], Ongoing Ongoing<br />
In Progress<br />
Drafted revisions to TeraGrid's community In progress<br />
SGW.Coord<br />
D<br />
Standardize security approach to gateways.<br />
Among gateways with roaming allocations, usage<br />
of multiple TeraGrid resources will increase.<br />
Mar-11<br />
account policy document, TG-10. Reviewing<br />
with co-authors Victor Hazlewood and Jim<br />
Marsteller. Expected completion March,<br />
SGW.Coord<br />
D<br />
Gateway Web services registry. It will be possible<br />
to discover services available through Gateways<br />
programmatically<br />
Jul-11<br />
SGW.Coord 1.0.1.3 O SGW Area Coordination PY5 Aug-09 Jul-10<br />
2011<br />
RENCI Delayed Work delayed to due SDSC staff medical<br />
leave.<br />
Nancy Wilkins-Diehr[50%],<br />
SGW.Coord 1.0.1.3.1 M Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30) Aug-09 Jul-10 Stuart Martin[50%]<br />
Ongoing Ongoing<br />
SGW.GDoc 1.0.2 O Gateway Documentation May-07 Jul-10<br />
SGW.GDoc 1.0.2.1.10 P Use of GRAM audit Dec-07 Mar-08 In Progress<br />
Stuart Martin checking TeraGrid's GRAM<br />
documentation to see what changes are<br />
needed for GRAM5 transition<br />
announcement<br />
In progress<br />
SGW.GDoc 1.0.2.2 O Gateway Documentation PY4 Aug-08 Jun-09 Diana Diehl[50%]<br />
Provide content for documentation, formation of<br />
Attention turned to Web services registry<br />
SGW.GDoc 1.0.2.2.2 M ideas and solutions to common gateway problems<br />
Aug-08 Sep-08 Effort redirected<br />
where good progress was made, funding for Effort redirected<br />
(RENCI);<br />
RENCI has ended in the Extension year<br />
SGW.GDoc 1.0.2.3 O Gateway Documentation PY5 Aug-09 Jul-10<br />
SDSC Diana Diehl[20%];<br />
UNC Jason Reilly [40%],<br />
John McGee [5%]<br />
SGW.GDoc 1.0.2.2.1 M Up to date documentation on science gateways. Aug-09 Jul-11 Ongoing Activity will continue in the extensio Ongoing<br />
SGW.GDoc 1.0.2.2.2 M Anatomy of Gateway tutorial Aug-09 Jul-10 Completed Completed<br />
SGW 1.1<br />
Objective 1.1 Projects: Gateways Targeted<br />
Support Program<br />
SGW.HDN 1.1.2.4 O Helpdesk PY5 Aug-09 Jul-10 NCSA Yan Liu[20%]<br />
SGW.HDN 1.1.2.5 O<br />
Provide helpdesk support for production science<br />
gateways by answering user questions, routing<br />
Aug-09 Jul-11<br />
user requests to appropriate gateway contacts,<br />
and tracking user responses<br />
Ongoing<br />
Activity will continue in the extensio Ongoing<br />
Provide the helpdesk support of the SimpleGrid<br />
SGW.HDN 1.1.2.6 O online training and prototyping services to gateway<br />
Aug-09 Jul-11<br />
developers and communities<br />
Ongoing<br />
Activity will continue in the extensio Ongoing<br />
SGW.HDN 1.1.2.7 P<br />
improve gateway helpdesk services.<br />
Gather requirements for a knowledge base to<br />
Feb-10 Jul-11 Completed Activity will continue in the extensio Completed<br />
SGW.CIG 1.1.6.3 P CIG support PY5 Aug-09 Jul-10<br />
SGW.CIG 1.1.6.4 P<br />
SGW.CIG 1.1.6.5 P<br />
SGW.CIG 1.1.6.6 P<br />
SGW.CIG 1.1.6.7 P<br />
Work with NOAO to develop a TG portal that<br />
allows users to run customized galaxy collision<br />
simulations using the power of the TeraGrid<br />
A scriptable interface to portal services will be<br />
created.<br />
Extend the CIG portal with an easy-to-use<br />
language-agnostic RPC-styleinterface that<br />
parallels the functionality of the web-based<br />
interactiveportal (and extends it where<br />
appropriate)<br />
Provide bindings from this interface into popular<br />
languages (driven by user demand) such as C,<br />
Pythonand Perl.<br />
CalTech Julian Bunn [10%],<br />
John McCorquodale [25%],<br />
Roy Wiliams [5%]<br />
Aug-09 Jul-10 Effort redirected to Arroyo<br />
gateway<br />
Aug-09 Oct-09 Effort redirected to Arroyo<br />
gateway, tasks forthcoming<br />
Nov-09<br />
SGW.Bio 1.1.10.3 P Bio web services, registry PY4 Aug-08 Jul-09<br />
SGW.Bio 1.1.10.3.4 M<br />
Work with the various TeraGrid teams to evolve<br />
the registry to meet the broader TeraGrid<br />
requirements (RENCI);<br />
SGW.Bio 1.1.10.3.5 M<br />
Work with other gateways to incorporate their<br />
service offerings into the registry (RENCI);<br />
Jan-10<br />
Effort redirected to Arroyo<br />
gateway, tasks forthcoming<br />
Feb-10 Apr-10 Effort redirected to Arroyo<br />
gateway, tasks forthcoming<br />
Jason Reilly[50%],<br />
Mats Rynge[25%]<br />
John McGee[5%]<br />
Arroyo work completed,<br />
collaboration with Caltech has<br />
ended in the Extension year<br />
Arroyo work completed,<br />
collaboration with Caltech has<br />
ended in the Extension year<br />
Arroyo work completed,<br />
collaboration with Caltech has<br />
ended in the Extension year<br />
In progress<br />
To IS, NOS Aug-08 Jul-09 Completed Completed<br />
Effort redirected to Arroyo<br />
gateway<br />
Effort redirected to Arroyo<br />
gateway, tasks forthcoming<br />
Effort redirected to Arroyo<br />
gateway, tasks forthcoming<br />
Effort redirected to Arroyo<br />
gateway, tasks forthcoming<br />
Aug-08 Jul-09 Completed Completed<br />
SGW.GC 1.1.12 P GridChem or Other New PY5 Aug-09 Jul-10 IU Suresh Marru [50%]<br />
SGW.GC 1.1.12.1 P<br />
Complete Gateway clients as determined by the<br />
Targeted Support Program<br />
Jan-10 Jul-10 Completed Completed<br />
SGW.GC 1.1.12.2 P<br />
Ongoing support as required for existing Gateway<br />
projects<br />
Jan-10 Jul-10 Completed Completed<br />
SGW.GC 1.1.12.3 P<br />
Assist GridChem gateway in the areas of software<br />
access, data management, improved workflows,<br />
visualization and scheduling<br />
Aug-09 Dec-09<br />
Completed<br />
Completed<br />
SGW.PG 1.1.13 P PolarGrid or Other New PY5 Aug-09 Jul-10 IU SGW TBD1 [100%]<br />
Complete Gateway clients as determined by the<br />
SGW.PG 1.1.13.1 P Jan-10 Jul-10<br />
Targeted Support Program<br />
Completed Completed<br />
End date extended to Jul-10<br />
End date extended to Jul-10<br />
GRAM5 (an attribute dependency for many)<br />
being installed for testing across TG<br />
Attention turned to Web services registry<br />
where good progress was made, funding for<br />
RENCI has ended in the Extension year<br />
Arroyo work completed,<br />
collaboration with Caltech has<br />
ended in the Extension year<br />
Arroyo work completed,<br />
collaboration with Caltech has<br />
ended in the Extension year<br />
Arroyo work completed,<br />
collaboration with Caltech has<br />
ended in the Extension year
Work Package Dependencies Planned<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
Resources 12/31/10 AD Status 12/31/10 AD Notes 9/30/10 AD Status 9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
OPMD<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
SGW.PG 1.1.13.2 P<br />
Ongoing support as required for existing Gateway<br />
projects<br />
Jan-10 Jul-10 Completed Completed<br />
Assist PolarGrid team with TeraGrid integration.<br />
PolarGrid's Matlab-based code<br />
was rewritten to remove<br />
dependencies on the matlab<br />
PolarGrid's Matlab-based code<br />
was rewritten to remove<br />
dependencies on the matlab<br />
May include realtime processing of sensor data,<br />
SGW.PG 1.1.13.3 P<br />
Aug-09 Jul-10<br />
distributed toolkit. It now runs on<br />
distributed toolkit. It now runs on<br />
support for parallel simulations, GIS integration<br />
and EOT components.<br />
the Quarry cluster in a much more<br />
the Quarry cluster in a much more<br />
Completed<br />
scalable and less expensive<br />
fashion.<br />
Completed<br />
scalable and less expensive<br />
fashion.<br />
SGW.SID 1.1.14 P SIDGrid or Other New PY5 Aug-09 Jul-10 UC Wenjun Wu [100%]<br />
SGW.SID 1.1.14.1 P<br />
SIDGrid scheduling will be enhanced by taking<br />
advantage of work underway in the Scheduling<br />
Working Group to enable monitoring of and<br />
Aug-09 Oct-09<br />
submission to multiple TeraGrid resources through<br />
a single interface<br />
Completed Using Swift Completed Using Swift<br />
SGW.SID 1.1.14.2 P<br />
Improved security model for community accounts<br />
utilized by the gateway<br />
Nov-09 Jan-10 Completed Completed<br />
SGW.SID 1.1.14.3 P<br />
Increased number of analysis codes available<br />
through the gateway<br />
Feb-10 Apr-10 Completed Completed<br />
SGW.SID 1.1.14.4 P<br />
Isolation of execution within virtual machines or<br />
other constrained environments, and data sharing<br />
with collaborators, while simultaneously protecting<br />
the users' data and execution environment from<br />
accidental or malicious compromise.<br />
Feb-10 Apr-10<br />
Completed<br />
Completed<br />
SGW.SID 1.1.14.5 P<br />
VDS, the current workflow solution, will be<br />
replaced by its successor, SWIFT.<br />
May-10 Jul-10 Completed Completed<br />
Documentation and code that is generally<br />
SGW.SID 1.1.14.6 P applicable to gateway efforts will be made<br />
May-10 Jul-10<br />
available to TeraGrid gateway partners<br />
Completed<br />
Completed<br />
SGW.OSG 1.1.15 P RENCI-OSG PY5 Aug-09 Jul-10<br />
SGW.OSG 1.1.15.1 P<br />
SGW.OSG 1.1.15.2 P<br />
Prototype of OSG jobs running on TeraGrid<br />
resources via NIMBUS<br />
Successful interactions with and assistance for<br />
gateway developers<br />
SGW.ES 1.1.16 P Environmental Science Gateway PY5 Aug-09 Jul-10<br />
SGW.ES 1.1.16.1 P<br />
Release of a demonstration version of the ESGC<br />
Portal capable of invoking CCSM runs on the<br />
TeraGrid<br />
UNC Jason Reilly [15%],<br />
John McGee [5%]<br />
Aug-09 Oct-09<br />
This work will not be completed,<br />
effort redirected to Web services<br />
Effort redirected registry<br />
Effort redirected<br />
Nov-09 Apr-10 Completed Completed<br />
Aug-09<br />
Oct-09<br />
NCAR NCAR SGW TBD<br />
[50%] ;Purdue PU SGW<br />
TBD1[50%]<br />
Effort redirected<br />
The team has reevaluated plans<br />
for an ESGC portal and is instead<br />
working on a demonstration<br />
version of the CCSM portal<br />
capable of invoking CCSM4 runs<br />
on the TeraGrid. We have<br />
implemented support for basic<br />
operations such as case creation<br />
and configuration. A prototype<br />
version is expected to complete in<br />
November 2010.<br />
Effort redirected<br />
This work will not be completed,<br />
effort redirected to Web services<br />
registry<br />
The team has reevaluated plans<br />
for an ESGC portal and is instead<br />
working on a demonstration<br />
version of the CCSM portal<br />
capable of invoking CCSM4 runs<br />
on the TeraGrid. We have<br />
implemented support for basic<br />
operations such as case creation<br />
and configuration. A prototype<br />
version is expected to complete in<br />
November 2010.
Work Package Dependencies Planned<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
Resources 12/31/10 AD Status 12/31/10 AD Notes 9/30/10 AD Status 9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
OPMD<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
SGW.ES 1.1.16.2 P<br />
Ability for runs invoked on the Purdue Climate<br />
Portal or the ESGC Gateway to be published back<br />
to the ESG data holdings<br />
SGW.ES 1.1.16.3 P<br />
Investigate federation of Purdue data holdings<br />
with ESG data archives<br />
SGW 1.2 Objective 1.2 Projects: Gateways User Count<br />
Nov-09<br />
SGW.UCnt 1.2.2 P Gateway User Count PY4 To NOS Aug-08 Jul-09<br />
SGW.UCnt<br />
D<br />
End gateway users can be counted<br />
programmatically, as opposed to current manual<br />
aggregation method. By the end of PY5, all jobs<br />
submitted via community accounts will include<br />
attributes identifying the end user of that gateway.<br />
SGW.UCnt 1.2.3 P Gateway User Count PY5 NOS Aug-09 Jul-10<br />
SGW.UCnt 1.2.3.1 M<br />
Develop internal web pages for the security<br />
working group and the science gateway<br />
administration page in support of attribute-based<br />
authentication<br />
Jul-11<br />
In progress<br />
See 9/30/10 notes. End Date<br />
changed to July, 2011.<br />
In progress<br />
Feb-10 Jul-10 Completed Completed<br />
Aug-09<br />
Jul-11<br />
Dec-09<br />
Tom Scavo[75%],<br />
Jim Basney[10%],<br />
Nancy Wilkins-Diehr[5%]<br />
NCSA Tom Scavo[50%], Jim<br />
Basney[10%], Terry Fleury<br />
[25%]; SDSC Nancy Wilkins-<br />
Diehr[5%], Michael Dwyer<br />
[10%]<br />
Delayed<br />
See 9/30/10 notes, end date changed to<br />
July, 2011.<br />
Delayed<br />
(1) Model output can be published<br />
to ESG either from local file system<br />
or via iRODS fuse interface. There<br />
is still some work going on to<br />
automate the process and make<br />
the system more robust/scalable.<br />
(2) Publishing of model metadata is<br />
delayed because of schedule<br />
change on the CESM and ESG<br />
sides. We are testing a beta<br />
version of the CESM model that<br />
has the capability to collect model<br />
run attributes. A system design on<br />
how to publish the metadata back<br />
to ESG has been defined. Still<br />
need to implement and test it.<br />
Publishing runs into ESG has been<br />
demonstrated.<br />
GRAM5 now being installed across TG<br />
Complete<br />
Complete<br />
SGW.UCnt 1.2.3.2 M Ubiquitous science gateway adoption. Aug-09 Dec-09 Delayed See 9/30/10 notes Delayed Waiting for GRAM5 install<br />
SGW.UCnt 1.2.3.3 M Ubiquitous RP adoption. Feb-09 Dec-09 Delayed See 9/30/10 notes Delayed Waiting for GRAM5 install<br />
SGW.UCnt 1.2.3.4 M User count INCA tests Jan-10 Feb-10 Complete Complete<br />
1.2.3.5 M Post-TG architecture documentation. Feb-10 Jul-10 Deferred Waiting for XD architecture decisions<br />
SGW 1.4<br />
Objective 1.4 Projects: Gateways General<br />
Services Discovery<br />
SGW.SG 1.4.2 P<br />
SimpleGrid, helpdesk, accounting, new<br />
communities PY4<br />
Reusable SijmpleGrid modules. TeraGrid science<br />
gateway program will open an online learning<br />
service for gateway development by integrating<br />
SGW.SG<br />
D<br />
SimpleGrid documentation, progressive online<br />
courses, and common components contributed<br />
from other existing gateways. All the training,<br />
coding, and prototyping exercises will be<br />
conducted through web browser.<br />
SGW.SG 1.4.1.1 P<br />
SimpleGrid, helpdesk, accounting, new<br />
communities PY5<br />
Continue to develop and support the SimpleGrid<br />
SGW.SG 1.4.1.1.1 M online training service for building new science<br />
gateways<br />
Develop a SimpleGrid prototyping service to<br />
support virtualized access to TeraGrid by<br />
SGW.SG 1.4.1.1.2 M efficiently converting a new community application<br />
to a science gateway application that provides web<br />
access for sharing within the community.<br />
Develop a streamlined packaging service for new<br />
SGW.SG 1.4.1.1.3 M<br />
communities to create their science gateway<br />
software based on the use of the SimpleGrid<br />
prototyping service<br />
To NOS<br />
Aug-08<br />
Aug-09<br />
Aug-09<br />
Aug-09<br />
Feb-10<br />
Jul-09<br />
Shaowen Wang[8%],<br />
Yan Liu[80%]<br />
Jul-11 NCSA In progress<br />
Jul-10<br />
Jul-11<br />
Jul-11<br />
Jul-11<br />
NCSA Shaowen Wang[10%],<br />
Yan Liu[80%]<br />
Ongoing<br />
In progress<br />
In progress<br />
Using CI Tutor, an extension year activity,<br />
end date changed to Jul-11<br />
Delayed<br />
Activity will continue in the extension Ongoing<br />
Activity will continue in the extension In progress<br />
Activity will continue in the extension In progress<br />
Will be using CI Tutor for this in the<br />
extension year, CI Tutor work in progress<br />
Work will complete in the extension<br />
year<br />
VM deployment in th extension<br />
year
Work Package Dependencies Planned<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
Resources 12/31/10 AD Status 12/31/10 AD Notes 9/30/10 AD Status 9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
OPMD<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
SGW.SG 1.4.1.1.4 M<br />
Develop a user-level TeraGrid usage service<br />
within SimpleGrid based on the community<br />
account model and attributes-based security<br />
services.<br />
Feb-10<br />
SGW.SG 1.4.1.1.5 M<br />
Work with potential new communities to improve<br />
the usability and documentation of the proposed<br />
gateway support services<br />
Aug-09 Jul-10<br />
Conduct education and outreach work using the<br />
SimpleGrid online training service and related<br />
SGW.SG 1.4.1.1.6 M science gateway technologies in the contexts of<br />
Aug-09 Jul-10<br />
undergraduate, graduate, and K-12 education and<br />
training and document experiences<br />
SGW.CA 1.4.5 P Community Accounts PY5 Aug-09 Jul-10<br />
Jul-11<br />
NICS Victor<br />
Hazlewood[30%];SDSC<br />
Nancy Wilkins-Diehr[15%]<br />
Delayed<br />
Complete<br />
Waiting for GRAM5 install and RP<br />
accounting scripts to send<br />
individual user attributes to the<br />
TGCDB, changed end date to Jul-<br />
11. Deferred Waiting for GRAM5 install<br />
Complete<br />
Complete SciDAC, TG10 Complete SciDAC, TG10<br />
SGW.CA 1.4.5.1 M Review and documentation of requirements Aug-09 Oct-09 Complete Complete<br />
SGW.CA 1.4.5.2 M Communication of requirements Nov-09 Jan-10 Complete Complete<br />
SGW.CA 1.4.5.3 M Specification of a standard deployment model(s) Feb-10 Apr-10 Complete Complete<br />
SGW.CA 1.4.5.4 M Deployment of standard model on one gateway May-10 Jul-10 Complete Complete<br />
SGW.GCD 1.4.6 P Gateways Code Discovery PY5 Aug-09 Jul-10<br />
SDSC Nancy Wilkins-<br />
Diehr[5%]; UNC Josh Coyle<br />
[30%] John McGee [5%]<br />
SGW.GCD 1.4.6.1 P<br />
RENCI Gateway Services advertised appropriately<br />
in TG Info Services<br />
Aug-09 Aug-09 Complete Complete<br />
SGW.GCD 1.4.6.2 P Schema for Gateway services and capabilities Aug-09 Sep-09 Complete Complete<br />
SGW SGW END
TeraGrid US Extension Year Project Plan<br />
Project-ID<br />
WBS<br />
O P M<br />
Description<br />
Work Package Dependencies Planned<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Resources<br />
Name [%FTE]<br />
<strong>2010Q4</strong><br />
Status<br />
Update<br />
US 6.0 Operations: Frontline User Support<br />
US.Coord 6.0.1 O US Area Coordination PSC Sergiu Sanielevici (30%)<br />
0 6.0.1.3 O US Area Coordination<br />
user interface<br />
council<br />
AUS, NOS,<br />
UFP<br />
US.Eng 6.0.2 O User Engagement Aug-09 Jul-11<br />
Aug-10 Jul-11 ongoing<br />
PSC David O'Neal(25%),<br />
Richard Raymond(20%),<br />
Raghurama Reddy(45%),<br />
Sergiu Sanielevici (30%)<br />
US.Eng 6.0.2.1 O User Champion Coordination Aug-09 Jul-11 ongoing<br />
US.Eng 6.0.2.2 O Campus Champion Coordination Aug-09 Jul-11 ongoing<br />
US.Eng 6.0.2.3 O Startup and education grant support coordination Aug-09 Jul-11 ongoing<br />
US.Eng 6.0.2.4 O Analyze and report usage patterns Aug-09 Jul-11 ongoing<br />
US.Eng 6.0.2.5 P TG User Survey PY5 question input All Aug-09 Jul-10 external contractor DONE<br />
PSC David O'Neal(10%), ongoing<br />
US.Tick 6.0.3 O Share and maintain best practices for ticket resolution across all RPs PY5 Aug-09 Jul-11<br />
Richard Raymond(30%),<br />
Raghurama Reddy(10%),<br />
Sergiu Sanielevici (10%)<br />
US.Tick 6.0.3.1 O<br />
Focus on providing users with a substantive clarification of the nature of the problem and<br />
the way forward<br />
Aug-09 Jul-11 ongoing<br />
US.Tick 6.0.3.2 O Focus on the coordinated resolution of problems spanning RPs and systems Aug-09 Jul-11 ongoing<br />
US.Tick 6.0.3.3 O Stale ticket count reaches zero Aug-09 Jul-11 ongoing<br />
US.Tick 6.0.3.4 O At least 85% survey ratings for promptness and quality Aug-09 Jul-11 ongoing<br />
Estabrook [100%], ongoing<br />
US.Cnslt.RP.NCSA 6.0.5.5 O NCSA RP User Services - Consulting Operations Apr-10 Jul-11 Jackson[100%], John [100%],<br />
US.Cnslt.RP.IU 6.0.5.6 O IU RP User Services - Consulting Operations Apr-10 Jul-11 ongoing<br />
US.Cnslt.RP.LONI 6.0.5.7 O LONI RP User Services - Consulting Operations Apr-10 Jul-11 Jundt [50%], Xu [100%] ongoing<br />
Loftis [50%], Lucio [100%], ongoing<br />
US.Cnslt.RP.NICS 6.0.5.8 O NICS RP User Services - Consulting Operations Apr-10 Jul-11<br />
Sharkey [100%], Wong [50%],<br />
Halloy [50%], Crosby [50%],<br />
TBD [250%]<br />
Blood[25%], Costa[90%], ongoing<br />
US.Cnslt.RP.PSC<br />
US.Cnslt.RP.PU<br />
6.0.5.10<br />
6.0.5.11<br />
O<br />
O<br />
PSC RP User Services - Consulting Operations<br />
PU RP User Services - Consulting Operations<br />
Apr-10<br />
Apr-10<br />
Jul-11<br />
Jul-11<br />
Gomez[45%], Madrid[35%],<br />
Maiden[20%], Nigra[70%],<br />
Raymond[3%],<br />
Sanielevici[2%] Wang[5%]<br />
P. Smith [20%], New person<br />
[100%]<br />
ongoing<br />
US.Cnslt.RP.SDSC 6.0.5.12 O SDSC RP User Services - Consulting Operations Apr-10 Jul-11<br />
Choi [5%], Greenberg [15%],<br />
Tatineni [55%], Wolter [100%]<br />
ongoing<br />
ongoing<br />
US.Cnslt.RP.TACC 6.0.5.13 O TACC RP User Services - Consulting Operations Apr-10 Jul-11<br />
Turner [40%], Wilson [40%],<br />
Sadahiro [50%]<br />
ongoing
Project-ID<br />
WBS<br />
OPMD<br />
TeraGrid AUS PY4-PY5 Project Plan<br />
Work Package Dependencies Planned<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD<br />
Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
AUS 10 Area: Advanced User Support<br />
10.0 AUS Area Coordination<br />
AUS.Coord 10.0.2 O AUS Area Coordination PY5 Aug-09 Jul-10<br />
Majumdar [75%],Alameda[0%], Sanielevici<br />
[0%],Barth[0%], Cheesman[0%],Jundt[0%],<br />
Crosby[0%]<br />
Total 0.75<br />
10.1 Advanced Support Projects<br />
AUS.LIB 10.1.1 P Library Database PY5 Aug-09 Jul-10<br />
NICS Mark Fahey[10%], Intern[50%], HPC Ops<br />
Staff[10%]<br />
AUS.LIB 10.1.1.1 P implement wrappers for ld and aprun Aug-09 Oct-09 100% done 100% done<br />
AUS.LIB 10.1.1.2 P test wrappers; write mining scripts Nov-09 Jan-10 100% done 100% done<br />
AUS.LIB<br />
10.1.1.3 P<br />
wrappers in production mode; implement internal<br />
web interface; share beta release with other<br />
interested sites Feb-10 Apr-10 100% done<br />
AUS.LIB<br />
everything in production; begin using as a tool to<br />
10.1.1.4 O<br />
help manage software; prepare full release;<br />
share with other interested sites May-10 Jul-10<br />
AUS.LIB 10.1.1.5 O Library Database PY6 Aug-10 Jun-11<br />
AUS.MPI 10.1.2 P MPIg Support PY5 Aug-09 Jul-10<br />
NIU Undergrad[29%], Grad[75%], Nick<br />
Karonis[25%]<br />
Work with ASTA apps groups to port to MPIg on<br />
TG sites. Continue investigation of running apps<br />
AUS.MPI 10.1.2.1 P at full scale on Ranger. Aug-09 Oct-09<br />
AUS.MPI 10.1.2.2 P Attempt cross-site runs with MPIg on TG sites. Nov-09 Jan-10<br />
AUS.MPI 10.1.2.3 P<br />
100% done<br />
(since we<br />
postponed TG<br />
site runs)<br />
100% done<br />
(since we<br />
postponed TG<br />
site runs)<br />
The tracking database is in<br />
production; the internal web interface<br />
was dropped as a deliverable earlier,<br />
rather a suite of scripts to provide the<br />
same functionality.<br />
100% done<br />
This is being treated as an<br />
operational activity since the project<br />
part of this is completed.<br />
Due to unable to use MPIg on TG<br />
sites we are postponing the task of<br />
running full scale)<br />
Due to unable to use MPIg on TG<br />
sites cross-site runs on TG sitesis<br />
postponed for now<br />
95% done<br />
100% done<br />
(since we<br />
postponed TG<br />
site runs)<br />
100% done<br />
(since we<br />
postponed TG<br />
site runs)<br />
Work with largest TG RP providers to enable<br />
cross-site runs (this is being postponed due to<br />
unable to use MPIg on TG sites). Instead adding<br />
new task of integraing UPC and MPI for user<br />
apps Feb-10 Apr-10 20% done<br />
AUS.MPI 10.1.2.4 O<br />
Investigate intetrating clouds, possibly with<br />
Globus, to run MPIg jobs. May-10 Jul-10<br />
AUS.MPI 10.1.2.5 O MPIg Support PY6 Aug-10 Jun-11<br />
10.2 Advanced Support Operations<br />
AUS.ASTA 10.2.1 O Advanced Support TG Applications<br />
AUS.ASTA 10.2.1.2 O Advanced Support TG Applications PY5 Aug-09 Jul-10<br />
NCSA 0.6 FTE<br />
Jay Alameda [20%]<br />
Dodi Heryadi [0%]<br />
Seid Koric [0%]<br />
Rick Kufrin[25%]<br />
Sudhakar Pamidighantam [0%]<br />
Mark Straka[15%]<br />
Ahmed Taha[0%]<br />
Michelle Gower[0%]<br />
David Bock [0%]<br />
Mark Vanmoer [0%]<br />
PSC 3.15 FTE<br />
Mahin Mahmoodi [10%]<br />
Raghu Reddy [30%]<br />
John Urbanic [25%]<br />
Joel Welling [50%]<br />
Philip Blood [35%]<br />
Roberto Gomez [40%]<br />
Marcela Madrid [20%]<br />
David O'Neal[50%]<br />
Yang Wang [30%]<br />
Anirban Jana [25%]<br />
This is being treated as an<br />
operational activity since the project<br />
part of this is completed.<br />
80% done<br />
The tracking database is in<br />
production; the internal web interface<br />
was dropped as a deliverable earlier,<br />
rather a suite of scripts to provide the<br />
same functionality.<br />
The tracking database is in<br />
production and we are using it as a<br />
tool to manage software<br />
maintenance decisions. The license<br />
is still not complete (working with<br />
lawyers), but the code will be shared<br />
with any interested centers.<br />
Due to unable to use MPIg on TG<br />
sites we are postponing the task of<br />
running full scale)<br />
Due to unable to use MPIg on TG<br />
sites cross-site runs on TG sitesis<br />
postponed for now<br />
Continuted to learn about OpenMP.<br />
Worked with user to identify code<br />
where OpenMP could work.<br />
RPs have still yet to embrace cloud<br />
computing. Learning more about<br />
OpenMP. Scheduled to meet with<br />
user in November to start modifying<br />
their code.
Project-ID<br />
WBS<br />
OPMD<br />
Description<br />
TeraGrid AUS PY4-PY5 Project Plan<br />
Work Package Dependencies Planned<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD<br />
Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
SDSC 1.75 FTE<br />
Natasha Balac[15%]<br />
Dong Ju Choi[15%]<br />
Amit Chourasia[10%]<br />
Yifeng Cui[40%]<br />
Dmitry Pekurovsky [25%]<br />
Wayne Pfeiffer[10%]<br />
Mahidhar Tatineni[20%]<br />
Ross Walker[20%]<br />
TBD/Doc etc.[20%]<br />
Purdue 0.45 FTE<br />
Phil Cheeseman[45%]<br />
LSU 0.1 FTE<br />
Adam Jundt [10%]<br />
Honggao Liu /POC[0%]<br />
ORNL 0.2 FTE<br />
John Cobb, backup POC ORNL[0%]<br />
Mei Li Chen, backup POC[10%]<br />
Vickie Lynch, POC ORNL[10%]<br />
NICS 0.8 FTE<br />
Lonnie Crosby, POC NICS[40%]<br />
Christian Halloy[20%]<br />
Kwai Wong[20%]<br />
Bruce Loftis, backup POC[0%]<br />
TACC 3.85 FTE<br />
Bill Barth, POC TACC<br />
John Cazes<br />
Lars Koesterke<br />
B. D. Kim<br />
Hang Liu<br />
Robert McLay<br />
Kent Milfeld<br />
John Peterson<br />
Karl Schulz (backup POC)<br />
TACC total 3.85<br />
IU 0.0 FTE<br />
Don Berry[0%]<br />
Ray Shepperd POC[0%]<br />
ANL 0.31 FTE<br />
Joe Insley[31%]<br />
AUS.ASP 10.2.2 O Advanced Support Projects<br />
AUS.ASP 10.2.2.2 O Advanced Support Projects PY5 Aug-09 Jul-10<br />
Total 11.06<br />
NCSA 0.4 FTE<br />
Jay Alameda [20%]<br />
Dodi Heryadi [0%]<br />
Seid Koric [0%]<br />
Ahmed Taha[0%]<br />
Michelle Gower[0%]<br />
David Bock [0%]<br />
Mark Vanmoer [0%]<br />
Mark Straka[20%]<br />
PSC 1.65 FTE<br />
Nick Nystrom [15%]<br />
Mahin Mahmoodi [40%]<br />
Raghu Reddy[5%]<br />
Joel Welling [30%]<br />
David O'Neal[15%]<br />
Anirban Jana[10%]<br />
Roberto Gomez[20%]<br />
Yang Wang [15%]<br />
Phil Blood [15%]<br />
SDSC 1.05 FTE<br />
Chourasia [15%]<br />
Pfeiffer[40%]
Project-ID<br />
WBS<br />
OPMD<br />
Description<br />
TeraGrid AUS PY4-PY5 Project Plan<br />
Work Package Dependencies Planned<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD<br />
Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
Dmitry Pekurovsky[10%]<br />
Tatineni[5%]<br />
Balac[10%]<br />
Ross Walker [25%]<br />
Purdue 0.3 FTE<br />
TBD 30%<br />
LSU 0.1 FTE<br />
Adam Jundt [10%]<br />
TACC 1.9 FTE<br />
Bill Barth [X%]<br />
others/TACC<br />
TACC total 1.9<br />
IU 0.0 FTE<br />
Don Berry [0]%<br />
NICS 0.4 FTE<br />
Christian Halloy[20%]<br />
Kwai Wong[20%]<br />
AUS.ASEOT 10.2.3 O Advanced Support EOT<br />
AUS.ASEOT 10.2.3.2 O Advanced Support EOT PY5 Aug-09 Jul-10<br />
Total 5.85<br />
NCSA 0.25 FTE<br />
Jay Alameda [10%]<br />
Sudhakar Pamidighantam [0%]<br />
David Bock [0%]<br />
Mark Vanmoer [0%]<br />
Mark Straka[15%]<br />
Purdue 0.2 FTE<br />
TBD[20%]<br />
PSC 0.7 FTE<br />
Nick Nystrom [10%]<br />
Philip Blood [10%]<br />
Marcela Madrid[30%]<br />
Shawn Brown[10%]<br />
Mahin Mahmoodi [10%]<br />
NICS 0.38 FTE<br />
Lonnie Crosby[38]<br />
SDSC 0.4 FTE<br />
Yifeng Cui[10%]<br />
Dmitry Pekurovsky[15%]<br />
TBD/doc[15%]<br />
TACC 0.85 FTE<br />
Lars Koesterke<br />
Kent Milfeld [X%]<br />
Bill Barth [X%]<br />
TACC total 0.85<br />
LSU 0.05 FTE<br />
Adam Jundt [5%]<br />
IU 0.0 FTE<br />
Don Berry [0%]<br />
Total 2.88
Work Package Dependencies Planned<br />
Resources<br />
(per GIG budget 7/7/08)<br />
12/31/10<br />
AD Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
O P M<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
UFP 9 Area: User-facing and Core Services Projects<br />
UFP 9<br />
Objective: User-facing Projects:<br />
Maintainance and Development<br />
May-07 Jul-10<br />
UFP.Coord 9.0 O UFP Area Coordination Aug-07 Jul-10<br />
UFP.Coord 9.0.3 O UFC Area Coordination PY5 Aug-09 Jul-10 UC, Jeff Koerner Ongoing Ongoing<br />
UFP.Coord 9.0.3.2 M Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31, 6/30)<br />
UFP.Coord<br />
9.0.3.3<br />
Lead UFC-RP working group for coordination<br />
O of RPs with UFC<br />
UFP.Acct<br />
NCSA Steve Quinn[20%],<br />
Michael Shapiro[20%];<br />
Ongoing<br />
Ongoing<br />
9.6 O Accounting Operations PY5 Aug-09 Jul-10<br />
SDSC Leo Carson[44%],<br />
Tiffany Duffield[17%], Henry<br />
Jaime[7%]<br />
UFP.Acct 9.6.1 O TGCDB and AMIE PY5 Aug-09 Jul-10<br />
UFP.Acct<br />
Ongoing operations and maintenance of the<br />
Ongoing<br />
Ongoing<br />
9.6.1.1<br />
central TGCDB-AMIE environment and<br />
O servers Aug-09 Jul-10<br />
Leo Carson, H. Jaime (SDSC)<br />
(0.36 FTE)<br />
UFP.Acct<br />
Update TGCDB tables and code to support<br />
Ongoing<br />
Ongoing<br />
9.6.1.2<br />
evolving TG requirements, including addition<br />
of new RP resources into TGCDB; changes to<br />
O existing resources. Aug-09 Jul-10<br />
Michael Shapiro (NCSA) (0.10<br />
FTE), Steve Quinn (NCSA)<br />
(0.10 FTE)<br />
UFP.Acct<br />
Michael Shapiro (NCSA) (0.10 Ongoing<br />
Ongoing<br />
9.6.1.3<br />
Update AMIE tables and code to support<br />
O evolving TG requirements Aug-09 Jul-10<br />
FTE), Steve Quinn (NCSA)<br />
(0.10 FTE)<br />
UFP.Acct 9.6.2 O tgusage and gx-map PY5<br />
UFP.Acct<br />
Maintain and perform critical fixes to tgusage<br />
Tiffany Duffield (SDSC) Ongoing<br />
Ongoing<br />
utility as directed by trouble tickets and in<br />
[replacing former staff] (0.17<br />
9.6.2.1 O support of evolving UFC needs. Aug-09 Jul-10 FTE)<br />
UFP.Acct<br />
Maintain and perform critical fixes to gx-map<br />
Ongoing<br />
Ongoing<br />
9.6.2.2<br />
utility as directed by trouble tickets and in<br />
O support of evolving UFC needs Aug-09 Jul-10<br />
Leo Carson (SDSC) (0.15<br />
FTE)<br />
UFP.Acct 9.6.3 O Metrics PY5<br />
UFP.Acct<br />
Maintain web-based TGU query and reporting<br />
Ongoing<br />
Ongoing<br />
system, in current form and later as integrated<br />
9.6.3.1 O part of TGUP. Aug-09 Jul-10<br />
UFP.Acct<br />
Metrics reporting for QSRs, including updated<br />
Ongoing<br />
Ongoing<br />
metrics as capabilities supported in TGCDB<br />
9.6.3.2 O and POPS (9/30, 12/31, 3/31, 6/30) Aug-09 Jul-10<br />
UFP.UDoc 9.1 R TG Documentation<br />
UFP.UDoc 9.1.1 R TG Docs PY4 Aug-08 Jun-09<br />
Fariba Fana[100%],<br />
Diana Diehl[25%]<br />
UFP.UDoc 9.1.1.3<br />
TG Resource Catalog integrated with MDS +<br />
M<br />
enhancements<br />
Core2 Aug-08 Jul-09 In progress In progress<br />
UFP.UDoc 9.1.1.4 M PI User Services Log integrated with TGCDB Core2 Aug-08 Jun-09 In progress In progress<br />
UFP.UP 9.3 User Portal<br />
UFP.UP 9.3.2 R TG User Portal PY4 Aug-08 Jun-09<br />
UFP.UP 9.3.2.4 M<br />
Expanded personalization and collaboration<br />
features across the user portal, this includes<br />
allocation management, customized resource<br />
listing, collaboration of proposals, file space,<br />
etc<br />
John Boisseau[8%],<br />
Maytal Dahan[18%],<br />
Patrick Hurley[35%],<br />
Praveen Nuthalapati[45%],<br />
Steve Mock[25%],<br />
TACC NOS[15%]<br />
UFP.UR 9.7 O User Requests<br />
UFP.UR 9.7.1 O User Requests PY5 Aug-09 Jul-10 NCSA V. Halberstadt[100%]<br />
UFP.UR<br />
9.7.1.1<br />
Create/establish user accounts in TGCDB,<br />
O mail new user packets, collect RP site logins. Aug-09 Jul-10<br />
UFP.UR<br />
Support PI requests for removal of users from<br />
9.7.1.2 O allocations Aug-09 Jul-10<br />
UFP.UR<br />
Vet/review user information for new PIs and co-<br />
9.7.1.3 O PIs Aug-09 Jul-10<br />
UFP.UR<br />
Serve as first point of contact for general user<br />
9.7.1.4 O inquiries re: UFC procedures Aug-09 Jul-10<br />
UFP.AP 9.8 O Allocations Process<br />
Aug-08 Jun-09 Completed Completed<br />
** This process may change in<br />
late PY4, early PY5<br />
This process will continue in<br />
PY5<br />
Ongoing<br />
Done<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Done<br />
Ongoing<br />
Ongoing
Work Package Dependencies Planned<br />
Resources<br />
(per GIG budget 7/7/08)<br />
12/31/10<br />
AD Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
O P M<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
UFP.AP<br />
O Allocations Process PY4<br />
UFP.AP<br />
D POPS interaction with RDR<br />
UFP.AP<br />
TACC Kent Milfeld[20%],<br />
9.8.1 O Allocations Process PY5<br />
Marg Murray[20%], Valerie<br />
Alvarez[25%]<br />
UFP.AP<br />
Quarterly TRAC Meetings (9/09, 12/09, 3/10,<br />
Ongoing<br />
Ongoing<br />
9.8.1.1 M 6/10) Aug-09 Jul-10<br />
UFP.AP<br />
Ongoing<br />
Ongoing<br />
9.8.1.2<br />
Processing of intermittent TRAC submissions<br />
O (supplements, justifications, etc) Aug-09 Jul-10<br />
UFP.AP<br />
Ongoing review and processing of Startup and<br />
Ongoing<br />
Ongoing<br />
9.8.1.3 O Education requests Aug-09 Jul-10<br />
UFP.AP<br />
Maintenance and updates to Allocations policy<br />
Ongoing<br />
Ongoing<br />
9.8.1.4 O and procedures documentation Aug-09 Jul-10<br />
UFP.AP 9.8.1.5 O Recruiting and supporting TRAC members Ongoing Ongoing<br />
UFP.POP 9.9 O Resource Requests (POPS)<br />
UFP.POP<br />
NCSA Steve Quinn[10%],<br />
9.9.1 O Resource Requests (POPS) PY5<br />
Ester Soriano[20%]<br />
UFP.POP<br />
Support for Quarterly TRAC meetings (9/09,<br />
Ongoing<br />
Ongoing<br />
9.9.1.1 M 12/09, 3/10, 6/10) Aug-09 Jul-10<br />
UFP.POP<br />
Operate, maintain and update POPS system<br />
Ongoing<br />
9.9.1.2 O in support of allocations process Aug-09 Jul-10<br />
UFP.IP 9.10 O Information Presentation/Web Presence<br />
UFP.IP<br />
SDSC Michael Dwyer[10%];<br />
Information Presentation/Web Presence<br />
TACC Patrick Hurley[25%];<br />
9.10.1 O PY5 Aug-09 Jul-10 UC Tim Dudek[50%]<br />
UFP.IP<br />
Maintenance and updates for deployed portal<br />
Ongoing<br />
9.10.1.1 O capabilities (SSO, etc.) Aug-09 Jul-10 P. Hurley (TACC)<br />
UFP.IP<br />
Support for RPs entering information into<br />
Ongoing<br />
9.10.1.2 O catalogs, monitors, and news services. Aug-09 Jul-10 M. Dwyer (SDSC) (0.05 FTE)<br />
UFP.IP<br />
Ongoing maintenance, updates, and<br />
Ongoing<br />
operations for existing catalogs: Compute and<br />
9.10.1.3 O data resources, CTSS, Gateways, etc. Aug-09 Jul-10 M. Dwyer (SDSC) (0.05 FTE)<br />
UFP.IP<br />
UFP.IP<br />
UFP.IP<br />
UFP.IP<br />
Maintenance and updates to TeraGrid's Batch<br />
Queue Prediction Service, including addition<br />
9.10.1.4<br />
of new resources, user support and bug fixes,<br />
O by QBETS team at UCSB. Aug-09 Jul-10<br />
G. Obertelli (UCSB) (0.50<br />
FTE)<br />
Provide statistics and metrics for QSR to<br />
assist in measuring success for TG efforts that<br />
appear in web/portal/KB behavior (9/30,<br />
9.10.1.5 O 12/31, 3/31, 6/30) Aug-09 Jul-10 T. Dudek (ANL) (0.25 FTE)<br />
Administration of WebSphere servers and<br />
9.10.1.6 O infrastructure Aug-09 Jul-10 Subcontract<br />
9.10.1.7 O<br />
Administration of TeraGrid wiki for<br />
problems/spam and fix creation of secure<br />
areas/user access etc. Aug-09 Jul-10 T. Dudek (ANL) (0.25 FTE)<br />
UFP.QA 9.11 O User Information Quality Assurance<br />
UFP.QA<br />
UFP.QA<br />
UFP.QA<br />
UFP.QA<br />
UFP.QA<br />
IU Andy Orahood[10%], Julie<br />
Thatcher[25%], Paul<br />
Brown[50%], Mary<br />
Hrovat[20%], Jonathon<br />
Bolte[25%]; SDSC Diana<br />
Diehl[55%]; UC Tim<br />
9.11.1 O User Information Quality Assurance PY5 Aug-09 Jul-10 Dudek[50%]<br />
Run Web Link Validator weekly and correct<br />
9.11.1.1 O broken links Aug-09 Jul-10<br />
Respond to user documentation correction<br />
requests and initiate corrections to major<br />
9.11.1.2 O errors within 1 business day. Aug-09 Jul-10 D. Diehl (SDSC)<br />
9.11.1.3<br />
9.11.1.4 O<br />
D. Diehl (SDSC) (0.55 across<br />
all items here)<br />
D. Diehl (SDSC); IU KB team;<br />
T. Dudek (ANL); M. Dahan<br />
(TACC)<br />
Monitor and compile user feedback to UFC<br />
web presence and procedures Aug-09 Jul-10<br />
Respond to content update requests for<br />
www.teragrid.<strong>org</strong> within 1 business day. Aug-09 Jul-10 T. Dudek (ANL) (0.25 FTE)<br />
Ongoing<br />
Ongoing<br />
Cancelled/M<br />
oved<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
User News application ported from<br />
SDSC's Oracle to TeraGrid postgreSQL.<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Cancelled/M<br />
oved<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
Ongoing<br />
User News application ported from<br />
SDSC's Oracle to TeraGrid postgreSQL.
Work Package Dependencies Planned<br />
Resources<br />
(per GIG budget 7/7/08)<br />
12/31/10<br />
AD Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
O P M<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
UFP.QA<br />
Provide central coordination of TG-wide<br />
Ongoing<br />
Ongoing<br />
documentation contributions and updates<br />
submitted from across the GIG and RP sites,<br />
using capabilities within the Liferay<br />
9.11.1.5 O environment. Aug-09 Jul-10 D. Diehl (SDSC)<br />
UFP.QA<br />
Conduct ongoing series of coordinated<br />
Ongoing Review and release of AUS performance Ongoing Review and release of AUS performance<br />
reviews to evaluate, synchronize and update<br />
pages into Liferay.<br />
pages into Liferay.<br />
documentation, KB and other web info related<br />
9.11.1.6 O to TG-wide user info areas. Aug-09 Jul-10 D. Diehl (SDSC); IU KB team<br />
UFP.QA<br />
Completion of 250 new KB documents, some<br />
Ongoing<br />
Ongoing<br />
in response to US and user inputs through the<br />
9.11.1.7 O ticket system. Aug-09 Jul-10 IU KB team<br />
UFP.QA<br />
Ongoing updates and special requests for<br />
Ongoing<br />
Ongoing<br />
web content including posting information for<br />
EOT and ER areas, and conference/workshop<br />
9.11.1.8 O sites. Aug-09 Jul-10 T. Dudek (ANL) (0.25 FTE)<br />
UFP.EUA 9.12 P Enhanced TeraGrid User Access<br />
UFP.EUA<br />
Enhanced TeraGrid User Access PY4<br />
UFP.EUA<br />
Linking Shibboleth login to TGUP login in<br />
D<br />
TGUP test instance completed completed<br />
Complete user account management,<br />
Nearing completion<br />
UFP.EUA<br />
D including end-to-end Shibboleth authentication<br />
in production completed In progress<br />
UFP.EUA<br />
Interactive capabilities for job submission in<br />
D<br />
TGUP Delayed Delayed<br />
UFP.EUA D Job reservations form in TGUP Delayed Delayed<br />
TGUP features combined into a workflow<br />
UFP.EUA<br />
D including file management, job submission<br />
and remote visualization Delayed Delayed<br />
UFP.EUA<br />
D Co-scheduling proof of concept portlet<br />
NCSA Ester Soriano[10%],<br />
UFP.EUA<br />
Steve Quinn[10%]; TACC<br />
Praveen Nuthulapati[100%],<br />
9.12.1 P Enhanced TeraGrid User Access PY5 Aug-09 Jul-10 Steve Mock [50%]<br />
UFP.EUA 9.12.1.1 P Users able to create TG logins at the TGUP Aug-09 Oct-09 Ongoing In Progress Ongoing Implementation plan approved<br />
UFP.EUA 9.12.1.2 P Job submission interface in TGUP Nov-09 Jan-10<br />
UFP.EUA<br />
Users can use campus credentials to<br />
Apr-10<br />
9.12.1.3 P authenticate to TGUP<br />
Feb-10<br />
UFP.EUA<br />
Jul-10 Cancelled New file sharing feature in TGUP file Cancelled New file sharing feature in TGUP file<br />
manager portlet provides partial<br />
manager portlet provides partial<br />
capability, including cross-TG data<br />
capability, including cross-TG data<br />
Mar-10<br />
movement<br />
movement<br />
9.12.1.4 P<br />
Lustre-WAN access via TGUP with cross-TG<br />
data movement<br />
UFP.RAM 9.13 P Resource Authorization and Management<br />
UFP.RAM<br />
NCSA Ester Soriano[60%],<br />
Steve Quinn[25%], Mike<br />
Shapiro[10%]; TACC Maytal<br />
9.13.1 P<br />
Resource Authorization and Management<br />
PY5 Aug-09 Jul-10<br />
Dahan[50%]; PSC Edward<br />
Hanna[10%], Rob Light[10%]<br />
UFP.RAM 9.13.1.1 P Improved user authorization process Aug-09 Oct-09 Ongoing Ongoing<br />
UFP.RAM<br />
9.13.1.2 P<br />
Expanding user capabilities for managing<br />
allocations and usage<br />
Nov-09<br />
Jan-10 Ongoing New resource request interface deployed Ongoing<br />
in POPS. Work beginning on revised<br />
POPS entry screens.<br />
UFP.RAM 9.13.1.3 P POPS integrated with TGUP Feb-10 Apr-10 Cancelled Cancelled<br />
UFP.RAM<br />
Resource Advisor made available to users<br />
Jul-10<br />
On hold<br />
9.13.1.4 P within UFP systems<br />
Mar-10<br />
UFP.IIS 9.14 P RP Integration and Information Sharing<br />
UFP.IIS<br />
RP Integration and Information Sharing<br />
P PY4<br />
UFP.IIS<br />
RDR system moved into production<br />
Delayed<br />
Delayed<br />
D environment<br />
UFP.IIS<br />
Complete In progress Nearing completion<br />
UFP.IIS<br />
D<br />
9.14.1 P<br />
Multi-audience TGCDB and POPS reporting<br />
capabilities implemented within TGUP as UFC<br />
development permits<br />
RP Integration and Information Sharing<br />
PY5 Aug-09 Jul-10<br />
NCSA Ester Soriano[10%],<br />
Mike Shapiro[10%]; PSC<br />
Edward Hanna[25%], Rob<br />
Light[25%]<br />
New resource request interface deployed<br />
in POPS. Work beginning on revised<br />
POPS entry screens.
Work Package Dependencies Planned<br />
Resources<br />
(per GIG budget 7/7/08)<br />
12/31/10<br />
AD Status<br />
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
12/31/10 AD Notes<br />
9/30/10 AD<br />
Status<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
9/30/10 AD Notes<br />
Project-ID<br />
WBS<br />
O P M<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
UFP.IIS 9.14.1.1 P RDR-based descriptions in POPS Aug-09 Oct-09 In Progress Delayed<br />
UFP.IIS 9.14.1.2 P RDR-based descriptions in TGCDB Nov-09 Jan-10 In Progress Delayed<br />
UFP.IIS 9.14.1.3 P Hardened, updated RDR Feb-10 Apr-10 In Progress Delayed<br />
UFP.IIS 9.14.1.4 P RDR includes non-compute resources Mar-10 Jul-10 Completed Completed<br />
UFP.UIP 9.15 P User Information Presentation<br />
UFP.UIP P User Information Presentation PY4<br />
UFP.UIP<br />
Post-migration alignment of portal and Web<br />
Ongoing Will be an ongoing effort for duration of Ongoing<br />
D site look, feel, and <strong>org</strong>anization.<br />
TeraGrid.<br />
UFP.UIP<br />
With WebSphere environment, developing<br />
Cancelled<br />
In progress<br />
D scalable documentation model for TeraGrid<br />
UFP.UIP<br />
Migration to RDR/MDS as definitive source of<br />
In Progress<br />
In progress<br />
D Resource Catalog info<br />
UFP.UIP<br />
UFP.UIP<br />
UFP.UIP<br />
UFP.UIP<br />
UFP.UIP<br />
UFP.UIP<br />
UFP<br />
D<br />
Enhanced TGUP system monitor with<br />
customizable list of resources and expanded<br />
resource information<br />
9.15.1 P User Information Presentation PY5 Aug-09 Jul-10<br />
PSC Edward Hanna[20%],<br />
Rob Light[20%]; SDSC<br />
Michael Dwyer[65%]; TACC<br />
Rion Dooley[75%]<br />
Integration of TeraGrid user portal info<br />
Oct-09 Delayed Delayed<br />
9.15.1.1 P services with RDR<br />
Aug-09<br />
Integration of TG catalogs and User News with<br />
Jan-10 Delayed Delayed<br />
9.15.1.2 P RDR<br />
Nov-09<br />
Enhanced system monitors for HPC, storage,<br />
Apr-10 Delayed Delayed<br />
9.15.1.3 P and vis resources<br />
Feb-10<br />
Enhanced integration of user news in TGUP<br />
Jul-10 Delayed Delayed<br />
9.15.1.4 P and user notification<br />
Mar-10<br />
END<br />
Delayed<br />
Delayed<br />
Will be an ongoing effort for duration of<br />
TeraGrid.
TeraGrid DV Extension Year Project Plan<br />
Work Package Dependencies Planned<br />
Resources<br />
Project-ID<br />
WBS<br />
O P M X<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
DV 3.0<br />
Objective 3.0 Operations:<br />
maintain/sustain current capabilities<br />
DV.Coord 3.0.1 O DAV Area Coordination<br />
DV.Coord 3.0.1.3 O DV Area Coordination Aug-09 Jul-11<br />
TACC Kelly Gaither [50%],<br />
Chris Jordan[25%]<br />
ongoing<br />
DV.Coord 3.0.1.3.1 M Quarterly <strong>Report</strong>ing Aug-09 Jul-11 ongoing<br />
DV.DDM.RP<br />
Aggregate RP Distributed Data<br />
O<br />
Management Ops<br />
Apr-10 Jul-11<br />
DV.DDM.RP.NCSA 3.0.1.4 O<br />
DV.DDM.RP.IU 3.0.1.5 O<br />
DV.DDM.RP.LONI 3.0.1.6 O<br />
DV.DDM.RP.NICS 3.0.1.7 O<br />
DV.DDM.RP.ORNL 3.0.1.8 O<br />
DV.DDM.RP.PSC 3.0.1.9 O<br />
DV.DDM.RP.PU 3.0.1.10 O<br />
DV.DDM.RP.SDSC 3.0.1.11 O<br />
DV.DDM.RP.TACC 3.0.1.12 O<br />
NCSA RP Distributed Data Management<br />
Operations<br />
LONI RP Distributed Data Management<br />
Operations<br />
NICS RP Distributed Data Management<br />
Operations<br />
PSC RP Distributed Data Management<br />
Operations<br />
SDSC RP Distributed Data Management<br />
Operations<br />
TACC RP Distributed Data Management<br />
Operations<br />
Apr-10<br />
Jul-11<br />
Alt [40%], B.Butler [75%],<br />
M.Butler [65%], Cai [40%],<br />
Chen [100%], Cribbs [50%],<br />
Glasgow [90%], Kerner [90%],<br />
Loftus [45%] Long [20%]<br />
ongoing<br />
Apr-10 Jul-11 Martinez [100%] ongoing<br />
Apr-10<br />
Apr-10<br />
Jul-11<br />
Jul-11<br />
Kovatch [50%], Hazlewood<br />
[50%], TBD [100%], TBD<br />
[100%], TBD [50%]<br />
Kar[40%], Litzinger[50%],<br />
Nowoczynski[60], S<strong>org</strong>e[15],<br />
Stone[80], Yanovich[80]<br />
ongoing<br />
ongoing<br />
Bennett [75%], Cai [95%], ongoing<br />
Apr-10 Jul-11<br />
Chen, S. [40%], Chen, L.<br />
[80%], Dinh [50%], Hom [75%],<br />
Rivers [65%]<br />
Apr-10 Jul-11 Jones[25%] ongoing<br />
DV.DDM.RP.UCANL 3.0.1.13 O<br />
DV.DC.RP O Aggregate RP Data Collections Ops Apr-10 Jul-11<br />
DV.DC.RP.NCSA 3.0.1.14 O<br />
DV.DC.RP.IU 3.0.1.15 O<br />
DV.DC.RP.LONI 3.0.1.16 O<br />
DV.DC.RP.NICS 3.0.1.17 O NICS RP Data Collections Operations Apr-10 Jul-11 Hazlewood [50%], TBD [70%] ongoing<br />
DV.DC.RP.ORNL 3.0.1.18 O<br />
DV.DC.RP.PSC 3.0.1.19 O<br />
DV.DC.RP.PU 3.0.1.20 O PU RP Data Collections Operations Apr-10 Jul-11<br />
Zhao[50%], 2 half time Grad<br />
Students [100%]<br />
ongoing<br />
DV.DC.RP.SDSC 3.0.1.21 O SDSC RP Data Collections Operations Apr-10 Jul-11 Nunes [10%], Wong [25%] ongoing<br />
<strong>2010Q4</strong> Status Update
DV.DC.RP.TACC 3.0.1.22 O TACC RP Data Collections Operations Apr-10 Jul-11 Urban[50%] ongoing<br />
DV.DC.RP.UCANL 3.0.1.23 O<br />
DV.Viz.RP O Aggregate Visualization Ops Apr-10 Jul-11<br />
DV.Viz.RP.NCSA 3.0.1.24 O<br />
DV.Viz.RP.IU 3.0.1.25 O<br />
DV.Viz.RP.LONI 3.0.1.26 O<br />
DV.Viz.RP.NICS 3.0.1.27 O NICS RP Visualization Operations Apr-10 Jul-11 TBD [60%], TBD [60%] ongoing<br />
DV.Viz.RP.ORNL 3.0.1.28 O<br />
DV.Viz.RP.PSC 3.0.1.29 O PSC RP Visualization Operations Apr-10 Jul-11 Foss[90%], new[10%] ongoing<br />
DV.Viz.RP.PU 3.0.1.30 O PU RP Visualization Operations Apr-10 Jul-11<br />
New Person [100%], 2 half time ongoing<br />
grad students [100%]<br />
DV.Viz.RP.SDSC 3.0.1.31 O SDSC RP Visualization Operations Apr-10 Jul-11 Chourasia [25%], TBD [150%] ongoing<br />
Gaither[10%], Burns[50%], ongoing<br />
DV.Viz.RP.TACC 3.0.1.32 O TACC RP Visualization Operations Apr-10 Jul-11<br />
GregSJohnson[30%],<br />
Schneider[50%],<br />
GregPJohnson[50%]<br />
DV.Viz.RP.UCANL 3.0.1.33 O UCANL RP Visualization Operations Apr-10 Jul-11 Insley[11%] ongoing<br />
DV.DPI 3.2<br />
Objective 3.2 Projects: Data Movement<br />
Performance<br />
DV.DPI 3.2.5 P Data Movement Performance PY5 Aug-09 Jul-11<br />
PSC Kathy Benninger[20%],<br />
Bob Budden[50%], Derek<br />
Simmel[60%]<br />
DV.DPI 3.2.5.2<br />
Production tools for scheduled data<br />
P Apr-10 Jul-11 TBD ONGOING<br />
DV 3.3<br />
movement<br />
Objective 3.3 Projects: Global Wide<br />
Area File Systems<br />
DV.GFS<br />
DV.GFS<br />
3.3.3<br />
3.3.3.1<br />
P Global Wide Area File Systems PY5<br />
P Lustre-WAN in Production<br />
Aug-09<br />
Apr-10<br />
Jul-11<br />
Jul-11<br />
NICS Phil Andrews[5%],<br />
Patricia Kovatch[15%],<br />
Nathaniel Mendoza[15%],<br />
Victor Hazlewood[15%]; PSC<br />
Josephine Palencia[50%];<br />
SDSC Jeff Bennett[25%],<br />
Thomas Guptill[25%]; TACC<br />
GFS Engineer[20%]<br />
ONGOING<br />
DV.GFS 3.3.3.6 P<br />
DV 3.4<br />
Evaluation of pNFS readiness for<br />
production<br />
Objective 3.4 Projects: TeraGrid Wide<br />
Data Architect Design<br />
DV.DWG 3.4.4 P Data Architecture PY5 Aug-09 Jul-10<br />
Apr-10<br />
Jul-11<br />
NICS Phil Andrews[10%],<br />
Bruce Loftis[20%]; PSC J Ray<br />
Scott[25%]; TACC Chris<br />
Jordan[50%]<br />
DELAYED(SOFTWARE<br />
AVAILABILITY)<br />
complete<br />
DV.DWG 3.4.4.3 P Physical instantiation of the design 3.4.3.3 TBD Apr-10 Mar-11 ONGOING<br />
DV 3.5 Objective 3.5 Projects: Visualization<br />
DV<br />
END
TeraGrid NOS Project Plan<br />
Work Package Dependencies Planned<br />
Resources<br />
Project-ID<br />
WBS<br />
O P M X<br />
Description<br />
Task<br />
Cross<br />
Area<br />
Start<br />
Date<br />
End<br />
Date<br />
Name [%FTE]<br />
Status<br />
NOS 4.0<br />
Objective 4.0 Operations:<br />
maintain/sustain current capabilities<br />
NOS.Coord 4.0.1 O NOS Area Coordination<br />
NOS.Coord 4.0.1.3 O NOS Area Coordination PY5 Aug-09 Jul-10 UCANL Jeff Koerner [25%] ongoing<br />
NOS.Coord 4.0.1.3.1<br />
Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31,<br />
M<br />
6/30)<br />
Aug-09 Jul-10 ongoing<br />
NOS.Net 4.0.2 Networking<br />
NOS.Net 4.0.2.3 O Networking Lead PY5 Apr-10 Mar-11<br />
SDSC Tom Hutton [10%];<br />
ANL Linda Winkler [50%]<br />
ongoing<br />
NOS.Net.RP O Aggregate RP Networking Ops Apr-10 Mar-11<br />
NOS.NET.RP.NCSA 4.0.2.4 O NCSA RP Networking Operations Apr-10 Mar-11 Wefel [10%], Shoop [10%] ongoing<br />
NOS.NET.RP.IU 4.0.2.5 Apr-10 Mar-11 ongoing<br />
NOS.NET.RP.LONI 4.0.2.6 O LONI RP Networking Operations Apr-10 Mar-11 ongoing<br />
NOS.NET.RP.NICS 4.0.2.7 O NICS RP Networking Operations Apr-10 Mar-11<br />
Mendoza [50%], TBD [100%], ongoing<br />
Baer [50%]<br />
NOS.NET.RP.ORNL 4.0.2.8 ongoing<br />
Adams[25%],<br />
ongoing<br />
NOS.NET.RP.PSC 4.0.2.9 O PSC RP Networking Operations Apr-10 Mar-11<br />
Benninger[40%],<br />
Huntoon[40%], Lambert[50%],<br />
Lappa[25%], Rapier[25%]<br />
NOS.NET.RP.PU 4.0.2.10 O Purdue RP Networking Operations Apr-10 Mar-11 Lewis [10%] ongoing<br />
Carlson [45%], Dombrowski<br />
NOS.NET.RP.SDSC 4.0.2.11 O SDSC RP Networking Operations Apr-10 Mar-11 [65%], Hutton [20%], Valente<br />
ongoing<br />
[60%]<br />
NOS.NET.RP.TACC 4.0.2.12 O TACC RP Networking Operations Apr-10 Mar-11 Jones [25%] ongoing<br />
NOS.NET.RP.UCANL 4.0.2.13 O UCANL RP Networking Operations Apr-10 Mar-11 Hedden [13%] ongoing<br />
NOS.Sec 4.0.4 O Security Services Apr-10 Mar-11<br />
NOS.Sec 4.0.4.1 O Operational Security Team PY5 Aug-09 Jul-10 PSC James Marsteller [50%] ongoing<br />
NOS.Sec 4.0.4.1.1 O Incident Response activities Aug-09 Jul-10 ongoing<br />
NOS.Sec 4.0.4.1.2<br />
Coordinating operational security issues<br />
O<br />
across the project<br />
Aug-09 Jul-10 ongoing<br />
NOS.Sec 4.0.4.1.3 O Participate in and help lead TAGPMA Aug-09 Jul-10 ongoing<br />
NOS.Sec.RP O Aggregate RP Security Ops Apr-10 Mar-11 ongoing<br />
NOS.Sec.RP.NCSA 4.0.4.2 O NCSA RP Security Operations Apr-10 Mar-11<br />
Barlow [30%], Brooks [15%],<br />
Sharma [60%]<br />
ongoing<br />
NOS.Sec.RP.IU 4.0.4.3 IU RP Securit Operations Apr-10 Mar-11 Cornet [75%] ongoing<br />
NOS.Sec.RP.LONI 4.0.4.4 ongoing<br />
NOS.Sec.RP.NICS 4.0.4.5 ongoing<br />
NOS.Sec.RP.ORNL 4.0.4.6 ongoing
Bennett[30%],<br />
ongoing<br />
NOS.Sec.RP.PSC 4.0.4.7 O PSC RP Security Operations Apr-10 Mar-11<br />
Marsteller[40%],<br />
Shelmire[60%], Simmel[10%],<br />
new[10]<br />
NOS.Sec.RP.PU 4.0.4.8 ongoing<br />
NOS.Sec.RP.SDSC 4.0.4.9 O SDSC RP Security Operations Apr-10 Mar-11 Sakai [30%], Bennett [50%] ongoing<br />
NOS.Sec.RP.TACC 4.0.4.10 O TACC RP Security Operations Apr-10 Mar-11 Murray [25%] ongoing<br />
NOS.Sec.RP.UCANL 4.0.4.11 O UCANL RP Security Operations Apr-10 Mar-11 Leggett [13%] ongoing<br />
NOS.Sec 4.0.4.3<br />
Security Services (kerb, myproxy, CA)<br />
John Quinn [50%]<br />
O Aug-09 Jul-10<br />
PY5<br />
ongoing<br />
Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31,<br />
NOS.Sec 4.0.4.3.1 M<br />
6/30)<br />
Aug-09 Jul-10 ongoing<br />
NOS.TOC 4.0.5 O TOC Services Aug-09 Jul-10 ongoing<br />
NOS.TOC 4.0.5.3 O TOC Services PY5 Aug-09 Jul-10 NCSA Ops Staff [200%] ongoing<br />
NOS.TOC 4.0.5.3.1<br />
Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31,<br />
M<br />
6/30)<br />
Aug-09 Jul-10 ongoing<br />
NOS.TOC.RP O Aggregate RP Helpdesk Ops Apr-10 Mar-11 ongoing<br />
Cagle [50%], Nickens [20%], ongoing<br />
NOS.TOC.RP.NCSA 4.0.5.4 NCSA RP Helpdesk Operations Apr-10 Mar-11<br />
Pingleton [45%], Roney [75%],<br />
Sellers [75%], Ten Have<br />
[45%], Wells [60%], Wilson<br />
[55%]<br />
NOS.TOC.RP.IU 4.0.5.5 ongoing<br />
NOS.TOC.RP.LONI 4.0.5.6 ongoing<br />
NOS.TOC.RP.NICS 4.0.5.7 ongoing<br />
NOS.TOC.RP.ORNL 4.0.5.8 ongoing<br />
Hackworth[10%], Maiden[5%], ongoing<br />
NOS.TOC.RP.PSC 4.0.5.9 O PSC RP Helpdesk Operations Apr-10 Mar-11 Raymond[2%],<br />
Sanielevici[3%]<br />
NOS.TOC.RP.PU 4.0.5.10 O Purdue RP Helpdesk Operations Apr-10 Mar-11 New person [50%] ongoing<br />
NOS.TOC.RP.SDSC 4.0.5.11 ongoing<br />
NOS.TOC.RP.TACC 4.0.5.12 ongoing<br />
NOS.TOC.RP.UCANL 4.0.5.13 ongoing<br />
NOS.HPCOps 4.0.8 O HPC Operations ongoing<br />
NOS.HPCOps.RP O Aggregate RP HPC Operations Apr-10 Mar-11 ongoing<br />
Bouvet [85%], Fernsler [50%], ongoing<br />
Hoyenga [80%], Khin [80%],<br />
NOS.HPCOps.RP.NCSA 4.0.8.1 O NCSA RP HPC Operations Apr-10 Mar-11<br />
Lapine [75%], Marcusiu<br />
[65%], Parga [75%],<br />
Pflugmacher [10%], J.Quinn<br />
NOS.HPCOps.RP.IU 4.0.8.2 O IU RP HPC Operations Apr-10 Mar-11<br />
[65%] Scharf [50%]<br />
Lowe [100%], Moore[100%],<br />
Miller[100%]<br />
ongoing<br />
Martinez [100%], Leche<br />
NOS.HPCOps.RP.LONI 4.0.8.3 O LONI RP HPC Operations Apr-10 Mar-11 [100%], Giaime [100%],<br />
ongoing<br />
Scheinine [50%]<br />
Walsh [100%], Baer [50%],<br />
NOS.HPCOps.RP.NICS 4.0.8.4 O NICS RP HPC Operations Apr-10 Mar-11 Kovatch [50%], Jones [50%],<br />
ongoing<br />
Ezell [50%], TBD [300%]<br />
NOS.HPCOps.RP.ORNL 4.0.8.5 O ongoing
Albert[85%], Bennett[45%], ongoing<br />
Budden[90%], Flaus[20%],<br />
NOS.HPCOps.RP.PSC 4.0.8.6 O PSC RP HPC Operations Apr-10 Mar-11<br />
Gill[90%], Graham[25%],<br />
Johanson[80%], Kar[40%],<br />
Kochmar[90%],<br />
Palencia[20%] Perrone[50%]<br />
Scott[50%],<br />
ongoing<br />
Sommerfield[80%],<br />
S<strong>org</strong>e[75%], Sullivan[50%],<br />
NOS.HPCOps.RP.PU<br />
4.0.8.6<br />
4.0.8.7<br />
O PSC RP HPC Operations Continued<br />
O Purdue RP HPC Operations<br />
Apr-10<br />
Apr-10<br />
Mar-11<br />
Mar-11<br />
Vargo[90%], Vizino[90%],<br />
Webb[50%], Wozniak[80%],<br />
Cubbison[30%], Lappa[25%],<br />
new[55]<br />
P. Smith [10%], Braun [50%] ongoing<br />
Conway [75%], Diegel [50%], ongoing<br />
NOS.HPCOps.RP.SDSC 4.0.8.8 O SDSC RP HPC Operations Apr-10 Mar-11<br />
Fillez [30%], Furman [60%],<br />
Guptill [20%], Hocks [50%],<br />
Kamrath [65%], Khem [60%],<br />
McNew[25%], Silva [25%],<br />
Smallen [5%], Yoshimoto<br />
[50%]<br />
Timm [50%], Carver [50%],<br />
NOS.HPCOps.RP.TACC 4.0.8.9 O TACC RP HPC Operations Apr-10 Mar-11 Anderson [25%], Walling<br />
ongoing<br />
[25%]<br />
NOS.HPCOps.RP.UCANL 4.0.8.10 O UCANL RP HPC Operations Apr-10 Mar-11<br />
Leggett [16%], Insley [18%],<br />
Hedden [45%], Olson [58%]<br />
ongoing<br />
Objective 4.2 Projects: Expanding<br />
NOS 4.2 P<br />
Secure TG Access<br />
ongoing<br />
NOS.Acc 4.2.2 P Expanding Secure TG Access PY4<br />
To WGs,<br />
Jim Basney [20%],<br />
Aug-08 Jul-09<br />
SGW<br />
Terry Fleury [50%]<br />
ongoing<br />
NOS.Acc 4.2.2.2 M Automated DN distribution compliance Aug-08 Dec-09 Delayed<br />
NOS.Acc 4.2.2.4<br />
Requirements/Design for cross-grid<br />
M<br />
Collab.<br />
Aug-08 Jul-10 ongoing<br />
NOS.Acc 4.2.3 P Expanding Secure TG Access PY5 Aug-09 Jul-10<br />
NOS.Acc 4.2.3.1 P<br />
Ubiquitous Adoption of Gateway<br />
accounting<br />
Aug-09<br />
Jul-10<br />
NCSA Jim Basney [25%], Jon<br />
Siwek [50%], Terry Fleury<br />
[25%]<br />
NOS.Acc 4.2.3.2 P INCA tests for Gateway accounting Aug-09 Jul-10 done<br />
NOS.Acc 4.2.3.3 P<br />
TG/OSG authorization interoperability<br />
functional<br />
NOS.Acc 4.2.3.4 P Shibboleth access for campus to TGUP Aug-09 Jul-10 done<br />
NOS 4.3<br />
Objective 4.3 Projects: Operational<br />
Intrumentation (device tracking)<br />
NOS.Inst 4.3.3 P Operational Instrumentation PY5 Aug-09 Jul-10 NCSA Neil Gorsuch [45%]<br />
NOS.Inst 4.3.3.1<br />
Ongoing Monthly, Quarterly and Annual<br />
O<br />
reporting<br />
Aug-09 Jul-10 ongoing<br />
Upgrade Globus Listener to support<br />
NOS.Inst 4.3.3.3 P<br />
GRAM5<br />
Feb-10 Jul-10 done<br />
NOS.Inst 4.3.3.4 P Integration plan for TAIS May-10 Jul-10 ongoing<br />
NOS.INCA 4.4.3 P INCA Improvements PY5 Aug-09 Jul-10<br />
SDSC Kate Ericson [50%],<br />
Shava Smallen[50%]<br />
ongoing<br />
NOS.INCA 4.4.3.1 P QA/CUE views Aug-09 Jul-10 ongoing<br />
NOS.INCA 4.4.3.2 P Link Tickets to Inca results Nov-09 Jan-10 ongoing<br />
Aug-09<br />
Jul-10<br />
pending gram5 deployment at<br />
all sites<br />
not started - depends on 4.2.2.4<br />
above
NOS.INCA 4.4.3.3 P knowledgbase linkage Jan-10 Jun-10 ongoing<br />
NOS.INCA 4.4.3.4 P integrate inca into TGUP Mar-10 Sep-10 ongoing<br />
NOS 4.6<br />
Objective 4.6 Projects: Annual Risk<br />
Assessment<br />
NOS.Risk 4.6.1 P Risk Assessment Aug-08 Jul-09 Jim Rome [10%]<br />
NOS.Risk 4.6.1.1 M ID assessment area and write plan Aug-08 Sep-09 Complete complete<br />
NOS.Risk 4.6.1.2 M Begin assessment of area Aug-08 Dec-09 complete<br />
NOS.Risk 4.6.1.3 M Draft report of findings Aug-08 Mar-10 complete<br />
NOS.Risk 4.6.1.4 M Final assessment and recommendations Aug-08 Jun-10 complete<br />
NOS.QACUE<br />
NOS.QA<br />
Objective X.X Projects: QA/CUE<br />
PY4 Availability Improvement Aug-08 Jul-09<br />
ANL Joe Insley[17%]; IU Mike<br />
Lowe[50%]; LONI Archit<br />
Kulshrestha[25%]; NCSA Dan<br />
Lapine[25%], Luke<br />
Scharf[25%]; NICS Victor<br />
Hazlewood[50%]; PSC<br />
TBD[100%]; Purdue<br />
TBD[50%]; SDSC Thomas<br />
Guptill[50%], Jerry<br />
Greenberg[50%] TACC Da id<br />
NOS.QA<br />
Organize and identify priorities and<br />
SDSC Kate Ericson[12.5%];<br />
Apr-10 Mar-11<br />
timelines for software testing cycle<br />
Shava Smallen[12.5%]<br />
Ongoing<br />
NOS.QA Identify new and existing tests Apr-10 Mar-11<br />
SDSC Kate Ericson[12.5%];<br />
Shava Smallen[12.5%]<br />
Ongoing<br />
NOS.QA Determine Test baselines Dec-08 Mar-09<br />
SDSC Kate Ericson[12.5%];<br />
Shava Smallen[12.5%]<br />
Complete<br />
NOS.QA TG grid services administration guide Aug-09 Jan-10<br />
SDSC Kate Ericson[12.5%];<br />
Shava Smallen[12.5%]<br />
Complete<br />
NOS.QA PY4 Reliability Improvement Aug-08 Dec-09<br />
NOS.QA Identify common INCA errors and classify Aug-08 Mar-10<br />
SDSC Kate Ericson[12.5%];<br />
Shava Smallen[12.5%]<br />
Complete<br />
NOS.QACUE O PY5 QA/CUE Leadership Aug-09 Jul-10<br />
SDSC Kate Ericson[12.5%];<br />
Shava Smallen[12.5%]; PSC<br />
Shawn Brown[20%]<br />
NOS.QACUE P Common User interface roll out Aug-10 Jul-10 Shawn Brown[20%] Ongoing<br />
NOS.QA O PY5 Availability Improvement Aug-09 Jul-10<br />
NOS.Email P Maintain TeraGrid.<strong>org</strong> email services Apr-11 Mar-11 Neil Gorscuh (5%) Complete<br />
NOS<br />
END
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
QSR 2010Q2 <strong>Report</strong> Values<br />
QSR 2010Q1 <strong>Report</strong> Values<br />
QSR 2009Q4 <strong>Report</strong> Values<br />
Project-ID<br />
WBS<br />
O P M<br />
Work Package Dependencies Planned<br />
Cross Start End<br />
Description<br />
Task Area Date Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD Status 12/31/10 AD Notes 09/30/10 AD Status 09/30/10 AD Notes 6/30/10 AD Status 6/30/10 AD Notes 3/31/10 AD Status 3/31/10 AD Notes<br />
12/31/09 AD<br />
Status<br />
12/31/09 AD Notes<br />
EOT 5 Area: Education, Outreach and Training<br />
EOT 5.0<br />
Objective 5.0 Operations:<br />
maintain/sustain current capabilities<br />
EOT.Coord 5.0.1 O EOT Area Coordination<br />
EOT.Coord 5.0.1.7 O EOT Area Coordination PY5 Aug-09 Jul-10 Scott Lathrop[50%]<br />
working group meetings; info sharing among<br />
100% 100% 100% 75%<br />
EOT.Coord 5.0.1.7.1 M<br />
Aug-09 Jul-10<br />
50%<br />
RP EOT staff<br />
PY5 Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31,<br />
100% 100% 100% 50%<br />
EOT.Coord 5.0.1.7.2 M<br />
Aug-09 Jul-10<br />
25%<br />
6/30)<br />
EOT.COOR 5.0.1.8 O Support HPCU portal infrastructure Aug-09 Jul-10 Shodor contract 100% 100% 100% 75% 50%<br />
EOT.Coord 5.0.1.9 O EOT Area Coordination Extension Aug-10 Jul-11 1.5 FTE for Extension<br />
AD and project coodinator activities for Scott<br />
50%<br />
EOT.Coord 5.0.1.9.1 M<br />
Aug-10 Jul-11<br />
and Elizabeth<br />
EOT 5.1<br />
Objective 5.1 Projects: Training<br />
(HPC University Workforce Development)<br />
IU Robert Ping [20%]<br />
Scott Tiege[20%];<br />
NCSA Sandie<br />
Kappes[50%]; NICS<br />
James Ferguson[30%],<br />
Bruce Loftis[20%]; PSC<br />
US, AUSS,<br />
Laura McGinnis[30%],<br />
EOT.TS 5.1.2 P Objective 5.1 Projects: Training<br />
Aug-09 Jul-10<br />
SG, DIV<br />
Robin Flaus[5%], Ryan<br />
Omecene[10%], Tom<br />
Maiden[3%], John<br />
Urbanic[3%], Phil<br />
Blood[3%]; SDSC<br />
Edward Jeff Sale[33%];<br />
TACC Chris<br />
EOT.TS 5.1.2.1<br />
Training & Seminars PY5 (training offered<br />
To RPs Aug-09 Jul-10<br />
at RP sites; on-line materials developed)<br />
Development of one new training content for<br />
US, AUSS,<br />
Training team 100% One is complete and the other was to 95% One is complete and the other was to 90% Developing two new online courses. 75%<br />
EOT.TS 5.1.2.5 P live, synchronous, and/or on-line delivery each<br />
Jul-09 Jul-10<br />
SG, DIV<br />
be complete by the end of Sept. s/b<br />
be complete by the end of Sept. s/b<br />
quarter<br />
completed mid Oct<br />
completed mid Oct<br />
65% Two more in development<br />
EOT.TS.RP 5.1.2.7 O Aggregate RP Training Operations Aug-09 Mar-10<br />
EOT.TS.RP.<br />
Arnold [50%], File 100% 100% 100% 75%<br />
5.1.2.7.1 O NCSA RP Training Operations Aug-09 Mar-10<br />
NCSA<br />
[80%], Glick [25%],<br />
Kappes [20%]<br />
50%<br />
EOT.TS.RP.I<br />
100% 100% 100% 75%<br />
5.1.2.7.2 O IU RP Training Operations Aug-09 Mar-10<br />
U<br />
50%<br />
EOT.TS.RP.<br />
100% 100% 100% 75%<br />
5.1.2.7.3 O LONI RP Training Operations Aug-09 Mar-10<br />
LONI<br />
50%<br />
EOT.TS.RP.<br />
Ferguson [50%], TBD 100% 100% 100% 75%<br />
5.1.2.7.4 O NICS RP Training Operations Aug-09 Mar-10<br />
NICS<br />
[50%]<br />
50%<br />
EOT.TS.RP.<br />
5.1.2.7.5 O ORNL RP Training Operations Aug-09 Mar-10<br />
ORNL<br />
McGinnis[4%], 100% 100% 100% 75%<br />
EOT.TS.RP.<br />
Mahmoodi[8%],<br />
5.1.2.7.6 O PSC RP Training Operations Aug-09 Mar-10<br />
PSC<br />
Urbanic[15%], Brown,<br />
S[3%], Madrid[5%],<br />
50%<br />
Maiden[35%] Jana[5%]<br />
EOT.TS.RP.<br />
100% 100% 100% 75%<br />
5.1.7.7 O PU RP Training Operations Aug-09 Mar-10<br />
PU<br />
50%<br />
EOT.TS.RP.<br />
Sale [50%] 100% 100% 100% 75%<br />
5.1.7.8 O SDSC RP Training Operations Aug-09 Mar-10<br />
SDSC<br />
50%<br />
EOT.TS.RP.<br />
Turner [10%], Wilson 100% 100% 100% 75%<br />
5.1.7.9 O TACC RP Training Operations Apr-09 Mar-10<br />
TACC<br />
[10%]<br />
50%<br />
EOT.TS.RP.<br />
5.1.7.10 O<br />
75%<br />
UC/ANL RP Training Operations Aug-09 Mar-10<br />
UCANL<br />
50%<br />
Training & Seminars Extension (training<br />
EOT.TS 5.1.8.1<br />
offered at RP sites; on-line materials<br />
To RPs Aug-10 Jul-11<br />
developed)<br />
Training team 25% On track, one from PSC and one for 5% early in process but on track, one from<br />
US, AUSS,<br />
EOT.TS 5.1.8.1.1 P Develop 3 new classes<br />
Aug-10 Jul-11<br />
NICS, and one from TACC<br />
PSC and one for NICS, and one from<br />
SG, DIV<br />
TACC<br />
Webcast Parallell Computing and<br />
US, AUSS,<br />
Training team 0% 0%<br />
EOT.TS 5.1.8.1.2 P<br />
Aug-10 Jul-11<br />
Programming Class 2 times<br />
SG DIV<br />
US, AUSS,<br />
Training team 50% 2 complete, 2 planned 0% Classes are scheduled. The next<br />
EOT.TS 5.1.8.1.2 P Produce 4 webcasts<br />
Aug-10 Jul-11<br />
SG DIV<br />
class is scheduled in Oct<br />
US, AUSS,<br />
Training team 0% Scheduled to start in January or when 0% Scheduled to start in January or when<br />
EOT.TS 5.1.8.1.2 P XD Transition place holder<br />
SG DIV<br />
the XD winner is identified<br />
the XD winner is identified<br />
EOT.RPTRN 5.1.9<br />
O<br />
Extension - Aggregate RP Training<br />
Aug-10 Jul-11<br />
Operations<br />
EOT.RPTRN<br />
Arnold [50%], File 50% 25%<br />
5.1.9.1 O Extension - NCSA RP Training Operations Aug-10 Jul-11<br />
.NCSA<br />
[80%], Glick [25%],<br />
Kappes [20%]<br />
EOT.RPTRN<br />
50% 25%<br />
5.1.9.2 O Extension - IU RP Training Operations Aug-10 Jul-11<br />
IU<br />
EOT.RPTRN<br />
50% 25%<br />
5.1.9.3 O Extension - LONI RP Training Operations Aug-10 Jul-11<br />
LONI<br />
EOT.RPTRN<br />
Ferguson [50%], TBD 50% 25%<br />
5.1.9.4 O Extension - NICS RP Training Operations Aug-10 Jul-11<br />
NICS<br />
[50%]<br />
McGinnis[4%], 50% 25%<br />
EOT.RPTRN<br />
Mahmoodi[8%],<br />
5.1.9.5 O Extension - PSC RP Training Operations Aug-10 Jul-11<br />
.PSC<br />
Urbanic[15%], Brown,<br />
S[3%], Madrid[5%],<br />
Maiden[35%] Jana[5%]<br />
EOT.RPTRN<br />
50% 25%<br />
5.1.9.6 O Extension - PU RP Training Operations Aug-10 Jul-11<br />
PU<br />
EOT.RPTRN<br />
Sale [50%] 50% 25%<br />
5.1.9.7 O Extension - SDSC RP Training Operations Aug-10 Jul-11<br />
SDSC<br />
EOT.RPTRN<br />
Turner [10%], Wilson 50% 25%<br />
5.1.9.8 O Extension - TACC RP Training Operations Aug-10 Jul-11<br />
TACC<br />
[10%]<br />
EOT.RPTRN<br />
5.1.9.9 O<br />
50% 25%<br />
Extension - UC/ANL RP Training Operations Aug-10 Jul-11<br />
UCANL<br />
EOT.OTS 5.1.10 Extension - Online Training Aug-10 Jul-11 1.04 FTE<br />
33% 1 completed, 2 started 10% 1 started, 1 going to start next quarter,<br />
EOT.OTS 5.1.10.1 P Develop 3-4 on-line training courses Aug-10 Jul-11<br />
the third will follow later in the year<br />
0% Scheduled to start in January or when 0% Scheduled to start in January or when<br />
EOT.OTS 5.1.10.2 P XD Transition place holder Aug-10 Jul-11<br />
the XD winner is identified<br />
the XD winner is identified<br />
EOT.TQA 5.1.11 Training and HPC University Quality Aug-10 Jul-11 0.48 FTE<br />
EOT.TQA 5.1.11.1 P QA of training materials Aug-10 Jul-11 33% reviewed the one that is completed 0%<br />
Scheduled to start in January or when<br />
Scheduled to start in January or when<br />
EOT.TQA 5.1.11.2 P XD Transition place holder Aug-10 Jul-11<br />
the XD winner is identified<br />
the XD winner is identified<br />
EOT.COMM 5.1.12 TG Community Aug-10 Jul-11 1.25 FTE<br />
Create a TeraGrid social collaboration<br />
25% plan is to use LifeRay environment. 25% Prototyped<br />
EOT.COMM 5.1.12.1 P environment for learning and community<br />
Aug-10 Jul-11<br />
Planning to work with Tim D.<br />
building<br />
0% Scheduled to start in January or when 0% Scheduled to start in January or when<br />
EOT.COMM 5.1.12.2 P XD Transition place holder Aug-10 Jul-11<br />
the XD winner is identified<br />
the XD winner is identified<br />
EOT 5.2<br />
Objective 5.2 Projects: External Relations<br />
(Science Highlights)<br />
EOT.SH 5.2.3 Science Highlights 2010 Jan-10 Nov-10<br />
UC Elizabeth<br />
Spans fiscal years<br />
Leake[10%]<br />
100% Distributed at SC10, to all RPs, and 75% 50%<br />
EOT.SH 5.2.3.1 P 2010 Science Highlights Production Apr-10 Nov-10<br />
NSF<br />
Froelich,ER team and 100% 100% 100%<br />
EOT.SH 5.2.3.2 P Review and select science stories to highlight<br />
Bruett Consulting<br />
Initiate identification of 2011 Science<br />
ER team 100% PY5 portion of SH ended, will be 100% PY5 portion of SH ended, will be 25%<br />
EOT.SH 5.2.3.3 P<br />
Apr-10 Jul-10<br />
Highlights<br />
covered by line 59<br />
covered by line 59<br />
EOT.SH 5.2.3 Science Highlights and EOT 2011 Jul-10 Nov-11 Spans fiscal years<br />
ER team 10% This has been changed to combine 5%<br />
Initiate collection of stories for the winning XD<br />
EOT.SH 5.2.4.1 P<br />
Jul-10 Nov-11<br />
the EOT and SH docs into a single<br />
team<br />
pub XD Transition<br />
EOT 5.3 Objective 5.3 Projects: Outreach
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
QSR 2010Q2 <strong>Report</strong> Values<br />
QSR 2010Q1 <strong>Report</strong> Values<br />
QSR 2009Q4 <strong>Report</strong> Values<br />
Project-ID<br />
WBS<br />
O P M<br />
Work Package Dependencies Planned<br />
Cross Start End<br />
Description<br />
Task Area Date Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD Status 12/31/10 AD Notes 09/30/10 AD Status 09/30/10 AD Notes 6/30/10 AD Status 6/30/10 AD Notes 3/31/10 AD Status 3/31/10 AD Notes<br />
12/31/09 AD<br />
Status<br />
12/31/09 AD Notes<br />
EOT.OE 5.3.2.2.1 M Close out books on TG'09 Conference Jul-09 Dec-09 100% 100% On-going Mike - Need to check with John T. On-going Need to check with Matt. On-going Matt is tallking with Towns to get the data.<br />
IU Robert Ping[20%];<br />
PSC Laura<br />
EOT.OE 5.3.5 Outreach Events PY5 ALL TG Aug-09 Jul-10<br />
McGinnis[10%]; SDSC<br />
Ange Mason[20%]<br />
presentations, exhibits at conferences and<br />
100% 100% 100% 75%<br />
EOT.OE 5.3.5.1 M<br />
Aug-09 Jul-10<br />
50%<br />
outreach events<br />
90% 25%<br />
Event is in August so effort increases as<br />
EOT.OE 5.3.5.3 M TG10 Aug-09 Jul-10<br />
20%<br />
the event approaches<br />
EOT.OE 5.3.5.3.1 M Close out books on TG'10 Conference Jul-10 Dec-10 100% Closed 90% Mike - Ask Richard M.<br />
EOT.OE 5.3.5.5 M Campus Champions meeting at TG10 Jun-10 Jul-10 Hunt 100% 100%<br />
EOT.OE 5.3.5.6 M Campus Champions meeting Spring 2010 Jan-10 Apr-10 Hunt 100% 100% 100%<br />
Campus Champion Visits - 2 per quarter<br />
100% 100% 100% 75%<br />
EOT.OE 5.3.5.7 M<br />
Aug-09 Jul-10<br />
50%<br />
throughout the year<br />
EOT.OE 5.3.5.8 O Campus Champions Consulting Aug-09 Jul-10 100% 100% 100% 75% 50%<br />
Campus Champions monthly training sessions<br />
100% 100% 100% 75%<br />
EOT.OE 5.3.5.9 M<br />
Aug-09 Jul-10<br />
50%<br />
- conference calls and/or sync delivery tools<br />
EOT.OE 5.3.5.10 P Professional Society Outreach Aug-09 Jul-10 Ping [20%] 100% 100% 100% 75% 50%<br />
Collect information on professional society<br />
Ping 100% 100% 100% 75%<br />
EOT.OE 5.3.5.10.1 P<br />
Aug-09 Jul-10<br />
50%<br />
meetings throughout the year<br />
Review and select future outreach events<br />
Ping 100% 100% 100% 75%<br />
EOT.OE 5.3.5.10.2 P quarterly: e.g. AAAS, APS, AGU, ACS, NSTA,<br />
Aug-09 Jul-10<br />
50%<br />
Tapia etc<br />
PY5 - presentations, exhibits at conferences<br />
100% 100% 100% 75%<br />
EOT.OE 5.3.5.10.3 P<br />
Aug-09 Jul-10<br />
50%<br />
and outreach events throughout the year<br />
EOT.OE 5.3.5.11 O TeraGrid Pathways Aug-09 Jul-10 100% 100% 100% 75% 50%<br />
EOT.OE 5.3.5.11.1 O Consulting of under-represented populations AUSS Aug-09 Jul-10 100% 100% 100% 75% 50%<br />
EOT.OE 5.3.5.11.2 O Fellowships for under-represented populations AUSS Aug-09 Jul-10 100% 100% 100% 75% 50%<br />
EOT.OE 5.3.5.11.3 O Mentoring of under-served populations AUSS Aug-09 Jul-10 100% 100% 100% 75% 50%<br />
Develop student competitions and awards in<br />
Wiziecki [20%] 100% 100% 100% 75%<br />
EOT.OE 5.3.5.12 P<br />
Aug-09 Jul-10<br />
50%<br />
conjunction with National Science Olympiad<br />
Work with Olympiad to establish computational<br />
Wiziecki 100% 100% 100% 75%<br />
EOT.OE 5.3.5.12.1 P<br />
Aug-09 Dec-09<br />
50%<br />
science component<br />
Wiziecki 100% 100% 100% 1 competition completed, one 50% 1 competition completed, one<br />
EOT.OE 5.3.5.12.2 P Develop competitions Jan-10 May-10<br />
remaining in May<br />
remaining in May<br />
Wiziecki 100% 100% 100% 1 competition completed, one 50% 1 competition completed, one<br />
EOT.OE 5.3.5.12.3 P Deliver competitions and make awards Apr-10 Jul-10<br />
remaining in May<br />
remaining in May<br />
IU Robert Ping[20%];<br />
PSC Laura<br />
EOT.OE 5.3.6 Outreach Events Extension ALL TG Aug-10 Jul-11<br />
McGinnis[10%]; SDSC<br />
Ange Mason[20%] 0.5<br />
FTE for Extension<br />
Campus Champions monthly training sessions<br />
50% 25%<br />
EOT.OE 5.3.6.1 M<br />
Aug-10 Jul-11<br />
- conference calls and/or sync delivery tools<br />
EOT.OE 5.3.6.2 P Professional Society Outreach Aug-10 Jul-11 Ping [20%] 50% 25%<br />
Collect information on professional society<br />
Ping 50% 25%<br />
EOT.OE 5.3.6.3 P<br />
Aug-10 Jul-11<br />
meetings throughout the year<br />
Review and select future outreach events<br />
Ping 50% 25%<br />
EOT.OE 5.3.6.4 P quarterly: e.g. AAAS, APS, AGU, ACS, NSTA,<br />
Aug-10 Jul-11<br />
Tapia etc<br />
Extension - presentations, exhibits at<br />
50% 25%<br />
EOT.OE 5.3.6.5 P conferences and outreach events throughout<br />
Aug-10 Jul-11<br />
the year<br />
Develop student competitions and awards in<br />
Wiziecki [20%] 50% 0%<br />
EOT.OE 5.3.6.10 P<br />
Aug-10 Jul-11<br />
conjunction with National Science Olympiad<br />
Work with Olympiad to establish computational<br />
Wiziecki 50% 0%<br />
EOT.OE 5.3.6.11 P<br />
Aug-10 Dec-10<br />
science component<br />
EOT.OE 5.3.6.12 P Develop competitions Aug-10 May-11 Laura McGinnis 50% 25%<br />
Deliver competitions and make awards for<br />
Laura McGinnis 100% 0% SC10 event and TG11<br />
EOT.OE 5.3.6.13 P<br />
Nov-10 Jul-11<br />
SC10<br />
Deliver competitions and make awards for<br />
Laura McGinnis 0% 0% SC10 event and TG11<br />
EOT.OE 5.3.6.13 P<br />
Nov-10 Jul-11<br />
TG11<br />
NCSA Edee Wiziecki<br />
[20%]; PSC Laura<br />
McGinnis [10%], Robin<br />
Flaus [10%], Phil Blood<br />
EOT.Edu 5.4.3 Education PY5 ALL TG Aug-09 Jul-10 [5%], Shawn Brown<br />
[10%]; Purdue KayHunt<br />
[50%], TBD [25%],<br />
SDSC Ane Mason<br />
[20%], Diane Baxter<br />
EOT.EDU 5.4.3.2 M Education workshops May-10 Aug-10 100% 100% 90%<br />
EOT.EDU 5.4.3.3 M monthly newsletter of EOT resources Aug-09 Mar-10 100% 100% 100% 75% 50%<br />
EOT.EDU 5.4.3 P EOT Highlights 2010 Apr-10 Dec-10<br />
EOT.EDU 5.4.3.1 P Request stories from EOT team Apr-10 Jul-10 Mason [20%] 100% 100% 100%<br />
EOT.EDU 5.4.3.2 P Select and write stories for inclusion Jul-10 Oct-10 Mason, EOT team 100% 100% 50%<br />
Mason, EOT team, 100% 75% 25%<br />
EOT.EDU 5.4.3.3 P Prepare layout and proof Sep-10 Dec-10<br />
Bruett Consulting<br />
Bruett Consulting 100% Distributed at SC10, to RPs, and to 0% will be printed in Oct 2010<br />
EOT.EDU 5.4.3.4 P Print Highlights Sep-10 Oct-10<br />
NSF<br />
EOT.EDU 5.4.3.5 M PY5 - SC10 Education Program Jan-10 Nov-10 100% 75% 75%<br />
NCSA Edee Wiziecki<br />
[20%]; PSC Laura<br />
McGinnin [10%], Robin<br />
Flaus [10%], Phil Blood<br />
[5%], Shawn Brown<br />
EOT.Edu 5.4.4 Education Extension ALL TG Aug-10 Jul-11<br />
[10%]; Purdue KayHunt<br />
[50%], TBD [25%],<br />
SDSC Ane Mason<br />
[20%], Diane Baxter<br />
[20%] Extension Year<br />
EOT.EDU 5.4.4.1 M Education workshops May-11 Jul-11<br />
EOT.EDU 5.4.4.2 M monthly newsletter of EOT resources Aug-10 Mar-11 50% 25%<br />
n/a<br />
This has been changed to combine<br />
EOT.EDU 5.4.4.3 P EOT Highlights 2011 Jan-11 Mar-11<br />
the EOT and SH docs into a single<br />
pub. XD Transition. This will now be<br />
tracked under 5 2 4 1<br />
EOT.EDU 5.4.4.3.1 P Request stories from EOT team Jan-11 Mar-11 Mason [20%] n/a<br />
EOT.EDU 5.4.4.8 M Extension - Education Program Jan-11 Jul-11 5% Planning has started<br />
EOT.EDU.R<br />
Aggregate RP Education & Outreach<br />
5.4.7 O<br />
Aug-09 Mar-10<br />
P<br />
Operations<br />
EOT.EDU.R<br />
100% 100% 100% 75%<br />
5.4.7.1 O NCSA RP Education & Outreach Operations Aug-09 Mar-10<br />
50%<br />
P NCSA<br />
EOT.EDU.R<br />
100% 100% 100% 75%<br />
5.4.7.2 O IU RP Education & Outreach Operations Aug-09 Mar-10<br />
50%<br />
P IU<br />
EOT.EDU.R<br />
100% 100% 100% 75%<br />
5.4.7.3 O LONI RP Education & Outreach Operations Aug-09 Mar-10<br />
50%<br />
P LONI<br />
EOT.EDU.R<br />
Ferguson [50%], TBD 100% 100% 100% 75%<br />
5.4.7.4 O NICS RP Education & Outreach Operations Aug-09 Mar-10<br />
50%<br />
P NICS<br />
[100%] TBD [50%]<br />
EOT.EDU.R<br />
5.4.7.5 O ORNL RP Education & Outreach Operations Aug-09 Mar-10<br />
P ORNL<br />
Budden[5%],<br />
100% 100% 100% 75%<br />
Flaus[15%],<br />
Graham[15%],<br />
Hanna[5%], Light[5%],<br />
McGinnis[25%],<br />
EOT.EDU.R<br />
Nystrom[10%], Brown,<br />
5.4.7.6 O PSC RP Education & Outreach Operations Aug-09 Mar-10<br />
50%<br />
P.PSC<br />
S[2%], Cubbison[10%],<br />
Hackworth[15%],<br />
Maiden[10%],<br />
Domaracki[50%],<br />
Czech[50%],<br />
Ishwad[25%],
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
QSR 2010Q2 <strong>Report</strong> Values<br />
QSR 2010Q1 <strong>Report</strong> Values<br />
QSR 2009Q4 <strong>Report</strong> Values<br />
Project-ID<br />
WBS<br />
O P M<br />
Work Package Dependencies Planned<br />
Cross Start End<br />
Description<br />
Task Area Date Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD Status 12/31/10 AD Notes 09/30/10 AD Status 09/30/10 AD Notes 6/30/10 AD Status 6/30/10 AD Notes 3/31/10 AD Status 3/31/10 AD Notes<br />
12/31/09 AD<br />
Status<br />
12/31/09 AD Notes<br />
EOT.EDU.R<br />
100% 100% 100% 75%<br />
5.4.7.7 O PU RP Education & Outreach Operations Aug-09 Mar-10<br />
P PU<br />
EOT.EDU.R<br />
Baxter [37%], Mason 100% 100% 100% 75%<br />
5.4.7.8 O SDSC RP Education & Outreach Operations Aug-09 Mar-10<br />
P SDSC<br />
[20%]<br />
EOT.EDU.R<br />
Armosky [25%] 100% 100% 100% 75%<br />
5.4.7.9 O TACC RP Education & Outreach Apr-09 Mar-10<br />
P TACC<br />
EOT.EDU.R<br />
100% 100% 100% 75%<br />
5.4.7.10 O UCANL RP Education & Outreach Operations Apr-09 Mar-10<br />
P UCANL<br />
EOT.RPOE 5.4.8 O Education &Outreach Aug-10 Mar-11 0.78 FTE for Extension<br />
Year<br />
EOT.RPOE.<br />
50% 25%<br />
5.4.8.1 O NCSA RP Education & Outreach Operations Aug-10 Mar-11<br />
NCSA<br />
EOT.RPOE.I<br />
50% 25%<br />
5.4.8.2 O IU RP Education & Outreach Operations Aug-10 Mar-11<br />
U<br />
EOT.RPOE.<br />
50% 25%<br />
5.4.8.3 O LONI RP Education & Outreach Operations Aug-10 Mar-11<br />
LONI<br />
EOT.RPOE.<br />
Ferguson [50%], TBD 50% 25%<br />
5.4.8.4 O NICS RP Education & Outreach Operations Aug-10 Mar-11<br />
NICS<br />
[100%] TBD [50%]<br />
Budden[5%],<br />
50% 25%<br />
Flaus[15%],<br />
Graham[15%],<br />
Hanna[5%], Light[5%],<br />
McGinnis[25%],<br />
EOT.RPOE.<br />
Nystrom[10%], Brown,<br />
5.4.8.6 O PSC RP Education & Outreach Operations Aug-10 Mar-11<br />
PSC<br />
S[2%], Cubbison[10%],<br />
Hackworth[15%],<br />
Maiden[10%],<br />
Domaracki[50%],<br />
Czech[50%],<br />
Ishwad[25%],<br />
EOT.RPOE.<br />
50% 25%<br />
5.4.8.7 O PU RP Education & Outreach Operations Aug-10 Mar-11<br />
PU<br />
EOT.RPOE.<br />
Baxter [37%], Mason 50% 25%<br />
5.4.8.8 O SDSC RP Education & Outreach Operations Aug-10 Mar-11<br />
SDSC<br />
[20%]<br />
EOT.RPOE.<br />
Armosky [25%] 50% 25%<br />
5.4.8.9 O TACC RP Education & Outreach Aug-10 Mar-11<br />
TACC<br />
EOT.RPOE.<br />
50% 25%<br />
5.4.8.10 O UCANL RP Education & Outreach Operations Aug-10 Mar-11<br />
UCANL<br />
EOT.OCCP 5.4.9 O Outreach: Campus Champions Program Aug-10 Mar-11 1.95 FTE<br />
EOT 5.5 Objective 5.5 Projects: Evaluation<br />
Abbott - external<br />
To SI, RPs,<br />
EOT.EVAL 5.5.2 PY5 Evaluation<br />
Aug-09 Jul-10 evaluator on consulting<br />
NSF<br />
agreement<br />
All RPs collect data at all events throughout<br />
all RPs 100% 100% 100% 75%<br />
EOT.EVAL 5.5.2.1 O<br />
Aug-09 Jul-10<br />
year and share with external evaluator<br />
Distribute surveys to people attending events<br />
Abbott 100% 100% 100% 10%<br />
EOT.EVAL 5.5.2.2 M<br />
Feb-10 Jul-10<br />
held 6 months ago<br />
EOT.EVAL 5.5.2.2 M Quartlery summary of data Dec-09 Jul-10 Abbott 100% 100% 75% 50%<br />
EOT.EVAL 5.5.2.3 M Mid-year analysis of data Jan-10 Feb-10 Abbott Completed Completed Completed Completed<br />
EOT.EVAL 5.5.2.4 M Full analysis report annually Jul-10 Jul-10 Abbott 100% 100%<br />
EOT.EVAL.<br />
5.5.3 PY5 - Evaluations offered by RPs Aug-09 Jul-10<br />
RP<br />
EOT.EVAL.<br />
100% 100% 100% 75%<br />
5.5.3.1 O Collect data throughout year Aug-09 Jul-10<br />
RP IU<br />
UC Elizabeth<br />
EOT.ER 5.6 P Objective 5.5 External Relations<br />
Leake[100%]<br />
PY5 - working group meetings; info sharing<br />
100% 100% 100% 75%<br />
EOT.ER 5.6.2 O<br />
Aug-09 Jul-10<br />
among RP ER staff<br />
PY5 - Quarterly <strong>Report</strong>ing (9/30, 12/31, 3/31,<br />
100% 100% 100% 50%<br />
EOT.ER 5.6.4 O<br />
Aug-09 Jul-10<br />
6/30)<br />
EOT.ER.RP 5.6.5 O Aggregate RP External Relations Operations Aug-09 Mar-10<br />
EOT.ER.RP.<br />
Barker [40%], Bell 100% 100% 100% 75%<br />
5.6.5.1 O NCSA RP External Relations Operations Aug-09 Mar-10<br />
NCSA<br />
[25%]<br />
EOT.ER.RP.<br />
100% 100% 100% 75%<br />
5.6.5.2 O IU RP External Relations Operations Apr-09 Mar-10<br />
IU<br />
EOT.ER.RP.<br />
100% 100% 100% 75%<br />
5.6.5.3 O LONI RP External Relations Operations Apr-09 Mar-10<br />
LONI<br />
EOT.ER.RP.<br />
100% 100% 100% 75%<br />
5.6.5.4 O NICS RP External Relations Operations Apr-09 Mar-10<br />
NICS<br />
EOT.ER.RP.<br />
5.6.5.5 O ORNL RP External Relations Operations Apr-09 Mar-10<br />
ORNL<br />
EOT.ER.RP.<br />
Schneider[50%], 100% 100% 100% 75%<br />
5.6.5.6 O PSC RP External Relations Operations Aug-09 Mar-10<br />
PSC<br />
Williams[50%]<br />
EOT.ER.RP.<br />
100% 100% 100% 75%<br />
5.6.5.7 O PU RP External Relations Operations Apr-09 Mar-10<br />
PU<br />
EOT.ER.RP.<br />
Zverina [50%] 100% 100% 100% 75%<br />
5.6.5.8 O SDSC RP External Relations Operations Aug-09 Mar-10<br />
SDSC<br />
EOT.ER.RP.<br />
Singer [25%] 100% 100% 100% 75%<br />
5.6.5.9 O TACC RP External Relations Apr-09 Mar-10<br />
TACC<br />
EOT.ER.RP.<br />
100% 100% 100% 75%<br />
5.6.5.10 O UC/ANL RP External Relations Operations Apr-09 Mar-10<br />
UCANL<br />
Communica<br />
Leake [10%], ER team 100% 100% 90% 25%<br />
EOT.ER 5.6.7 M TG10<br />
tions and all RPs Aug-09 Jul-10 members<br />
PR<br />
Booth<br />
EOT.ER 5.6.9 M SC10<br />
ADs<br />
Coordinated by Leake 100% 75% 50%<br />
May-10 Dec-10<br />
planning<br />
[10%] ER team<br />
Identify new venues for disseminating<br />
Leake [10%] 100% 100% 100% 75%<br />
EOT.ER 5.6.10 P<br />
information about TeraGrid among science<br />
Aug-09 Jul-10<br />
and engineering communities to attract new<br />
users to TeraGrid resources and services<br />
Work with ER team, AUS, ADs, and TG<br />
Leake 100% 100% 100% 75%<br />
EOT.ER 5.6.10.1 P<br />
Aug-09 Jul-10<br />
Forum to identify venues<br />
EOT.RPERC 5.6.5 O Aggregate RP External Relations Operations Aug-10 Mar-11 0.75 FTE for Extension<br />
EOT.RPERC<br />
Barker [40%], Bell 50% 25%<br />
5.6.5.1 O NCSA RP External Relations Operations Aug-10 Mar-11<br />
NCSA<br />
[25%]<br />
EOT.RPERC<br />
50% 25%<br />
5.6.5.2 O IU RP External Relations Operations Aug-10 Mar-11<br />
IU<br />
EOT.RPERC<br />
50% 25%<br />
5.6.5.3 O LONI RP External Relations Operations Aug-10 Mar-11<br />
LONI<br />
EOT.RPERC<br />
50% 25%<br />
5.6.5.4 O NICS RP External Relations Operations Aug-10 Mar-11<br />
NICS<br />
EOT.RPERC<br />
50% 25%<br />
5.6.5.5 O ORNL RP External Relations Operations Aug-10 Mar-11<br />
ORNL<br />
EOT.RPERC<br />
Schneider[50%], 50% 25%<br />
5.6.5.6 O PSC RP External Relations Operations Aug-10 Mar-11<br />
PSC<br />
Williams[50%]<br />
EOT.RPERC<br />
50% 25%<br />
5.6.5.7 O PU RP External Relations Operations Aug-10 Mar-11<br />
PU<br />
EOT.RPERC<br />
Zverina [50%] 50% 25%<br />
5.6.5.8 O SDSC RP External Relations Operations Aug-10 Mar-11<br />
SDSC<br />
EOT.RPERC<br />
Singer [25%] 50% 25%<br />
5.6.5.9 O TACC RP External Relations Aug-10 Mar-11<br />
TACC<br />
EOT.RPERC<br />
50% 25%<br />
5.6.5.10 O UC/ANL RP External Relations Operations Aug-10 Mar-11<br />
UCANL<br />
Write stories about TeraGrid (non RP-specific)<br />
Leake [10%], ER team 50% 25%<br />
EOT.ERCom<br />
5.6.7 M<br />
resources and services. Customize existing<br />
Aug-10 Mar-11<br />
members, Grad<br />
m<br />
stories for NSF-OCI, OLPA, and other<br />
Student, 0.87 FTE for<br />
purposes<br />
Extension<br />
Graphic Design assistance with the facilitation<br />
50% 25%<br />
EOT.ERCom<br />
5.6.9 M and updating of TeraGrid’s web<br />
Aug-10 Mar-11<br />
m<br />
content—specifically images and graphic files<br />
Graduate assistant to help with research, list<br />
0% 0%<br />
EOT.ERCom<br />
5.6.10 P maintenance, and ER XD transition<br />
Aug-10 Mar-11<br />
m<br />
documentation<br />
Internal relations activities. International<br />
EOT.ERCom<br />
50% 25%<br />
5.6.10.1 P collaboration. Training in support of<br />
Aug-10 Mar-11<br />
m<br />
international communication<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
25%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
50%<br />
20%<br />
50%<br />
50%
QSR <strong>2010Q4</strong> <strong>Report</strong> Values<br />
QSR 2010Q3 <strong>Report</strong> Values<br />
QSR 2010Q2 <strong>Report</strong> Values<br />
QSR 2010Q1 <strong>Report</strong> Values<br />
QSR 2009Q4 <strong>Report</strong> Values<br />
Project-ID<br />
WBS<br />
O P M<br />
Work Package Dependencies Planned<br />
Cross Start End<br />
Description<br />
Task Area Date Date<br />
Resources<br />
Name [%FTE]<br />
12/31/10 AD Status 12/31/10 AD Notes 09/30/10 AD Status 09/30/10 AD Notes 6/30/10 AD Status 6/30/10 AD Notes 3/31/10 AD Status 3/31/10 AD Notes<br />
12/31/09 AD<br />
Status<br />
12/31/09 AD Notes<br />
Science Writing (stories written and<br />
EOT.SW 5.7<br />
Aug-09 Jul-10<br />
distributed)<br />
PSC Michael<br />
100% 100% 100% 75%<br />
EOT.SW 5.7.3 O Science Writing PY5 Aug-09 Jul-10 Schneider[20%],<br />
50%<br />
Shandra Williams[20%]<br />
Review and select science stories to highlight,<br />
ER team; coordinated 100% 100% 100% 75%<br />
EOT.SW 5.7.3.1 O<br />
RPs, GIG Aug-09 Jul-10<br />
50%<br />
write and disseminate throughout year<br />
by Leake<br />
EOT.SW 5.7.3.2 O Select and write stories each quarter RPs, GIG Aug-09 Jul-10 ER team 100% 100% 100% 75% 50%<br />
Pubish stories quarterly on web; disseminate<br />
ER team 100% 100% 100% 75%<br />
EOT.SW 5.7.3.3 O<br />
GIG, UFP Aug-09 Jul-10<br />
50%<br />
through media - iSGTW HPCWire etc<br />
EOT.SW 5.7.4 O Science Writing Extension Aug-10 Jul-11<br />
PSC Michael<br />
Schneider[20%],<br />
Shandra Williams[20%]<br />
Review and select science stories to highlight,<br />
ER team; coordinated 50% 25%<br />
EOT.SW 5.7.4.1 O<br />
RPs, GIG Aug-10 Jul-11<br />
write and disseminate throughout year<br />
by Leake<br />
EOT.SW 5.7.4.2 O Select and write stories each quarter RPs, GIG Aug-10 Jul-11 ER team 50% 25%<br />
Pubish stories quarterly on web; disseminate<br />
ER team 50% 25%<br />
EOT.SW 5.7.4.3 O<br />
GIG, UFP Aug-10 Jul-11<br />
through media - iSGTW HPCWire etc<br />
ER team 0% 0% Scheduled to start in January or when<br />
Scheduled to start in January or when<br />
EOT.SW 5.7.4.4 O XD Transition place holder GIG, UFP<br />
the XD winner is identified<br />
the XD winner is identified<br />
Computational Science Problem of the<br />
EOT.CSPW 5.8<br />
Aug-10 Jul-11<br />
Week (EOT CSPW)<br />
Computational Science Problem of the Week<br />
0.68 FTE 50% 25%<br />
EOT.CSPW 5.8.1 O<br />
Aug-10 Jul-11<br />
(EOT.CSPW)<br />
EOT END
D<br />
EU-US HPC Summer School <strong>Report</strong><br />
EU-US HPC Summer School<br />
<strong>Report</strong><br />
December 24, 2010<br />
The first European US Summer School on HPC Challenges in Computational<br />
Sciences <strong>org</strong>anized jointly by Europe’s DEISA project and the US/NSF TeraGrid project<br />
was held at Santa Tecla Palace in Acireale, Sicily from Oct 3-7, 2010.<br />
The attendees were asked about their goals for attending and the extent to which the<br />
summer school met those goals. Overwhelmingly, the attendees said that the summer<br />
school met their goals, and quite a few people indicated that the event fully and<br />
completely met their goals, and a couple people said the event exceeded their<br />
expectations. Two people indicated that there were topics that could have been covered<br />
in more depth to aid them including fault tolerance, and more on numerical algorithms.<br />
When asked for their overall assessment of how well the summer school met their goals,<br />
the attendees responded as follows:<br />
• Excellent – 16 - 33.3%<br />
• Very good – 25 - 52.1%<br />
• Good – 6 - 12.5%<br />
• Fair – 1 - 2.1%<br />
• Poor – 0
Among the participants were sixty graduate students or post-docs, selected from more<br />
than 100 applications: 24 from US and 35 from EU universities and research institutions.<br />
Students came from a variety of disciplines, among them astronomy, atmospheric<br />
sciences, chemistry, computer science, engineering, mathematics and physics.<br />
Approximately 20% of the attendees were female students. The list of attendees is<br />
included in Appendix B.<br />
Twenty-five high level speakers addressed major fields of computational science, with<br />
nine speakers from the US and sixteen from Europe. Areas covered included Challenges<br />
by Scientific Disciplines, Programming, Performance Analysis & Profiling, Algorithmic<br />
Approaches & Libraries, and Data Intensive Computing and Visualization.<br />
“The summer school HPC Challenges in Computational Sciences was an excellent<br />
opportunity for me to get an overview of the complete spectrum of High Performance<br />
Computing and Computational Science. Attending talks from the large number of<br />
distinguished speakers from almost every domain of computational science and High<br />
performance computing has enabled me to get a very clear idea of the current trends in<br />
the area”, one of the students stated afterwards.<br />
This was planned from the outset and a joint DEISA-TeraGrid effort. “Our primary<br />
objective for the student experience was to advance computational sciences by enabling<br />
and stimulating future international collaboration, innovation, and discovery through the<br />
most effective use of HPC,” said Hermann Lederer from the DEISA coordination team at<br />
RZG, Garching. “We hope to continue with such events every year— alternating<br />
between EU and US destinations,” said TeraGrid Forum Chair John Towns, National<br />
Center for Supercomputing Applications. “The overwhelmingly positive feedback of the<br />
students, in which 85% rated the event as very good or excellent, can be taken as a<br />
mandate”, he added.<br />
The agenda for the event is included as Appendix A. The presentation slides are<br />
available at: http://www.deisa.eu/Summer-School/talks<br />
A survey was conducted among the attendees and presenters to capture feedback from the<br />
event to assess the impact of this event, and to help plan for future events. A total of 50<br />
responses were provided from among the 60 participants, and 13 responses ere collected<br />
from the 35 presenters.<br />
The make-up of the attendees responding to the survey included:<br />
• Postdoc – 6 - 12.2%<br />
• Graduate – 36 - 73.5%<br />
• HPC Center staff – 4 - 8.4%<br />
• Other – 3 - 6.1%<br />
The make-up of the presenters (and staff) responding to the survey included:
• Faculty – 3 - 37.5%<br />
• Researcher – 2 - 25%<br />
• HPC Center staff – 2 - 25%<br />
• Other – 1 – 12.5%<br />
The attendees were from among the following fields:<br />
• Physics – 14<br />
• Computer science – 10<br />
• Chemistry – 7<br />
• Engineering – 6<br />
• Astronomy – 2<br />
• Other – atmospheric science, biology, mathematics<br />
The survey respondents were asked to indicate what they found to be most useful from<br />
the summer school experience. Many of the responses indicated that they felt very<br />
positive about the broad range of topics that were covered including scientific areas and<br />
technological areas related to HPC. The attendees felt they learned a great deal about<br />
HPC tools, resources, and the directions that others in the community were taking that<br />
would help to guide their own endeavors. Many of the attendees enjoyed learning about<br />
the challenges addressed in a diverse array of disciplines, which in turn helped them to<br />
understand they faced similar challenges and similar opportunities for advancing their<br />
own research. A number of attendees also mentioned that it was beneficial to learn about<br />
career opportunities.<br />
The survey respondents were asked to indicate what they found to be least useful from<br />
the summer school experience. A number of attendees mentioned that they would like to<br />
have heard less about the detailed scientific results and the theory behind the science,<br />
with a preference for more information about the computational details and<br />
computational methodologies that were applied to advance the research. A number of<br />
attendees would have preferred more hands-on aspects during the summer school,<br />
although some attendees said that would only have reduced the time to learn about the<br />
breadth of topics that were covered, that were overall considered to be very good by the<br />
vast majority of people responding to the survey. Some respondents mentioned that they<br />
would like to have heard more about algorithms. There was a very strong negative<br />
response by many attendees to the situation with the Internet access, which required work<br />
by the staff at the site to rectify lack of access and charges to use the Internet. These<br />
problems were worked out with the hotel on-site. There were a few negative comments<br />
about offering parallel sessions, as these people wanted to attend the competing sessions,<br />
yet many people appreciated the parallel sessions as they weren’t interested in both<br />
topics, and eliminating the parallel tracks would make the summer school much longer.<br />
A couple people suggested that the presentations could be shortened.<br />
When asked to provide advice for future, many respondents said that all of the talks<br />
should address the HPC challenges, general computational problems and solutions, and
the talks should address less on the specific research work and specific formulas. Some<br />
respondents would like to hear more about GPU programming, and more coverage of<br />
other fields of science including chemistry and biology. It was suggested that young<br />
researchers should talk about their research and seek advice from the group, including the<br />
more senior presenters, to guide them in their research. It was mentioned that slides from<br />
all presenters should be made available as soon after the presentations as possible. The<br />
attendees suggested that the use of Google Docs would help to address this. A number<br />
of people again repeated the request for hands-on sessions. A few people suggested that<br />
the summer school could be between ½ a day and full day longer to cover more topics.<br />
The survey respondents recommended conducting the summer school so as not to<br />
compete with classes, and to be closer to a major airport to eliminate the travel hassles of<br />
getting to the meeting site. The attendees liked the BOF sessions during lunch, and<br />
suggested that they start earlier in the week to allow people to meet and share information<br />
on topics of common interest. The attendees would like to have received a list of all<br />
participants and presenters, and their fields of research to help people make connections<br />
during the summer school.<br />
When asked how the summer school would impact their own work, most of the<br />
respondents said the summer school would not change their research plans, but many of<br />
them planned to apply the tools they learned about to enhance their research. Many of<br />
the attendees plan to apply the code profiling, tuning, and optimization techniques on<br />
their own codes, plan to explore the use of numerical libraries, and utilize the<br />
visualization tools they learned about. Some of the attendees plan to explore some of the<br />
new frameworks they learned about (e.g. UPC, StarSs), and to share what they learned<br />
with their colleagues.<br />
Overall, nearly all of the respondents felt that the event was well <strong>org</strong>anized, that the<br />
location was wonderful (except for transportation issues to and from the airport), and that<br />
the accommodations were quite good (except for early Internet problems). Some people<br />
felt that being near a major airport and near a city to offer more off-hours opportunities<br />
would be useful. One person mentioned that the hotel did not provide ample options for<br />
vegetarians.<br />
A major emphasis for the event was to facilitate interactions among the attendees and the<br />
presenters, and to allow ample time for people to share ideas, explore challenges and<br />
opportunities, and to establish new colleagues and collaborations. The vast majority of<br />
attendees felt that the summer school did allow plenty of time for people to interact and<br />
talk. A number of people mentioned that they have new colleagues to share ideas. A few<br />
indicated that new collaborations may result from the event. A few people indicated that<br />
there were not enough others in the same field of study to build new connections, but that<br />
the event overall was still very useful to their learning. A few people said more could be<br />
done to encourage presenters to spread out among the attendees, and not sit together<br />
during meals.
In summary, the attendees and presenters strongly recommended that further summer<br />
schools be conducted in the US and Europe.
Appendix A - Summer School Agenda<br />
Mon Oct 4, 2010:<br />
9:00 – 12:00 - HPC Challenges and Technology<br />
• HPC Challenges, Barry Schneider, NSF, US<br />
• DEISA overview, Hermann Lederer, RZG, Germany<br />
• TeraGrid / XD overview, John Towns, NCSA, US<br />
• PRACE overview, Giovanni Erbacci, CINECA, Italy<br />
13:30 – 15:30 - Challenges by Scientific Disciplines I<br />
• Parallel track 1: Materials & Life Sciences<br />
Nano technology: Thomas Schulthess, ETHZ, Zurich, & CSCS, Switzerland<br />
• Parallel track 2: QCD<br />
Quantum Chromo Dynamics: Richard Brower, Boston University, US<br />
15:50 – 16:50 - Challenges by Scientific Disciplines II<br />
• Parallel track 1: Materials & Life Sciences<br />
Molecular Dynamics simulation packages I<br />
Gromacs: David van der Spoel, Univ. Upsala, Sweden<br />
• Parallel track 2: Plasma Physics<br />
Fusion energy research: Frank Jenko, MPI for Plasma Physics, Garching,<br />
Germany<br />
Tue Oct 5, 2010:<br />
9:00 – 12:00 - Programming<br />
• Overview on Mixed MPI/OpenMP Programming, UPC, CAF, StarSs Model<br />
Giovanni Erbacci, CINECA, Bologna, Italy<br />
David Henty (Lecture 1, Lecture 2, Lecture 3,Lecture 4), EPCC, University of<br />
Edinburgh, UK<br />
Josep M. Perez, Barcelona Supercomputer Center, Spain<br />
13:30 – 15:20 - Challenges by Scientific Disciplines III<br />
• Parallel track 1: Astro Sciences<br />
Astro sciences, Cosmology: Mike Norman, UCSD, San Diego, US<br />
• Parallel track 2: Materials & Life Sciences<br />
Molecular Dynamics simulation packages II<br />
Amber: Thomas Cheatham, Univ. Utah, US<br />
CP2K: Marcella Iannuzzi, Univ. Zurich, Switzerland<br />
15:40– 16:35 - Challenges by Scientific Disciplines IV<br />
• Parallel track 1: Climate Research<br />
Climate Research: Patrick Joeckel, DLR, Germany<br />
• Parallel track 2: Materials & Life Sciences<br />
Molecular Dynamics simulation packages III
QuantumEspresso: Paolo Gianozzi, Univ. Udine, Italy<br />
16:35– 17:30 - Algorithmic Approaches & Libraries I<br />
Advanced algorithms for long-range interactions: Axel Arnold, University of<br />
Stuttgart, Germany<br />
Wed Oct 6, 2010:<br />
9:00 – 12:00 - Performance Analysis & Profiling<br />
• Points package, Scalasca, etc.<br />
Philip Blood, PSC, Pittsburgh, US<br />
Bernd Mohr, FZJ, Juelich, Germany<br />
13:30 – 17:00 - Algorithmic Approaches & Libraries II<br />
• Particle-in-Cell methods in plasma physics: Roman Hatzky, IPP, Garching,<br />
Germany<br />
• Software environment for efficient flow simulations: Joachim Bungartz, TUM,<br />
Munich, Germany<br />
• Numerical Libraries: Tony Drummond, LBNL, Berkeley, US<br />
Thu Oct 7, 2010:<br />
9:00 – 12:00 - Data Intensive Computing and Visualization<br />
• Data intensive computing: John R Johnson, PNL, US<br />
• Visualization<br />
Sean Ahern, ORNL, Knoxville, US<br />
Uwe Woessner, HLRS, Stuttgart, Germany
Appendix B – Attendees, Staff and Presenters<br />
Attendees<br />
First Name Last Name Institute Country<br />
Xavier Abellan Ecija Barcelona Supercomputing Center Spain<br />
Shannen Adcock University of Arkansas USA<br />
Fabio Affinito CINECA Italy<br />
Sean Ahern University of Tennessee USA<br />
Axel Arnold University of Stuttgart, ICP Germany<br />
Thomas Auckenthaler TU München Germany<br />
Mirco Bazzani Physics department, Parma Italy<br />
University<br />
Philip Blood Pittsburgh Supercomputing Center, USA<br />
Carnegie Mellon University<br />
Steven Böing Delft University of Technology Netherlands<br />
Jonathan<br />
Richard Brower Boston University USA<br />
Hans- Bungartz TUM, Department of Informatics Germany<br />
Joachim<br />
Benjamin Byington University of California, Santa Cruz USA<br />
Ramon Calderer University of Illinois at Urbana- USA<br />
Champaign<br />
Thomas Cheatham University of Utah USA<br />
Galen Collier Clemson University USA<br />
Benjamin Cruz Perez University of Puerto Rico at<br />
Mayaguez<br />
USA,<br />
Puerto Rico<br />
Tony Drummond Lawrence Berkeley National USA<br />
Laboratory<br />
Philipp Edelmann Max-Planck-Institute for<br />
Germany<br />
Astrophysics<br />
Giovanni Erbacci CINECA Italy<br />
Johannes Feist ITAMP, Harvard-Smithsonian USA<br />
Center for Astrophysics<br />
James Ferguson National Institute for Computational USA<br />
Sciences<br />
Daniel Fletcher University of Warwick (EPRSC) UK<br />
Wolfgang Gentzsch Max-Planck-Institute Germany<br />
Paolo Giannozzi University of Udine and IOM- Italy<br />
Democritos<br />
Andreas Goetz San Diego Supercomputer Center USA<br />
Luca Graziani Max-Planck-Institute for<br />
Germany<br />
Astrophysics, Garching<br />
Thomas Guillet CEA/DSM France<br />
Roman Hatzky IPP Germany<br />
Andreas Hauser Genzentrum, LMU Germany
Tobias Heidig MPI Magdeburg Germany<br />
David Henty EPCC UK<br />
Matthew Hogan University of Massachusetts USA<br />
Lorenz Hüdepohl Max-Planck-Institut für Astrophysik Germany<br />
Marcella Iannuzzi University of Zurich Switzerland<br />
Sebastian Illi University of Stuttgart, IAG Germany<br />
Sam Jacobs University of Antwerp Belgium<br />
Frank Jenko IPP Germany<br />
Patrick Jöckel DLR Institute for Atmospheric Germany<br />
Physics<br />
John Johnson Pacific Northwest National USA<br />
Laboratory<br />
Alan Kelly University of Glasgow UK<br />
Hans- Klingshirn MPI für Plasmaphysik Germany<br />
Joachim<br />
Elizabeth Leake University of Chicago USA<br />
Hermann Lederer Garching Computer Centre, Max- Germany<br />
Planck-Society<br />
Angelo Limone Max-Planck-Institut für<br />
Germany<br />
Plasmaphysik<br />
Jeremy Logan University of Maine USA<br />
Katie Maerzke Vanderbilt University USA<br />
Chris Malone Stony Brook University USA<br />
Bernd Mohr Jülich Supercomputing Centre Germany<br />
Giuseppa Muscianisi University of Messina Italy<br />
Sreeja Nag Massachusetts Institute of<br />
USA<br />
Technology<br />
Stefan Nagele Vienna University of Technology Austria<br />
Terrence Neumann Marquette University USA<br />
Michael Norman SDSC, UCSD USA<br />
Szilárd Páll Center For Biomembrane Research, Sweden<br />
Stockholm<br />
Benjamin Payne Missouri Science and Technology USA<br />
Renate Pazourek Vienna University of Technology Austria<br />
Christian Pelties Department of Earth &<br />
Germany<br />
Environmental Sciences, LMU<br />
Munich<br />
Josep M. Perez Barcelona Supercomputing Center Spain<br />
Marcelo Ponce Rochester Institute of Technology USA<br />
Johannes Reetz Garching Computer Centre, Max- Germany<br />
Planck-Society<br />
Martin Roderus Technische Universität München Germany<br />
Dominic Roehm Institute for Computational Physics Germany<br />
Anastasia Romanova Max-Planck-Institute for<br />
Germany<br />
Mathematics in the Science<br />
Romelia Salomon San Diego Supercomputing Center USA
Staff and Presenters<br />
Ferrer<br />
Annika Schiller Jülich Supercomputing Centre Germany<br />
Barry Schneider National Science Foundation USA<br />
Mandes Schönherr Institute of Physical Chemistry Switzerland<br />
Thomas Schulthess ETHZ Switzerland<br />
Jacob Searcy University of Oregon USA<br />
Daniele Selli Technische Univarsitat Dresden Germany<br />
Cengiz Sen University of Tennessee USA<br />
Andrey Sharapov University of Pittsburgh USA<br />
Pablo Souza University of Michigan USA<br />
Filippo Spiga IBM and University of Milano US and Italy<br />
Bicocca<br />
Julien Thibault University of Utah USA<br />
Daniel Told MPI für Plasmaphysik Germany<br />
John Towns NCSA/University of Illinois USA<br />
Volodymy Turchenko University of Calabria Italy<br />
r<br />
Ikram Ullah Royal Institute of Technology Sweden<br />
(KTH)<br />
David van der Spoel Department of Cell & Molecular Sweden<br />
Biology, Uppsala University<br />
Xingyu Wang New York University USA<br />
Susann Wangnett Garching Computer Centre, Max- Germany<br />
Planck-Society<br />
Stefan Wenk Munich University, Geophysics Germany<br />
Katelyn White University of California, Santa Cruz USA<br />
Daniel Wilde EPCC Scotland<br />
Uwe Woessner HLRS Germany<br />
Guoxing Xia Max-Planck-Institute for Physics Germany<br />
Anastasia Yanchilina Lamont Doherty Earth Observatory, USA<br />
Columbia University<br />
Haseeb Zia Technische Universität München Germany<br />
First Name Last Name Institute Country<br />
Sean Ahern University of Tennessee USA<br />
Axel Arnold University of Stuttgart, ICP Germany<br />
Philip Blood Pittsburgh Supercomputing Center, USA<br />
Carnegie Mellon University<br />
Richard Brower Boston University USA<br />
Hans- Bungartz TUM, Department of Informatics Germany<br />
Joachim<br />
Thomas Cheatham University of Utah USA<br />
Tony Drummond Lawrence Berkeley National USA
Laboratory<br />
Giovanni Erbacci CINECA Italy<br />
James Ferguson National Institute for Computational USA<br />
Sciences<br />
Paolo Giannozzi University of Udine and IOM- Italy<br />
Democritos<br />
Roman Hatzky IPP Germany<br />
David Henty EPCC UK<br />
Marcella Iannuzzi University of Zurich Switzerland<br />
Frank Jenko IPP Germany<br />
Patrick Jöckel DLR Institute for Atmospheric Germany<br />
Physics<br />
John Johnson Pacific Northwest National USA<br />
Laboratory<br />
Elizabeth Leake University of Chicago USA<br />
Hermann Lederer Garching Computer Centre, Max- Germany<br />
Planck-Society<br />
Bernd Mohr Jülich Supercomputing Centre Germany<br />
Michael Norman SDSC, UCSD USA<br />
Josep M. Perez Barcelona Supercomputing Center Spain<br />
Barry Schneider National Science Foundation USA<br />
Thomas Schulthess ETHZ Switzerland<br />
John Towns NCSA/University of Illinois USA<br />
David van der Spoel Department of Cell & Molecular Sweden<br />
Biology, Uppsala University<br />
Uwe Woessner HLRS Germany