28.01.2013 Views

Annual Meeting - SCEC.org

Annual Meeting - SCEC.org

Annual Meeting - SCEC.org

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Poster Abstracts | Group 1 – SHRA<br />

Purdue University, <strong>SCEC</strong> researchers and computer scientists were able to create probabilistic<br />

seismic hazard maps at very high resolution using Open Seismic Hazard Analysis (OpenSHA).<br />

OpenSHA is an open-source seismic hazard framework developed as a joint project between the<br />

Southern California Earthquake Center and the United States Geological Survey. While OpenSHA<br />

currently implements Earthquake Rupture Forecasts and attenuation relationships only for<br />

California, we were able to demonstrate that OpenSHA can scale-up computationally to the scale<br />

required to produce global hazard maps. We demonstrated this by discretizing California at a very<br />

fine level, resulting in over 1.6 million hazard curves, and calculating a very high-resolution PSHA<br />

hazard map for California. The number of curves in our very high resolution California PSHA<br />

maps is comparable to the number of curves required to produce a global hazard maps at 10 km<br />

spacing. Demonstrating this type of scalability is an essential step towards meeting the needs of the<br />

Global Earthquake Model (GEM) program's goal of generating uniform, worldwide seismic hazard<br />

maps.<br />

It took almost 18,000 CPU hours to generate all 1.6 million curves, which we completed in 51 wall<br />

clock hours on 400 CPU cores (including overhead). The curves were calculated using the Working<br />

Group on California Earthquake Probabilities' 2007 Uniform California Earthquake Rupture<br />

Forecast (UCERF), an extremely computationally complex forecast that requires much more<br />

computation time than forecasts for other parts of the world. This suggests that a Global run could<br />

be computed in less CPU time using simpler models. One major achievement was making the<br />

hazard map creation workflow as portable and automatic as possible. In fact, it is just as easy to<br />

run the workflow at the University of Southern California's HPC cluster as it is on many of the<br />

clusters that are part of the National Science Foundation's TeraGrid grid infrastructure. The 1.6<br />

million curve workflow was run on the National Center for Supercomputing Applications<br />

(NCSA)'s Abe cluster.<br />

1-036<br />

SEISMIC LOSS ESTIMATION BASED ON END-TO-END SIMULATION Muto M,<br />

Krishnan S, Beck JL, and Mitrani-Reiser J<br />

Recently, there has been increasing interest in simulating all aspects of the seismic risk problem,<br />

from the source mechanism to the propagation of seismic waves to nonlinear time-history analysis<br />

of structural response and finally to building damage and repair costs. This study presents a<br />

framework for performing truly “end-to-end” simulation. A recent region-wide study of tall steelframe<br />

building response to a M7.9 scenario earthquake on the southern portion of the San Andreas<br />

Fault is extended to consider economic losses. In that study a source mechanism model and a<br />

velocity model, in conjunction with a finite-element model of Southern California, were used to<br />

calculate ground motions at 636 sites throughout the San Fernando and Los Angeles basins. At<br />

each site, time history analyses of a nonlinear deteriorating structural model of an 18-story steel<br />

moment-resisting frame building were performed, using both a pre-Northridge earthquake design<br />

(with welds at the moment-resisting connections that are susceptible to fracture) and a modern<br />

code (UBC 1997) design. This work uses the simulation results to estimate losses by applying the<br />

MDLA (Matlab Damage and Loss Analysis) toolbox, developed to implement the PEER lossestimation<br />

methodology. The toolbox includes damage prediction and repair cost estimation for<br />

structural and non-structural components and allows for the computation of the mean and<br />

variance of building repair costs conditional on engineering demand parameters (i.e. inter-story<br />

drift ratios and peak floor accelerations). Here, it is modified to treat steel-frame high-rises,<br />

including aspects such as mechanical, electrical and plumbing systems, traction elevators, and the<br />

possibility of irreparable structural damage. Contour plots of conditional mean losses are<br />

generated for the San Fernando and the Los Angeles basins for the pre-Northridge and modern<br />

86 | Southern California Earthquake Center

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!