12.02.2013 Views

Salman Habib - Los Alamos National Laboratory

Salman Habib - Los Alamos National Laboratory

Salman Habib - Los Alamos National Laboratory

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

1998 2015<br />

Cosmological Simulations:<br />

Modeling the Universe<br />

<strong>Salman</strong> <strong>Habib</strong><br />

<strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong><br />

SF09, July 10, 2009<br />

Visualization: Pat McCormick,<br />

CCS-1, LANL<br />

SDSS, Katrin First Heitmann, Light <strong>Los</strong> <strong>Alamos</strong> in 1998 <strong>National</strong> <strong>Laboratory</strong> Deep Lens LBNL, Survey/LSST<br />

March 12, 2009


Standard Model of Cosmology and Beyond<br />

• Good idea of the history of the Universe<br />

• Good idea of the composition:<br />

‣ ~73% of a mysterious dark energy<br />

‣ ~23% of an unkown dark matter component<br />

‣ ~4% of baryons<br />

• Constraints on ~20 cosmological<br />

parameters, including optical depth,<br />

spectral index, Hubble constant, ...<br />

• Values are known to ~10%<br />

• For comparison: the parameters of the<br />

Standard Model of Particle Physics are<br />

known with 0.1% accuracy!<br />

• Where do we go from here?<br />

‣ Precision for the sake of precision not useful,<br />

want more theory and less phenomenology<br />

‣ Observations and theory must work together to<br />

understand the unknowns of the current<br />

Standard Model<br />

‣ What is the Dark Energy? What is the Dark<br />

Matter? Does GR have to be modified? Where<br />

do the primordial fluctuations come from? How<br />

can inflation be tested? (More physics and<br />

less scenarios)<br />

z~1000<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009<br />

z~30<br />

z~2


Progress in Cosmology I: CMB<br />

• Cosmic microwave background<br />

measurements started the era of<br />

“precision cosmology”<br />

• What made it “precision”?<br />

‣ Physics “easy” to understand<br />

‣ In its wavelength band, the CMB<br />

dominates the sky<br />

• After CMB, what?<br />

‣ More CMB (polarization), but much<br />

messier!<br />

‣ Large scale structure probes -- but<br />

now must go beyond linear theory<br />

2009+ Planck<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Progress in Cosmology II: LSS<br />

Void<br />

Gregory & Thompson, 1978<br />

• 1978: Discovery of voids and<br />

superclusters, theory of hierarchical<br />

structure formation via gravitational<br />

instability emerges<br />

• 2006: SDSS has measured more than<br />

1,000,000 galaxies, key discoveries such<br />

as BAO (Eisenstein et al.) cementing the<br />

current picture of structure formation<br />

De Lapparent, Geller, Huchra<br />

CfA 1986 1,100 galaxies<br />

SDSS<br />

~1,000,000 galaxies<br />

M. Blanton<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


How is Higher Accuracy Useful? Example: Dark Energy<br />

• What is the nature of dark energy?<br />

‣ Cosmological constant (consistent with data, but terrible from theoretical perspective)<br />

‣ Scalar field (foundational motivation is fuzzy, but good toy models for a dynamical explanation)<br />

‣ Phase transition of some sort?<br />

‣ Or perhaps general relativity needs modification at large length scales?<br />

• All alternatives less than compelling (1000+ papers! No one believes anyone else,<br />

including themselves. Anyway, how can any sane person test so many models?)<br />

• Settle, therefore, for a more model-free approach. Try to determine the dark energy<br />

equation of state, w and its time variation (Cosmological const. has w=-1, dw/dt=0)<br />

• Ask question such as, can we exclude classes of dark energy models? Can we carry<br />

out consistency tests to show that GR works as well in cosmology as on sub-galactic<br />

scales? Rule out more whacky ideas --<br />

• CMB provides very useful information, with good control on systematic errors. But to<br />

move forward need structure formation probes: baryon acoustic oscillations<br />

(geometry), cluster counts (mass function), supernovae (geometry), weak lensing<br />

(mass distribution). These are promising, but intrinsically complex!<br />

• Can even the minimal program be carried out? Observers are confident --<br />

• Can theorists afford to be complacent? There’s more to life than forecasts! (See<br />

Decadal Survey White Papers) Plenty of examples where theory is stuck (foundations<br />

of turbulence, nonperturbative dynamics of field theories, --)<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Future of Precision Cosmology<br />

• Current coss-validated Standard Model<br />

works at better than 10%, so --<br />

‣ It’s unlikely that whatever comes next will<br />

totally overthrow what is currently known.<br />

For two reasons: (i) What we know is pretty<br />

solid, and (ii) Mostly we are pushing<br />

observational boundaries not so much creating<br />

new windows to observe the Universe (e.g.,<br />

gravitational waves, neutrino telescopes)<br />

‣ Signatures of new physics will be subtle,<br />

otherwise they would have been picked up<br />

already<br />

‣ Can theoretical predictions be made<br />

accurately enough? Can systematic errors<br />

be controlled to the desired levels?<br />

• How good is analytic theory?<br />

‣ No natural expansion parameter, breaks<br />

down catastrophically in nonlinear regime<br />

‣ Cannot deal with complexity of the problem<br />

• Are simulations up to the challenge?<br />

‣ Extreme dynamic range<br />

‣ All relevant physics hard to include<br />

Carlson, White, Padmanabhan 2009<br />

Simulations<br />

One-loop<br />

pertbn. theory<br />

Two-loop<br />

pertbn. theory<br />

Linear theory<br />

Heitmann et al 2009<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


What do Simulations do?<br />

• At large scales, gravity dominates, so<br />

need to solve the Einstein-Poisson<br />

equation in the non-relativistic limit<br />

‣ Note equation is (i) 6-D, and (ii)<br />

intrinsically nonlinear, cannot solve<br />

as PDE<br />

‣ Use N-body methods -- sample<br />

phase-space PDF with particles and<br />

evolve system of interacting particles<br />

‣ Can we understand and predict errors<br />

well enough?<br />

• At small scales need to add gas<br />

physics (hydrodynamics) and<br />

astrophysical feedback<br />

‣ Simulations get harder --<br />

‣ Gastrophysics too complicated<br />

(subgrid modeling needed)<br />

‣ Need to calibrate simulations against<br />

observations. Is this doable?<br />

Cosmological Vlasov-Poisson Equation<br />

Simulated Sky<br />

Real Sky (SDSS)<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Life with a Gravity-only Cosmology Code (Cartoon)<br />

• Initial Conditions<br />

‣ Simulation begins with initial density field well in the linear regime. (i) Pick primordial<br />

power spectrum; (ii) Multiply by transfer function for chosen cosmology; (iii) Normalize as<br />

desired (CMB or LSS); (iv) Generate a single realization of the density field in k-space,<br />

use Poisson equation to solve for gradients of the potential field; (v) use Zel’dovich (or<br />

some other) approximation to move particles off a grid (quiet start) or off a “glass” initial<br />

configuration<br />

• Time-Stepping<br />

‣ Use symplectic/leap-frog integrator; first “stream” particles, then compute inter-particle<br />

forces, update velocities, do next “stream” and repeat ad nauseam --<br />

‣ More sophistication -- adaptive/individual particle time-steps, particle splitting, etc.<br />

• Force Computation<br />

‣ Direct particle-particle force evaluations hopeless, use approximate tricks (tree, PM,<br />

AMR, --) to reduce order of algorithm to NlogN<br />

• Tests<br />

‣ Many sources of error in places you don’t expect; develop suite of tests for robustness of<br />

simulation results, check everything multiple times --<br />

• Analysis<br />

Katrin ‣ Compute P(k), etc. Make halo catalogs, merger trees -- write papers<br />

Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Large Scale Structure Probes: Some Numbers<br />

P(k)<br />

Baryon Wiggles<br />

k [h/Mpc]<br />

6Gpc Box<br />

Vikhlinin et al. 2008,<br />

ApJ in press<br />

• Baryon Accoustic Oscillations (BOSS, HETDEX, JDEM, LSST,<br />

WiggleZ, --)<br />

‣ Precision requirement: 0.1% measurement of distance scale<br />

‣ Simulations: Very large box sizes (~3 Gpc) to reduce sampling<br />

variance and systematics from nonlinear mode coupling<br />

‣ Gravity-only simulations largely adequate<br />

• Weak Lensing (Euclid, JDEM, LSST, --)<br />

‣ Precision requirement: 1% accuracy at k~1-10 h/Mpc<br />

‣ Large box sizes (~1Gpc) to reduce nonlinear mode coupling<br />

‣ At scales k > 1 h/Mpc: baryonic physics starts to become important<br />

• Clusters (eROSITA, LSST, SPT, --)<br />

‣ Large box sizes (~1Gpc) for good statistics (~40,000 clusters)<br />

‣ Gas physics and feedback effects important<br />

‣ Well calibrated mass-observable relations required<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


“Great Survey” Size Simulations<br />

• Workhorse “Billion/Billion” simulation:<br />

‣ Gigaparsec box, billion particles<br />

‣ Smallest halos: ~10¹³ M . (100 particles)<br />

‣ 10 time snapshots: ~250GB of data<br />

‣ ~30,000 Cpu hours with e.g. Gadget-2, ~5<br />

days on 256 processors (no waiting time in the<br />

queue included...)<br />

‣ Accuracy at k~1h/Mpc: ~1%<br />

• “Supersimulations” 1-6 Gpc, 300 billion to<br />

one trillion particles<br />

‣ Smallest dark matter halos: ~10¹² M .<br />

(and<br />

10-100 times smaller --)<br />

‣ 10 time snapshots: ~75TB (for 300 billion<br />

particles)<br />

• Effectively one wants to model the entire<br />

observable Universe down to galactic<br />

scales!<br />

‣ Basic gravity runs must satisfy stringent error<br />

requirements<br />

‣ Addition of new physics -- non-Gaussianities,<br />

modified gravity, self-consistent dark energy,<br />

dark energy fluctuations, etc. -- must also<br />

satisfy the same requirements<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Cosmological Simulations: More on Requirements<br />

I. Techniques:<br />

N-body treatment of gravity in an<br />

expanding universe including gas<br />

dynamics and subgrid physics (star<br />

formation, supernova feedback, etc.)<br />

II. Large vs. Small:<br />

Large simulations cover entire survey<br />

volumes, smaller simulations target<br />

physics issues at enhanced resolution<br />

II. Requirements:<br />

(1) Memory, (2) Dynamic range in<br />

space, time, and mass, (3) Accuracy<br />

and robustness, and (4) Speed<br />

current “workhorse”<br />

simulations (billion<br />

particles)<br />

Example: SDSS-III, Gravity-only (need many of these)<br />

Volume = 3 (Gpc)^3, Memory ~ 40 TB (particles + grid)<br />

Particle mass = 10^10 Solar mass, # of particles, N > 3X10^11<br />

Force resolution, ∆ = 30 kpc (or less), Spatial dynamic range = 10^5-10^6<br />

Mass dynamic range = 10^4 -10^5, Time-steps ~ 10^4-10^5<br />

Example requirement -- galaxy clustering statistics to accuracy of better than 1%


The Coyote Universe: Walking before Running<br />

Coyote-I: arXiv:0812.1052, Coyote-II: arXiv:0902.0429 (ApJ, to appear),<br />

Coyote-III, IV: in preparation, and even more to come --<br />

• Large simulation suite run on LANL Coyote compute cluster<br />

‣ 38 cosmological models with different dark energy equations of state<br />

‣ 1.3 Gpc cubed comoving volume, 1 billion particles each<br />

‣ 16 medium resolution, 4 higher resolution, and 1 very high resolution simulation for each model<br />

= 798 simulations, ~60Tb of data<br />

• Aim: precision predictions at the 1% accuracy level (really) for different<br />

cosmological statistics<br />

‣ dark matter power spectrum out to k~1h/Mpc; on smaller scales: hydrodynamics effects<br />

become important! (White 2004, Zhang & Knox 2004, Jing et al. 2006, Rudd et al. 2008)<br />

‣ weak lensing shear power spectrum<br />

‣ halo mass function<br />

• Three parts to the project:<br />

‣ Demonstrate 1% accuracy of the dark mattter simulations out to k=1h/Mpc ✓ (arXiv:0812.1052)<br />

‣ Develop framework which can predict these statistics from a minimal number of simulations ✓<br />

(arXiv:09.02.0429)<br />

‣ Build prediction tools from simulation suite (Coyote III, IV, in progress)<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


P(k)[(Mpc/h) 3 ]<br />

Code Comparison: Are the Basic Techniques Robust?<br />

Residuals<br />

10000<br />

1000<br />

100<br />

Heitmann et al., ApJS (2005); Heitmann et al., Comp. Science and Discovery (2008)<br />

10<br />

1<br />

0.1<br />

0.05<br />

0<br />

-0.05<br />

HOT<br />

PKDGRAV<br />

Hydra<br />

Gadget-2<br />

TPM<br />

TreePM<br />

0.1 1 10<br />

k[h/Mpc]<br />

64 Mpc/h<br />

256³ particles<br />

Particle<br />

Nyquist<br />

• Comparison of ten major<br />

codes (subset is shown)<br />

• Each code starts from<br />

same initial conditions<br />

• Each simulation is<br />

analysed in exactly the<br />

same way<br />

• Overall, good agreement<br />

between codes for<br />

different statistics at the<br />

5-10% level<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009<br />

2%<br />

?<br />

Flash<br />

PKDGRAV


Mass Resolution: Technical Issues Matter!<br />

• How many particles needed? Test<br />

with different particle loading in 1Gpc<br />

box<br />

‣ Run 1024³ particles as reference<br />

‣ Downsample to 512³ and 256³ particles<br />

and run forward<br />

‣ In addition: downsample z=0,1 1024³<br />

results to characterize shot noise<br />

problem<br />

• For precision answers, interparticle<br />

spacing has to be small! Previous<br />

hand-waving arguments and folk<br />

theorems incorrect<br />

• Requirement: k < k_Ny/2 (no surprise<br />

to Shannon)<br />

• Gigaparsec box requires billion<br />

particle minimum, thus a factor of 10<br />

increase in resolution will require a<br />

trillion particles!<br />

• Force resolution is not the limiting<br />

factor, but mass resolution is<br />

256³ particles k_Ny/2<br />

512³ particles<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009<br />

z=1<br />

z=0<br />

Downsampled<br />

to 512³<br />

Downsampled<br />

to 256³


How Many Simulations?: Cosmic Calibration<br />

Heitmann et al., ApJL (2006); <strong>Habib</strong> et al., PRD (2007); Schneider et al., PRD (2008),<br />

Heitmann et al., arXiv:0902.0429<br />

• Good News: We have simulation accuracy under control at the 1% level out to<br />

k~1h/Mpc<br />

‣ Mass resolution, box size, initial start, force resolution, and time step criteria exist!<br />

• Bad News: For cosmological constraints from e.g. SDSS:<br />

‣ Run your favorite Markov Chain Monte Carlo code, eg. CosmoMC<br />

- MCMC: directed random-walk in parameter space<br />

‣ Need to calculate P(k) ~ 10,000 - 100,000 times for different models<br />

‣ 30 years of Coyote time (2048 processor Beowulf Cluster), impossible!<br />

• What we need: framework that allows us to provide, e.g., P(k) for a range of<br />

cosmological parameters via some form of “interpolation”<br />

• The Cosmic Calibration Framework (based on several cutting edge<br />

applications of modern statistics) provides:<br />

‣ Simulation design, an optimal strategy to choose parameter settings<br />

‣ Emulation, smart interpolation scheme that will replace the simulator and will generate<br />

power spectra, mass functions... with controlled errors<br />

‣ Uncertainty and sensitivity analysis<br />

‣ Calibration -- combining simulations with observations to determine best-fit cosmology<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


P(k) Emulator Performance: Proof of Success<br />

• Emulator: interpolation scheme, which allows us to predict the power spectrum at<br />

non-simulated settings in the parameter space under consideration<br />

• Build emulator from 37 HaloFit runs according to our design<br />

• Generate 10 additional power spectra within the priors with Halofit and the emulator<br />

• Emulator predictions are accurate at the sub-percent level!<br />

• Removes fundamental barrier to precision simulation approach<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009<br />

1%<br />

1%


The Next Step: The Roadrunner Universe<br />

hard to refuse, eh?<br />

The nuts and bolts, a year and a half later --<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


6GB<br />

6GB<br />

100TB total RAM<br />

Opterons have little compute,<br />

but half of the memory and<br />

balanced communication<br />

Cells dominate the compute<br />

but communication is poor,<br />

50-100 times out of balance<br />

Multi-layer programming<br />

model<br />

What is Roadrunner?<br />

PPU<br />

4XSPE<br />

4XSPE<br />

Bus<br />

C/SPU<br />

Intrinsics<br />

DaCS<br />

C/C++<br />

MPI<br />

Notional Cell Architecture<br />

RAM<br />

I/O


How to make Roadrunner a Cosmic Predictor?<br />

• Direct N-body O(N^2)<br />

‣ Accurate, but hopeless -- 100 years on Roadrunner! (too long by a factor of 10,000)<br />

• Grid-based methods (PIC/PM) O(NlogN)<br />

‣ Fast, good error controls but memory-limited due to the grid, would require 10-100 PB to<br />

reach dynamic range target (too big by about a factor of 1000)<br />

• PIC/PM + Adaptive Mesh Refinement (AMR)<br />

‣ Too slow, complex, memory problems persist, error controls a worry<br />

• Tree Methods O(NlogN)<br />

‣ Efficient, but boundary conditions and error controls troublesome, performance controlled<br />

by tree-walk<br />

• Hybrid I: Tree/PM<br />

‣ Long/medium range force handled by PM, short range forces via tree, involves dealing<br />

with a complicated data structure, error control still an issue?<br />

• Hybrid II: P^3M<br />

‣ Long/medium range force handled by PM, short range forces via direct N^2. Easy to<br />

code, error control is simple, but can bog down when particle distribution is clustered<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Back of the Envelope MC3 Design I<br />

Hybrid grid/particle design is good match to RR architecture:<br />

Grid on Opteron layer, “standard” operations (FFTs, shifts,<br />

etc.), handle medium-resolution tasks (~40s/10^4 cubed FFT)<br />

Particles on Cell layer, which handles compute-intensive high<br />

resolution tasks (similar memory requirements as grid)<br />

Limited inter-layer BW not a problem as only grid info goes<br />

back and forth between layers (requires ~1s) and only<br />

intermittently (need ~100-1000 long time-steps)<br />

Which hybrid design?<br />

Opted for P^3M -- clustering problem overcome by brute<br />

force Cell performance (100 GF/s/CBE ok for 10^5 particles/<br />

CBE, worst case, O(secs)) -- applies to large volume cosmology<br />

simulations (single precision, local coordinates -- whew!), no<br />

need to load balance (large physical volume/node)<br />

Conclusion:<br />

Simple performance estimates argue that base code can run the<br />

reference simulation problem in a matter of days, provided that<br />

scaling can be achieved (is this a big if? -- Cell comm BW is poor)


• Use P^3M<br />

‣ Grid lives on Opteron layer with FFT Poissonsolves<br />

of up to 10,000^3, uses digital filtering<br />

and Super-Lanczos differentiation to reduce<br />

particle-grid interaction<br />

‣ Particles live in Cell BE and interact on “subgrid<br />

scales” by fast hand-coded routines<br />

‣ Only simple grid info flows between Cells and<br />

Opterons, so thin pipe problem is solved<br />

• Avoid Particle Communication<br />

‣ Particle communication between Cells at<br />

every short-range step would be too slow,<br />

avoid this using particle caching<br />

‣ Intermittent nearest neighbor refresh (fast)<br />

• On the Fly Analysis<br />

‣ Avoid I/O as much as possible, analysis<br />

routines must run as a part of the code Overload Zone (particle “cache”)<br />

Nodal domain contains all particles that will ever cross into its reference volume (30% overhead can be reduced to few %). Particles<br />

inside reference volume are “active”, i.e., used to compute self-consistent force, others are “passive”, i.e., used only as tracers (their<br />

“active” self belongs to a different compute node); as they move, particles change their state based on local information. For the PM<br />

piece, overloading is “exact”, for the N^2 part, error at the domain edge propagates inwards. But it is (i) small, and (ii) can be controlled<br />

by increasing Katrin Heitmann, the boundary <strong>Los</strong> <strong>Alamos</strong> layer <strong>National</strong> thickness <strong>Laboratory</strong> or via periodic refreshes.<br />

LBNL, March 12, 2009


P^3M requires “quiet” force<br />

at hand-over:<br />

Spatial filtering (TSC<br />

etc.) too complex for Cell<br />

Spectral filtering in Fourier<br />

domain reduces code “bookkeeping”:<br />

Use basic CIC deposition<br />

and interpolation at the<br />

Cell level<br />

Apply spectral filtering<br />

at Opteron (grid) level<br />

-- as effective as spatial<br />

filtering<br />

Avoid real-space<br />

differentiation by using<br />

the super-Lanczos<br />

gradient (see Hamming)<br />

Not BOE Design MC3 III<br />

!"#+<br />

!"#*<br />

!"#)<br />

!"#(<br />

!"#'<br />

!"#&<br />

!"#%<br />

!"#$<br />

!"<br />

!" !$ !% !& !' !( !) !* !+<br />

!(#$<br />

!(<br />

!"#'<br />

!"#&<br />

!"#%<br />

!"#$<br />

CIC PM<br />

1/r^2<br />

Noisy CIC PM force<br />

6th-Order sinc-Gaussian<br />

filtered CIC PM force<br />

“Quiet” PM<br />

Ratio to 1/r^2<br />

!"<br />

!" !( !$ !) !% !* !& !+ !'


The Roadrunner Universe Project: Status<br />

Collaboration: S. <strong>Habib</strong> (PI), J. Ahrens, L. Ankeny, S. Bhattacharya, J. Cohn,<br />

C.-H. Hsu, D. Daniel, N. Desai, P. Fasel, K. Heitmann, Z. Lukic, G. Mark, A. Pope, M. White<br />

• First full test code (med.<br />

resolution) to run anytime now,<br />

will already be the largest<br />

cosmology run ever (~100+<br />

billion particles)<br />

• First high resolution runs in a<br />

month (particle-particle force is<br />

>50 times faster on the Cells wrt<br />

the Opterons)<br />

• Hydro module in the fall<br />

• Roadrunner Universe Blue<br />

Book soon(ish)<br />

• Data will be made public (as<br />

much as size limitations allow)<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


The Roadrunner Universe Project: Science<br />

• SDSS-III/BOSS baryon acoustic oscillations simulations and mock catalogs<br />

• SDSS-III Lyman-alpha runs, signature of BAO scale in Lyman-alpha forest<br />

• JDEM weak lensing<br />

• High-resolution simulations for dark matter searches<br />

• LSST simulations for a variety of projects (weak lensing, clusters, BAO)<br />

• Large-volume cluster simulations and large-scale flows<br />

• Addition of new physics:<br />

‣ Self-consistent dark energy models and inclusion of dark energy fluctuations (LSST ISW)<br />

‣ Nonparametric w(z) simulations<br />

‣ Calibration of baryonic effects in weak lensing<br />

‣ New code for modified gravity (cosmological PPN)<br />

‣ More general primordial fluctuations (running of spectral index, non-Gaussianity, warm/nonthermal dark<br />

matter)<br />

‣ Nonlinear clustering of neutrinos<br />

‣ Your favorite idea here --<br />

• Extension of Calibration Framework to address expanded “physics space” and smaller<br />

errors<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009


Conclusions<br />

• Nonlinear regime of structure formation requires simulations<br />

‣ No error controlled analytic theory exists or is ever likely to exist<br />

‣ Simulated skies/mock catalogs essential for survey analysis and accurate predictions<br />

• Simulation requirements are demanding, but (at least some) can be met<br />

‣ Only a finite number of simulations need be performed (will always be true?)<br />

‣ Hydro/feedback calibration against data is key, we have a method to do this within the<br />

CCF<br />

• Cosmic Calibration Framework<br />

‣ Accurate emulation of several statistics, matching within code errors<br />

‣ Allows (very) fast calibration of models vs. data<br />

• Future simulations<br />

‣ Very large data sets<br />

‣ Emphasis on analysis, what should be done<br />

‣ How should data be made available to the community? (RRU will have a ~PB of data)<br />

Katrin Heitmann, <strong>Los</strong> <strong>Alamos</strong> <strong>National</strong> <strong>Laboratory</strong> LBNL, March 12, 2009

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!