29.12.2014 Views

Magellan Final Report - Office of Science - U.S. Department of Energy

Magellan Final Report - Office of Science - U.S. Department of Energy

Magellan Final Report - Office of Science - U.S. Department of Energy

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Magellan</strong> <strong>Final</strong> <strong>Report</strong><br />

comparison tools. MG-RAST is unique in its high-throughput analysis strategies, which have improved its<br />

computational capacity by factor <strong>of</strong> 200 in the last two years. It uses a distributed load-balancing scheme to<br />

spread the computationally expensive sequence similarity comparison across a large number <strong>of</strong> distributed<br />

computation resources, including the <strong>Magellan</strong> systems. MG-RAST has been used to analyze 3 terabases <strong>of</strong><br />

metagenomic sequence data in the last six months.<br />

The goal <strong>of</strong> the demonstration was to utilize the testbeds at both sites, and to explore potential failover<br />

techniques within the cloud. The demonstration used 150 nodes from ALCF’s <strong>Magellan</strong> to perform the<br />

primary computations over the course <strong>of</strong> a week, with NERSC’s <strong>Magellan</strong> acting as a failover cloud. Half<br />

a dozen machines on ALCF’s <strong>Magellan</strong> were intentionally failed. Upon detecting the failures, the s<strong>of</strong>tware<br />

automatically started replacement machines on the NERSC <strong>Magellan</strong>, allowing the computation to continue<br />

with only a slight interruption.<br />

The same virtual machine image was used on both the ALCF and NERSC <strong>Magellan</strong> clouds. However,<br />

this was not a simple port and required some changes to the image. The instance uses all eight cores on each<br />

cloud node and about 40% <strong>of</strong> the available memory. The demonstration was a single run within the Deep<br />

Soil project and represents only ˜1/30th <strong>of</strong> the work to be performed.<br />

Use <strong>of</strong> the <strong>Magellan</strong> cloud allowed the project to dynamically utilize resources as appropriate, while<br />

the project continued to work on algorithmic improvements. This demonstration showed the feasibility<br />

<strong>of</strong> running a workflow across both the cloud sites, using one site as a failover resource. The project had<br />

many challenges similar to STAR discussed above in terms <strong>of</strong> image management and application design and<br />

development.<br />

11.2.3 LIGO<br />

Figure 11.5: A screenshot <strong>of</strong> the Einstein@Home screensaver Showing the celestial sphere (sky) with the<br />

constellations. Superimposed on this are the gravity wave observatories (the three colored “angle images”)<br />

and the search marker (a bulls-eye). Recent binary radio pulsar search work units also have a gray disc,<br />

representing the radio receiver dish upon which the search is focused for that unit. (Image courtesy <strong>of</strong><br />

Einstein@Home.)<br />

The Laser Interferometer Gravitational Wave Observatory (LIGO) collaboration is a dynamic group <strong>of</strong><br />

more than 800 scientists worldwide focused on the search for gravitational waves from the most violent<br />

events in the universe. The collaboration developed the Einstein@Home program that uses donated idle<br />

time on private personal computers to search for gravitational waves from spinning neutron stars (also called<br />

pulsars) using data from the LIGO gravitational wave detector. Later, the LIGO researchers ported this<br />

109

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!