D2 3 Computing e-Infrastructure cost calculations and business _models_vam1-final
D2 3 Computing e-Infrastructure cost calculations and business _models_vam1-final
D2 3 Computing e-Infrastructure cost calculations and business _models_vam1-final
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
e-‐FISCAL: www.efiscal.eu <br />
EC Contract Number: 283449 <br />
In an effort to get more inputs from more countries <strong>and</strong> centres during the second period of the project two new <br />
questionnaires were prepared. The first one was a user-‐friendly spreadsheet version based on the initial one <strong>and</strong> <br />
the second was a high-‐level one based on ranges (mores suitable for centres with confidentiality issues). The <br />
questionnaires were further disseminated <strong>and</strong> the project received four new questionnaires; two of which were <br />
new (from new sites <strong>and</strong> new countries) <strong>and</strong> two were updates from the previous ones. <br />
Finally, the excel version of the questionnaire acted as basis for the on-‐line web-‐based tool. <br />
3.6 Benchmarking exercise<br />
As shown in the last step of our methodology, we provided for the execution of a benchmarking exercise. The <br />
objective of the benchmarking effort (also described in the <strong>D2</strong>.2) has been to carry out a performance comparison <br />
of different systems with identical specifications, while focusing on the HPC, HTC <strong>and</strong> Cloud infrastructures. The <br />
output of the benchmarking exercise would be used in the comparison between the prices of cloud providers <strong>and</strong> <br />
the <strong>cost</strong>s derived from the e-‐FISCAL <strong>cost</strong>ing model in an effort to compare “like with like”. However, with the <br />
advancements in the computer architecture <strong>and</strong> infrastructures, it became increasingly difficult to compare <br />
performance of various computer systems by merely looking at their specifications. Therefore, domain-‐specific <br />
benchmarking tests (such as NAS Parallel Benchmark <strong>and</strong> HEPSEPC06) are required <strong>and</strong> were adopted to compare <br />
the systems across HPC, HTC <strong>and</strong> Cloud infrastructures. In terms of scope, the benchmarking effort focused on the <br />
deployment <strong>and</strong> testing of the equivalent instances, while comparing the HPC <strong>and</strong> HTC infrastructures against the <br />
commercial Cloud offerings (e.g. Amazon EC2). Amazon EC2 is chosen for performance comparisons as it is one of <br />
the market leaders <strong>and</strong> public benchmarking figures are available for the EC2 to compare with. Moreover, it is <br />
important to note here that additional performance tests (e.g. MPPTest, STREAM) can be carried out to <br />
individually measure various performance metrics (e.g. network performance, memory b<strong>and</strong>width, MPI <br />
communication). But, in the context of the e-‐FISCAL methodology, the aggregated performance factor is sufficient <br />
enough to yield the performance adjusted price figures for various infrastructures. The analysis is presented in <br />
Appendix 7.7. <br />
The benchmarking exercise also had important secondary goals. First of all, setting up the benchmarking <br />
environments within both in-‐house <strong>and</strong> Cloud infrastructures gave important tacit information about the practical <br />
migration effort that would be necessary in order to move applications from the in-‐house infrastructures to the <br />
Cloud <strong>and</strong> vice versa. Despite the fact that portability <strong>and</strong> rapid deployment are the key design goals of <br />
benchmarking suites, the effort needed for setting up <strong>and</strong> maintaining the environments was considerable. This <br />
points out towards a need for further studies with real applications when estimating the feasibility of various <br />
hybrid e-‐<strong>Infrastructure</strong> scenarios. Furthermore, the benchmarking study served an important role in building a <br />
common underst<strong>and</strong>ing of the e-‐<strong>Infrastructure</strong>. Discussing the benchmarking methodologies <strong>and</strong> their limitations <br />
with financial experts had an important role in enabling efficient trans-‐disciplinary collaboration; in fact this was <br />
equally important as discussing the strengths <strong>and</strong> limitations of the different <strong>cost</strong> assessment methodologies with <br />
e-‐<strong>Infrastructure</strong> experts. <br />
e-‐FISCAL : Financial Study for Sustainable <strong>Computing</strong> e-‐<strong>Infrastructure</strong>s <br />
Deliverable <strong>D2</strong>.3 – <strong>Computing</strong> e-‐<strong>Infrastructure</strong>s <strong>cost</strong> estimation <strong>and</strong> analysis – Pricing <strong>and</strong> <br />
Business <strong>models</strong> <br />
41