25.08.2013 Views

EUCLIPSE First Period Report

EUCLIPSE First Period Report

EUCLIPSE First Period Report

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

WP4 : Sensitivity Experiments and Hypothesis Testing<br />

In this WP we will integrate results from other work-packages to develop numerical<br />

experiments designed to both test our developing understanding and identify<br />

observables that can help further constrain cloud feedbacks. The work proposed in<br />

this package is broken into three tasks:<br />

1. Evaluate unusual behavior<br />

2. Developing and Testing Parameterization Improvements<br />

3. Establishing Observational Metrics<br />

This WP officially started only in Month 13 and since many of the tasks depend on the<br />

results of the other WP’s it is in a preparing stage. A meeting in the autumn of 2011 is<br />

planned to discuss the practical implementation of the tasks in WP4 in relation to the<br />

other 3 WP’s. As a preparation of the use from numerical weather prediction for the<br />

evaluation of fast processes in climate models, ECMWF has started in setting up a<br />

testing framework to evaluate climate models in weather prediction mode. A report<br />

can be found below<br />

A study identifying the utility of NWP based methods for identifying and narrowing<br />

sources of divergent behaviour in cloud-feedbacks in models (Month 36)<br />

Deliverable D4.3<br />

Daniel Klocke, Mark Rodwell<br />

ECMWF<br />

The work for the <strong>EUCLIPSE</strong> project at ECMWF started on June 1 st 2011. The aim is to<br />

utilize and compare different techniques from numerical weather prediction for the<br />

evaluation of fast processes in climate models. Such processes include, for example,<br />

clouds and convection.<br />

Two methods are proposed in the literature to evaluate climate models in weather<br />

prediction mode. The diagnostic of initial physical tendencies (Rodwell and Palmer,<br />

2007) allows evaluation closest to the process level. However, this technique requires<br />

data assimilation capabilities of the used model, which are not always available for<br />

climate models that are used to solve boundary condition problems. Alternatively<br />

transpose-AMIP experiments were proposed by Phillips et al., (2004), where a short<br />

forecast with a climate model is initialized with an analysis produced by an ‘alien’<br />

model. The advantage is that no data assimilation capabilities are required. However,<br />

the analysis from the alien model is not entirely consistent with the physics of the<br />

evaluated climate model and this can lead to spurious tendencies at the beginning of<br />

each forecast. This could potentially lead to wrongly diagnosed model error. Therefore<br />

errors in the transpose-AMIP framework are diagnosed after a few forecast days in the<br />

hope that the initial shock has decayed sufficiently. At these longer lead times, there<br />

is more scope for processes to interact and for local errors to be overwhelmed by<br />

41

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!