Climatologies Based on the Weather Research and Forecast (WRF ...

rap.ucar.edu
  • No tags were found...

Climatologies Based on the Weather Research and Forecast (WRF ...

ong>Climatologiesong> ong>Basedong> on the Weather Research and Forecast (WRF) ModelFrancois Vandenberghe, Mike Barlage, and ScottSwerdlinNational Center for Atmospheric Research(NCAR) and the STAR Institute, Boulder, CO{vandenb, barlage, swerdlin}@starinst.orgJudith Gardiner, Ashok Krishnamurthy, and AlanChalkerOhio Supercomputer Center, Columbus, OH{judithg, ashok, alanc}@osc.eduBecause tests at Army Test and Evaluation Command (ATEC) ranges can often only be conductedunder specific atmospheric conditions, climatologies of each range could be a very useful tool forlong-range test planning. Such datasets can provide the probability density function of near-surfacewind speed, percent cloud cover, temperature, precipitation, turbulence intensity, upper-atmosphericwind speed, etc., as a function of the month of the year, time of day, and location on the range. TheSTAR Institute with guidance from the Ohio Supercomputer Center is working on porting the ClimateFour Dimensional Data Assimilation (CFDDA) technology developed by the National Center forAtmospheric Research to version 3 of the Weather and Research Forecast (WRF) model for use onsystems within the DoD High Performance Computing Modernization Program. The fully parallelizedWRF-CFDDA system uses data-assimilation by Newtonian relaxation, to generate a regional reanalysisthat is consistent with both observations and model dynamics for every hour of the past 20years, at fine scales (3 km grid) over the Dugway Proving Ground range.


1. IntroductionOver the past years, the National Center forAtmospheric Research (NCAR) has developed theClimatology Four-Dimensional Data Assimilation systemweather system (CFDDA). CFDDA has the capability togenerate hourly atmospheric analysis at a horizontalresolution down to the kilometer for any meteorologicalsituation or time period in the past 30 years. Thepreviously developed system consists of the followingmodel and observation input data:1. The 5 th generation of the NCAR/Penn StateMesoscale Model (MM5) and its FourDimensional Data Assimilation (FDDA)functions2. 40 years of global analysis generated at NCARwith the National Centers for EnvironmentalPrediction (NCEP) model (NNRP)3. 30 years of meteorological observations from theAutomatic Data Processing historical repositorymaintained at NCAR4. 28 years of high resolution of Sea SurfaceTemperature reanalyses from NCEP5. 20 years of the NASA’s Global Land DataAssimilation Surface model output land and soildataNCAR MM5 CFDDA runs on Linux clusters and hasalready been applied in various Defense applications forseveral sponsors (NGIC, DTRA, and AMREOC). Thesystem could also prove to be a very valuable missionplanning tool for the Army Test and EvaluationCommand (ATEC) ranges. For example, the 20–40 yearclimatologies created by CFDDA can provide the detailedprobability density function (probability of differentvalues) of near-surface wind speed, percent cloud cover,temperature, precipitation, turbulence intensity, upperatmosphericwind speed, etc., as a function of the monthof the year, time of day, at any location on the range. Ananalysis of the climatology could therefore help theranges to define the season, time, and location on therange where required test conditions are most likely to bemet—and long-range scheduling and planning for testscan be undertaken accordingly. The ATEC ranges aredistributed over the CONUS and Alaska. Generating 20years model gridded climatologies for each of theseranges is an exceptional computational challenge thatrequires resources not normally available to academicinstitutions.STAR Institute is working with NCAR under theguidance of the Ohio Supercomputer Center ondeveloping a new CFDDA capability based on the thirdversion of the Weather and Research Forecast (WRF)model. This new version of the WRF model includesseveral important features, such as Raleigh damping nearthe top of the model, which are highly relevant for rangemeteorology. The code is fully parallelized making WRFvery suitable for implementation in the High PerformanceComputing Modernization Program (HPCMP) distributedmemory computing environment.The WRF code with its preprocessor, postprocessorand peripherals has been ported to the Air Force ResearchLaboratory Department of Defense (DoD)Supercomputing Resource Center (AFRL DSRC) Linuxcluster Falcon. The input data, about 240GB compressed,have been uploaded to the archive. A verification andvalidation of the porting of the code has been conductedand a statistical post-processing of the validation resultshas been applied. The WRF model and the configurationadopted for the climatological study is described inSection 2. The verification results are given in Section 3and some of preliminary statistical products are presentedin Section 4. Section 5 concludes the report with someconsiderations on the statistical and numerical aspects ofthe climatologies that are created.2. WRF ConfigurationThe month of September over the Dugway ProvingGround (DPG) range was selected to conduct theverification. This choice was primarily motivated by thehigh quality meteorological observations available at thislocation at this time of the year after collection during thefield campaign FFT07. During the FFT07 campaign, thesurface local measurements routinely collected at theranges through its mesonet SAMS network weretransmitted in real-time to the MADIS database, and can,therefore, be found in the NCAR historical observationaldatabase (ADP). The month of September 2006 waschosen to ensure that range surface reports are onlyingested into the WRF model when the SAMS database isused, and gives us more control on the data withholdingexperiments. The Dugway WRF Real Time Time FDDAreal-time set-up (Liu et al., 2008) was used, except thatthe smallest and fourth domain at 1.1km grid increment isnot activated in the C-FDDA system. This leaves threeembedded domains with a resolution of 30/10/3.3km,respectively. The domain configuration is illustrated inFigure 1. In such a configuration each domain providesboundary forcing to the smaller domain it encompasses.There are 37 vertical levels, and the first vertical level atthe center of the domain, where most of the local surfacestations are located, is approximately 2 meters above theground. In the verification, however, the modeldiagnostic 2-meter temperature and humidity and the 10-meter winds, and not the lowest vertical level, were used.


Figure 1. Terrain height and WRF Domain configuration overDugway Proving Ground. Grid increment for Domain 1(windows), 2 (largest rectangle), 3 (medium rectangle) and 4(smallest rectangle) is, respectively, 30km, 10km, 3.3km and1.1km. Domain 4 is not activated for climatological runs.Initial conditions and lateral boundary conditions forthe largest domain are taken from the 6-hourly 2.5-degreeanalyses of the NCEP/NCAR Reanalysis project data filesarchived at NCAR (http://dss.ucar.edu/datasets/ds091.0/).The model is reinitialized every 8.5 days (i.e., cold startedat 12z the last day of the preceding month and then on the8 th , 16 th , and 24 th of the month) and output from the first12 hours after cold-start is thrown away. Observationsare ingested at the time they were recorded. The modelintegration takes 45 min. wall clock time per day ofsimulation on one of the 1,024 nodes of the AFRL DSRCmachine “Falcon”. The WRF physical parameterizationsadopted for the runs are listed under Table 1.Table 1. WRF parametrizations used for thesimulationsPhysical ParametersCumulus ParametrizationMicrophysicsLong Wave RadiationShortwave RadiationBoundary LayerSurface LayerLand Surface3. Verification and ValidationSchemeKain FrictchLin et al.RRTMDudhiaYSUMonin ObukovNoahThe verification and validation of a reanalysismesoscale system poses an interesting question regardingthe treatment of independent observations. Ideally, onewould like to evaluate the analysis system at its bestperformance, that is, when the system uses all theavailable information, which would leave no independentobservational data set. Withholding some of theobservations for verification will lead to the validation ofa sub-optimal system. Still, keeping this in mind,verification against independent data appeared to us to bean important step of the validation process. And since theproject is focusing on the ATEC ranges, the local surfaceobservations collected every day at the ranges offers anideal dataset for this type of verification process. Thoseobservations are issued from automatic stations, referredas SAMS, distributed over the ranges. SAMS report 2-meter temperature and humidity and 10-meter winds withhigh temporal frequency, every 5 min. for certain ranges.For a given range, the numbers of SAMS stations andtheir geographical locations have changed over the years,but NCAR has consolidated a database of more than 10years of SAMS records that can provide usefulinformation on the surface weather at the ranges.SAMS reports have been transmitted to the NationalOceanic & Atmospheric Administration (NOAA) datacollection network through the MADIS database since2007, but for all prior years, SAMS data are not present inthe NCAR ADP database used by CFDDA. They areavailable, however, from NCAR and assimilated duringgeneration of the ranges climatologies. The verificationstatistics of WRF CFDDA output against SAMS surfacereports for the month of September 2006 over DugwayProving Ground are presented below.There were 19 SAMS stations reporting during themonth of September 2006. On average, one report every5 min. was collected in the NCAR SAMS database. Afterapplication of the RT-FDDA quality control proceduredescribed by Liu et al. 2004, about 3800 reports wereavailable each day for verification over the full month.For each report, the 2-meter temperature and humidityand the 10-meter wind components gridded field from thehourly WRF output files nearest to the reporting timewere horizontally interpolated to the SAMS locations.The bias and root mean square difference between theobserved and the model interpolated values werecalculated for each report and aggregated by hour of theday for the full month. Tables 2, 3, 4, and 5 present thosestatistics for wind components, temperature and relativehumidity at 00z (18MDT) over domain 3 for threedifferent runs:a) NOBS: no observations assimilatedb) ADP: ADP only observations assimilatedc) ADP+SAMS ADP and SAMS observationsassimilated.Table 2. Wind U component aggregated verificationstatistics at 00z against SAMS data of WRF domain 3for the month of September 2006


U-wind NOBS ADP ADP+SAMSBias in m/s 0.5 0.3 0.5Rms in m/s 3.0 2.9 3.3Table 3. Wind V component aggregated verificationstatistics at 00z against SAMS data of WRF domain 3for the month of September 2006V-wind NOBS ADP ADP+SAMSBias in m/s -1.2 -1.0 -0.6Rms in m/s 2.8 2.9 3.0Table 4. Temperature aggregated verification statisticsat 00z against SAMS data of WRF domain 3 for themonth of September 2006Temperature NOBS ADP ADP+SAMSBias in K -1.2 -0.3 -0.1Rms in K 3.2 2.2 2.9Table 5. Relative Humidity aggregated verificationstatistics at 00z against SAMS data of WRF domain 3for the month of September 2006RelativeHumidity NOBS ADP ADP+SAMSBias in % 7.1 5.0 2.1Rms in % 12.8 10.0 8.1The statistics averaged across the domains are inagreement with the performances of current mesoscalemodels. It is striking, but not unexpected, that theassimilation of observations tends to slightly degrade theaccuracy of the winds analysis. As it was noted in Rifeand Davis 2005, the DPG range has complex terrain withsub-kilometer features that cannot be completelyrepresented by a model with 3.3km grid increment. As aresult, the model has difficulties reproducing all localeffects (e.g., canyon, canopy, etc.) that are immediatelyaround the recording station. These sub-grid propertiesparticularly affect winds. For these stations, it is known(Davis et al., 2005) that a model with coarser resolution,which produces smooth fields, could, on average, producebetter statistics than a higher resolution model. Figure 2depicts graphically the impact of the assimilation.Figure 2: Distribution of the SAMS temperature pairs(observation on the x-axis/model interpolated value on they-axis) for domain 3 for NOBS (left), ADP (center) andADP+SAMS (right) WRF simulations.Figure 3. 10m climatic mean wind (vectors) and standarddeviation (color) for September at 00z over DPG test range.Years 2000–2006 were used.


Figure 4: 2m temperature minima (contour) and maxima(color) at 00z over DPG test range. Years 2000–2006 wereused.4. Preliminary Statistical ResultsA statistical post-processing was applied to hourlyWRF output files for the month of September 2000-2006.For each hour of the day, the 30 days × 7 years = 210 fileswere averaged, creating a mean value for eachmeteorological field at each point of the model grid. Thestandard deviation was then calculated. Figure 3 showsthe mean wind vectors at 10m with the standard deviationoverlaid in color at 00z. The hourly climatic extremes,whose values are used in the testing decision makingprocess, are also calculated from the collection of files.The locations of high variability coincide with thelocation of elevated terrain objects indicating that,although not perfectly accurate, the 3.3km domain is ableto capture small-scale processes such as canyonchanneling and upslope-downslope winds. Figure 3shows the minima in contour line and the maxima in colorof the 2m temperature.climatological version of the WRF FDDA system issimilar to the accuracy obtained at the ranges by its realtimecounterpart WRF RT-(?)FDDA and an earlierversion of the CFDDA system based on MM5.The HPCMP AFRL DSRC Linux cluster “Falcon”was used in this study. The month of simulation wasdecomposed into four 8.5 day-long independent jobswhich were conducted in parallel in about 21 hours wallclocktime using a total of 8 processors of the 2,048 of themachine. Increased speed-up can be achieved with morecomputational resources, i.e., more than four nodes permonth of simulation, but this acceleration will have to beachieved through WRF (geographical) domaindecomposition. Figure 5 shows the shape of the WRFspeed-up for an elemental job (8.5 days) as function of thenumber of processors used for domain decomposition,based on series of benchmarks conducted on “Falcon”.The 4-2-processor configuration was used in this study.Because the speed-up does not increase linearly withthe number of processors, for a given number ofprocessors, it is more efficient to run as many 2-processor8.5 day elemental jobs as possible. For example, if 200hundred processors are made available, the 30-yearclimatology can potentially be generated in 13 days underthis configuration, while it could take up to 32 days if 32processors are allocated to the elemental 8.5 day job. Thefinal duration to produce the climatology will depend onthe number of processors available at a given time on themachine.5. ConclusionsA capability to efficiently generate years of highresolutionclimatologies at the ATEC ranges with theWRF model has been developed and ported onto HPCMPsuper-computers. This capability is based on NCARFDDA technology and uses various public globaldatabases (NCAR/NCEP Reanalysis, NCAR ADP, NCEPOI SST, and NASA GLDAS) as input data. A month ofdata for September 2006 has been generated over theDPG range and verification statistics against independentlocal and global data have been presented. The resultsindicate that the accuracy obtained with thisFigure 5. WRF speed-up as function of processors for theWRF configuration of Figure 1AcknowledgementThis publication was made possible through supportprovided by DoD HPCMP PET activities through acontract with Mississippi State University The opinionsexpressed herein are those of the author(s) and do not


necessarily reflect the views of the DoD or MississippiState University.ReferencesDavis, C.A., B.G. Brown, D.L. Rife, and R. Bullock, “Objectbasedverification approaches for NWP.” 21. Conf. on WeatherAnalysis and Forecasting/17. Conf. on Numerical WeatherPrediction, Washington, DC (USA), 31 Jul–5 Aug 2005.Laprise R. et al., “Challenging some tenets of Regional ClimateModelling.” Meteorol Atmos Phys, 100, pp. 3–22, 2007.Liu Y. et al., “The Operational Mesogamma-Scale Analysis andForecast System of the U.S. Army Test and EvaluationCommand. Part I: Overview of the Modeling System, theForecast Products, and How the Products Are Used.” J. Appl.Meteorol. Climatol. Vol. 47, no. 4, pp. 1077–1092, 2008.Liu, Yubao;, F. Vandenberghe, S. Low-Nam, T. Warner, and S.Swerdlin, “Observation-quality estimation and its application inNCAR/ATEC real-time FDDA and forecast (RTFDDA)system.” AMS Symp. on Forecasting the Weather and Climate ofthe Atmosphere and Ocean, Seattle, WA (USA), 10–16 Jan2004.Rife, D.L. and C.A. Davis, CA, “Verification of TemporalVariations in Mesoscale Numerical Wind Forecasts.” Mon.Weather Rev., Vol. 133, no. 11, pp. 3368–3381, 2005.

More magazines by this user
Similar magazines