[Wrfems] AutoPost Processing with V3

Case, Jonathan (MSFC-VP61)[Other] jonathan.case-1 at nasa.gov
Fri Sep 18 13:52:01 MDT 2009


Pablo,

Glad to hear that you got the autopost working.

Regarding the post-processing of the WRF-NMM output, I have experienced a similar issue.
The I/O tends to not be able to keep up with the rate of model output, esp. with a very large NMM domain and frequent (30 min or hourly) output.
BobR obtained a fix for a "fast" copygb re-projection of the NMM grid for display.  However, I'd like to confirm whether Bob has implemented the "fast copygb" into EMSv3?  This speeds up post-processing of NMM forecasts immensely.

If the fast copygb is already in place in EMSv3, then you might just want to cut back on the number of output fields for the NMM domain.
By default, the EMSv3 wrfpost_cntrl.parm file (in the static directory of the domain of interest) has a LOT of fields/levels turned on at numerous pressure levels.
The more pressure level data that are turned on, the longer the post-processing takes.  Some fields such as moisture are redundant and can be derived from only one variable (esp. if you use GEMPAK to create graphics).  Also, reflectivity and cloud fractions on pressure surface are both turned on by default.  You might be able to choose only one of these....(or maybe neither?)

Finally, you may want to turn off a lot of pressure levels for each field because you probably won't look at 10-mb winds or temperature, for example.
Check out the link that BobR provided at: http://strc.comet.ucar.edu/wrf/wrfems_postcntrl.htm
This defines all the levels that can be turned on/off in wrfpost_cntrl.parm for EMSv3.

By default, the wrfpost_cntrl.parm is set to output height, temperature, RH, u-wind, v-wind, and omega at the following pressure levels:
10,  25, 50, 100, 150, 200, 225, 250, 275, 300, 325, 350, 375, 400, 425, 450, 475, 500, 525, 550, 575, 600, 625, 650, 675, 700, 725, 750, 775, 800, 825, 850, 875, 900, 925, 950, 975, 1000, and 1013.2 mb.

That seems like a lot of levels by default, so I changed the flags from (temperature, for example):
(TEMP ON PRESS SFCS  ) SCAL=( 4.0)
 L=(11111 11111 11111 11100 10010 01001 00100 10010 01001 00100 10010 01001 00100 10010 01001 00101)

to:

(TEMP ON PRESS SFCS  ) SCAL=( 4.0)
 L=(00010 10101 01010 10100 00010 00001 00000 10000 01001 00100 10010 01001 00100 10010 01001 00101)

in order to cut out a lot of the upper pressure levels and achieve generally 50-mb vertical output instead of every 25 mb.

If you do this for every field being processed on pressure levels, you'll see a dramatic decrease in the post-processing time!
Sorry for the lengthy reply, but I hope you find it helpful.

Good luck,
Jonathan

From: wrfems-bounces at comet.ucar.edu [mailto:wrfems-bounces at comet.ucar.edu] On Behalf Of Pablo Santos
Sent: Friday, September 18, 2009 1:48 PM
To: paul.shannon at noaa.gov
Cc: WRF Model ListServ
Subject: Re: [Wrfems] AutoPost Processing with V3

That did the trick but I also had to add Bob Script fix below so that the model did not post process the output files again at the end of the run.

Add the single line indicated below to the strc/ems_auto/auto_main.pm file:

 switch (&handler(4,&auto_ems::auto_run)) {
     case 53      {$EMSauto{EMSPOST} = &merge($EMSauto{EMSPOST},$EMSauto{AUTOPOST});
                   &ems_utils::emsprint(6,9,96,2,2,"Autopost failed, migrating over to regular post")}
     case {$_[0]} {return 1             }
 }

 $EMSauto{EMSPOST} = 0;       <--- ADD THIS LINE
 return  if &handler(5,&auto_ems::auto_post);

The Autopost processing with my ARW now is working good. However, I have a wrnnmm domain also and that one runs so fast that auto processing just does not work. It does not matter how I set my wait time in the ems_autopost.conf file. I run the model out to 48 hours. By the time it finishes the model is still half way processing output files and it does not wait long enough to finish processing all of them. The end result is I end up with less forecast output files than the length of the run. Is there something I am missing here or perhaps is it best to just not use auto post processing with this one?

Any ideas are welcome.

Thanks,

Pablo

Paul Shannon wrote:
Pablo,

We had the same problem and Jon's answer echoes what Bob R. sent me at the time.  That worked although I don't know why it was set to precess them by default.

Paul


Paul,

The wrfpost can not be used to process the sfcout files as they should be in GRIB format. Have you changed the output format
of these files?  You can also turn off the generation of these files by setting HISTORY_INTERVAL = 0 in run_auxhist1.conf.

Bob
Case, Jonathan (MSFC-VP61)[Other] wrote:

Pablo,



I had to turn off the "sfcout" auxiliary fields in run_auxhist1.conf in order to get autopost working in V3. I just set HISTORY_INTERVAL = 0.  I have not yet tried to investigate why the autopost fails when the sfcout aux files are written.



Jon



--------------------------------------------------------------------------------------------------

Jonathan Case, ENSCO Inc.

NASA Short-term Prediction Research and Transition Center (aka SPoRT Center)

320 Sparkman Drive, Room 3062

Huntsville, AL 35805

Voice: 256.961.7504

Fax: 256.961.7788

Emails: Jonathan.Case-1 at nasa.gov<mailto:Jonathan.Case-1 at nasa.gov> / case.jonathan at ensco.com<mailto:case.jonathan at ensco.com>

-------------------------------------------------------------------------------------------------


--
[cid:image001.png at 01CA386C.22F9F370]






________________________________






_______________________________________________

Wrfems mailing list

Wrfems at comet.ucar.edu<mailto:Wrfems at comet.ucar.edu>



-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.comet.ucar.edu/pipermail/wrfems/attachments/20090918/9663fc02/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image001.png
Type: image/png
Size: 4725 bytes
Desc: image001.png
URL: <https://mailman.comet.ucar.edu/pipermail/wrfems/attachments/20090918/9663fc02/attachment-0001.png>


More information about the Wrfems mailing list