Monday, April 28, 2014

Using input datasets in real WRF runs

 Here is a summary of the run process for five different easily available datasets one can use to run WRF. There are, of course, others, like the Japanese reanalysis and forecasts from the NAM model. However, these are the datasets I've used. Running MERRA requires some extra programs. In my experience, I haven't found it to produce particularly great output with WRF, but if you want to give it a shot, let me know and I can give you further instruction.

1) NCAR Climate Forecast System (v2)
    Download location:    CISL RDA
        NCEP Climate Forecast System Reanalysis (CFSR)
        ds093.0 (pre-2011); ds094.0 (post-2010)
        Data available every six hours (21600 seconds)
    Choose Subsetting
        Download files for desired range using two parameter presets:
            a) WRF Model Input: Vtable.CFSR - Pressure levels
            b) WRF Model Input: Vtable.CFSR - Surface
    WPS Processing
        Use ungrib to unpack the two input file types (requires two runs of ungrib)
            For the pressure level input, use Vtable.CFSR_press_pgbh06
            For surface fields, use Vtable.CFSR_sfc_flxf06
        Run metgrid with both ungrib output file types listed under fg_name in namelist
    Run WRF with the following settings
        p_top_requested            = 1000
        num_metgrid_levels        = 38
        num_metgrid_soil_levels    = 4
       
2) ERA-Interim Reanalysis
    Download location:    CISL RDA
        ECMWF Interim Reanalysis
        ds627.0
        Data available every six hours (21600 seconds)
    Choose ERA-Interim Project
        Download files from two categories for desired range (Complete File List)
            a) ERA Interim atmospheric model analysis interpolated to pressure levels
                  (uv and sc files)
            b) ERA Interim atmospheric model analysis for surface (sc files)
    WPS Processing
        Use ungrib with Vtable.ERA-interim.pl (requires one run; link all files at once)
        Run metgrid
    Run WRF with the following settings
        p_top_requested            = 1000
        num_metgrid_levels        = 38
        num_metgrid_soil_levels    = 4
       
3) GFS Final Analysis
    Download location:    CISL RDA
        NCEP Operational Data (WRF Inputs): 1-degree FNLs
        ds083.2
        Data available every six hours (21600 seconds)
    Choose 6 HOURLY FILES for period
        Use complete file list (now GRIB2 interestingly)
    WPS Processing
        Use ungrib w/GRIB2 build to unpack the input with Vtable.GFS
        Run metgrid
    Run WRF with the following settings
        p_top_requested            = 1000
        num_metgrid_levels        = 27
        num_metgrid_soil_levels    = 4
       
4) North American Regional Reanalysis
    Download location:    CISL RDA
        NCEP North American Regional Reanalysis
        ds608.0
        Data available every three hours (10800 seconds)
    Choose regrouped 3-hourly files for period
        Need three types of files
            a) 3D files
            b) flx files
            c) sfc files
        Also need the fixed fields file (Constant fields of NARR) rr-fixed.grb
    WPS Processing
        Use ungrib with Vtable.NARR
            Only need to do fixed-fields once
                Link to rr-fixed.grb
                start_date = '1979-11-08_00:00:00',
                end_date = '1979-11-08_00:00:00',
                Run ungrib
            Then link to all running data types (3d,flx,sfc) and run ungrib with different prefix
        Run metgrid with fixed file listed as constants in namelist.wps
            constants_name = './FF_NAME'
            Running data listed under fg_name as usual
    Run WRF with the following settings
        p_top_requested            = 10000
        num_metgrid_levels        = 30
        num_metgrid_soil_levels    = 4
       
5) NASA MERRA Reanalysis
    Download location:    NASA MDISC Data Subsetter/Mirador
        MERRA Data Subset
        Data available every six hours (21600 seconds)
    Need to download two types of data (MERRA, and GLDAS for land)
        Download using command 'wget --content-disposition -i [filename]'
        Need two types of MERRA data
            a) 3D files     - inst6_3d_ana_Np
            b) SFC files    - tavg1_2d_slv_Nx
            Note: don't do a spatial subset, and download all variables; download NetCDF format
            Also need the constant fields file for surface data - const_2d_asm_Nx
                Note: constants file is only available in HDF format, which converter program will 
                not accept. Use program to convert to netCDF (e.g. ncl_convert2nc)
        Need GLDAS land data
            Download files from GLDAS_NOAH025SUBP_3H set
            Note: download grib format
    Pre-processing
        a) Fix the GLDAS data (remove missing values under terrain)
            Run the program readgldas_025 for each data file (I use a script)
            Specify input file name and output file name
        b) Ungrib the GLDAS data using ungrib and provided Vtable.GLDAS
        c) Move GLDAS_CONSTANTS_INTERMEDIATE file to WPS folder
            Add file to metgrid namelist section as a constants_name entry
        d) Compile MERRA_SFC.f90 program with location of SFC data set (and constants file)
            Run MERRA_SFC to convert surface data to WPS intermediate format
            Note: careful; input file name sizes hardcoded... need to set to appropriate value!
            Note: for some reason, variable names may not be capitalized in MERRA subset 
            data. Need to change variable names for fortran program to work. Variables are
            still capitalized in constants file!
        e) Compile MERRA_UPP.f90 program with location of 3D data set
            Run MERRA_UPP to convert surface data to WPS intermediate format
            Note: same caveats apply as with MERRA_SFC program
        f) Move both intermediate file types to WPS folder
        g) Replace the gribmap.txt and METGRID.TBL files in the metgrid folder with new
            versions. Be sure to backup originals!
        h) Run metgrid to compile GLDAS and MERRA data into met_em files for real.exe
            List all three sources (GLDAS from ungrib and MERRA SFC and UPP intermediate 
            files) as input data to metgrid
    Run WRF with the following settings
        p_top_requested            = 5000
        num_metgrid_levels        = 32
        num_metgrid_soil_levels    = 4

Friday, August 30, 2013

Connect to CU from iPhone, iPad, or iPod

Dear Mac Mobile Users,

I wanted to let you know something that I found useful. If you are using an iPhone, iPad, or iPod, IT allows you to now be able connect to CU.

See the following site:

http://www.colorado.edu/oit/services/network-internet-services/vpn/help/mobile-devices

Currently, Android Devices are not supported. (Sorry.)

Click one of the "Apple Devices" links there. It then allows you to install the Junos Pulse software, which will allow a VPN connection into CU to be established. After this is established, you might need a navigation/connection software App. I suggest ServerAuditor:

https://serverauditor.com/

Which is free and works pretty well.

I would be pleased to hear if anyone else has needed this or has better Apps to connect from such a mobile device.

Happy Computing!


Cheers,

--Paul Quelet

Remote Connections to Desktop Computers

Dear Lundquist Group,

Thanks to the help of Andrew Eppler, we are now able to connect to desktop computers remotely. Here are the brief instructions (as best I can...please try and let me know if it works).

1. If off campus (not on network), start the Network Connect software to establish a VPN connection. If needed see:

http://www.colorado.edu/oit/services/network-internet-services/vpn/help/desktop-applications/network-connect

2. Initiate Remote Desktop Connection software (for Windows...not sure what the Mac equivalent is).

3. In the Computer line, put in the Full Computer Name of your desktop computer

You can find this in the Windows System information about the computer by using:

Start Windows --> Computer --> (Right Click) --> Properties --> (Look for Full Computer Name)

Example (PTQ computer):

ATOC-RSCH-D-015.ad.colorado.edu

4. Click Connect

5. You will be prompted for a username and password

Username:

ad\your_identikey_username

Password:

identikey_login_password


Note 1: Must add the ad\ before your username for the "active directory".

Note 2: The screen of your desktop computer should come up on the computer you are working on. If you are near your desktop computer, you will see the screen lock, but you are still connected on the other computer.

Note 3: Your desktop computer must be powered on to connect from a remote location. You do not have to be logged in, but a connection cannot be established if it is not on.

Note 4: You can remote desktop into anyone's computer as long as you/they have their username and password. (This eliminates the need to walk down the hall any longer to get to their computer, say if they are far away.) The best software I know of that more than one person can be on the same computer editing at the same time is VNC Viewer:

http://www.realvnc.com/

http://www.realvnc.com/download/viewer/


I hope that is helpful to many. Now I can work on my desktop computer when I am sitting at home and still do many things I would want to do anyway as if I traveled into campus.

Apparently, the issue had to do with the settings around Duane for VPN, etc. that had been changed many times. Hopefully, this works for many. If problems exist in the future, we can request IT support for this.

Happy computing!!


Cheers,

--Paul Quelet



Tuesday, August 13, 2013

WRF Job Submission on Janus

It's a poorly kept secret that HPC jobs with smaller wall time requests tend to get onto the queue quicker. As a result, if I have a WRF job that I expect will take about 16 hours to complete, I can probably get it done faster using four 4-hour jobs rather than one 16-hour job. If you keep scaling that process up to longer jobs... you can end up with a lot of jobs (I've run 100+ job submissions before). So I wrote a few scripts to automate the submission of multi-part WRF jobs that others may find useful.

Click here to download the Janus job submitter (tar file)

Assuming you follow the suggested folder WRF structure, you should extract the tar file into your base WRF directory (where your WRFV3.X and possibly WPSV3.X folders are located). Inside the resulting WRF/jobs folder, you will find the following:
  • long_nl - a folder that contains the namelist files for the long run. In here, you will find a template namelist file for the LES tutorial case. You should replace the namelist.temp file with one of your own that contains the settings you want for your job (you don't need to change the time settings though; that comes next)
  • gen_namelists - an interactive Fortran program that reads in the namelist.temp file and splits it into user-specified time increments. For example, if I have a six hour run that I split into two-hour increments, this program will create namelist.01, namelist.02, and namelist.03 files.
  • new_project.sh - a short script that creates the folder structure necessary to run the other scripts. Your job output and settings will be stored in these folders.
  • run_wrf_long.temp - a placeholder script. This file needs to be here, but you shouldn't need to edit it (unless you want to!)
  • start_wrf_long.sh - this script submits your split WRF jobs to the Moab scheduler. The necessary command line arguments are described at the top of the script.
So let's say I wanted to start a PBL comparison project and the first WRF run would test the MYJ configuration. I'd first create the project folders.

./new project.sh PBL_TESTS

Then I'd customize the WRF namelist to my liking and place it inside the long_nl folder as namelist.temp. Then I'd run gen_namelists and specify how I want the run split temporally. After generating the run segment namelist files, I'd be ready to submit the job(s).

./start_wrf_long.sh 1 3 PBL_TESTS MYJ_TEST 4 2 janus-small

That command would submit 3 dependent jobs. It would create a run called MYJ_TEST inside of the PBL_TESTS project folders. The jobs would use 4 nodes (12 cores per, for a total of 48 cores) and each job would request 2 hours of wall time. Finally, the jobs would be submitted to the janus-small queue. There is also an optional argument to the script to set the allocation you want to charge the wall time to. If you don't specify one, your default allocation will be used.

Feel free to give it a try and let me know if anything is unclear or you encounter any problems. The script is designed to cancel subsequent jobs automatically if any one job fails, but I haven't tested that part of it on Janus yet. I hope you find it useful!

Tuesday, July 16, 2013

 
Below you'll find Jiwan's instructions for running/compiling WRF on Janus 
 
Original author: Timothy Dwight Dunn(timothy.dunn@colorado.edu)
File created by: Jiwan Kumar Rana(jiwan.rana@colorado.edu)
Date created   : June 28, 2013

***********************************************************************
NOTE: THESE INSTRUCTIONS ARE FOR WRF 3.5 AND WPS 3.5


// HOW TO COMPILE AND RUN WRF ON JANUS//

1. Copy the contents of the file "dot_my.bash_profile" to your
      ".my.bash_profile" file. 
      If using a bash shell, copy the commands as it is.
      If using a differnt shell, export -> setenv 
         Example: export MPI_CC=icc (bash)
       setenv MPI_CC icc (csh)  

2. Download, unzip and untar WRF source files on Janus.
      If need to copy the files from local computer to Janus, use either
        SFTP or SCP or anyother commands.
      Unzip and untar using :
            gunzip <filename.tar.gz> 
            tar -xvf <filename.tar>

3. Goto WRF3 directory and run the following commands:
       1. ./clean -a
       2. ./configure
       3. Choose Linux_x86_64 i486 i586 i686 ifort compiler with icc(dmpar)
       { Most likely Option # 19}
       4. Compile for nesting? Choose 1 =basic
       5. ./compile -j 12 em_real >& compile.log     
       6. View compile.log for any errors

4. Goto WPS directory and run the following commands:
       1. ./configure
       2. Choose Linux x86_64, Intel compiler (dmpar_NO_GRIB2)
       { Most likely Option # 20 }
       3. ./compile >& compile.log
       4. View compile.log for any errors   

5. Congratulations ! You have sucessfully compiled WRF on Janus.
Welcome! This blog should become a useful portal and archive for maintaining group knowledge about our common resources, such as computing on Janus, travel and presentations, abstract reminders, etc. Pictures will come; feel free to share interesting research results here: this blog is not searchable. Please also try to use descriptive labels for your posts so that we can search for "Janus", "CWEX", "TODS", "100S", "200S" etc. to find the most relevant posts for desired topics. Short posts on new and interesting articles and useful texts would also be great.