Beginning with the v3.9 release of the WPS, the metgrid.exe program is capable of reading native, unstructured mesh output in netCDF format from the Model for Prediction Across Scales (MPAS; https://mpas-dev.github.io/); the metgrid.exe program can then horizontally interpolate the MPAS fields directly to any domain defined by the geogrid.exe program to produce output files that are usable by the WRF real.exe program in exatly the same way as metgrid output interpolated from intermediate files. In this way, output from MPAS may be used to provide initial and lateral boundary conditions for WRF. When running an MPAS simulation, an output stream must be set up to contain the minimum set of fields necessary to initialize a WRF simulation. The following output stream should be sufficient with the MPAS v5.x and later code.
After having run MPAS with a suitable output stream defined, a set of netCDF files containing fields on the native MPAS mesh will have been produced. Because these files do not contain fields describing the locations, geometry, and connectivity of the MPAS grid cells, this information must be provided to the metgrid program with a “static” file from the MPAS simulation. Therefore, it is necessary to specify MPAS netCDF files (prefixed with ‘mpas:’) in the &metgrid namelist record with both the constants_name and fg_name variables, e.g.,
In the above example, the metgrid.exe program would first read the MPAS ‘static.nc’ file to read mesh information and compute remapping weights from the MPAS mesh to the WRF domain defined by the geogrid.exe program, then all time periods of the MPAS files with a prefix of‘MPAS’(and a suffix of YYYY-MM-DD_HH.nc) would be processed. The real.exe program can then be run as usual.
Data from intermediate files created by the ungrib.exe program can be combined with MPAS data by the metgrid program. This may be useful, e.g., to use SST, sea ice, or land-surface fields from another source. An example of combining MPAS data with ERA-Interim intermediate files with soil data (with the prefix ‘ERAI_SOIL’) is shown below.
Because the MPAS ‘zgrid’ field does not change in time, it can be omitted from the MPAS periodic output stream; in this case, however, the ‘zgrid’ field must be placed in its own netCDF file that must also define the dimension ‘Time’ as a netCDF unlimited dimension. Then, this file (say, ‘zgrid.nc’) can be supplied to the metgrid program using the constants_name namelist variable, e.g.,
Placing the ‘zgrid’ field in its own file can save considerable space when long MPAS simulations are run, or when the output stream to be used as WRF initial and boundary conditions is written out at high temporal frequency. The python script, below, may serve as an example of how to extract the ‘zgrid’ field to its own netCDF file.
It is worth noting that the use of native MPAS output with metgrid.exe has not been thoroughly tested for parallel (i.e., “dmpar”) builds of the WPS; it is therefore recommended to run metgrid.exe in serial when processing MPAS datasets.
Please use 1 core (serial) to run metgrid.exe. Otherwise, the wrfout will be fail even (dmpar) of metgrid seems to be done.
sbatch -N 1 -n 1 run_metgrid.sh
For large MPAS meshes
Also, in cases of large MPAS meshes, it may be necessary to increase the value of two constants in the metgrid code that are used to statically allocate several data structures used in the computation of remapping weights from the MPAS mesh to the WRF domain. These two constants, shown below, are located in the WPS/src/metgrid/remapper.F file.
1 2 3 4 5
! should be at least (earth circumference / minimum grid distance) integer, parameter :: max_queue_length = 2700
! should be at least (nCells/32) integer, parameter :: max_dictionary_size = 82000
After changing the value of these constants, metgrid must be recompiled.
SIZE MISMATCH: num_metgrid_levels --> 56 (default of MPAS)
1 2 3 4 5 6
d01 2024-05-01_00:00:00 Yes, this special data is acceptable to use: OUTPUT FROM METGRID V4.3 d01 2024-05-01_00:00:00 Input data is acceptable to use: met_em.d01.2024-05-01_00:00:00.nc metgrid input_wrf.F first_date_input = 2024-05-01_00:00:00 metgrid input_wrf.F first_date_nml = 2024-05-01_00:00:00 d01 2024-05-01_00:00:00 input_wrf.F: SIZE MISMATCH: namelist num_metgrid_levels = 34 d01 2024-05-01_00:00:00 input_wrf.F: SIZE MISMATCH: input file BOTTOM-TOP_GRID_DIMENSION = 56
p_top_requested < grid%p_top
1 2 3 4 5 6 7 8 9
rsl.error.0020:25:p_top_requested < grid%p_top possible from data rsl.out.0016:31: p_top_requested = 1000.00000 rsl.out.0016:32: allowable grid%p_top in data = 1281.68921 rsl.out.0016:35:p_top_requested < grid%p_top possible from data
-------------- FATAL CALLED --------------- FATAL CALLED FROM FILE: <stdin> LINE: 1219 p_top_requested < grid%p_top possible from data -------------------------------------------
How? try to set p_top_requested = 5000. (leave homework)
==> rsl.out.0025 <== d01 2024-05-01_00:00:00 Input data is acceptable to use: Tile Strategy is not specified. Assuming 1D-Y WRF TILE 1 IS 1 IE 20 JS 84 JE 100 WRF NUMBER OF TILES = 1 Flerchinger USEd in NEW version. Iterations= 10 Flerchinger USEd in NEW version. Iterations= 10
==> rsl.error.0028 <== d01 2024-05-01_00:00:00 Input data is acceptable to use: Tile Strategy is not specified. Assuming 1D-Y WRF TILE 1 IS 61 IE 80 JS 84 JE 100 WRF NUMBER OF TILES = 1
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error: #0 0x1488066c7b4f in ???
==> rsl.out.0000 <== d01 2024-05-01_00:00:00 14241 points exceeded w_critical_cfl in domain d01 at time 2024-05-01_00:00:00 hours d01 2024-05-01_00:00:00 Max W: 74 24 2 W: ******* w-cfl: Inf dETA: 0.01
Note: Be aware of the stability (here, MPAS-480km mesh is used).
Successful,
1 2 3 4 5 6
$ ls wrf* wrfbdy_d01 wrf.exe wrfinput_d01 wrfout_d01_2024-05-01_00:00:00 wrfrst_d01_2024-05-01_06:00:0
==> rsl. <== Timing for Writing restart for domain 1: 14.65449 elapsed seconds d01 2024-05-01_06:00:00 wrf: SUCCESS COMPLETE WRF