Enable parallel remapping of static fields with arbitrary graph partition files; special CVT partition files are no longer required.
Run init atmosphere model with only one MPI task specified to create static.nc
Note that it is critical for this step that the initialization core is run serially;
Can parallel run ! But keep in mind that memory issue because NCAR they just set each core real all geo-dataset instead of needed partittion.
Reset the default for the lower air-temperature extrapolation (config_extrap_airtemp) from 'linear' to 'lapse-rate' in the namelist. This applies to initialization and to lateral boundary condition generation for MPAS-A.
Set the condition for the lower extrapolation of the horizontal velocity such that it returns the lowest analysis level value instead of a linear extrapolation when the requested level is below the analysis level.
Create a new init case (13) for creating 3-d CAM-MPAS grids.
Changes to the physics include:
Update the Noah land surface scheme to the WRF 4.5 release.
Update the MM5 surface layer scheme to the WRF 4.5 release.
Implemented the CCPP-compliant version of:
the revised MM5 surface layer scheme;
the parameterization of the gravity-wave drag over orography;
the YSU Planetary Boundary Layer scheme;
the scale-aware nTiedtke parameterization of convection; and
the WSM6 cloud microphysics parameterization.
Correct the initialization of the maximum snow albedo over sea ice points (now set to 0.75 instead of 0).
Fix the option that defines the surface albedo over sea ice points. The default option is now set to zero and the default value for the surface albedo over sea ice points is set to 0.65.
In the dynamics, rework the computation of the advective tendency of potential temperature needed as forcing in the nTiedtke and Grell-Freitas parameterizations of deep convection. The advective tendency of potential temperature is now computed the same way for the nTiedtke and Grell-Freitas convection schemes.
IMPORTANT NOTE: The updated physics schemes require new look-up tables, in particular, for the Noah land-surface model. The checkout_data_files.sh script that is run at build time should correctly update these tables, but tables in other run directories will need to be manually updated.
Include changes to the initialization and to MPAS-A such that this release can be directly used in CESM/CAM.
Enable parallel remapping of static fields
1 2 3 4 5 6 7 8
##-- For spack init, (it takes longer startup time !!) . /home/wpsze/spack/share/spack/setup-env.sh source /home/wpsze/MPAS-A/intel/mpas_env_intel.sh #ln -sf /home/wpsze/MPAS-A/intel/mpasv821/MPASv821/init_atmosphere_model ln -sf /home/wpsze/MPAS-A/intel/mpasv822/MPASv822/init_atmosphere_model
---------------------------------------------------------------------- Beginning MPAS-init_atmosphere Output Log File for task 0 of 6 Opened at 2025/03/05 15:54:12 ----------------------------------------------------------------------
MPAS Init-Atmosphere Version 8.2.2
Output from 'git describe --dirty': unknown
Compile-time options: Build target: gfortran OpenMP support: no OpenACC support: no Default real precision: single Compiler flags: debug I/O layer: PIO 2.x
Run-time settings: MPI task count: 6
Reading namelist from file namelist.init_atmosphere *** Encountered an issue while attempting to read namelist record physics The following values will be used for variables in this record:
config_tsk_seaice_threshold = 100.000
*** Encountered an issue while attempting to read namelist record io The following values will be used for variables in this record:
config_pio_num_iotasks = 0 config_pio_stride = 1
----- I/O task configuration: -----
I/O task count = 6 I/O task stride = 1
Initializing MPAS_streamInfo from file streams.init_atmosphere Reading streams configuration from file streams.init_atmosphere Found mesh stream with filename template static.nc Using io_type Parallel-NetCDF (CDF-5, large variable support) for mesh stream ** Attempting to bootstrap MPAS framework using stream: input Bootstrapping framework with mesh fields from input file 'static.nc'
▶
log_error
1 2 3 4 5 6 7 8 9 10 11 12
MPASv8.1.0/8.2.0 free(): invalid next size (normal)
Running init_atmosphere in directory: At line 183 of file mpas_block_decomp.F Fortran runtime error: Index '30210' of dimension 1 of array 'global_list' above upper bound of 30209
At line 183 of file mpas_block_decomp.F: Fortran runtime error: Index '40962' of dimension 1 of array 'global_list' above upper bound of 40961
This is because with the clobber_mode = "'never_modify", the output file cannot be overwritten. Please delete the file 'x1.40962.init.nc' from your previous run and rerun your case.
I am not sure yet what caused this issue. Please rerun this case with the fix of the 1st issue and let me know if you still get the same error.
Your suggestion fixed the issue. I have also use the "overwrite" option, that works as well.
nSoilComps in static.nc
1 2 3 4
ERROR: At least one fields to be read from the 'input' stream is dimensioned ERROR: by 'nSoilComps', but the 'nSoilComps' dimension is not defined ERROR: in the file static.nc CRITICAL ERROR: Please check the input file(s) to be read by the 'input' input stream.
▶
src/core_init_atmosphere/Registry.xml
1 2 3 4 5 6 7
description="The number of atmospheric levels, always one more than the number of layers"/> <dim name="nSoilComps" definition="8" description="The number of soil textures as needed as input in the NOAH-MP land surface model"/>
<!-- SOIL COMPOSITION fields needed for the NOAH-MP land surface scheme --> <var name="soilcomp" type="real" dimensions="nSoilComps nCells" units="percent" description="soil composition needed as input in the NOAH-MP land surface model"/>
▶
src/core_atmosphere/physics/Registry_noahmp.xml
1 2 3 4 5
<dim name="nSoilComps" definition="8" description="The number of soil textures as needed as input in the NOAH-MP land surface model"/>
<var name="soilcomp" type="real" dimensions="nSoilComps nCells" units="unitless" description="soil composition needed as input in the NOAH-MP land surface model"
In ./src/core_init_atmosphere, added the initialization of the static variables soilcomp, soilcl1, soilcl2, soilcl3, and soilcl4 needed to run the Noah-MP land surface scheme.
The static.nc is different after Noah-MP is applied. (v8.2.2 ... v8.2.0). And, re-generate static.nc
---------------------------------------------------------------------- Beginning MPAS-init_atmosphere Output Log File for task 0of6 Opened at2025/03/0615:23:51 ----------------------------------------------------------------------
MPAS Init-Atmosphere Version 8.2.2
Output from 'git describe --dirty': unknown
Compile-time options: Build target: gfortran OpenMP support: no OpenACC support: no Default real precision: single Compiler flags: debug I/O layer: PIO 2.x
Run-time settings: MPI task count: 6
Reading namelist fromfile namelist.init_atmosphere *** Encountered an issue while attempting toread namelist record physics The following values will be used for variables in this record:
config_tsk_seaice_threshold = 100.000
*** Encountered an issue while attempting toread namelist record io The following values will be used for variables in this record:
config_pio_num_iotasks = 0 config_pio_stride = 1
----- I/O task configuration: -----
I/O task count = 6 I/O task stride = 1
Initializing MPAS_streamInfo fromfile streams.init_atmosphere Reading streams configuration fromfile streams.init_atmosphere Found mesh stream with filename template static.nc Using io_type Parallel-NetCDF (CDF-5, large variable support) for mesh stream ** Attempting to bootstrap MPAS framework using stream: input Bootstrapping framework with mesh fields from input file 'static.nc'
---------------------------------------------------------------------- Beginning MPAS-init_atmosphere Output Log File for task 0of6 Opened at2025/03/0617:08:04 ----------------------------------------------------------------------
MPAS Init-Atmosphere Version 8.2.2
Output from'git describe --dirty': unknown
Compile-time options: Build target: intel-mpi OpenMP support: no OpenACC support: no Default real precision: single Compiler flags: optimize I/O layer: PIO 2.x
Run-time settings: MPI task count: 6
Reading namelist fromfile namelist.init_atmosphere *** Encountered an issue while attempting toread namelist record physics The following values will be used for variables in this record:
config_tsk_seaice_threshold = 100.000
*** Encountered an issue while attempting toread namelist record io The following values will be used for variables in this record:
numberof seaice cells converted to land cells 2 = 210
******************************************************** Finished running the init_atmosphere core ********************************************************
Timer information: Globals are computed across all threads and processors
Columns: total time: Global maxof accumulated time spent in timer calls: Total numberof times this timer was started / stopped. min: Global minoftime spent ina single start / stop max: Global maxoftime spent ina single start / stop avg: Global maxofaveragetime spent ina single start / stop pct_tot: Percent ofthe timer at level 1 pct_par: Percent ofthe parent timer (one level up) par_eff: Parallel efficiency, globalaverage total time / globalmax total time
timer_name total calls minmaxavg pct_tot pct_par par_eff 1 total time9.5360119.535889.536019.53594100.000.001.00 2 initialize 0.8572410.850610.857240.852068.998.990.99
An aerosol climatology file QNWFA_QNIFA_SIGMA_MONTHLY.dat
MPAS Version 8.2.0
The aerosol-aware Thompson microphysics (as in WRF v4.1.4) is available by setting config_microp_scheme = 'mp_thompson_aerosols' in the &physics namelist group.
An aerosol climatology file (QNWFA_QNIFA_SIGMA_MONTHLY.dat) is used when running the init_atmosphere_model program to produce initial and lateral boundary conditions for nifa and nwfa.