I am not aware of anything specific regarding setting things up for WRF compiling. If you're able to compile successfully, I don't believe the installation of oneAPI would be incorrect. Have you verified that you can execute your WRF simulation to completion when NOT using oneAPI? I doubt this will help, but I have some notes I've written up for myself for installing Intel OneAPI. I'll paste them here and you can see if there is anything drastically different. These notes are somewhat specific to compiling on an Amazon Web Services cloud instance (just FYI).
For HPC computing, you must install both of the following toolkits in the order presented OneAPI Base Toolkit OneAPI HPC Toolkit
First go to the OneAPI Base Toolkit page and follow the prompts to select the correct version for your system. 1. Select operating system: Linux 2. Select distribution: web and local 3. Select installer: online 4. Follow “Command Line Download” instructions (enter 'wget' command to obtain code, and then the 'sudo bash' command to initiate the build)
Follow prompts to continue through (I chose to not install GPU or Eclipse) default install location is /opt/intel/oneapi
After you exit, issue the following commands: Code: sudo yum update sudo yum -y install cmake pkgconfig sudo yum groupinstall “Development Tools” which cmake pkg-config make gcc g++ you should get /usr/bin/cmake /usr/bin/pkg-config /usr/bin/make /usr/bin/gcc /usr/bin/g++
Now install Intel oneAPI HPC Toolkit (that includes the compilers). From the website, choose the same prompts you chose in step 1, and then from your local machine, issue the given 'wget' and then 'sudo bash' commands to obtain the code and initiate installation.
Follow prompts to continue through (I chose to not install GPU or Eclipse) default install location is /opt/intel/oneapi
Add something similar to .bashrc *Note you will need to modify the paths for your specific environment. *Note this includes a setting to the path for “DIR.” This is just for the sake of simplifying installation.
Make a directory to install all the libraries. Code: mkdir libs
mpich
Code: wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/mpich-3.0.4.tar.gz tar -xf mpich-3.0.4.tar.gz cd mpich-3.0.4 ./configure --prefix=$DIR make 2>&1 make install cd .. rm -rf mpich*
zlib
Code: wget https://www2.mmm.ucar.edu/people/duda/files/mpas/sources/zlib-1.2.11.tar.gz tar xzvf zlib-1.2.11.tar.gz cd zlib-1.2.11 ./configure --prefix=$DIR/grib2 make -j 4 make install cd .. rm -rf zlib*
HDF5
Code: wget https://www2.mmm.ucar.edu/people/duda/files/mpas/sources/hdf5-1.10.5.tar.bz2 tar -xf hdf5-1.10.5.tar.bz2 cd hdf5-1.10.5 ./configure --prefix=$DIR --with-zlib=$DIR/grib2 --enable-fortran --enable-shared make -j 4 make install cd .. rm -rf hdf5*
NetCDF-c
Code: wget https://github.com/Unidata/netcdf-c/archive/v4.7.2.tar.gz tar -xf v4.7.2.tar.gz cd netcdf-c-4.7.2 ./configure --enable-shared --enable-netcdf4 --disable-filter-testing --disable-dap --prefix=$DIR/netcdf make -j 4 make install cd .. rm -rf v4.7.2.tar.gz netcdf-c*
netcdf-fortran
Code: export LIBS=”-lnetcdf -lhdf5_hl -lhdf5 -lz” wget https://github.com/Unidata/netcdf-fortran/archive/v4.5.2.tar.gz tar -xf v4.5.2.tar.gz cd netcdf-fortran-4.5.2 ./configure --enable-shared --prefix=$DIR/netcdf make -j 4 make install cd .. rm -rf netcdf-fortran* v4.5.2.tar.gz
libpng
Code: wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/libpng-1.2.50.tar.gz tar xzvf libpng-1.2.50.tar.gz cd libpng-1.2.50 ./configure --prefix=$DIR/grib2 make -j 4 make install cd .. rm -rf libpng*
jasper
Code: wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/jasper-1.900.1.tar.gz tar xzvf jasper-1.900.1.tar.gz cd jasper-1.900.1 ./configure --prefix=$DIR/grib2 make -j 4 make install cd .. rm -rf jasper*
WRF
Code: git clone --recurse-submodule https://github.com/wrf-model/WRF.git cd WRF ./configure (choose options 15 and 1) ./compile em_real -j 4 >& log.compile
Update: Issue was resolved with a simple git pull. If you're also running into this issue, you may be running an older version of WRF. In my case I'm using serial, so it was option 76/78 (INTEL ifort/icx).
These are some of the exports I use for my icx build in dmpar. Hope it helps
That is a good question. We have just found that the advantage to running WPS in parallel isn't that beneficial (or much faster) than a simple serial run, as the WPS programs run so quickly anyway. We do find that it is necessary if you have very large domains (1000's x 1000's of grid cells). That being said, if you are finding that it speeds up the process for your runs, then it should be perfectly fine to use parallel processing. As I'm sure you already know, geogrid and metgrid are the only programs that can be run in parallel. Ungrib must still be run serially.
Has anyone tried to compile the WRF system and associated libraries with the OneAPI Intel compiler. Here there is an explanation on how to compile with classic Intel compiler. But soon (2023-2024), OneAPI C/C++ compiler (icx, icpx) will succeed to the classic (icc,icpc) compiler.
The linked example should be a good starting point for using the OneAPI compilers, especially as they share similar flags. You might be able to just replace icc -> icx and ifort -> ifx
Stanzas for the OneAPI compilers already exist in the latest WRF/WPS releases so once the dependencies are built you should be able to compile WRF.
I previously built WRF on our old Linux cluster using the PGI compiler. We've recently upgraded our cluster and I successfully built WRF v4.6.1 on the new system using the Intel compiler with option 78 (dmpar) – INTEL (ifx/icx): oneAPI LLVM.
These differences are randomly distributed and the values are small over most areas, --- we did see a similar model behavior when running the same case using different number of processors and in different machines. From this perspective, I would say that it is acceptable.
We used to have a website that provides test data, NCL scripts and verification of model output. Unfortunately this page seems gone because we are reconstructing WRF website and update documents. Sorry for the inconvenience caused by this.