Frequently Asked Questions

Note

This section is still heavily under construction as we aggregate questions and answers into this document.

Running environment

What are the system requirements for WRF-GC?

A Linux-based system, either on your local cluster or cloud services. Software requirements are: a Fortran/C compiler (Intel or GNU GCC+GFortran), MPI, HDF5, netCDF (with netCDF Fortran).

Recommended system specifications are at least 6 CPU cores, and at least 32 GB of RAM. Storage requirements depend on the resolution of your run, but plan for generally ~2 GB per output file. Input data for GEOS-Chem and WRF meteorology is also necessary, and may be around hundreds of GB.

Can I run WRF-GC on my own computer?

Generally we recommend running on a computing cluster. If this is not possible, cloud services such as AWS may also be available.

If you want to test WRF-GC locally, a Linux system with at least 8 recent (consumer CPUs: Intel Skylake, 6th-generation Core; AMD Ryzen Zen 2 and above, 3xxx-series) CPU cores, 64 GB of memory is recommended. However, running on your own computer may be extremely slow and only recommended for debugging.

I want to build a cluster for my lab to run WRF-GC. What do you recommend?

Hardware is ever-changing, but both recent AMD EPYC and Intel Xeon processors will work. If you are using an AMD system, you may want to use GNU compilers for better performance. Storage capacity is generally the more, the better, because WRF-GC outputs are fairly large as they include the 220+ species and all met fields computed by WRF and GEOS-Chem.

Does WRF-GC require ESMF?

No.

Building

Compiling WRF-GC takes a long time and is stuck on module_first_rk_step_part2.F90

This step usually takes approximately 40 minutes. Usually there is no cause for alarm unless compiling WRF-GC takes longer than 2-4 hours, depending on your machine.

Input

What are the typical input files for running WRF-GC?

Configuration:

  • namelist.input - the configuration file (namelist) for WRF-GC. Most options are configured here.

  • HEMCO_Config.rc - the HEMCO configuration file, for chemistry emissions.

  • (GEOS-Chem 12 and 13) input.geos - only used to specify some input files paths for Fast-JX.

  • (GEOS-Chem 14 and above) geoschem_config.yml - only used to specify some input files paths for Fast-JX, and optionally enable Complex SOA option.

Input data:

  • met_em.* files for meteorology boundary conditions: Generated by the WRF pre-processor system (WPS) based on reanalysis fields (e.g., GFS, FNL)

  • wrfinput_d01 and wrfbdy_d01: Initial and boundary condition files for WRF-GC (both meteorology and chemistry). Generated by real.exe, then chemical initial/boundary conditions are added to these files using a tool like mozbc. See How do I generate chemical initial/boundary conditions?

  • Emissions inputs specified in paths within HEMCO_Config.rc.

  • Essential chemistry input files specified within input.geos or geoschem_config.yml.

How do I generate chemical initial/boundary conditions?

See Chemical Initial/Boundary Conditions (IC/BC).

Can I nudge the input meteorology data?

Yes. The process is similar to WRF, or WRF-Chem. Refer to Nudging meteorology.

Can I use other emission inventories?

Yes. You want to edit the HEMCO configuration file (HEMCO_Config.rc), like in GEOS-Chem. Generally, all emission inventories supported by GEOS-Chem/HEMCO are supported in WRF-GC, because the species list is the same.

For emission inventories from WRF-Chem (e.g., prepared by prep_chem_sources), these inventories need to be ported to HEMCO to function with WRF-GC. We do not support WRF-Chem style emissions (i.e., “auxinput05”) within WRF-GC.

Running and configuration

Can I use WRF-GC to run Hg/CH4/CO2 specialty simulations?

Generally not out of the box. WRF-GC only supports chemical mechanisms for full-chemistry simulations. Xu et al. (2022) has developed a new version (WRF-GC-Hg v1.0) for atmospheric Hg. You can refer to Xu et al., 2022 on GMD for more information on the development of Hg simulation on WRF-GC.

Can I use WRF-GC with tropchem (troposphere-only chemistry)?

Generally not out of the box. By default, UCX will run above the tropopause, into the stratosphere. Note that WRF model top is generally lower than GEOS-Chem (which goes to 0.1 hPa). It is recommended to leave this at the default configuration - note that recent versions of GEOS-Chem have also retired tropchem.

Can I customize WRF-GC’s vertical grid?

Yes. See WRF User’s Guide on vertical grid configuration. Specifically, p_top_requested in namelist.input sets the model top target (note that WPS must have the necessary meteorology up to this pressure level), and e_vert specifies the number of vertical levels requested.

If only e_vert and p_top_requested are specified, then real.exe will generate vertical levels for you and output some information. You may also be able to specify the grid completely manually using eta_levels.

Where are the configuration files for WRF-GC?

In the run directory. Generally, WRFV3/run or WRF/run. The configuration files you want to use are namelist.input (WRF namelist - configures both WRF and chemistry options), and HEMCO_Config.rc (to configure HEMCO emissions).

input.geos (or geoschem_config.yml) also holds the paths to some essential chemistry input files. You generally only need to edit the paths in this file.

Note

While input.geos (or geoschem_config.yml), HISTORY.rc, and HEMCO_Diagn.rc files, familiar to GEOS-Chem users, are also in the WRF-GC run directory, they should generally not be modified. To control WRF-GC, use the WRF namelist namelist.input.

Warning

Be careful to back up your configuration files. Every WRF-GC recompile will reset the namelist and configuration files.

Can I run the model in multiple segmented runs?

Yes. WRF will generate restart files based on the namelist configuration’s restart_interval.

What do we do about WRF parameterizations (e.g., cumulus) at higher resolution runs?

This is a research question, but the WRF-GC paper Lin et al., 2020 includes some guidance:

The WRF-GC state conversion module currently supports convective mass flux calculations using the new Tiedtke scheme (Tiedtke, 1989; Zhang et al., 2011; Zhang and Wang, 2017) and the Zhang–McFarlane scheme (Zhang and McFarlane, 1995) (Table 1), because these two cumulus parameterization schemes are more physically compatible with the convective transport algorithm currently in GEOS-Chem. In addition, the users should consider the horizontal resolution of the model when choosing which cumulus parameterization to use. The new Tiedtke scheme and the Zhang–McFarlane schemes are generally recommended for use in simulations at horizontal resolutions larger than 10km (Skamarock et al., 2008; Arakawa and Jung, 2011). At horizontal resolutions between 2 and 10km, the so-called “convective grey zone” (Jeworrek et al., 2019), the use of the Grell–Freitas scheme is recommended for the WRF model (Grell and Freitas, 2014), as it allows subsidence to spread to neighboring columns; this option will be implemented in a future WRF-GC version. At horizontal resolutions finer than 2km, it is assumed that convections are resolved and cumulus parameterizations should not be used (Grell and Freitas, 2014; Jeworrek et al., 2019). The scale dependence of cumulus parameterizations and their impacts on convective mixing of chemical species are an active area of research, which we will explore in the future using WRF-GC.

Output

How can I configure output?

Use history_interval in WRF’s namelist.input.

What is the output format? What are some tools to process them?

The output is in wrfout_ netCDF format used by WRF, and WRF-Chem. As such, tools to process WRF and WRF-Chem outputs may be useful for WRF-GC with some species name modifications.

The outputs are so large! Can I compress them?

You may be able to use netCDF tools to only get the variables you want after output. If you want to customize how WRF-GC outputs variables, see Can I select what variables WRF-GC outputs?.

Can I select what variables WRF-GC outputs?

Yes. WRF documentation has some instructions that make this possible. A text file (e.g., outputlist.txt) can be added to the run directory containing customized options, e.g.,

-:h:0:sala,salc

This will remove sala and salc from the output file (wrfout_…), when namelist.input’s &time_control section has the following:

iofields_filename = 'outputlist.txt',

If you want to remove pretty much all the species and only select the ones available, you can start from the species list in Registry/registry.chem (search for chem_opt==233), and use that comma-separated list.

Can I output GEOS-Chem diagnostics?

Limited support is available for this at this time. Generally, only very specific diagnostics such as wet deposition loss rates are available. See Additional diagnostics for detailed descriptions and guidance on how to manually output anything that is within GEOS-Chem / HEMCO.

Planeflight diagnostics are not available at this time but may be developed in the future.

Advanced

Does WRF-GC support MPI or OpenMP parallelization?

At present, only MPI. OpenMP routines were removed during the development of WRF-GC.

Which MPI does WRF-GC support?

mvapich was used for development, but OpenMPI, and Intel MPI should also work. When configuring WRF-GC you are asked to fill in the correct MPI library in the ESMF_COMM environment variable. openmpi, mvapich2, and intelmpi are supported.

If you have a different MPI communicator for compiling, you can try to edit these options with the correct linking flags for your MPI in WRF/chem/Makefile:

# Specify MPI-specific options (hplin, 6/23/19)
ifeq ($(ESMF_COMM),openmpi)
        MPI_OPT := $(shell mpif90 --showme:link)
        MPI_OPT += $(shell mpicxx --showme:link)
else ifeq ($(ESMF_COMM),mvapich2)
        MPI_OPT := -lmpich -lmpichf90
else ifeq ($(ESMF_COMM),intelmpi)
        MPI_OPT := -lmpi
else
        $(error Unknown MPI communicator ESMF_COMM, valid are openmpi or mvapich2)
endif

Does WRF-GC support parallel I/O by WRF?

Yes, but HEMCO does not use parallel I/O. You do not need PNETCDF to run WRF-GC normally.