Domain, meteorology, and configuration
This section discusses:
Setting up the domain using WPS’s
Processing the meteorological initial and boundary conditions by downloading, running WPS’s
Basic configuration of WRF-GC using the
real.exeto prepare input before adding in chemical initial/boundary conditions.
The WRF Pre-Processor can be learned best from the WRF User’s Guide, as this is not specific to chemistry.
An overview of the workflow of the WRF Pre-Processor system (by Xu Feng):
Setting up the domain using GEOGRID
Configuration of the WPS is done in
namelist.wps file under the
The first step is to describe your simulation domain. Example entries for the
&geogrid sections are below:
&share wrf_core = 'ARW', max_dom = 1, start_date = '2016-06-27_00:00:00', end_date = '2016-06-29_00:00:00', interval_seconds = 21600 io_form_geogrid = 2, debug_level = 1, / &geogrid parent_id = 1, parent_grid_ratio = 1, i_parent_start = 1, j_parent_start = 1, e_we = 245, e_sn = 181, geog_data_res = 'default', 'default', dx = 27000, dy = 27000, map_proj = 'mercator', ref_lat = 27, ref_lon = 105, truelat1 = 27.0, stand_lon = 105, geog_data_path = '/n/seasasfs02/hplin/geog' /
The configuration options you need to change with a brief description are listed below. This will get you up and running fast, but we recommend checking out the WPS user’s guide.
max_dom: Number of domains. 1 = single domain, up to 8 are supported when working with nested domains. We do not discuss multiple domains here for simplicity.
start_date(per-domain): Start date of simulation
end_date(per-domain): End date of simulation
e_sn: Dimensions of the grid in x/y dimensions.
dy: Grid distance in the x/y dimensions where the map scale factor is 1. In meters when
map_proj = 'mercator', in degrees when
map_proj = 'lat-lon'.
map_proj: Map projection. Only mercator and lat-lon (unrotated regular latitude-longitude) are supported currently in WRF-GC.
stand_lonetc. are grid location parameters (where your regional grid is located in). Refer to the WRF User’s Guide.
geog_data_path: Path to the static WPS input data you downloaded in the previous steps.
namelist.wps is configured, you can run GEOGRID:
This will generate
geo_em.d01.nc (1 domain) and other
geo_em.d0X.nc files for other domains if you are using multiple domains.
Preview the generated grid using the
ncl script (requires NCL installed):
An example is shown below:
Downloading meteorological data
Setting up Vtable
Depending on the meteorological data, the appropriate
Vtable needs to be linked so the UNGRIB utility can find it.
ln -s ungrib/Variable_Tables/Vtable.GFS Vtable
Running UNGRIB and METGRID
Configure UNGRIB and METGRID in
namelist.wps. These should be mostly unchanged:
&ungrib out_format = 'WPS', prefix = 'FILE', / &metgrid fg_name = 'FILE', io_form_metgrid = 2, /
Link GRIB files -
./link_grib.csh gfs* (replace
gfs* pointing to the meteorological input files you downloaded in the previous step)
./metgrid.exe. You should now have meteorology data named
met_em.d… in the WPS directory.
Configuring WRF-GC -
Almost all WRF-GC configuration is performed inside namelist.input. This namelist, located in the WRF run directory, controls most aspects of the simulation.
Not all options in WRF for dynamics and physics are supported in WRF-GC! This is because to couple WRF to GEOS-Chem, the internal quantities need to be translated to GEOS-Chem’s meteorology format (based on GEOS-FP).
The list of supported schemes is available in Lin et al., 2020:
We do not discuss WRF configuration options in detail here and invite you to refer to the WRF User’s Guide. The basic options to change in
Configure the length of your run in
Configure output frequency. Use
history_interval(in minutes). e.g., hourly output -
history_interval = 60.
Configure frames per output netCDF file. e.g.,
frames_per_outfile = 2with
history_interval = 60means 2 hours will be written per file.
Restarts. If this is a restart run (running from existing
restart = .true.. By default should be set to
Write out restart files. Set
Microphysics scheme. (
mp_physics): We recommend the Morrison Double-Moment scheme (
mp_physics = 10).
Cumulus parameterization scheme. (
cu_physics): We recommend New-Tiedke scheme (
cu_physics = 16).
Configuration of chemistry is within the
For WRF-GC chemistry, set
chem_opt = 233.
You can control individual processes in GEOS-Chem using:
Turbulence / Boundary layer mixing:
By setting these switches to
0 (off) or
To configure some simple GEOS-Chem diagnostics, add options to
&chem following the guide in Additional diagnostics.
Configuring WRF-GC -
GEOS-Chem version 12 and 13:
Most input.geos options known by GEOS-Chem users are not configured in input.geos in WRF-GC, and are instead controlled by
namelist.input. Only two exceptions: the path to
CHEM_INPUTS needs to be specified in:
Root data directory : /n/holyscratch01/external_repos/GEOS-CHEM/gcgrid/data/ExtData/
%%% PHOTOLYSIS MENU %%% : FAST-JX directory : /n/holyscratch01/external_repos/GEOS-CHEM/gcgrid/data/ExtData/CHEM_INPUTS/FAST_JX/v2021-10/
GEOS-Chem version 14 and above:
Most geoschem_config.yml options are controlled by
namelist.input, except the file input paths:
root_data_dir: /n/holyscratch01/external_repos/GEOS-CHEM/gcgrid/data/ExtData chem_inputs_dir: /n/holyscratch01/external_repos/GEOS-CHEM/gcgrid/data/ExtData/CHEM_INPUTS/ ... photolysis: input_dir: /n/holyscratch01/external_repos/GEOS-CHEM/gcgrid/data/ExtData/CHEM_INPUTS/FAST_JX/v2021-10/
and the Complex SOA option, which can be enabled and the Complex SOA species (TSOAx, ASOAx, …) need to be added to the advected species list.
Most other options in input.geos (or geoschem_config.yml) for WRF-GC are ignored.
Configuring WRF-GC - emissions in
Configuration of HEMCO is exactly the same as the GEOS-Chem model. Remember to update the HEMCO data path in this configuration file:
A reminder about
namelist.input configuration files - these files are replaced every time the WRF model is recompiled (when
./compile em_real is ran). Please remember to back up your configuration files!
After configuring, run
real.exe. This is a memory and compute intensive operation - if you are on a cluster, you will need to submit a batch job like you would do when running other models. Otherwise, run
mpirun -np 32 ./real.exe
Where “32” would be the number of cores. The output can be watched by
tail -f rsl.out.0000 and any errors would be in
real.exe, the initial condition file
wrfinput_d<domain> and boundary condition file(s)
wrfbdy_d<domain> are generated.