Set up a rectangular regional CESM-MOM6 run#

A typical workflow of utilizing CrocoDash consists of four main steps:

  1. Generate a regional MOM6 domain.

  2. Create the CESM case.

  3. Prepare ocean forcing data.

  4. Build and run the case.

SECTION 1: Generate a regional MOM6 domain#

We begin by defining a regional MOM6 domain using CrocoDash. To do so, we first generate a horizontal grid. We then generate the topography by remapping an existing bathymetric dataset to our horizontal grid. Finally, we define a vertical grid.

Step 1.1: Horizontal Grid#

from CrocoDash.grid import Grid

grid = Grid(
  resolution = 0.01,
  xstart = 278.0,
  lenx = 1.0,
  ystart = 7.0,
  leny = 1.0,
  name = "panama1",
)

Step 1.2: Topography#

from CrocoDash.topo import Topo

topo = Topo(
    grid = grid,
    min_depth = 9.5,
)
bathymetry_path='<BATHY_FILE>'

topo.interpolate_from_file(
    file_path = bathymetry_path,
    longitude_coordinate_name="lon",
    latitude_coordinate_name="lat",
    vertical_coordinate_name="elevation"
)
If bathymetry setup fails, rerun this function with write_to_file = True
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Input In [3], in <cell line: 0>()
      1 bathymetry_path='<BATHY_FILE>'
----> 3 topo.interpolate_from_file(
      4     file_path = bathymetry_path,
      5     longitude_coordinate_name="lon",
      6     latitude_coordinate_name="lat",
      7     vertical_coordinate_name="elevation"
      8 )

File ~/work/CrocoGallery/CrocoGallery/CrocoDash/CrocoDash/topo.py:44, in Topo.interpolate_from_file(self, file_path, longitude_coordinate_name, latitude_coordinate_name, vertical_coordinate_name, fill_channels, positive_down, write_to_file)
     40     expt.mom_input_dir = Path("")
     41 print(
     42     "If bathymetry setup fails, rerun this function with write_to_file = True"
     43 )
---> 44 self._depth = expt.setup_bathymetry(
     45     bathymetry_path=file_path,
     46     longitude_coordinate_name=longitude_coordinate_name,
     47     latitude_coordinate_name=latitude_coordinate_name,
     48     vertical_coordinate_name=vertical_coordinate_name,
     49     fill_channels=fill_channels,
     50     positive_down=positive_down,
     51     write_to_file=write_to_file,
     52 ).depth

File ~/work/CrocoGallery/CrocoGallery/CrocoDash/CrocoDash/rm6/regional_mom6/regional_mom6.py:1736, in experiment.setup_bathymetry(self, bathymetry_path, longitude_coordinate_name, latitude_coordinate_name, vertical_coordinate_name, fill_channels, positive_down, write_to_file, regridding_method)
   1728 ## Convert the provided coordinate names into a dictionary mapping to the
   1729 ## coordinate names that MOM6 expects.
   1730 coordinate_names = {
   1731     "xh": longitude_coordinate_name,
   1732     "yh": latitude_coordinate_name,
   1733     "depth": vertical_coordinate_name,
   1734 }
-> 1736 bathymetry = xr.open_zarr(bathymetry_path, chunks="auto",storage_options={"anon": False})[
   1737     coordinate_names["depth"]
   1738 ]
   1740 bathymetry = bathymetry.sel(
   1741     {
   1742         coordinate_names["yh"]: slice(
   (...)
   1745     }  # 0.5 degree latitude buffer (hardcoded) for regridding
   1746 ).astype("float")
   1748 ## Check if the original bathymetry provided has a longitude extent that goes around the globe
   1749 ## to take care of the longitude seam when we slice out the regional domain.

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/backends/zarr.py:944, in open_zarr(store, group, synchronizer, chunks, decode_cf, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, consolidated, overwrite_encoded_chunks, chunk_store, storage_options, decode_timedelta, use_cftime, zarr_version, chunked_array_type, from_array_kwargs, **kwargs)
    930     raise TypeError(
    931         "open_zarr() got unexpected keyword arguments " + ",".join(kwargs.keys())
    932     )
    934 backend_kwargs = {
    935     "synchronizer": synchronizer,
    936     "consolidated": consolidated,
   (...)
    941     "zarr_version": zarr_version,
    942 }
--> 944 ds = open_dataset(
    945     filename_or_obj=store,
    946     group=group,
    947     decode_cf=decode_cf,
    948     mask_and_scale=mask_and_scale,
    949     decode_times=decode_times,
    950     concat_characters=concat_characters,
    951     decode_coords=decode_coords,
    952     engine="zarr",
    953     chunks=chunks,
    954     drop_variables=drop_variables,
    955     chunked_array_type=chunked_array_type,
    956     from_array_kwargs=from_array_kwargs,
    957     backend_kwargs=backend_kwargs,
    958     decode_timedelta=decode_timedelta,
    959     use_cftime=use_cftime,
    960     zarr_version=zarr_version,
    961 )
    962 return ds

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/backends/api.py:572, in open_dataset(filename_or_obj, engine, chunks, cache, decode_cf, mask_and_scale, decode_times, decode_timedelta, use_cftime, concat_characters, decode_coords, drop_variables, inline_array, chunked_array_type, from_array_kwargs, backend_kwargs, **kwargs)
    560 decoders = _resolve_decoders_kwargs(
    561     decode_cf,
    562     open_backend_dataset_parameters=backend.open_dataset_parameters,
   (...)
    568     decode_coords=decode_coords,
    569 )
    571 overwrite_encoded_chunks = kwargs.pop("overwrite_encoded_chunks", None)
--> 572 backend_ds = backend.open_dataset(
    573     filename_or_obj,
    574     drop_variables=drop_variables,
    575     **decoders,
    576     **kwargs,
    577 )
    578 ds = _dataset_from_backend_dataset(
    579     backend_ds,
    580     filename_or_obj,
   (...)
    590     **kwargs,
    591 )
    592 return ds

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/backends/zarr.py:1011, in ZarrBackendEntrypoint.open_dataset(self, filename_or_obj, mask_and_scale, decode_times, concat_characters, decode_coords, drop_variables, use_cftime, decode_timedelta, group, mode, synchronizer, consolidated, chunk_store, storage_options, stacklevel, zarr_version)
    990 def open_dataset(  # type: ignore[override]  # allow LSP violation, not supporting **kwargs
    991     self,
    992     filename_or_obj: str | os.PathLike[Any] | BufferedIOBase | AbstractDataStore,
   (...)
   1008     zarr_version=None,
   1009 ) -> Dataset:
   1010     filename_or_obj = _normalize_path(filename_or_obj)
-> 1011     store = ZarrStore.open_group(
   1012         filename_or_obj,
   1013         group=group,
   1014         mode=mode,
   1015         synchronizer=synchronizer,
   1016         consolidated=consolidated,
   1017         consolidate_on_close=False,
   1018         chunk_store=chunk_store,
   1019         storage_options=storage_options,
   1020         stacklevel=stacklevel + 1,
   1021         zarr_version=zarr_version,
   1022     )
   1024     store_entrypoint = StoreBackendEntrypoint()
   1025     with close_on_error(store):

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/backends/zarr.py:441, in ZarrStore.open_group(cls, store, mode, synchronizer, group, consolidated, consolidate_on_close, chunk_store, storage_options, append_dim, write_region, safe_chunks, stacklevel, zarr_version, write_empty)
    439 if consolidated is None:
    440     try:
--> 441         zarr_group = zarr.open_consolidated(store, **open_kwargs)
    442     except KeyError:
    443         try:

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/zarr/convenience.py:1345, in open_consolidated(store, metadata_key, mode, **kwargs)
   1343 # normalize parameters
   1344 zarr_version = kwargs.get("zarr_version")
-> 1345 store = normalize_store_arg(
   1346     store, storage_options=kwargs.get("storage_options"), mode=mode, zarr_version=zarr_version
   1347 )
   1348 if mode not in {"r", "r+"}:
   1349     raise ValueError(f"invalid mode, expected either 'r' or 'r+'; found {mode!r}")

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/zarr/storage.py:201, in normalize_store_arg(store, storage_options, mode, zarr_version)
    199 else:
    200     raise ValueError("zarr_version must be either 2 or 3")
--> 201 return normalize_store(store, storage_options, mode)

File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/zarr/storage.py:173, in _normalize_store_arg_v2(store, storage_options, mode)
    171     return FSStore(store, mode=mode, **(storage_options or {}))
    172 elif storage_options:
--> 173     raise ValueError("storage_options passed with non-fsspec path")
    174 if store.endswith(".zip"):
    175     return ZipStore(store, mode=mode)

ValueError: storage_options passed with non-fsspec path
topo.depth.plot()
<matplotlib.collections.QuadMesh at 0x346dbeab0>
../../_images/b352f12206a9a67b0c73f3e5cafbad410359214594ef4a4fba1a90856bba2c38.png
# Erase Pacific & Canada Bays
%matplotlib ipympl
from CrocoDash.topo_editor import TopoEditor
topo.depth["units"] = "m"
TopoEditor(topo)

Step 1.3: Vertical Grid#

from CrocoDash.vgrid import VGrid

vgrid  = VGrid.hyperbolic(
    nk = 75,
    depth = topo.max_depth,
    ratio=20.0
)
print(vgrid.dz)
[ 3.65535122  3.67844102  3.7057838   3.73815846  3.77648469  3.82184774
  3.87552721  3.93903041  4.01413088  4.10291266  4.20782073  4.33171814
  4.47794994  4.65041395  4.85363759  5.0928596   5.37411432  5.70431463
  6.09132815  6.54403859  7.07238132  7.68733883  8.40087798  9.2258073
 10.17552982 11.26366612 12.50352475 13.90740505 15.48573221 17.24604754
 19.19190787 21.32178462 23.62808815 26.09646895 28.70555098 31.42722598
 34.22757583 37.06840024 39.90922464 42.70957449 45.43124949 48.04033152
 50.50871232 52.81501585 54.9448926  56.89075293 58.65106826 60.22939542
 61.63327572 62.87313435 63.96127066 64.91099317 65.73592249 66.44946164
 67.06441915 67.59276188 68.04547232 68.43248585 68.76268615 69.04394087
 69.28316289 69.48638652 69.65885053 69.80508234 69.92897974 70.03388781
 70.12266959 70.19777006 70.26127326 70.31495273 70.36031578 70.39864201
 70.43101667 70.45835946 70.48144925]
import matplotlib.pyplot as plt
plt.close()
# Create the plot
for depth in vgrid.z:
    plt.axhline(y=depth, linestyle='-')  # Horizontal lines

plt.ylim(max(vgrid.z) + 10, min(vgrid.z) - 10)  # Invert y-axis so deeper values go down
plt.ylabel("Depth")
plt.title("Vertical Grid")
plt.show()

SECTION 2: Create the CESM case#

After generating the MOM6 domain, the next step is to create a CESM case using CrocoDash. This process is straightforward and involves instantiating the CrocoDash Case object. The Case object requires the following inputs:

  • CESM Source Directory: A local path to a compatible CESM source copy.

  • Case Name: A unique name for the CESM case.

  • Input Directory: The directory where all necessary input files will be written.

  • MOM6 Domain Objects: The horizontal grid, topography, and vertical grid created in the previous section.

  • Project ID: (Optional) A project ID, if required by the machine.

Step 2.1: Specify case name and directories:#

Begin by specifying the case name and the necessary directory paths. Ensure the CESM root directory points to your own local copy of CESM. Below is an example setup:

from pathlib import Path
# CESM case (experiment) name
casename = "panama-1"

# CESM source root (Update this path accordingly!!!)
cesmroot ="<CESM>"

# Place where all your input files go 
inputdir = Path.home() / "croc_input" / casename
    
# CESM case directory
caseroot = Path.home() / "croc_cases" / casename

Step 2.2: Create the Case#

To create the CESM case, instantiate the Case object as shown below. This will automatically set up the CESM case based on the provided inputs: The cesmroot argument specifies the path to your local CESM source directory. The caseroot argument defines the directory where the case will be created. CrocoDash will handle all necessary namelist modifications and XML changes to align with the MOM6 domain objects generated earlier.

from CrocoDash.case import Case
import os
case = Case(
    cesmroot = cesmroot,
    caseroot = caseroot,
    inputdir = inputdir,
    ocn_grid = grid,
    ocn_vgrid = vgrid,
    ocn_topo = topo,
    project = 'NCGD0011',
    override = True,
    machine = "ubuntu-latest"
)
INFO:  csp_solver:CspSolver initialized.
Creating case...

• Updating ccs_config/modelgrid_aliases_nuopc.xml file to include the new resolution "panama-1" consisting of the following component grids.
 atm grid: "TL319", lnd grid: "TL319", ocn grid: "panama1".

• Updating ccs_config/component_grids_nuopc.xml file to include newly generated ocean grid "panama1" with the following properties:
 nx: 100, ny: 100. ocean mesh: /Users/manishrv/croc_input/panama-1/ocnice/ESMF_mesh_panama1_0a7340.nc.

Running the create_newcase tool with the following command:

/Users/manishrv/CrocoGallery/cesm/cime/scripts/create_newcase --compset 1850_DATM%JRA_SLND_SICE_MOM6_SROF_SGLC_SWAV --res panama-1 --case /Users/manishrv/croc_cases/panama-1 --machine ubuntu-latest --run-unsupported --project NCGD0011 

The create_newcase command was successful.

Navigating to the case directory:

cd /Users/manishrv/croc_cases/panama-1

Apply NTASK grid xml changes:

./xmlchange NTASKS_OCN=128

Running the case.setup script with the following command:

./case.setup

Adding parameter changes to user_nl_mom:

  ! Custom Horizonal Grid, Topography, and Vertical Grid
  INPUTDIR = /Users/manishrv/croc_input/panama-1/ocnice
  TRIPOLAR_N = False
  REENTRANT_X = False
  REENTRANT_Y = False
  NIGLOBAL = 100
  NJGLOBAL = 100
  GRID_CONFIG = mosaic
  GRID_FILE = ocean_hgrid_panama1_0a7340.nc
  TOPO_CONFIG = file
  TOPO_FILE = ocean_topog_panama1_0a7340.nc
  MAXIMUM_DEPTH = 2780.1300176568307
  MINIMUM_DEPTH = 9.5
  NK = 75
  COORD_CONFIG = none
  ALE_COORDINATE_CONFIG = FILE:ocean_vgrid_panama1_0a7340.nc
  REGRIDDING_COORDINATE_MODE = Z*

  ! Timesteps (based on grid resolution)
  DT = 25.0
  DT_THERM = 100.0
INFO:	stage:SUCCESS: All stages are complete.
Case created successfully at /Users/manishrv/croc_cases/panama-1.

To further customize, build, and run the case, navigate to the case directory in your terminal. To create another case, restart the notebook.

Section 3: Prepare ocean forcing data#

We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is "east_unprocessed" for segments and "ic_unprocessed" for the initial condition.

In this notebook, we are forcing with the Copernicus Marine “Glorys” reanalysis dataset. There’s a function in the CrocoDash package, called configure_forcings, that generates a bash script to download the correct boundary forcing files for your experiment. First, you will need to create an account with Copernicus, and then call copernicusmarine login to set up your login details on your machine. Then you can run the get_glorys_data.sh bash script.

Step 3.1 Configure Initial Conditions and Forcings#

case.configure_forcings(
    date_range = ["2020-01-01 00:00:00", "2020-01-09 00:00:00"],
    function_name="get_glorys_data_from_cds_api"
)
INFO - 2025-06-05T16:49:37Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-06-05T16:49:37Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-06-05T16:49:46Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
INFO - 2025-06-05T16:52:27Z - Successfully downloaded to /var/folders/s5/23r81yrs51q2rlvqmt_qdnl40000gp/T/tmpy0f_752v/test_file.nc
INFO:copernicusmarine:Successfully downloaded to /var/folders/s5/23r81yrs51q2rlvqmt_qdnl40000gp/T/tmpy0f_752v/test_file.nc
INFO - 2025-06-05T16:52:30Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-06-05T16:52:30Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-06-05T16:52:39Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
INFO - 2025-06-05T16:54:53Z - Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/east_unprocessed.nc
INFO:copernicusmarine:Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/east_unprocessed.nc
INFO - 2025-06-05T16:54:56Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-06-05T16:54:56Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-06-05T16:55:04Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
INFO - 2025-06-05T16:56:55Z - Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/west_unprocessed.nc
INFO:copernicusmarine:Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/west_unprocessed.nc
INFO - 2025-06-05T16:57:01Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-06-05T16:57:01Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-06-05T16:57:09Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
INFO - 2025-06-05T16:58:32Z - Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/north_unprocessed.nc
INFO:copernicusmarine:Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/north_unprocessed.nc
INFO - 2025-06-05T16:58:35Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-06-05T16:58:35Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-06-05T16:58:43Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
INFO - 2025-06-05T17:02:36Z - Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/south_unprocessed.nc
INFO:copernicusmarine:Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/south_unprocessed.nc
INFO - 2025-06-05T17:02:39Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-06-05T17:02:39Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-06-05T17:02:47Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
INFO - 2025-06-05T17:05:49Z - Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/ic_unprocessed.nc
INFO:copernicusmarine:Successfully downloaded to /Users/manishrv/croc_input/panama-1/glorys/ic_unprocessed.nc

Step 3.3: Process forcing data#

In this final step, we call the process_forcings method of CrocoDash to cut out and interpolate the initial condition as well as all boundaries. CrocoDash also updates MOM6 runtime parameters and CESM xml variables accordingly.

case.process_forcings()
INFO:regional_mom6.regridding:Getting t points..
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.regridding:Creating Regridder
Setting up Initial Conditions
Regridding Velocities... 
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Getting u points..
INFO:regional_mom6.regridding:Getting v points..
Done.
Regridding Tracers... Done.
Regridding Free surface... Done.
Saving outputs... 
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_001
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_001
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_001
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_001
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_001
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_002
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_002
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_002
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_002
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_002
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
done setting up initial condition.
Processing south boundary velocity & tracers...Done.
Processing north boundary velocity & tracers...Done.
Processing west boundary velocity & tracers...
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_003
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_003
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_003
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_003
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_003
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_004
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_004
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_004
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_004
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_004
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
Done.
Processing east boundary velocity & tracers...Done.
Adding parameter changes to user_nl_mom:

  ! Initial conditions
  INIT_LAYERS_FROM_Z_FILE = True
  TEMP_SALT_Z_INIT_FILE = init_tracers.nc
  Z_INIT_FILE_PTEMP_VAR = temp
  Z_INIT_ALE_REMAPPING = True
  TEMP_SALT_INIT_VERTICAL_REMAP_ONLY = True
  DEPRESS_INITIAL_SURFACE = True
  SURFACE_HEIGHT_IC_FILE = init_eta.nc
  SURFACE_HEIGHT_IC_VAR = eta_t
  VELOCITY_CONFIG = file
  VELOCITY_FILE = init_vel.nc

  ! Open boundary conditions
  OBC_NUMBER_OF_SEGMENTS = 4
  OBC_FREESLIP_VORTICITY = False
  OBC_FREESLIP_STRAIN = False
  OBC_COMPUTED_VORTICITY = True
  OBC_COMPUTED_STRAIN = True
  OBC_ZERO_BIHARMONIC = True
  OBC_TRACER_RESERVOIR_LENGTH_SCALE_OUT = 3.0E+04
  OBC_TRACER_RESERVOIR_LENGTH_SCALE_IN = 3000.0
  BRUSHCUTTER_MODE = True
  OBC_SEGMENT_001 = "J=0,I=0:N,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
  OBC_SEGMENT_001_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
  OBC_SEGMENT_001_DATA = "U=file:forcing_obc_segment_001.nc(u),V=file:forcing_obc_segment_001.nc(v),SSH=file:forcing_obc_segment_001.nc(eta),TEMP=file:forcing_obc_segment_001.nc(temp),SALT=file:forcing_obc_segment_001.nc(salt)"
  OBC_SEGMENT_002 = "J=N,I=N:0,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
  OBC_SEGMENT_002_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
  OBC_SEGMENT_002_DATA = "U=file:forcing_obc_segment_002.nc(u),V=file:forcing_obc_segment_002.nc(v),SSH=file:forcing_obc_segment_002.nc(eta),TEMP=file:forcing_obc_segment_002.nc(temp),SALT=file:forcing_obc_segment_002.nc(salt)"
  OBC_SEGMENT_003 = "I=0,J=N:0,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
  OBC_SEGMENT_003_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
  OBC_SEGMENT_003_DATA = "U=file:forcing_obc_segment_003.nc(u),V=file:forcing_obc_segment_003.nc(v),SSH=file:forcing_obc_segment_003.nc(eta),TEMP=file:forcing_obc_segment_003.nc(temp),SALT=file:forcing_obc_segment_003.nc(salt)"
  OBC_SEGMENT_004 = "I=N,J=0:N,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
  OBC_SEGMENT_004_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
  OBC_SEGMENT_004_DATA = "U=file:forcing_obc_segment_004.nc(u),V=file:forcing_obc_segment_004.nc(v),SSH=file:forcing_obc_segment_004.nc(eta),TEMP=file:forcing_obc_segment_004.nc(temp),SALT=file:forcing_obc_segment_004.nc(salt)"

./xmlchange RUN_STARTDATE=2020-01-01

./xmlchange MOM6_MEMORY_MODE=dynamic_symmetric

Case is ready to be built: /Users/manishrv/croc_cases/panama-1

Section 4: Build and run the case#

After completing the previous steps, you are ready to build and run your CESM case. Begin by navigating to the case root directory specified during the case creation. Before proceeding, review the user_nl_mom file located in the case directory. This file contains MOM6 parameter settings that were automatically generated by CrocoDash. Carefully examine these parameters and make any necessary adjustments to fine-tune the model for your specific requirements. While CrocoDash aims to provide a solid starting point, further tuning and adjustments are typically necessary to improve the model for your use case.

Once you have reviewed and modified the parameters as needed, you can build and execute the case using the following commands:

./case.build
./case.submit