Set up a rectangular regional CESM-MOM6 run#
A typical workflow of utilizing CrocoDash consists of four main steps:
Generate a regional MOM6 domain.
Create the CESM case.
Prepare ocean forcing data.
Build and run the case.
SECTION 1: Generate a regional MOM6 domain#
We begin by defining a regional MOM6 domain using CrocoDash. To do so, we first generate a horizontal grid. We then generate the topography by remapping an existing bathymetric dataset to our horizontal grid. Finally, we define a vertical grid.
Step 1.1: Horizontal Grid#
from CrocoDash.grid import Grid
grid = Grid(
resolution = 0.01,
xstart = 278.0,
lenx = 1.0,
ystart = 7.0,
leny = 1.0,
name = "panama1",
)
Step 1.2: Topography#
from CrocoDash.topo import Topo
topo = Topo(
grid = grid,
min_depth = 9.5,
)
bathymetry_path='s3://crocodile-cesm/CrocoDash/data/gebco/GEBCO_2024.zarr/'
topo.interpolate_from_file(
file_path = bathymetry_path,
longitude_coordinate_name="lon",
latitude_coordinate_name="lat",
vertical_coordinate_name="elevation"
)
If bathymetry setup fails, rerun this function with write_to_file = True
Begin regridding bathymetry...
Original bathymetry size: 1.85 Mb
Regridded size: 0.24 Mb
Automatic regridding may fail if your domain is too big! If this process hangs or crashes,make sure function argument write_to_file = True and,open a terminal with appropriate computational and resources try calling ESMF directly in the input directory None via
`mpirun -np NUMBER_OF_CPUS ESMF_Regrid -s bathymetry_original.nc -d bathymetry_unfinished.nc -m bilinear --src_var depth --dst_var depth --netcdf4 --src_regional --dst_regional`
For details see https://xesmf.readthedocs.io/en/latest/large_problems_on_HPC.html
Afterwards, we run the 'expt.tidy_bathymetry' method to skip the expensive interpolation step, and finishing metadata, encoding and cleanup.
Regridding successful! Now calling `tidy_bathymetry` method for some finishing touches...
setup bathymetry has finished successfully.
Tidy bathymetry: Reading in regridded bathymetry to fix up metadata...done. Filling in inland lakes and channels...
topo.depth.plot()
<matplotlib.collections.QuadMesh at 0x7f0ece942cd0>

# Erase Pacific & Canada Bays
%matplotlib ipympl
from CrocoDash.topo_editor import TopoEditor
topo.depth["units"] = "m"
TopoEditor(topo)
Step 1.3: Vertical Grid#
from CrocoDash.vgrid import VGrid
vgrid = VGrid.hyperbolic(
nk = 75,
depth = topo.max_depth,
ratio=20.0
)
print(vgrid.dz)
[ 3.65535122 3.67844102 3.7057838 3.73815846 3.77648469 3.82184774
3.87552721 3.93903041 4.01413088 4.10291266 4.20782073 4.33171814
4.47794994 4.65041395 4.85363759 5.0928596 5.37411432 5.70431463
6.09132815 6.54403859 7.07238132 7.68733883 8.40087798 9.2258073
10.17552982 11.26366612 12.50352475 13.90740505 15.48573221 17.24604754
19.19190787 21.32178462 23.62808815 26.09646895 28.70555098 31.42722598
34.22757583 37.06840024 39.90922464 42.70957449 45.43124949 48.04033152
50.50871232 52.81501585 54.9448926 56.89075293 58.65106826 60.22939542
61.63327572 62.87313435 63.96127066 64.91099317 65.73592249 66.44946164
67.06441915 67.59276188 68.04547232 68.43248585 68.76268615 69.04394087
69.28316289 69.48638652 69.65885053 69.80508234 69.92897974 70.03388781
70.12266959 70.19777006 70.26127326 70.31495273 70.36031578 70.39864201
70.43101667 70.45835946 70.48144925]
import matplotlib.pyplot as plt
plt.close()
# Create the plot
for depth in vgrid.z:
plt.axhline(y=depth, linestyle='-') # Horizontal lines
plt.ylim(max(vgrid.z) + 10, min(vgrid.z) - 10) # Invert y-axis so deeper values go down
plt.ylabel("Depth")
plt.title("Vertical Grid")
plt.show()
SECTION 2: Create the CESM case#
After generating the MOM6 domain, the next step is to create a CESM case using CrocoDash. This process is straightforward and involves instantiating the CrocoDash Case object. The Case object requires the following inputs:
CESM Source Directory: A local path to a compatible CESM source copy.
Case Name: A unique name for the CESM case.
Input Directory: The directory where all necessary input files will be written.
MOM6 Domain Objects: The horizontal grid, topography, and vertical grid created in the previous section.
Project ID: (Optional) A project ID, if required by the machine.
Step 2.1: Specify case name and directories:#
Begin by specifying the case name and the necessary directory paths. Ensure the CESM root directory points to your own local copy of CESM. Below is an example setup:
from pathlib import Path
# CESM case (experiment) name
casename = "panama-not"
# CESM source root (Update this path accordingly!!!)
cesmroot ="/home/runner/work/CrocoGallery/CrocoGallery/CESM/"
# Place where all your input files go
inputdir = Path.home() / "croc_input" / casename
# CESM case directory
caseroot = Path.home() / "croc_cases" / casename
Step 2.2: Create the Case#
To create the CESM case, instantiate the Case
object as shown below. This will automatically set up the CESM case based on the provided inputs: The cesmroot
argument specifies the path to your local CESM source directory.
The caseroot
argument defines the directory where the case will be created. CrocoDash will handle all necessary namelist modifications and XML changes to align with the MOM6 domain objects generated earlier.
from CrocoDash.case import Case
import os
os.environ["CIME_MACHINE"] = "ubuntu-latest"
case = Case(
cesmroot = cesmroot,
caseroot = caseroot,
inputdir = inputdir,
ocn_grid = grid,
ocn_vgrid = vgrid,
ocn_topo = topo,
project = 'NCGD0011',
override = True,
machine = "ubuntu-latest",
compset = "1850_DATM%JRA_SLND_SICE_MOM6_SROF_SGLC_SWAV"
)
WARNING: cime_interface:CIME_OUTPUT_ROOT doesn't exist. Creating it at /home/runner/cesm/scratch
ERROR: cime_interface:DIN_LOC_ROOT doesn't exist: /home/runner/cesm/inputdata
INFO: csp_solver:CspSolver initialized.
Creating case...
• Updating ccs_config/modelgrid_aliases_nuopc.xml file to include the new resolution "panama-not" consisting of the following component grids.
atm grid: "TL319", lnd grid: "TL319", ocn grid: "panama1".
• Updating ccs_config/component_grids_nuopc.xml file to include newly generated ocean grid "panama1" with the following properties:
nx: 100, ny: 100. ocean mesh: /home/runner/croc_input/panama-not/ocnice/ESMF_mesh_panama1_249ef7.nc.
Running the create_newcase tool with the following command:
/home/runner/work/CrocoGallery/CrocoGallery/CESM/cime/scripts/create_newcase --compset 1850_DATM%JRA_SLND_SICE_MOM6_SROF_SGLC_SWAV --res panama-not --case /home/runner/croc_cases/panama-not --machine ubuntu-latest --run-unsupported --project NCGD0011
The create_newcase command was successful.
Navigating to the case directory:
cd /home/runner/croc_cases/panama-not
Apply NTASK grid xml changes:
./xmlchange NTASKS_OCN=128
Running the case.setup script with the following command:
./case.setup
Adding parameter changes to user_nl_mom:
! Custom Horizonal Grid, Topography, and Vertical Grid
INPUTDIR = /home/runner/croc_input/panama-not/ocnice
TRIPOLAR_N = False
REENTRANT_X = False
REENTRANT_Y = False
NIGLOBAL = 100
NJGLOBAL = 100
GRID_CONFIG = mosaic
GRID_FILE = ocean_hgrid_panama1_249ef7.nc
TOPO_CONFIG = file
TOPO_FILE = ocean_topog_panama1_249ef7.nc
MAXIMUM_DEPTH = 2780.130017656834
MINIMUM_DEPTH = 9.5
NK = 75
COORD_CONFIG = none
ALE_COORDINATE_CONFIG = FILE:ocean_vgrid_panama1_249ef7.nc
REGRIDDING_COORDINATE_MODE = Z*
! Timesteps (based on grid resolution)
DT = 25.0
DT_THERM = 100.0
INFO: stage:SUCCESS: All stages are complete.
Case created successfully at /home/runner/croc_cases/panama-not.
To further customize, build, and run the case, navigate to the case directory in your terminal. To create another case, restart the notebook.
Section 3: Prepare ocean forcing data#
We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is "east_unprocessed"
for segments and "ic_unprocessed"
for the initial condition.
In this notebook, we are forcing with the Copernicus Marine “Glorys” reanalysis dataset. There’s a function in the CrocoDash
package, called configure_forcings
, that generates a bash script to download the correct boundary forcing files for your experiment. First, you will need to create an account with Copernicus, and then call copernicusmarine login
to set up your login details on your machine. Then you can run the get_glorys_data.sh
bash script.
Step 3.1 Configure Initial Conditions and Forcings#
case.configure_forcings(
date_range = ["2020-01-01 00:00:00", "2020-01-09 00:00:00"],
function_name="get_glorys_data_from_cds_api"
)
INFO - 2025-08-04T21:44:09Z - Selected dataset version: "202311"
INFO:copernicusmarine:Selected dataset version: "202311"
INFO - 2025-08-04T21:44:09Z - Selected dataset part: "default"
INFO:copernicusmarine:Selected dataset part: "default"
INFO - 2025-08-04T21:44:18Z - Starting download. Please wait...
INFO:copernicusmarine:Starting download. Please wait...
---------------------------------------------------------------------------
KeyboardInterrupt Traceback (most recent call last)
Input In [12], in <cell line: 0>()
----> 1 case.configure_forcings(
2 date_range = ["2020-01-01 00:00:00", "2020-01-09 00:00:00"],
3 function_name="get_glorys_data_from_cds_api"
4 )
File ~/work/CrocoGallery/CrocoGallery/CrocoDash/CrocoDash/case.py:327, in Case.configure_forcings(self, date_range, boundaries, tidal_constituents, tpxo_elevation_filepath, tpxo_velocity_filepath, product_name, function_name, too_much_data, chl_processed_filepath)
323 self.ProductFunctionRegistry.load_functions()
324 assert (
325 tb.category_of_product(product_name) == "forcing"
326 ), "Data product must be a forcing product"
--> 327 if not self.ProductFunctionRegistry.validate_function(
328 product_name, function_name
329 ):
330 raise ValueError("Selected Product or Function was not valid")
331 self.forcing_product_name = product_name.lower()
File ~/work/CrocoGallery/CrocoGallery/CrocoDash/CrocoDash/raw_data_access/driver.py:101, in ProductFunctionRegistry.validate_function(self, product, func_name)
99 return False
100 try:
--> 101 res = func(*test_args)
102 except Exception as e:
103 logger.error(f"Error running function: {e}")
File ~/work/CrocoGallery/CrocoGallery/CrocoDash/CrocoDash/raw_data_access/datasets/glorys.py:83, in get_glorys_data_from_cds_api(dates, lat_min, lat_max, lon_min, lon_max, output_dir, output_file, dataset_varnames)
81 end_datetime = dates[-1]
82 dataset_id = "cmems_mod_glo_phy_my_0.083deg_P1D-m"
---> 83 response = copernicusmarine.subset(
84 dataset_id=dataset_id,
85 minimum_longitude=lon_min - 1,
86 maximum_longitude=lon_max + 1,
87 minimum_latitude=lat_min - 1,
88 maximum_latitude=lat_max + 1,
89 start_datetime=start_datetime,
90 end_datetime=end_datetime,
91 variables=dataset_varnames,
92 output_directory=output_dir,
93 output_filename=output_file,
94 )
95 return response
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/core_functions/deprecated_options.py:78, in deprecated_python_option.<locals>.deco.<locals>.wrapper(*args, **kwargs)
75 @functools.wraps(f)
76 def wrapper(*args, **kwargs):
77 rename_kwargs(f.__name__, kwargs, aliases)
---> 78 return f(*args, **kwargs)
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/python_interface/exception_handler.py:13, in log_exception_and_exit.<locals>.wrapper(*args, **kwargs)
10 @wraps(function)
11 def wrapper(*args, **kwargs):
12 try:
---> 13 return function(*args, **kwargs)
14 except click.Abort:
15 print("Abort")
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/python_interface/subset.py:154, in subset(dataset_id, dataset_version, dataset_part, username, password, variables, minimum_longitude, maximum_longitude, minimum_latitude, maximum_latitude, minimum_depth, maximum_depth, vertical_axis, start_datetime, end_datetime, coordinates_selection_method, output_filename, file_format, service, request_file, output_directory, credentials_file, motu_api_request, overwrite, skip_existing, dry_run, disable_progress_bar, staging, netcdf_compression_level, netcdf3_compatible, chunk_size_limit)
151 start_datetime = homogenize_datetime(start_datetime)
152 end_datetime = homogenize_datetime(end_datetime)
--> 154 return subset_function(
155 dataset_id,
156 dataset_version,
157 dataset_part,
158 username,
159 password,
160 variables,
161 minimum_longitude,
162 maximum_longitude,
163 minimum_latitude,
164 maximum_latitude,
165 minimum_depth,
166 maximum_depth,
167 vertical_axis,
168 start_datetime,
169 end_datetime,
170 coordinates_selection_method,
171 output_filename,
172 file_format,
173 service,
174 request_file,
175 output_directory,
176 credentials_file,
177 motu_api_request,
178 overwrite,
179 skip_existing,
180 dry_run,
181 disable_progress_bar,
182 staging,
183 netcdf_compression_level,
184 netcdf3_compatible,
185 chunk_size_limit,
186 )
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/core_functions/subset.py:184, in subset_function(dataset_id, force_dataset_version, force_dataset_part, username, password, variables, minimum_longitude, maximum_longitude, minimum_latitude, maximum_latitude, minimum_depth, maximum_depth, vertical_axis, start_datetime, end_datetime, coordinates_selection_method, output_filename, file_format, force_service, request_file, output_directory, credentials_file, motu_api_request, overwrite, skip_existing, dry_run, disable_progress_bar, staging, netcdf_compression_level, netcdf3_compatible, chunk_size_limit)
174 if retrieval_service.service_name in [
175 CopernicusMarineServiceNames.GEOSERIES,
176 CopernicusMarineServiceNames.TIMESERIES,
177 CopernicusMarineServiceNames.OMI_ARCO,
178 CopernicusMarineServiceNames.STATIC_ARCO,
179 ]:
180 if (
181 retrieval_service.service_format
182 == CopernicusMarineServiceFormat.ZARR
183 ):
--> 184 response = download_zarr(
185 username,
186 password,
187 subset_request,
188 retrieval_service.dataset_id,
189 disable_progress_bar,
190 retrieval_service.dataset_valid_start_date,
191 retrieval_service.service,
192 None if chunk_size_limit == 0 else chunk_size_limit,
193 )
194 else:
195 raise ServiceNotSupported(retrieval_service.service_name)
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/download_functions/download_arco_series.py:237, in download_zarr(username, password, subset_request, dataset_id, disable_progress_bar, dataset_valid_start_date, service, chunk_size_limit)
230 output_directory = (
231 subset_request.output_directory
232 if subset_request.output_directory
233 else pathlib.Path(".")
234 )
235 variables = subset_request.variables
--> 237 response = download_dataset(
238 username=username,
239 password=password,
240 dataset_id=dataset_id,
241 geographical_parameters=geographical_parameters,
242 temporal_parameters=temporal_parameters,
243 depth_parameters=depth_parameters,
244 coordinates_selection_method=subset_request.coordinates_selection_method,
245 dataset_url=dataset_url,
246 output_directory=output_directory,
247 output_filename=subset_request.output_filename,
248 file_format=subset_request.file_format,
249 variables=variables,
250 disable_progress_bar=disable_progress_bar,
251 overwrite=subset_request.overwrite,
252 netcdf_compression_level=subset_request.netcdf_compression_level,
253 netcdf3_compatible=subset_request.netcdf3_compatible,
254 dry_run=subset_request.dry_run,
255 service=service,
256 chunk_size_limit=chunk_size_limit,
257 skip_existing=subset_request.skip_existing,
258 )
259 return response
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/download_functions/download_arco_series.py:174, in download_dataset(username, password, dataset_id, geographical_parameters, temporal_parameters, depth_parameters, coordinates_selection_method, dataset_url, output_directory, output_filename, file_format, variables, disable_progress_bar, netcdf_compression_level, netcdf3_compatible, service, dry_run, overwrite, chunk_size_limit, skip_existing)
172 else:
173 with TqdmCallback():
--> 174 _save_dataset_locally(
175 dataset,
176 output_path,
177 netcdf_compression_level,
178 netcdf3_compatible,
179 )
180 logger.info(f"Successfully downloaded to {output_path}")
181 if overwrite:
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/download_functions/download_arco_series.py:499, in _save_dataset_locally(dataset, output_path, netcdf_compression_level, netcdf3_compatible)
497 _download_dataset_as_zarr(dataset, output_path)
498 else:
--> 499 _download_dataset_as_netcdf(
500 dataset,
501 output_path,
502 netcdf_compression_level,
503 netcdf3_compatible,
504 )
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/copernicusmarine/download_functions/download_arco_series.py:551, in _download_dataset_as_netcdf(dataset, output_path, netcdf_compression_level, netcdf3_compatible)
549 xarray_download_format = "NETCDF3_CLASSIC" if netcdf3_compatible else None
550 engine = "h5netcdf" if not netcdf3_compatible else "netcdf4"
--> 551 return dataset.to_netcdf(
552 output_path,
553 mode="w",
554 encoding=encoding,
555 format=xarray_download_format,
556 engine=engine,
557 )
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/core/dataset.py:2310, in Dataset.to_netcdf(self, path, mode, format, group, engine, encoding, unlimited_dims, compute, invalid_netcdf)
2307 encoding = {}
2308 from xarray.backends.api import to_netcdf
-> 2310 return to_netcdf( # type: ignore # mypy cannot resolve the overloads:(
2311 self,
2312 path,
2313 mode=mode,
2314 format=format,
2315 group=group,
2316 engine=engine,
2317 encoding=encoding,
2318 unlimited_dims=unlimited_dims,
2319 compute=compute,
2320 multifile=False,
2321 invalid_netcdf=invalid_netcdf,
2322 )
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/backends/api.py:1324, in to_netcdf(dataset, path_or_file, mode, format, group, engine, encoding, unlimited_dims, compute, multifile, invalid_netcdf)
1321 if multifile:
1322 return writer, store
-> 1324 writes = writer.sync(compute=compute)
1326 if isinstance(target, BytesIO):
1327 store.sync()
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/backends/common.py:256, in ArrayWriter.sync(self, compute, chunkmanager_store_kwargs)
253 if chunkmanager_store_kwargs is None:
254 chunkmanager_store_kwargs = {}
--> 256 delayed_store = chunkmanager.store(
257 self.sources,
258 self.targets,
259 lock=self.lock,
260 compute=compute,
261 flush=True,
262 regions=self.regions,
263 **chunkmanager_store_kwargs,
264 )
265 self.sources = []
266 self.targets = []
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/xarray/core/daskmanager.py:211, in DaskManager.store(self, sources, targets, **kwargs)
203 def store(
204 self,
205 sources: DaskArray | Sequence[DaskArray],
206 targets: Any,
207 **kwargs,
208 ):
209 from dask.array import store
--> 211 return store(
212 sources=sources,
213 targets=targets,
214 **kwargs,
215 )
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/dask/array/core.py:1227, in store(***failed resolving arguments***)
1224 if not return_stored:
1225 import dask
-> 1227 dask.compute(arrays, **kwargs)
1228 return None
1229 else:
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/site-packages/dask/base.py:681, in compute(traverse, optimize_graph, scheduler, get, *args, **kwargs)
678 expr = expr.optimize()
679 keys = list(flatten(expr.__dask_keys__()))
--> 681 results = schedule(expr, keys, **kwargs)
683 return repack(results)
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/queue.py:171, in Queue.get(self, block, timeout)
169 elif timeout is None:
170 while not self._qsize():
--> 171 self.not_empty.wait()
172 elif timeout < 0:
173 raise ValueError("'timeout' must be a non-negative number")
File /usr/share/miniconda/envs/CrocoDash/lib/python3.11/threading.py:327, in Condition.wait(self, timeout)
325 try: # restore state no matter what (e.g., KeyboardInterrupt)
326 if timeout is None:
--> 327 waiter.acquire()
328 gotit = True
329 else:
KeyboardInterrupt:
Step 3.3: Process forcing data#
In this final step, we call the process_forcings
method of CrocoDash to cut out and interpolate the initial condition as well as all boundaries. CrocoDash also updates MOM6 runtime parameters and CESM xml variables accordingly.
case.process_forcings()
INFO:regional_mom6.regridding:Getting t points..
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.regridding:Creating Regridder
Setting up Initial Conditions
Regridding Velocities...
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Getting u points..
INFO:regional_mom6.regridding:Getting v points..
Done.
Regridding Tracers... Done.
Regridding Free surface... Done.
Saving outputs...
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_001
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_001
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_001
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_001
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_001
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_001
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_002
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_002
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_002
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_002
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_002
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_002
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
done setting up initial condition.
Processing south boundary velocity & tracers...Done.
Processing north boundary velocity & tracers...Done.
Processing west boundary velocity & tracers...
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_003
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_003
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_003
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_003
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_003
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_003
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Creating Regridder
INFO:regional_mom6.rotation:Getting rotation angle
INFO:regional_mom6.rotation:Calculating grid rotation angle
INFO:regional_mom6.regridding:Creating coordinates of the boundary q/u/v points
INFO:regional_mom6.regridding:Filling in missing data horizontally, then vertically
INFO:regional_mom6.regridding:Adding time dimension
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in salt_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to salt_segment_004
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in temp_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to temp_segment_004
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in u_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to u_segment_004
INFO:regional_mom6.regridding:Renaming vertical coordinate to nz_... in v_segment_004
INFO:regional_mom6.regridding:Replacing old depth coordinates with incremental integers
INFO:regional_mom6.regridding:Adding perpendicular dimension to v_segment_004
INFO:regional_mom6.regridding:Adding perpendicular dimension to eta_segment_004
WARNING:regional_mom6.regridding:All NaNs filled b/c bathymetry wasn't provided to the function. Add bathymetry_path to the segment class to avoid this
INFO:regional_mom6.regridding:Generating encoding dictionary
Done.
Processing east boundary velocity & tracers...Done.
Adding parameter changes to user_nl_mom:
! Initial conditions
INIT_LAYERS_FROM_Z_FILE = True
TEMP_SALT_Z_INIT_FILE = init_tracers.nc
Z_INIT_FILE_PTEMP_VAR = temp
Z_INIT_ALE_REMAPPING = True
TEMP_SALT_INIT_VERTICAL_REMAP_ONLY = True
DEPRESS_INITIAL_SURFACE = True
SURFACE_HEIGHT_IC_FILE = init_eta.nc
SURFACE_HEIGHT_IC_VAR = eta_t
VELOCITY_CONFIG = file
VELOCITY_FILE = init_vel.nc
! Open boundary conditions
OBC_NUMBER_OF_SEGMENTS = 4
OBC_FREESLIP_VORTICITY = False
OBC_FREESLIP_STRAIN = False
OBC_COMPUTED_VORTICITY = True
OBC_COMPUTED_STRAIN = True
OBC_ZERO_BIHARMONIC = True
OBC_TRACER_RESERVOIR_LENGTH_SCALE_OUT = 3.0E+04
OBC_TRACER_RESERVOIR_LENGTH_SCALE_IN = 3000.0
BRUSHCUTTER_MODE = True
OBC_SEGMENT_001 = "J=0,I=0:N,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
OBC_SEGMENT_001_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
OBC_SEGMENT_001_DATA = "U=file:forcing_obc_segment_001.nc(u),V=file:forcing_obc_segment_001.nc(v),SSH=file:forcing_obc_segment_001.nc(eta),TEMP=file:forcing_obc_segment_001.nc(temp),SALT=file:forcing_obc_segment_001.nc(salt)"
OBC_SEGMENT_002 = "J=N,I=N:0,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
OBC_SEGMENT_002_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
OBC_SEGMENT_002_DATA = "U=file:forcing_obc_segment_002.nc(u),V=file:forcing_obc_segment_002.nc(v),SSH=file:forcing_obc_segment_002.nc(eta),TEMP=file:forcing_obc_segment_002.nc(temp),SALT=file:forcing_obc_segment_002.nc(salt)"
OBC_SEGMENT_003 = "I=0,J=N:0,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
OBC_SEGMENT_003_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
OBC_SEGMENT_003_DATA = "U=file:forcing_obc_segment_003.nc(u),V=file:forcing_obc_segment_003.nc(v),SSH=file:forcing_obc_segment_003.nc(eta),TEMP=file:forcing_obc_segment_003.nc(temp),SALT=file:forcing_obc_segment_003.nc(salt)"
OBC_SEGMENT_004 = "I=N,J=0:N,FLATHER,ORLANSKI,NUDGED,ORLANSKI_TAN,NUDGED_TAN"
OBC_SEGMENT_004_VELOCITY_NUDGING_TIMESCALES = 0.3, 360.0
OBC_SEGMENT_004_DATA = "U=file:forcing_obc_segment_004.nc(u),V=file:forcing_obc_segment_004.nc(v),SSH=file:forcing_obc_segment_004.nc(eta),TEMP=file:forcing_obc_segment_004.nc(temp),SALT=file:forcing_obc_segment_004.nc(salt)"
./xmlchange RUN_STARTDATE=2020-01-01
./xmlchange MOM6_MEMORY_MODE=dynamic_symmetric
Case is ready to be built: /Users/manishrv/croc_cases/panama-1
Section 4: Build and run the case#
After completing the previous steps, you are ready to build and run your CESM case. Begin by navigating to the case root directory specified during the case creation. Before proceeding, review the user_nl_mom
file located in the case directory. This file contains MOM6 parameter settings that were automatically generated by CrocoDash. Carefully examine these parameters and make any necessary adjustments to fine-tune the model for your specific requirements. While CrocoDash aims to provide a solid starting point, further tuning and adjustments are typically necessary to improve the model for your use case.
Once you have reviewed and modified the parameters as needed, you can build and execute the case using the following commands:
./case.build
./case.submit