MODELS IN OCEANOGRAPHY
Ocean models are numerical models of ocean properties and their circulation. They play a large role in our understanding of the ocean's influence on weather and climate. Robert Stewart wrote in the “Introduction to Physical Oceanography”: Produced analytic solutions of the equations of motion are impossible to obtain for typical oceanic flows. The problem is due to non-linear terms in the equations of motion, turbulence, and the need for realistic shapes for the sea floor and coastlines. We have also seen how difficult it is to describe the ocean from measurements. Satellites can observe some processes almost everywhere every few days. But they observe only some processes, and only near or at the surface. Ships and floats can measure more variables, and deeper into the water, but the measurements are sparse. Hence, numerical models provide the only useful, global view of ocean currents. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
Numerical models of ocean currents have many advantages. They simulate flows in realistic ocean basins with a realistic sea floor. They include the influence of viscosity and non-linear dynamics. And they can calculate possible future flows in the ocean. Perhaps, most important, they interpolate between sparse observations of the ocean produced by ships, drifters, and satellites.
While ocean models start with the same continuous equations, the discrete equations possess important distinctions that play a role in the simulation features. In particular, different parts of the ocean are most naturally represented using different vertical coordinate systems. Level-coordinate models consider each box at the same level, with geopotential or pressure the common basis for defining levels. These models are relatively easy to code and have been the basis for ocean climate modeling since the 1960s. Such models may also have particular advantages in representing the transition between the poorly stratified mixed layer and the interior ocean where flow is predominantly along density surfaces. In contrast, isopycnal models handle the interior ocean more naturally, and may also have significant advantages in representing thin dense overflows. In shallow shelf regions, where the top and bottom boundary layers are thought to be most important, many oceanographers use terrain-following coordinate systems. [Source: NOAA]
Important Points as of 2008: 1) Numerical models are used to simulate oceanic flows with realistic and useful results. The most recent models include heat fluxes through the surface, wind forcing, mesoscale eddies, realistic coasts and sea-floor features, and more than 20 levels in the vertical. 2) Current models are now so good, with resolution near 0.1 degrees , that they show previously unknown aspects of the ocean circulation. 3) Numerical models can be forced by real-time oceanographic data from ships and satellites to produce forecasts of oceanic conditions, including El Niño in the Pacific, and the position of the Gulf Stream in the Atlantic. 4) Coupled ocean-atmosphere models have much coarser spatial resolution so that that they can be integrated for hundreds of years to simulate the natural variability of the climate system and its response to increased CO2 in the atmosphere.
Numerical models are very widely used for many purposes in oceanography. For our purpose we can divide models into two classes: Mechanistic models are simplified models used for studying processes. Because the models are simplified, the output is easier to interpret than output from more complex models. Many different types of simplified models have been developed, including models for describing planetary waves, the interaction of the flow with sea-floor features, or the response of the upper ocean to the wind. These are perhaps the most useful of all models because they provide insight into the physical mechanisms influencing the ocean. The development and use of mechanistic models is, unfortunately, beyond the scope of this book. Simulation models are used for calculating realistic circulation of oceanic regions. The models are often very complex because all important processes are included, and the output is difficult to interpret.
Websites and Resources: National Oceanic and Atmospheric Administration (NOAA) noaa.gov; “Introduction to Physical Oceanography” by Robert Stewart , Texas A&M University, 2008 uv.es/hegigui/Kasper ; Woods Hole Oceanographic Institute whoi.edu ; Cousteau Society cousteau.org ; Monterey Bay Aquarium montereybayaquarium.org
Limitations of Models in Oceanography
Numerical models are not perfect. They solve discrete equations, which are not the same as the equations of motion described in earlier chapters. And, Numerical models cannot reproduce all turbulence of the ocean because the grid points are tens to hundreds of kilometers apart. The influence of turbulent motion over smaller distances must be calculated from theory, and this introduces errors. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
David Berlinski said in 1996: “There is a world of difference between the character of the fundamental laws, on the one hand, and the nature of the computations required to breathe life into them, on the other”. According to the “Introduction to Physical Oceanography”: The models can never give complete descriptions of the oceanic flows even if the equations are integrated accurately. The problems arise from several sources. Analytic solutions of the equations of motion are impossible to obtain for typical oceanic flows. The problem is due to non-linear terms in the equations of motion, turbulence, and the need for realistic shapes for the sea floor and coastlines. We have also seen how difficult it is to describe the ocean from measurements. Satellites can observe some processes almost everywhere every few days. But they observe only some processes, and only near or at the surface. Ships and floats can measure more variables, and deeper into the water, but the measurements are sparse. Hence, numerical models provide the only useful, global view of ocean currents. Let’s look at the accuracy and validity of the models, keeping in mind that although they are only models, they provide a remarkably detailed and realistic view of the ocean.
Given that we cannot do things ‘right’, is it better to do nothing? That is not an option. ‘Nothing’ means applying viscous goo and wishing for the ever bigger computer. Can we do better? For example, can we guess a higher entropy configuration toward which the eddies tend to drive the ocean (that tendency to compete with the imposed forcing and dissipation)? All models must be run to calculate one to two decades of variability before they can be used to simulate the ocean. This is called spin-up. Spin-up is needed because initial conditions for density, fluxes of momentum and heat through the sea-surface, and the equations of motion are not all consistent.
Models of the ocean must run on available computers. This means oceanographers further simplify their models. We use the hydrostatic and Boussinesq approximations, and we often use equations integrated in the vertical, the shallow-water equations). We do this because we cannot yet run the most detailed models of oceanic circulation for thousands of years to understand the role of the ocean in climate.
Early Numerical Models in Oceanography
The first simulation model was developed in the late 1960s by Kirk Bryan and Michael Cox at the Geophysical Fluid Dynamics laboratory in Princeton. According to the “Introduction to Physical Oceanography”: They calculated the 3-dimensional flow in the ocean using the continuity and momentum equation with the hydrostatic and Boussinesq approximations and a simple equation of state. Such models are called primitive equation models because they use the basic, or primitive form of the equations of motion. The equation of state allows the model to calculate changes in density due to fluxes of heat and water through the surface, so the model includes thermodynamic processes. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
The Bryan-Cox model used large horizontal and vertical viscosity and diffusion to eliminate turbulent eddies having diameters smaller about 500 kilometers, which is a few grid points in the model. It had complex coastlines, smoothed sea-floor features, and a rigid lid. The rigid lid was needed to eliminate ocean-surface waves, such as tides and tsunamis, that move far too fast for the coarse time steps used by all simulation models. The rigid lid had, however, disadvantages. Islands substantially slowed the computation, and the sea-floor features were smoothed to eliminate steep gradients.
The first simulation model was regional. It was quickly followed by a global model with a horizontal resolution of 2 degrees and with 12 levels in the vertical. The model ran far too slowly even on the fastest computers of the day, but it laid the foundation for more recent models. The coarse spatial resolution required that the model have large values for viscosity, and even regional models were too viscous to have realistic western boundary currents or mesoscale eddies.
Since those times, the goal has been to produce models with ever finer resolution, more realistic modeling of physical processes, and better numerical schemes. Computer technology is changing rapidly, and models are evolving rapidly. The output from the most recent models of the north Atlantic, which have resolution of 0.03 degrees look very much like the real ocean. Models of other areas show previously unknown currents near Australia and in the south Atlantic.
Ocean and Atmosphere Models use very different spacing of grid points. As a result, ocean modeling lags about a decade behind atmosphere modeling. Dominant ocean eddies are 1/30 the size of dominant atmosphere eddies (storms). But, ocean features evolve at a rate that is 1/30 the rate in the atmosphere. Thus ocean models running for say one year have (30 × 30) more horizontal grid points than the atmosphere, but they have 1/30 the number of time steps. Both have about the same number of grid points in the vertical. As a result, ocean models run 30 times slower than atmosphere models of the same complexity.
Global Ocean Models
Robert Stewart writes in the “Introduction to Physical Oceanography”: Several types of global models are widely used in oceanography. Most have grid points about one tenth of a degree apart, which is sufficient to resolve mesoscale eddies.2, that have a diameter larger than two to three times the distance between grid points. Vertical resolution is typically around 30 vertical levels. Models include: 1) realistic coasts and bottom features; 2) heat and water fluxes though the surface; 3) eddy dynamics; and 4) the meridional-overturning circulation. Many assimilate satellite and float data. The models range in complexity from those that can run on desktop workstations to those that require the world’s fastest computers. Models are started from rest with values of density from the Levitus (1982) atlas and integrated for a decade using mean-annual wind stress, heat fluxes, and water flux. The model may be integrated for several more years using monthly wind stress, heat fluxes, and water fluxes. The Bryan-Cox models evolved into several widely used models which are providing impressive views of the global ocean circulation. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
Parallel Ocean Program Model produced by Smith and colleagues at Los Alamos National Laboratory in the 1990s is another widely used model growing out of the original Bryan-Cox code. The model includes improved numerical algorithms, realistic coasts, islands, and unsmoothed bottom features. It has model has 1280×896 equally spaced grid points on a Mercator projection extending from 77 degrees S to 77 degrees N, and 20 levels in the vertical. Thus it has 2.2×107 points giving a resolution of 0.28 degrees × 0.28 degrees cos θ, which varies from 0.28 degrees (31.25 kilometers) at the equator to 0.06 degrees (6.5 kilometers) at the highest latitudes. The average resolution is about 0.2 degrees . The model was is forced by ECMWF (European Centre for Medium-Range Weather Forecasts) wind stress and surface heat and water fluxes.
All the models just described use x, y, z coordinates. Such a coordinate system has both advantages and disadvantages. It can have high resolution in the surface mixed layer and in shallower regions. But it is less useful in the interior of the ocean. Below the mixed layer, mixing in the ocean is easy along surfaces of constant density, and difficult across such surfaces. A more natural coordinate system in the interior of the ocean uses x, y, ρ, where ρ is density. Such a model is called an isopycnal model. Essentially, ρ(z) is replaced with z(ρ). Because isopycnal surfaces are surfaces of constant density, horizontal mixing is always on constant-density surfaces in this model.
Hybrid Coordinate Ocean Model (HYCOM) model uses different vertical coordinates in different regions of the ocean, combining the best aspects of zcoordinate model and isopycnal-coordinate model. The hybrid model has evolved from the Miami Isopycnic-Coordinate Ocean Model It is a primitive-equation model driven by wind stress and heat fluxes. It has realistic mixed layer and improved horizontal and vertical mixing schemes that include the influences of internal waves, shear instability, and double-diffusion. The model results from collaborative work among investigators at many oceanographic laboratories.
Modular Ocean Model (MOM)
The Geophysical Fluid Dynamics Laboratory's Modular Ocean Model (MOM) is the canonical large-scale ocean climate model used and around the world. MOM4 formed the basis of the CM2.1, CM2M, ESM2M, and CM2.5 climate models used for high-end modeling that contributed to the IPCC AR4 and AR5 assessments. MOM5, featuring the Eulerian-algorithm code, was released in October 2012. [Source: NOAA, 2016]
Modular Ocean Model (MOM) consists of a large set of modules that can be configured to run on many different computers to model many different aspects of the circulation. The source code is open and free, and it is in the public domain. The model is widely use for climate studies and for studying the ocean’s circulation over a wide range of space and time scales. Because MOM is used to investigate processes which cover a wide range of time and space scales, the code and manual are lengthy. However, it is far from necessary for the typical ocean modeler to become acquainted with all of its aspects. Indeed, mom can be likened to a growing city with many different neighborhoods. Some of the neighborhoods communicate with one another, some are mutually incompatible, and others are basically independent. This diversity is quite a challenge to coordinate and support. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
MOM6 incorporates the Lagrangian vertical coordinate work in GOLD, including a conservative representation of wetting and drying essential for evolving ice shelf grounding lines, along with key physical parameterizations and analysis features of MOM5. In contrast to MOM5, which uses the Arakawa B-grid, MOM6 uses an Arakawa C-grid which is more appropriate for mesoscale eddy-rich simulations. MOM6 also has a tracer sub-cycling time-stepping scheme that allows for an efficient incorporation of many bio-geochemical constituents. The Lagrangian treatment ( of the vertical supports a general coordinate capability including hybrid vertical coordinates. Hybrid coordinates have been shown to reduce spurious mixing in the ocean interior and permit a more faithful representation of overflows, both of which are important for studying ocean climate on centennial and longer time scales. MOM6 is the ocean code used in OM4 , the ice-ocean component of the latest GFDL coupled climate model, CM4, the earth system model, ESM4, as well as the seasonal-to-decadal prediction model, SPEAR. Additionally, MOM6 is part of the long-standing MOM community of codes and has adopted the open-development paradigm thus encouraging participation among government agencies and academic partners.
The Generalized Ocean Layer Dynamics (GOLD) is a generalized Lagrangian-coordinate ocean code. GOLD can be configured using geopotential (z*), terrain-following (σ), layered isopycnal, or a continuous isopycnal (ρ) coordinate for describing the vertical. Nonetheless, the principle use of GOLD has been as an isopycnal coordinate ocean climate model, following in the lineage of the Hallberg Isopycnal Model (HIM).
Regional Oceanic Modeling System
Stewart writes: Regional Oceanic Modeling System (ROMS) is a regional model that can be imbedded in models of much larger regions. It is widely used for studying coastal current systems closely tied to flow further offshore, for example, the California Current. ROMS is a hydrostatic, primitive equation, terrain-following model using stretched vertical coordinates, driven by surface fluxes of momentum, heat, and water. It has improved surface and bottom boundary layers (Shchepetkin and McWilliams, 2004). [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
Climate models are used for studies of large-scale hydrographic structure, climate dynamics, and water-mass formation. These models are the same as the eddy-admitting, primitive equation models I have just described except the horizontal resolution is much coarser because they must simulate ocean processes for decades or centuries. As a result, they must have high dissipation for numerical stability, and they cannot simulate mesoscale eddies. Typical horizontal resolutions are 2 degrees to 4 degrees . The models tend, however, to have high vertical resolution necessary for describing the deep circulation important for climate.
Dartmouth Gulf of Maine Model developed by Lynch et al in the 1990s is a 3- dimensional model of the circulation using a triangular, finite-element grid. The size of the triangles is proportional to both depth and the rate of change of depth. The triangles are small in regions where the bottom slopes are large and the depth is shallow, and they are large in deep water. The variable mesh is especially useful in coastal regions where the depth of water varies greatly. Thus the variable grid gives highest resolution where it is most needed.
The model uses roughly 13,000 triangles to cover the Gulf of Maine and nearby waters of the north Atlantic. Minimum size of the elements is roughly one kilometer. The model has 10 to 40 horizontal layers. The vertical spacing of the layers is not uniform. Layers are closer together near the top and bottom and they are more widely spaced in the interior. Minimum spacing is roughly one meter in the bottom boundary layer.
Stewart writes: Storms coming ashore across wide, shallow, continental shelves drive large changes of sea level at the coast called storm surges. The surges can cause great damage to coasts and coastal structures. Intense storms in the Bay of Bengal have killed hundreds of thousands in a few days in Bangladesh. Because surges are so important, government agencies in many countries have developed models to predict the changes of sea level and the extent of coastal flooding.[Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
Calculating storm surges is not easy. Here are some reasons, in a rough order of importance. 1) The distribution of wind over the ocean is not well known. Numerical weather models calculate wind speed at a constant pressure surface, stormsurge models need wind at a constant height of 10 meters. Winds in bays and lagoons tend to be weaker than winds just offshore because nearby land distorts the airflow, and this is not included in the weather models. 2) The shoreward extent of the model’s domain changes with time. For example, if sea level rises, water will flood inland, and the boundary between water and sea moves inland with the water. 3) The drag coefficient of wind on water is not well known for hurricane force winds. 4) The drag coefficient of water on the seafloor is also not well known. The models must include waves and tides which influence sea level in shallow waters. 6) Storm surge models must include the currents generated in a stratified, shallow sea by wind.
To reduce errors, models are tuned to give results that match conditions seen in past storms. Unfortunately, those past conditions are not well known. Changes in sea level and wind speed are rarely recorded accurately in storms except at a few, widely paced locations. Yet storm-surge heights can change by more than a meter over distances of tens of kilometers. Despite these problems, models give very useful results.
SLOSH and ADCIRC
Sea, Lake, and Overland Surges Model (SLOSH) is used by noaa for forecasting storm surges produced by hurricanes coming ashore along the Atlantic and Gulf coasts of the United States. The model is the result of a lifetime of work by Chester Jelesnianski. In developing the model, Jelesnianski paid careful attention to the relative importance of errors in the model. He worked to reduce the largest errors, and ignored the smaller ones. For example, the distribution of winds in a hurricane is not well known, so it makes little sense to use a spatially varying drag coefficient for the wind. Thus, Jelesnianski used a constant drag coefficient in the air, and a constant eddy stress coefficient in the water.
SLOSH calculates water level from depth-integrated, quasi-linear, shallowwater equations. Thus it ignores stratification. It also ignores river inflow, rain, and tides. The latter may seem strange, but the model is designed for forecasting. The time of landfall cannot be forecast accurately, and hence the height of the tides is mostly unknown. Tides can be added to the calculated surge, but the nonlinear interaction of tides and surge is ignored. The model is forced by idealized hurricane winds. It needs only atmospheric pressure at the center of the storm, the distance from the center to the area of maximum winds, the forecast storm track and speed along the track.
In preparation for hurricanes coming ashore near populated areas, the model has been adapted for 27 basins from Boston Harbor Massachusetts to Laguna Madre Texas. The model uses a fixed polar mesh. Mesh spacing begins with a fine mesh near the pole, which is located near the coastal city for which the model is adapted. The grid stretches continuously to a coarse mesh at distant boundaries of a large basin. Such a mesh gives high resolution in bays and near the coast where resolution is most needed. Using measured depths at sea and elevations on land, the model allows flooding of land, overtopping of levees and dunes, and sub-grid flow through channels between offshore islands. Sea level calculated from the model has been compared with heights measured by tide gauges for 13 storms, including Betsy: 1965, Camile: 1969, Donna: 1960, and Carla: 1961. The overall accuracy is ±20 percent .
Advanced Circulation Model (ADCIRC) is an experimental model for forecasting storm surges produced by hurricanes coming ashore along the Atlantic and Gulf coasts of the United States. The model uses a finite-element grid, the Boussinesq approximation, quadratic bottom friction, and vertically integrated continuity and momentum equations for flow on a rotating earth.
It can be run as either a two-dimensional, depth-integrated model, or as a three-dimensional model. Because waves contribute to storm surges, the model includes waves calculated from the wam third-geneation wave mode. The model is forced by: 1) High resolution winds and surface pressure obtained by combining weather forecasts from the NOAA National Weather Service and the National Hurricane Center along the official and alternate forecast storm tracks. 2) . Tides at the open-ocean boundaries of the model. 3) Sea-surface height and currents at the open-ocean boundaries of the model. The model successfully forecast the Hurricane Katrina storm surge, giving values in excess of 6.1 meters near New Orleans.
Stewart writes: The great economic importance of the coastal zone has led to the development of many different numerical models for describing coastal currents, tides, and storm surges. The models extend from the beach to the continental slope, and they can include a free surface, realistic coasts and bottom features, river runoff, and atmospheric forcing. Because the models don’t extend very far into deep water, they need additional information about deep-water currents or conditions at the shelf break. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
The many different coastal models have many different goals, and many different implementations. Several of the models described above, including MOM and ROM, have been used to model coastal processes. But many other specialized models have also been developed. Heaps (1987), Lynch et al (1996), and Haidvogel and Bec kilometersan (1998) provide good overviews of the subject. Rather than look at a menu of models, let’s look at two typical models.
Princeton Ocean Model developed by Blumberg and Mellor in the 1980s and 90s and is widely used for describing coastal currents. It includes thermodynamic processes, turbulent mixing, and the Boussinesq and hydrostatic approximations. The Coriolis parameter is allowed to vary using a beta-plane approximation. Because the model must include a wide range of depths, Blumberg and Mellor used a vertical coordinate σ scaled by the depth of the water: σ = z − η H + η (15.1) where z = η(x, y, t) is the sea surface, and z = −H(x, y) is the bottom. Sub-grid turbulence is parameterized using a closure scheme whereby eddy diffusion coefficients vary with the size of the eddies producing the mixing and the shear of the flow.
The model is driven by wind stress and heat and water fluxes from meteorological models. The model uses known geostrophic, tidal, and Ekman currents at the outer boundary. The model has been used to calculate the three-dimensional distribution of velocity, salinity, sea level, temperature, and turbulence for up to 30 days over a region roughly 100–1000 kilometers on a side with grid spacing of 1–50 kilometers.
Stewart writes: Many of the models I have described so far have output, such as current velocity or surface topography, constrained by oceanic observations of the variables they calculate. Such models are called assimilation models. In this section, I will consider how data can be assimilated into numerical models.
Let’s begin with a primitive-equation, eddy-admitting numerical model used to calculate the position of the Gulf Stream. Let’s assume that the model is driven with real-time surface winds from the ECMWF (European Centre for Medium-Range Weather Forecasts) weather model. Using the model, we can calculate the position of the current and also the sea-surface topography associated with the current. We find that the position of the Gulf Stream wiggles offshore of Cape Hatteras due to instabilities, and the position calculated by the model is just one of many possible positions for the same wind forcing. Which position is correct, that is, what is the position of the current today? We know, from satellite altimetry, the position of the current at a few points a few days ago. Can we use this information to calculate the current’s position today? How do we assimilate this information into the model? Many different approaches are being explored (Malanotte-Rizzoli, 1996). [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
Roger Daley (1991) gives a complete description of how data are used with atmospheric models. Andrew Bennet (1992) and Carl Wunsch (1996) describe oceanic applications. The different approaches are necessary because assimilation of data into models is not easy. 1) Data assimilation is an inverse problem: A finite number of observations are used to estimate a continuous field — a function, which has an infinite number of points. The calculated fields, the solution to the inverse problem, are completely under-determined. There are many fields that fit the observations and the model precisely, and the solutions are not unique. In our example, the position of the Gulf Stream is a function. We may not need an infinite number of values to specify the position of the stream if we assume the position is somewhat smooth in space. But we certainly need hundreds of values along the stream’s axis. Yet, we have only a few satellite points to constrain the position of the Stream. To learn more about inverse problems and their solution, read Parker (1994) who gives a very good introduction based on geophysical examples.
2) Ocean dynamics are non-linear, while most methods for calculating solutions to inverse problems depend on linear approximations. For example the position of the Gulf Stream is a very nonlinear function of the forcing by wind and heat fluxes over the north Atlantic. 3) Both the model and the data are incomplete and both have errors. For example, we have altimeter measurements only along the tracks such as those shown in figure 2.6, and the measurements have errors of ±2 centimeters. 4) Most data available for assimilation into data comes from the surface, such as avhrr and altimeter data. Surface data obviously constrain the surface geostrophic velocity, and surface velocity is related to deeper velocities. The trick is to couple the surface observations to deeper currents.
Coupled Ocean and Atmosphere Models
Stewart writes: Coupled numerical models of the atmosphere and ocean are used to study the climate, its variability, and its response to external forcing. The most important use of the models has been to study how earth’s climate might respond to a doubling of CO2 in the atmosphere. Much of the literature on climate change is based on studies using such models. Other important uses of coupled models include studies of El Niño and the meridional overturning circulation. The former varies over a few years, the latter varies over a few centuries. Development of the coupled models tends to be coordinated through the World Climate Research Program of the World Meteorological Organization. Many coupled ocean and atmosphere models have been developed. Some include only physical processes in the ocean, atmosphere, and the ice-covered polar seas. Others add the influence of land and biological activity in the ocean. [Source: Robert Stewart, “Introduction to Physical Oceanography”, Texas A&M University, 2008]
Climate System Model developed by the National Center for Atmospheric Research NCAR includes physical and biogeochemical influence on the climate system (Boville and Gent, 1998). It has atmosphere, ocean, land-surface, and sea-ice components coupled by fluxes between components. The atmospheric component is the ncar Community Climate Model, the oceanic component is a modified version of the Princeton Modular Ocean Model, using the Gent and McWilliams (1990) scheme for parameterizing mesoscale eddies. Resolution is approximately 2 degrees × 2 degrees with 45 vertical levels in the ocean. The model has been spun up and integrated for 300 years, the results are realistic, and there is no need for a flux adjustment..
Princeton Coupled Model consists of an atmospheric model with a horizontal resolution of 7.5 degrees longitude by 4.5 degrees latitude and 9 levels in the vertical, an ocean model with a horizontal resolution of 4 degrees and 12 levels in the vertical, and a land-surface model. The ocean and atmosphere are coupled through heat, water, and momentum fluxes. Land and ocean are coupled through river runoff. And land and atmosphere are coupled through water and heat fluxes.
Hadley Center Model is an atmosphere-ocean-ice model that minimizes the need for flux adjustments. The ocean component is based on the Bryan-Cox primitive equation model, with realistic bottom features, vertical mixing coefficients from Pacanowski and Philander (1981). Both the ocean and the atmospheric component have a horizontal resolution of 96 × 73 grid points, the ocean has 20 levels in the vertical.
Image Sources: Wikimedia Commons; YouTube, NOAA
Text Sources: National Oceanic and Atmospheric Administration (NOAA) noaa.gov; “Introduction to Physical Oceanography” by Robert Stewart , Texas A&M University, 2008 uv.es/hegigui/Kasper ; Wikipedia, National Geographic, Live Science, BBC, Smithsonian, New York Times, Washington Post, Los Angeles Times, The New Yorker, Reuters, Associated Press, Lonely Planet Guides and various books and other publications.
Last Updated March 2023