lec 23: Modelling Climate Change Flashcards
predictive models
complex math-based computer programs which use past and present conditions to determine likely future conditions
predictive models are the most common models we use in climate science
rational behind all these type of models
- what has happened in the past and is currently happening
- and why/how it happened
- will probably remain true for future events
system will remain similar over the time periods covered in the predictive model
General Circulation Model (GCM)
utilizes mathematical models to simulate the circulation of energy and/or mass within and between the atmosphere and ocean
GCMs are incredibly complex and involve data collected by thousands of individuals over long time scales
(average model contains enough code for - 18 000 printed pages of text
output often in the petabytes of data (1 petabyte = 1024 TB)
requires very high computing power (this can sometime be a common limiting factor)
variables included in GCMs
air temperature
vapour pressure
solar radiation
air pressure
albedo and more
mainly physics based energy and mass fluxes
applying GCM to the terrestrial surface
land and atmosphere area is divided equally into a grid (divisions extend into the atmosphere)
The GCM will simulate climatic conditions per unit of area
why are most GCMs very broad in scale -> 100 km2 grid size
they are designed to give averages for the planet, not details on a specific area
what is the limitation to the global scale being useful for global predictions?
it misses potentially important local trends (i.e global warming trend in the oceans since 1970, but some areas have experienced a (yet unexplained) cooling trend)
downscaling
global-scale data can be converted to local-scale data (downscaling)
Regional climate models (RCMs)
mathematical models used to convert data calibrated for global regions to apply to small-scale, local topography
note that this does not mean that the local models are more accurate
(simply that the data is now calibrated over a smaller geographic area)
RCMs
RCMs are often how we make predictions about the impacts of climate change on the scale of a country or smaller
RCM-> not generating new data, changing the scale of existing data to a smaller unit area
each grid is run as a separate simulation
decreasing grid size drastically increases the number of grids which must be simulated
computing power limitation
computing power currently limits the total area that we can downscale
at one point, downscalling creates too many grids to simulate with current computing power
(downscaled maps are only of small regions, not the entire planet)
uncertainty
all predictive models include some elements of uncertainty
uncertainty:
- the parts of a model for which we have insufficient data
- uncertainly can be known or unknown
known= we know we do not have the data
unknown= we don’t even realize we are missing the data MUCH more problematic
common sources of uncertainty
frequently not possible to know all variables that should be part of a model
even if you know all the variables, can be difficult to determine how important each is
some sources of data do not stay the same over the time periods of the study (violate an assumption of predictive models)
impact of uncertainty
greater degree of uncertainty = less accurate model
uncertainty in a model is acceptable (not a flaw)
but its important to know (or at least have an idea) of where your model has greater uncertainty (often we visually display our uncertainties in our projections )
uncertainty in current GCMs
we have drastically decreased uncertainty in our GCMs over time (from the first models in the 1970 to today)
most uncertainties has been decreased by adding new global systems to our model (or better understanding known systems)
current GCMs
most recent GCMs incorporate biogeochemical cycles
(transfer of chemicals between abiotic and biotic reservoirs)
new class of GCMs are called Earth System Models (ESMs)
Intergrates assessment models (IAMs)
the final addition to GCMs has been the integration of predictions based on human activities
Representative concentration pathway (RCP) projections
IAMs are responsible for the type of global temperature projections published by the IPCC
RCP models represent different projections for global climate change based on different future projections for CO2 emissions
RCP 1.9
limits global warming to below 1.5C
not considered likely by most
often not even found in IPCC graphs anymore
RCP 2.6
CO2 emissions starts declining by 2020 and go to 0 by 2100
currently considered the best case scenario
1.5-2C at 2100
RCP 4.5
CO2 emissions peak in 2045 with peak oil
believed to be most likely with no mitigation efforts
2.5-3C at 2100
RCP 8.5
CO2 emissions do not peak before 2100
business as usual scenario
5C by 2100
reducing the scale of GCMs over time
greater computing power , more data
deceasing scale from 500 - 250km
requires 10X increase in computing power
how to check your model to see if it’s accurate?
Hindcasting
hindcasting
- Using past environmental (pre-1850) conditions to calibrate a model
- test your calibrated model to see if it can accurately “predict” current (2024) conditions (for a model calibrated with past data -> current conditions represent the future)
- If your model can accurately “predict” conditions in 2024 (then will likely be accurate in predicting conditions past 2024 (in the future!))
global systems poorly understood or still inherently difficult to model in the GCM framework
clouds
snow and ice formation in the atmosphere
jet stream
unpredictable rare events (major volcanic eruptions …)
Clouds
cover 2/3 of the planet surface at any one time (but individual clouds can form and break down within minutes
depending on the location of the cloud (globally and within the atmosphere) (can either warm or cool the planet)
no relevant historical data on cloud formation to calibrate models (no proxy data and few historical direct observations)
cloud formation (condensation) and breakdown (evaporation n or precipitation) are frequently short term and localized events. Cannot be captured at the scale of most GCMs and even most RCMs
smallest RCMs are still -15km long (clouds are much smaller than. 15km some as small as 1km)
to attempt to model cloud systems , musts parameterized the dynamics of cloud formation and breakdown
Parametrize
Average a variable over a large area as accurately as possible (use that average as a best estimate for a smaller area)
ex: rate of cloud formation averaged across 4 grids: average of 0.375 applied to all 4 grids
convective-permitting models
simulate on an exceptionally small scale
few km wide
small enough to model most known models of cloud formation without the need for parameterization
currently very limited in scope
smaller scale produces MANY more grids to analyse than a larger scale
currently only available at the regional or country scale
better understanding of cloud formation
more accurate modelling of extreme rainfall events
more accurate modelling of changes in precipitation rates
accuracy of GCMs
models are continuously updated as :
- we obtain more data concerning anthropocentric activities
- we gain a better understanding of global systems
- computing power improves
we have been modelling climate with a focus on anthropocentric activity since 1970s
modern 21st century models have minimized uncertainty as much as possible
(but are they really accurate?)
are they accurate?
recent studies examining future predictions from older models (even as far back as the 1970s models)
vast majority have proven to have accurately predicted current global temperatures
older models were created with much less computing power and much less data
in some cases calculations were still conducted with punch cards