Conservation and Quantitative Methods Flashcards
pseudoreplication
The use of inferential statistics to test for treatment effects with data from experiments where either treatment are not replicated (though samples may be) or replicates are not statistically independent.
types of pseudoreplication
simple, sacrificial, and temporal
simple pseudoreplication
samples are grouped together in a way that creates nonrandom differences between groups that don’t include ‘treatment effects. For example, two separate plots where all experimental organisms are in one plot, and all control are in the other
sacrificial pseudoreplication
data is pooled prior to statistical analyses, OR two or more samples taken from each experimental unit are treated as independent replicates; variance between treatments exists, but is inappropriately mixed with variance within treatments when the replicates are pooled
temporal pseudoreplication
samples aren’t taken from experimental units (like in simple pseudoreplication) but sequentially, creating nonrandom differences between grouped samples; samples are taken from same individuals at different time points, and the different time points are treated as independent samples, when there will be correlations between them as they are from the same individual; repeated sampling of experimental units is appropriate, it is only that treating them as independent data points is inappropriate
evidence that climate change is occurring?
global temperature increases - 1 degree celsius surface temps since late 1800s, and 0.3 degrees celsius ocean temps since 1969 (Levitus, 2017) - current warming is occurring roughly 10 times faster than the average rate of warming after an ice age; ice cores showing history of temperatures; melting glaciers and ice sheets; satellites show us that snow cover is decreasing; sea levels rising - 20 cm in the past century; extreme weather events are increasing in frequency and intensity; ocean acidification has increased by 30%
evidence that climate change is human caused?
“Since systematic scientific assessments began in the 1970s, the influence of human activity on the warming of the climate system has evolved from theory to established fact.” (IPCC); increasing greenhouse gases, especially carbon dioxide from the burning of fossil fuels (transportation, industrial/factories) as well as deforestation (which reduces CO2 sinks), but also methane (landfills, agriculture, and natural gas leaks), nitrous oxide (agriculture/fossil fuels/burning vegetation), and chlorofluorocarbons (refrigerants, solvents). These greenhouse gases trap the heat within the atmosphere, preventing it from dispersing at the rate it normally would.
Describe some of the effects of climate change on species distributions, community composition, and ecosystem function
extreme weather events/natural disasters (drought, wildfire, hurricane) increases species through ecosystems losses in a stochastic way, as well as changing community composition (shifting alpine bumble bee communities toward species who are better suited for warmer temperatures - Scharnhorst, et al, 2023) (plant communities in most ecoregions in North, Central and South America have experienced thermophilization over the past four decades - Feeley, 2020) and ecosystem function (kelp forests provide food and shelter for animals in the community, but also provide ecosystem services for humans, such as carbon sequestering, reduce the force of storm-driven tides and surges and act like a trash fence to help retain nearshore sand, preventing erosion, but are declining due to increased ocean temperatures and acidification - Smale, 2019); range shifts (a meta-analysis of 764 species (mostly arthropods) found an average rate of poleward migration of 16.9 km/decade - Chen et al 2011); phenology mismatches
How could climate change influence evolutionary processes?
Evolutionary adaptation can be rapid and potentially help species counter stressful conditions or realize ecological opportunities arising from climate change; natural selection - changing or increasing selective pressures; gene flow - increasing or decreasing as species shift ranges; genetic drift - if populations become smaller or isolated due to die-offs or dispersal, would affect them more
paleontological example of climate change influencing evolutionary processes
(Simoes, 2022) time tree for the early evolution of reptiles and their closest relatives to reconstruct how the Permian-Triassic climatic crises shaped their long-term evolutionary trajectory. By combining rates of phenotypic evolution, mode of selection, body size, and global temperature data, we reveal an intimate association between reptile evolutionary dynamics and climate change in the deep past. We show that the origin and phenotypic radiation of reptiles was not solely driven by ecological opportunity following the end-Permian extinction as previously thought but also the result of multiple adaptive responses to climatic shifts spanning 57 million years. a strongly directional evolutionary regime by archelosaurs at the end of the Permian is associated with an adaptive response to those fast climatic shifts. ombined with ecological opportunity arising from the demise of several groups of early synapsids after the EGE and PTE (13, 14, 17, 18), climate change–driven adaptive evolution resulted in the rapid diversification of the vast diversity of reptile morphotypes that came to characterize worldwide ecosystems later on during the Triassic. Smaller body sizes favored (smaller area-volume ratios make them better capable of heat exchange with the surrounding environment). accelerated rates of morphological evolution among large-bodied archosauromorph reptiles, invasion of the marine realm by ichthyosauromorphs and sauropterygians, as well as maintenance of a small-bodied morphotype in lepidosauromorphs.
contemporary example of climate change influencing evolutionary processes
Brassica rapa blooms nearly 2 days earlier than pre-drought plants in response to a multi-year drought caused by climate change (Franks, 2007). One of the best examples of plant evolutionary response to an extreme climatic event comes from a resurrection study of the annual field mustard Brassica rapa [42,60]. The investigators collected a large sample of seeds from two California populations in 1997, after several wet years, and again in 2004 after several years of severe spring drought. They then grew population samples of genotypes collected in 1997 and in 2004 together in a common garden. The 2004 genotypes flowered significantly earlier in the common garden than the 1997 genotypes. Experimental water manipulations showed that early drought onset strongly selected for earlier flowering, evidence that the observed evolutionary change was adaptive. These B. rapa populations also display a genomic signature of temporal drought adaptation [42]. A genome-wide scan for Fst outlier-loci found 855 genes with significant temporal differentiation in allele frequencies between the 1997 and 2004 samples. Many had annotations suggesting involvement in flowering time and drought response. However, only 11 genes exhibited parallel shifts in allele frequencies in both populations. Thus, rapid adaptation to drought in the two populations appears to have occurred along largely independent trajectories.
Pros and cons of conserving ecological and evolutionary processes, rather than preserving of specific phenotypic variants - Moritz (1999)
Can still help individual species, but focusing more on overall eco and evo processes until extinction rates begin to decline; gene flow (via connecting fragmented habitats) helps populations, especially small ones; increase genetic diversity; certain phenotypic variants may be well suited for their current environment, but if they don’t have sufficient underlying genetic diversity, they will not be able to adapt to environmental changes; however, may lose certain species that are needed, like keystone species, if they aren’t given enough individual attention
fixed effects
variables that are constant across individuals; these variables don’t change or change at a constant rate over time; species, feather color, sex
random effects
variables that vary across individuals; colony, site; random effects allow us to control for noise caused by randomly chosen populations; Interested in effect size, not as much in its variation; random intercept or random slope
mixed model
mixed effects model is a type of regression model that combines both fixed and random effects. Mixed effects models are useful when there is variation in the effect of a factor across groups or individuals, but some of the variation is systematic (i.e., can be explained by specific variables) and some is random (i.e., cannot be explained by specific variables).
replication
repetition of an experiment or observation in the same or similar conditions. Replication is important because it adds information about the reliability of the conclusions or estimates to be drawn from the data.
pseudoissue
“those who do not see any problems with reducing spatial and temporal scales in order to obtain replication, and those who understand that experiments must be conducted in spatial and temporal scales relevant for the predictions to be tested, and replicate the experiment as well as possible within this constraint” = sometimes the constraints we work in make true replication impossible, or the more ideal approach is to include pseudoreplication; understand, be aware, avoid in experimental design when possible, and correct for it with models and analysis
pros of observational studies
naturally occurring and not manipulated by the researcher, so the results could be considered more realistic. Observational studies can also allow a researcher to gather a more broad set of information, not limited to the narrow scope of an experiment. Non-scientists can contribute to large observational datasets via “citizen science” efforts. Observational studies can also be done in situations where experiments are not possible, such as across large temporal or spatial scales; more generalizable across contexts
cons of observational studies
results of observational studies could be considered less reliable, as the variables are not directly controlled and manipulated; however, I would argue that a well-planned observational study is just as reliable as an experiment. Observational studies are limited in that a specific question can only be asked if it naturally occurs
pros of experimental studies
control for nearly all possible variables, often leading to more confidence in the results, can also be designed to explore a specific question
cons of experimental studies
can be challenging and expensive, and are not possible in every situation. Experiments, especially in ecology, lead to very specific results that are often not applicable to broader questions; not always natural or realistic; missing ecological context of species interactions, ecosystem effects, etc.
experimental studies
researcher manipulates the conditions or treatments the subjects receive in a controlled and randomized way
observational studies
researcher observes the effects of naturally occurring conditions or treatments, without manipulating them
Are the data treated in the same way for experiments vs observational studies (i.e. will the same statistical analyses be applied)?
A key goal for most research ventures is determining causality, which is done by statistical analysis. Ultimately, determining causality differs in experiments compared to observational studies. In experiments, determining causality is often simpler. Or, as put by Paul Holland (1986), “it is not that I believe an experiment is the only proper setting for discussing causality, but I do feel that an experiment is the simplest such setting.” In observational studies, causality is not as simple. However, using correlations, causation can be inferred - in some cases, correlation DOES mean causation. Causality can be determined by coefficients of correlation between variables (Simon, 1954).; observational studies will have a ton of variables, sources of error, and more complex models
natural experiments
individuals are exposed to the experimental and control conditions that are determined by nature or by other factors outside the control of the investigators. The process governing the exposures arguably resembles random assignment.; no manipulative control, but you can still examine the effects of a certain variable; example: after the clear-cutting of a section of the forest, you could compare the community composition between clear cut and control
under what conditions does it become essential to design an experiment to manipulate behavior?
When the question can no longer be answered by observation alone; for example, it is possible to observe bee behavior, and learn what they forage on by observation. You may notice the bees forage on citrus flowers, and you know that citrus flowers have caffeine in them. How does the caffeine affect their behavior? Does it make them move more/faster, learn better, get sick, change their taste perception, etc.? That must be done by experiment. Both approaches were necessary, because we didn’t know bees consumed caffeine until we saw them do it in nature.
ASA p-values statement
- P-values can indicate how incompatible the data are with a specified statistical model.
- P-values do not measure the probability that the studied hypothesis is true, or the probability that the data were produced by random chance alone.
- Scientific conclusions and business or policy decisions should not be based only on whether a p-value passes a specific threshold.
- Proper inference requires full reporting and transparency.
- A p-value, or statistical significance, does not measure the size of an effect or the importance of a result.
- By itself, a p-value does not provide a good measure of evidence regarding a model or hypothesis.
controversy over the use of p-values
“the ASA has not previously taken positions on specific matters of statistical practice”
summarize the proper use of p-values within a broader summary discussion of other methods for model and variable selection
“First, you can augment your p-value with information about how confident you are in it, how likely it is that you will get a similar p-value in a replicate study, or the probability that a statistically significant finding is in fact a false positive. Second, you can enhance the information provided by frequentist statistics with a focus on effect sizes and a quantified confidence that those effect sizes are accurate. Third, you can augment or substitute p-values with the Bayes factor to inform on the relative levels of evidence for the null and alternative hypotheses; this approach is particularly appropriate for studies where you wish to keep collecting data until clear evidence for or against your hypothesis has accrued. Finally, specifically where you are using multiple variables to predict an outcome through model building, Akaike information criteria can take the place of the p-value, providing quantified information on what model is best.” (Halsey, 2019)
Similarities between Bayesian and frequentist methods
Both are statistical methods used to estimate parameters and can be used for analysis or prediction. They both rely on the likelihood (the product of the probability of the data given the parameters), but in very different ways
frequentist statistics
a type of statistical inference that draws conclusions from sample data by emphasizing the frequency or proportion of the data
bayesian statistics
an approach to data analysis and parameter estimation based on Bayes’ theorem. Unique for Bayesian statistics is that all observed and unobserved parameters in a statistical model are given a joint probability distribution, termed the prior and data distributions.; actually been around longer than frequentist statistics, but only somewhat recently gaining popularity
Differences between bayesian and frequentist
bayesian: combination of prior data with new data to provide a “posterior,” or probability distribution - frequentist does not do this; in bayesian methods, the posterior distribution actually provides the probability of the outcomes, which is more intuitive and is not done in frequentist methods. Frequentist statistics estimates the desired confidence percentage (usually 95%) that some parameter occurs.; frequentist statistics accepts or rejects the null hypotheses, but Bayesian statistics estimates the ratio of probabilities of two different hypotheses.; “Frequentist statistics never uses or calculates the probability of the hypothesis, while Bayesian uses probabilities of data and probabilities of both hypothesis. Frequentist methods do not demand construction of a prior and depend on the probabilities of observed and unobserved data. On the other hand, Bayesian methods depend on a prior and on the probability of the observed data”; in frequentist, P-values are used to test for the probability of obtaining a test result at least as extreme as the result actually observed under the assumption of the null hypothesis; bayesian methods are generally more computationally complex (often require Markov chain Monte Carlo methods) than frequentist models
Bayes’ theorem
describes the probability of an event occurrence based on previous knowledge of the conditions associated with this event
For what purposes are Bayesian vs frequentist well suited?
more complex studies/models, or small dataset (priors help fill in), bayesian better. If you have no priors, frequentist better
statistical randomness
contains no recognizable patterns or regularities; sequences such as the results of an ideal dice roll or the digits of π exhibit statistical randomness
randomness vs chaos
random=boundaries, chaos=no boundaries
null model
attempt to establish a pattern that represents a baseline for a particular system. Another way of looking at a null model is that it represents a circumstance where all of the variables considered to be of interest have no effect.; often difficult to establish a true null model (Harvey, 1983) - after all, they don’t exist in nature. Because of this, they can be controversial - are they accurate, are they useful?
null model example - evolution
Hardy-Weinberg equilibrium (population without influence from evolutionary forces)
Hardy-Weinberg
in a large, randomly-mating population, allele frequencies will remain constant from generation to generation in the absence of forces like natural selection and genetic drift (Hardy-Weinberg equilibrium). If there are alleles with frequencies p and q, one generation of random mating will lead to genotype frequencies of p^2, 2pq, and q^2, and thus equilibrium for both the alleles and genotypes.
how null models are important for ecology and evolutionary biology
we must have a place to start from, a reference point, to compare to. Data is meaningless unless we know how it differs from a null model.
null model example - ecology
exponential growth model for population ecology (without births, deaths, immigration and emigration beyond null)
analytical modeling
use mathematical equations to predict and describe simple, linear components of ecosystems, such as food chains; quantitative in nature, and used to answer a specific question or make a specific design decision; illustrate or understand patterns that exist or how something works, such as energy flow in food webs or how insects flap their wings; processes
examples of analytical modeling
simulation models (use computer algorithms to predict ecosystem dynamics; they are considered the most ecologically-realistic and accurate); an analytical model of insect wings flapping while they hover, which takes into account aerodynamics, acceleration, how the wing lifts, and the deformation of the wing shape that must happen in order for an insect to be able to hover (Kang, 2014)
statistical modeling
using statistical methods to understand relationships between variables, make predictions; answer questions; interactions/relationships
benefits of mixed models
allows for generalizability across different sites, etc.
Analysis of Variance (ANOVA)
used to compare difference between means of different groups; developed by RA Fisher
assumptions of ANOVA
The population from which samples are drawn should be normally distributed..; Homogeneity of variance: Homogeneity means that the variance among the groups should be approximately equal..; The data are independent (The observations in each group are independent of the observations in every other group. and The observations within each group were obtained by a random sample.)
How would you determine if a data set meets ANOVA assumptions? - normality of data
Check the assumption visually using histograms (normal distribution) or Q-Q plots (follow straight line); formal statistical tests like Shapiro-Wilk (if p-value<0.05, data is not normal)
How would you determine if a data set meets ANOVA assumptions? - approximately equal variance
compare length of box plots; Bartlett’s test
variance
spread of data compared to average
How would you determine if a data set meets ANOVA assumptions? - independent data
not really something you check after the fact, more something you plan for with experimental design
What do you do if ANOVA assumptions are violated - normality of data
ANOVA should work anyway if only a little violated, but if its really violated, could transform the response values of your data so that the distributions are more normally distributed; or use a non-parametric statistical test, where normality is not assumed
What do you do if ANOVA assumptions are violated - equal variance
if severely violated, could use a non-parametric test
What do you do if ANOVA assumptions are violated - independence of data
re-run the experiment with a more careful design
Generalized Linear Models
flexible generalization of ordinary linear regression. The GLM generalizes linear regression by allowing the linear model to be related to the response variable via a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value.
Structural equation modeling
a multivariate statistical analysis technique that is used to analyze structural relationships. This technique is the combination of factor analysis and multiple regression analysis, and it is used to analyze the structural relationship between measured variables and latent constructs.
path analysis
a precursor to and subset of structural equation modeling; a method to discern and assess the effects of a set of variables acting on a specified outcome via multiple causal pathways.
SEMs and causality
researchers do not derive causal relations from an SEM. Rather, the SEM represents and relies upon the causal assumptions of the researcher. These assumptions derive from the research design, prior studies, scientific knowledge, logical arguments, temporal priorities, and other evidence that the researcher can marshal in support of them. The credibility of the SEM depends on the credibility of the causal assumptions in each application.
SEMs vs GLMs
random effects are explicitly specified as latent variables (variables that can only be inferred indirectly through a mathematical model from other observable variables that can be directly observed or measured) in SEM and relationships between observed/latent variable are explicitly specified as causal or non-causal
Endangered Species Act
1973; The purposes of the ESA are two-fold: to prevent extinction and to recover species to the point where the law’s protections are not needed. It therefore “protect[s] species and the ecosystems upon which they depend” through different mechanisms. For example, section 4 requires the agencies overseeing the Act to designate imperiled species as threatened or endangered. Section 9 prohibits unlawful ‘take,’ of such species, which means to “harass, harm, hunt…” Section 7 directs federal agencies to use their authorities to help conserve listed species. The Act also serves as the enacting legislation to carry out the provisions outlined in The Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES).[2] The Supreme Court found that “the plain intent of Congress in enacting” the ESA “was to halt and reverse the trend toward species extinction, whatever the cost.”[1] The Act is administered by two federal agencies, the United States Fish and Wildlife Service (FWS) and the National Marine Fisheries Service (NMFS); listing, preventing harm, protecting habitat
greatest threats to wildlife, habitats, and conservation of habitats?
land use change/habitat destruction/fragmentation; invasive species; obstacles = humans like to use land for whatever we want, and invasive species are difficult and expensive to manage
solutions to land use change/habitat destruction?
wildlife reserve; connectivity of habitats; mandating and maintaining connected green spaces; people get paid for their land, but it can be seized by the gov. if determined necessary for conservation efforts
solutions to invasive species
effective management efforts; alternative stable state?; community interactions and strong ecosystems; funding and resources
climate sensitive management approaches
regional modeling, vulnerability frameworks, expanding protected areas, protecting the matrix, regional coordination, and sharing of resources; invasive species; wildlife corridors
regional modeling (conservation)
modeling climate change as site and species specific as possible (Hannah, Midgley, and Millar, 2002)
vulnerability frameworks
Using geohistorical records, paleoecology, and DNA comparisons (ancient and modern) to understand climate change events in the past, and how and why some species survived and some didn’t. All of this plus current climate change models can help us determine which species are the most vulnerable, and the relative urgency and type of conservation effort needed (Dawson, 2011)
Expanding protected areas and matrix management
range shifts, organisms need somewhere to go
knowledge gaps in management approaches
whole community perspective - how do things interact? Which species are most important? If we save one, are we sacrificing another? (Sage grouse and Pinyon Jay); also, extreme weather events - we don’t know as much about how communities are affected by large, unpredictable weather events like wildfires
most critical information to have about a species and its relationship to other species in order to manage for biodiversity or conservation
life history strategies; population size, effective population size, and what a good population size is; genetic diversity; preferred/necessary habitat; geographic range; what is causing the decline; invasive species/competition we should be aware of?
What information is secondary or possibly non-essential for managing biodiversity or conservation
many behavioral things, like how individuals communicate with each other, predator evasion behaviors, etc.; non-essential genetic data, like mutations they have; some evolutionary data, like how or when certain traits evolved
biodiversity
variety and variability of life on Earth. Biodiversity is a measure of variation at the genetic, species, and ecosystem level; currently having an extreme loss of biodiversity, which could be considered the start of the sixth mass extinction, and is almost entirely driven by anthropogenic changes to the earth
evolutionary significant unit
a population of organisms that is considered distinct for purposes of conservation
SLOSS debate
Single large or several small reserves were better for conserving biodiversity in a fragmented habitat? It was originally thought (because of the island biogeography theory) that a singular large reserve (SL) would be better for preserving biodiversity, however, more recently and due to lack of empirical evidence, the several small reserves side is becoming more popular.
How has the debate changed?
This debate should be based on beta diversity or species composition and its variation.
Meaning that the questions cannot be answered by comparing species richness on individual patches of different sizes but should instead be addressed by comparing total species richness among sets of patches with the same total area but different numbers and size of patches. Simberloff and Able (1982) finding no empirical evidence that SL is better than SS.; Quinn and Harrison (1988) finding that “[i]n all cases where a consistent effect of subdivision is observed, the more subdivided collection of islands or isolates contains more species.”; There has even been a review from Fahrig (2020) showing that, based on 30 years of data, that SS is superior to SL in terms of saving biodiversity.; connectedness of reserves; neither option fits every situation, must be evaluated on case-by-case basis; metapopulation dynamics and stochastic events could make several small better; when loss is dispersed, single large may be better; oftentimes this argument is completely irrelevant, because often with management and conservation, you get the land you get and you work with it.
species translocation
the capture, transport and release or introduction of species, habitats or other ecological material from one location to another., in this case generally for conservation purposes; can be reintroduced to a region where they used to live, introduced to a new one they have never lived, or added to an existing population that is small.
pros of translocations
reintroduce to community it used to be a part of; add new genetic material to reverse inbreeding depression or save species from extinction (genetic rescue)
cons of translocation
did you fix the underlying problem?/threats still present; knock existing population off adaptive peak, or introduce the individuals to an area it is not well-adapted for; diseases; limited genetic diversity in translocated individuals
wolves of Yellowstone translocation example
wolves were killed off in Yellowstone, overhunting; in their absence, elk populations increased, having a large overgrazing effect on the plant community and leading to land erosion; coyotes also increased; in the mid 1990s, 30-50 wolves were captured in Canada and translocated to Yellowstone, and their population has grown from there; reintroduction of wolves triggered a trophic cascade, because wolves are a keystone predator in this community. Elk and coyote populations have decreased, leading to plant communities rebounding, leading to increases of grizzly bears, beavers, birds, and more. It has not quite been 30 years, so it is still early to know how this will all work out long-term, but it looks like the community is more balanced now than it was, so this could be considered a success. The biggest reasons this worked is that wolves were historically found here, so the community did not take much adjusting for them to fit in, and the original cause of their declines (overhunting) was outlawed in Yellowstone, although they are still legal to hunt outside of the park.
Florida panther translocation example
Florida panther populations shrank due to habitat loss, degradation, and fragmentation; by the early 1990s, there were only 20-25 individuals left, and had low amounts of genetic diversity left, and were showing signs of inbreeding depression via deformities, low fecundity, and high rates of disease (Johnson, 2010); researchers determined that without genetic rescue, there was a high chance of extinction; translocated 8 pumas from Texas. Historically, there had been gene flow between these populations; the panthers interbred, and the effective population size and heterozygosity increased; today, the population has grown to more than 200. This was considered a great success, as they did prevent the impending extinction of the Florida panther. However, how successful it was remains to be seen. This is still a very small population. There were some actions taken to solve the original problem of habitat loss, with some protected habitat, highway underpasses to decrease cars hitting them, and other efforts, but ultimately their habitat has been so reduced and degraded from historical ranges that it is hard to be optimistic long-term.
Andouin’s gulls translocation example
An example of unsuccessful translocation; chicks translocated to an uninhabited site to attempt to reinforce a metapopulation; birds tend to return to their fledging sites to raise their own chicks; many chicks did not survive, and the ones that survived did return, but many did not breed, and the new site failed to establish; there are many possible reasons why this was unsuccessful, but it is one example of unsuccessful translocation efforts, which are important to be aware of, since the successful effort get a lot of attention. (Oro, 2011)
translocations vs ecological invasions
translocation is still humans introducing or reintroducing species, and the community may have filled that niche or adapted to life without that species in the meantime, or has never experienced this species at all; in this way, translocated species can be received by the community in a similar way as an ecological invasion; however, a meta-analysis by Novak (2021) determined that while translocations are very popular (“Translocations have been performed, are planned, or are part of continuing recovery actions for 70% (1,112 of 1,580) of listed threatened and endangered taxa”), they only found one instance of translocation done for a conservation effort causing a loss in biodiversity. Translocations can also technically be for non-conservation purposes, as the same term is used for organisms moved for economic and cultural interests in the absence of conservation-based governance (such as fish stocking for sport and biological control programs for agricultural pests). In these cases, harmful effects were more common, and could be compared more to ecological invasions.
Sixth mass extinction
definitely a loss of diversity that amounts to a crisis, many researchers agree that we are in the sixth mass extinction event or at least heading into one; hard to estimate species losses because we lack the data, especially for some groups more than others, such as invertebrates, which make up 95-97% of animal species, and plants; use of IUCN Red List extinction data to determine current extinction rates inevitably leads to dramatic under-estimation of rates, except for birds, mammals and perhaps amphibians (Cowie, 2022)
How do current extinction rates compare to historical rates
E/MSY metric (number of extinctions per million species-years); a background rate is somewhere between 0.5 and 2 E/MSY, and we are currently estimated to be at a species loss rate of 150-260 E/MSY
which species/traits are most threatened
island species (isolated, more vulnerable to changes in environment)
traits that predict survival of mass extinction
wide habitat breadth, large range, fast life-history traits, high dispersal ability
wide habitat breadth - survival of mass extinction vs regular times
extinction: Species that are able to survive in a wide range of habitats have the ability to more quickly and effectively adapt to disturbances by using what resources are still available, even if their original or preferred habitats are no longer viable;
regular: Not necessarily more beneficial - habitat generalists are common, but so are specialists. Specialization has evolved independently and repeatedly across the evolutionary tree, and it is a perfectly viable strategy in many environments
large range - survival of mass extinction vs regular times
extinction: Complicated, but correlated with species abundance and dispersal, the latter of which is also tied with wide habitat breadth. Higher abundance and higher ability to disperse are both survival traits, and Species with large ranges are less vulnerable to stochastic events;
regular: This survival trait does tend to hold true even during times of normal extinction levels, but becomes more pronounced during mass extinction events
fast life history traits - survival of mass extinction vs regular times
extinction: Better able to balance high mortality in disturbance situations, and Higher number of generations in a shorter amount of time increasing their opportunities for adaptation via evolution;
regular: Not necessarily more beneficial - both fast and slow life-history strategies are common, and each have merits depending on the environment, so fast life-history traits are not guaranteed to be beneficial in a normal extinction level environment
high dispersal ability - survival of mass extinction vs regular times
extinction: Mostly applicable to invertebrates and plants, as vertebrates are generally pretty well-equipped to disperse, and Able to move to a new area are better able to cope with the environmental disturbances often occuring in mass extinctions
regular: Dispersal ability is important even at normal extinction levels, but is much more important during mass extinction events, when widespread disturbances are more common, and environmental stability decreases
high dispersal ability - survival of mass extinction vs regular times
extinction: Mostly applicable to invertebrates and plants, as vertebrates are generally pretty well-equipped to disperse, and Able to move to a new area are better able to cope with the environmental disturbances often occuring in mass extinctions
regular: Dispersal ability is important even at normal extinction levels, but is much more important during mass extinction events, when widespread disturbances are more common, and environmental stability decreases
novel ecosystem
one that has been heavily influenced by humans but is not under human management; human-built, modified, or engineered niches of the Anthropocene. They exist in places that have been altered in structure and function by human agency.; new community compositions and possibly ecosystem function; concept originally developed to compare ecosystem function of human-modified ecosystems to historical ecosystems
Novel ecosystem theory potentially poses a challenge to traditional conservation and restoration, and discuss your position on the issue.
the novel ecosystem concept was developed in response to a growing concern that managing degraded ecosystems and restoring them had gotten too costly and potentially not worth the money and time investments.; some ecosystems are truly novel, and are unable to return to historical state - in this case, we have no choice but to attempt to manage it in the healthiest possible state we can; however, novel ecosystem could be a cop-out when managers don’t want to do the work to restore the ecosystem, or when funders don’t want to provide the necessary resources; it is probably rare for an ecosystem to be unsalvageable, but not impossible