EXAM 1 Flashcards

1
Q

Econometrics

A

The science of using statistics and economic theory to acquire economic data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Causality

A

An action is said to cause an outcome if the outcome is the direct
result, or consequence, of that action

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Controlled Experience
- control group
- treatment group
-random assignment

A

In a controlled experiment, the control group doesn’t receive treatment, while treatment group does. The assignment to each group is random.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Casual Effect

A

Effect on an outcome of a given action/treatment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Experimental Data

A

experiment designed to evaluate a treatment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Observational Data-

A

Observes actual behavior outside an experimental setting
- treatments are not assigned

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Cross sectional data

A

data on different entities for a single period of time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Time series data

A

data for single entity at multiple times period

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Panel data

A

data for multiple entities, which each entity is observed at 2+ periods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Probability Theory

A

basic language of uncertainty + forms the basis for statistical inference

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Outcomes

A

mutually exclusive potential results of a random process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Sample Space

A

set of all possible outcomes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Event

A

Subset of the sample space; set that contains more than one outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Probability of Outcome

A

the proportion of the time that the outcome occurs in the long run

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Probability of Event

A

the sum of the probabilities of the outcomes in the event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Random Variable

A

numerical summary of a random outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Probability Distribution

A

The probability distribution of a discrete random variable is the list of all possible values of the variable and the probability that each value will occur.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Cumulative Probability Distribution

A

the probability that the random variable is less than
or equal to a particular value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Bernoulli Distribution

A

p and 1-p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Probability Density Function (continuous)

A

probability that the random
variable falls between those two points

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

CDF for continuous

A

the probability that the random variable is less than or equal
to a particular value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Expected Value of Discrete RV

A
  • long run average value of the random variable over many repeated trials
  • weighted average of the
    possible outcomes of that random variable where the weights are the
    outcomes’ probabilities.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Expected Value of Continuous RV

A

an uncountable infinite many
possible values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

mean+SD

A

measure the center of the
distribution and its spread

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Skewness

A

measures the lack of symmetry
- symmetric, skewness =0
- long right tail = + skewness
-long left tail = - skewness

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Kurtosis

A

how thick or heavy the tails of distributions are
- the greater the kurtosis, the more likely the outliers
- a normal distributed RV is 3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Standard Deviation + Variance

A

measures of dispersion of distribution
- the variance is an expected value of the square of the deviation of Y from its mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Moments of Distribution

A
  1. Mean of Y; E[Y] is first moment
  2. E[Y^2] is the second moment
  3. E[Y^r] is the rth moment
    - variance is function of 1st and 2nd moment
    -skewness is the function of 1st-3rd moment
    - kurtosis is the 1st-4th moment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Joint Probability Distribution

A

probability that the random variables simultaneously take on certain x and y values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Marginal Probability Distribution of Y

A

Adding up all the probabilities possible for which Y takes on a specific value

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Conditional Distribution

A

The distribution of Y conditional on X taking a specific value
P( X|Y)= P(X,Y)/P(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Law of Iterated Expectation

A

weighted average of the P(Y|X), weighted by the distribution of X

  • the expected value of Y is equal to the expectation of the
    conditional expectation of Y given X
    E [Y ] = E [E [Y |X ]]
33
Q

LIE

A
  • computed using the conditional distribution of Y given X , and the outer expectation is computed
    using the marginal distribution of X .
  • implies that if the conditional mean of Y given X is zero, then the mean of Y is zero.
  • applies to expectations that are conditioned on
    multiple random variables
34
Q

Independence

A

2 RV are independent if knowing the value of one variable provides no info about the others

35
Q

Covariance

A

The extent the two variables move together
- if X & Y are independent, covariance is zero

36
Q

Correlation

A

How much one variable depends on the other variable
- X&Y are uncorrelated if corr(x,y)=0
-1<corr<1

37
Q

funfact

A

If the conditional mean of Y does not depend on X , then Y and X are
uncorrelated

38
Q

Standard normal distribution

A

convenient representation for RV that is normally distributed

39
Q

Multivariate Normal Distribution

FOUR IMPORTANT PROPERTIES

A

Represents the joint distribution of two (bivariate normal) or more

40
Q

MULTIVARIATE NORMAL DISTRIBUTION- PROPERTY 1

A

If X and Y have bivariate normal distribution with covariance σXY , then for
constants a and b

41
Q

MULTIVARIATE NORMAL DISTRIBUTION- PROPERTY 2

A

If a set of variables has a multivariate normal distribution, then the marginal
distribution of each of the variables is normal.

42
Q

MULTIVARIATE NORMAL DISTRIBUTION- PROPERTY 3

A

If variables with multivariate normal distribution have covariances that
equal zero, then the random variables are independent.
- if X
and Y have a bivariate normal distribution with σXY = 0, then X and Y
are independent.

  • uncorrelation implies independence (not true in
    general)
43
Q

MULTIVARIATE NORMAL DISTRIBUTION- PROPERTY 4

A

If X and Y have a bivariate normal distribution, then the conditional
expectation of Y given X is linear in X that is E [Y |X = x] = a + bx, where a and b are constant.

Joint normality implies linearity of conditional
expectations, but linearity of conditional expectation does not imply joint
normality

44
Q

Chi Squared Distribution

A

sum of M square independent standard normal RV
- et Z1,
Z2 and Z3, be independent standard normal random variables. Then,
Z1 + Z2+Z3 has a Chi-squared distribution with 3 degrees of freedom

45
Q

Student T Distribution

A

ratio of a standard normal random variable, divided by the square root of an independently distributed chi-squared random variable with M degrees of freedom divided by M

46
Q

F Distribution

A

with M and N degrees of freedom, denoted by FM,N is
defined to be the distribution of the ratio of a chi-squared random variable
with M degrees of freedom, divided by M, to an independent chi-squared
distribution with N degrees of freedom, divided by N

47
Q

Random Sampling

A

Random sampling procedures n objects are selected random from a population

Y1, …, Yn, where
Y1 is the first observation, Y2 is the second observation, and so forth.
Each of these Yi ’s are random variables.

48
Q

Identically distributed

A

each Yi has the same marginal distribution

49
Q

Sample Average

A

= 1/n (Y1+..+Yn)

50
Q

E[Y] = μY
Var [Y] = σ Y = σ^2/n .

A
51
Q

When the distribution of Y is not normal

A

the exact distribution of the sample mean is typically
complicated and depends on the distribution of Y

52
Q

Large Sample Approximation

A

Large sample approach uses approximations to sampling distribution that rely on on n>30

53
Q

Asymptotic Distribution

A

approximation becomes exact, n-> infinity

54
Q

Law of Large #

A

sample size is large, the average be very close to mean with very high probability

55
Q

Central Limit Theorem

A

when sample size is large, the sampling distribution of standardized sample average is approximately normal

56
Q

Asymptotic Theory

A

While exact sampling distributions are complicated and depend on the
distribution of Y , asymptotic distributions are simple.

57
Q

Convergence in Probability

A

converges in probability to μY (or equivalently ̄Y
is consistent for μY if the probability that the the sample average ̄Y is in
the range μY − c to μY − c becomes arbitrarily close to 1 as n increases
for any constant c > 0

58
Q

Statistics

A

science of using data to learn about the world around us

59
Q

Estimator

A

function of a sample of data to be drawn from population

  • is a RV because it’s a function of random sample observations
60
Q

Estimate

A

numerical value of the estimator when it’s actually computed using data from specific sample

  • not random sample
61
Q

ESTIMATION PROPERTY 1: Unbiasedness

A

We say ˆμY is an unbiased estimator of μY if E [ˆμY ] = μY .

  • bias of the estimator is E [ˆμY ] − μY
  • if we compute the value of the estimator for different samples, on average, we get the right number
62
Q

ESTIMATION PROPERTY 2: Consistency

A

Let ˆμY be an estimator of μY . We say ˆμY is a consistent estimator of μY
if ˆμY converges in probability to μY

  • when the sample size is large, the uncertainty about the value of μY arising from random variation in the sample is very small.

Convergence in Probability
A sequence of random variables { Xn } converges in probability to X if for
all > 0
lim
x→∞ Pr (|Xn − X | > ) = 0.

63
Q

ESTIMATION PROPERTY 3: Efficiency

A

Let ˆμY and ̃μY be unbiased estimators of μY .

We say that ˆμY is more
efficient than ̃μY if V [ˆμY ] < V [ ̃μY ]. In other words, an estimator is more
efficient than other if it has a tighter sampling distribution.
SBU Econometrics Fall 2021 8 / 43

64
Q

Properties of the average Y

A
  • The sample mean Y is an unbiased and consistent estimator of μY
  • The sample mean Y is the best linear unbiased estimator (BLUE), where “best” stands for more efficient here.
  • The sample mean Y is also the least squares estimator of μY .
65
Q

P Value

A

probability of drawing a statistic at least as unfavorable to the null hypothesis as the value actually computed with your data,
assuming that the null hypothesis is true. One often “rejects the null
hypothesis” when the p-value is less than the significance level α

66
Q

Significance level

A

The significance level of a test is a pre-specified probability of incorrectly rejecting the null, when the null is true.
- probability of type 1 error

67
Q

Critical Value

A

the value of the test statistic for which the test just rejects the null hypothesis at the chosen significance level

68
Q

Sample Variance

A

the square root of the sample
variance. The sample variance is an unbiased and consistent estimator of the population variance.

69
Q

Standard Error

A

an estimator of the standard deviation and is denoted by SE

70
Q

n known, variance unknown

A

p value= (- (Yaverage-mean)/SE))

71
Q

Type 1 Error

A

Rejecting null when it’s true

72
Q

Type 2 Error

A

Accepting null when it’s false

73
Q

Rejection Region

A

the set of values of test statistics for which the null hypothesis is rejected

74
Q

Acceptance region

A

the set of values of test statistics for which null hypothesis is not rejected

75
Q

Size of Test

A

probability of type 1 error

76
Q

Power of test

A

probability of rejecting the Ho when alternative is true

77
Q

Confidence Interval

A

an interval that contains the true value of mean in 95% of repeated samples

78
Q

When to use t statistics

A

when sample size is rlly small

79
Q

The sample covariance is a consistent estimator of the population covariance.

The sample correlation lies between −1 and 1

A

The sample correlation coefficient measures the strength of the linear
association between X and Y in the sample of n observations.