EGD - Exam 2 Flashcards

1
Q

What is the difference in dynamic optimization between optimizing with respect to continuous vs discrete time?

A

A variable measured in continuous time can be differentiated with respect to time. A variable measured in discrete time cannot.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is optimal control?

A

It is an approach to solve dynamic optimization
problems when one of the constraints is a differential equation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Describe a typical optimal control problem in economic growth theory.

A

▶ Economic agent chooses (controls) a sequence of values for certain variables over time (control variables).

▶ She/he wants to maximize an objective function (utility/profits) subject to some constraints.

▶ Such constraints are dynamic such that they describe the state of the economy: state variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define the general problem of dynamic optimization and optimal control.

A

The maximization of utility function V(0), which is the integral from initial time 0 to final time T of the instantaneous objective functions u, which depend on a control variable c_t, a state variable k_t, and time t. This maximization is subject to:
a) the state variable k(dot)_t, which is a function of k_t, c_t, and t;
b) k_0, which is given;
c) the condition that the final value of k, k_T,

How well did you know this?
1
Not at all
2
3
4
5
Perfectly