Chapter 1 - Some General Theory of Ordinary Differential Equations Flashcards

1
Q

Existence and Uniqueness Theorem

Theorem

A

-consider the differential equation
y’’ + p(x)y’ + q(x)y = 0
with p and q continuous on some interval I given by a≤x≤b
-let α, β be any two real numbers and xo be any point in the interval I
-then the equation has a unique solution defined on the interval I which satisfies:
y(xo) = α
y’(xo) = β

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Existence and Uniqueness Theorem

General Solution

A

y’’ + p(x)y’ + q(x)y = 0
-since the equation is linear and homogeneous, its solution set forms a vector space
-since it is second order, the dimension of this space is 2
-any two linearly independent solutions y1(x), y2(X) can be used as a basis
-the general solution is given as the linear superposition:
y(x) = c1 y1(x) + c2 y2(x)
-the existence and uniqueness theorem says that c1 and c2 can be fixed by specifying initial conditions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

y’’ + p(x)y’ + q(x)y = 0

Boundary Conditions

A
  • it is also possible to fix c1 and c2 by specifying 2-point boundary conditions such as y(0) = y(1) = 0
  • but in this case we do not have such a powerful general theorem telling us that a (nontrivial) solution always exists
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

y’’ + p(x)y’ + q(x)y = 0

Why only two initial conditions?

A
-the equation implies that:
y''(x) = -p(x)y'(x) - q(x)y(x) 
for any x ∈ I 
-in particular:
y''(xo) = -p(xo)y'(xo) - q(xo)y(xo)
-so y''(xo) cannot be independently specified
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The Wronskian

Derivation

A

given:
y(x) = c1 y1(x) + c2 y2(x)
and
y(xo) = α , y’(xo) = β
-we have a pair of linear equations for c1 and c2:
c1 y1(xo) + c2y2(xo) = α
c1y1’(xo) + c2y2’(xo) =β
-rewrite in a matrix form:
AB = C
-where A = 2x2 matrix, top row y1(xo) , y2(xo), bottom row y1’(xo) , y2’(xo)
-B is a 2x1 matrix: c1, c2
-C is a 2x1 matrix: α, β
-to be able to calculate c1 and c2 we need to multiply both sides by A inverse, so A inverse must exist
-the Wronskian is the determinant of matrix A so if the Wronskian is 0 we cannot find c1 and c2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Wronskian of Two Functions

A

W(x) = Wy1,y2

= y1(x)y2’(x) - y1’(x)y2(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Abel’s Formula

Formula

A
-let yi(x) be solutions of :
y'' + p(x)y' + q(x)y = 0
Then:
W'(x) = -p(x) W(x)
and so:
W(x) = Wo*exp(-∫p(ξ)dξ)
-where the integral is taken from xo to x
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Abel’s Formula

Proof

A

-start with the definition of the Wronskian :
W(x) = y1y2’ - y1’y2
-differentiate:
W’(x) = y1’y2’ + y1y2’’ - y1’y2’ - y1’‘y2
W’(x) = y1y2’’ - y1’‘y2
-from the original ODE: y’’ + p(x)y’ + q(x)y = 0
we can write
y2’’ = -py2’ - qy2
y1’’ = -py1’ - qy1
-sub in and cancel:
W’(x) = -p(x)W(x)
-therefore defining Wo = W(xo), we have W’(x)=-p(x)W(x) solved by W(x) = Wexp(-∫p(ξ)dξ)
-where the integral is taken from xo to x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Linear Dependence

Definition

A

-two functions y1(x) and y2(x) are linearly dependent if there exists γ≠0 such that y2 = γ*y1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Wronskian of Linearly Dependent Functions

A
  • if two functions y1 and y2 are linearly dependent then y2 can be written as y2 = γ*y1 for some non-zero γ
  • in this case the Wronskian is 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Is this statement true?

‘ W[y1,y2] = 0 implies y1 and y2 are linearly dependent’

A
  • no this is not always true and can be shown by counter example
  • if y1=x^3 and y2=|x|^3 then y1 and y2 are linearly independent but W[y1,y2]=0
  • however this statement does hold true subject to the condition that y1 and y2 are solutions of a second order homogeneous linear differential equation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Linear Dependence and Wronskian Theorem

A

-let y1(x) and y2(x) be two non-zero solutions of:
y’’ + p(x)y’ + q(x)y = 0
-with p(x) and q(x) continuous on some interval I given by a≤x≤b
-then y1(x) and y2(x) are linearly dependent if and only if their Wronskian vanishes identically on I

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Euler’s Equation

A
-the general form of Euler's equation:
x²y'' + axy' + by = 0
-where a and b are constants
-in this case p(x)=ax/x² and q(x)=b/x²
-the point x=0 is badly defined so we seek solutions only for x>0
-calculating the Wronskian gives:
W(x) = Wo*x^(-a)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Singular Point

Definition

A

-points which cannot be included in the interval I are called singular points
-e.g. if p(x)=1/x² and q(x)=1/x
then x=0 would be a singular point as the functions are not continuous over an interval including x=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Wronskian of Three Functions

A
  • determinant of a 3x3 matrix
  • first row with entries: y1, y2, y3
  • second row with entries: y1’, y2’, y3’
  • third row with entries: y1’’, y2’’, y3’’
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Wronskian of Three Functions to Find the Homogeneous Linear Differential Equation Whose Solution Space is Spanned by Two Given Functions

A
-two functions y1(x) and y2(x) are given as linearly independent solutions of:
y'' + p(x)*y' + q(x)*y = 0
-let y(x) be a general solution of this equation, then:
y(x) = c1*y1(x) + c2*y2(x) 
-for some c1 and c2
-calculate the Wronskian:
W[y1,y2,y] = 0 
W[y1,y2,y]=y'' + p(x)y' + q(x)y
-where:
p(x) = -W'(x)/W(x)
q(x) = W[y1',y2'] / W[y1,y2]
17
Q

y’’ + p(x)y’ + q(x)y = 0

Given One Solution, Find , the Other

A
-given one solution y1(x), then we have a first order differential equation for y2(x):
y1*y2' - y1'*y2 = W(X)
-this equation is linear in y2 with integrating factor y1^(-2) leading to:
d/dx (y2/y1) = W(x) / y1²
-integrating and rearranging:
y2 = y1 * ∫ W(x)/y1² dx
y2 = 
y1 *  ∫ [exp(- ∫ p(x)dx)/y1²]dx
18
Q

Constant Coefficient Second Order ODE with Double Root

y’’ - 2my’ + m²y = 0

A
y'' - 2my' + m²y = 0
-using y=e^(λx) gives an auxiliary equation:
(λ-m)² = 0
-so:
y1 = e^(mx)
-calculate W using 
W'=-p(x)W , in this case 
p(x)=-2m :
W' = 2mW
=> W = e^(2mx)
-substitute into 
d/dx (y2/y1) = W(x) / y1² :
y2 = x*e^(mx)
19
Q

Basis of Solutions

A

-to find the general solution of;
y’’ + p(x)y’ + q(x)y = 0
-we just need to find two linearly independent solutions
-i.e. solutions y1 and y2 such that W[y1,y2]≠0
-since all other solutions are then just a linear combination of these two, the pair of functions y1 and y2 form a basis in the linear algebra sense

20
Q

Standard Basis for The Solution Space of the Second Order Equation

A

y’’ + p(x)y’ + q(x)y = 0
-just like choosing a convenient basis e1 and e2 for 2-D space, we can choose:
->y1 to be the unique solution satisfying:
y1(xo) = 1
AND y1’(xo) = 0
->y2 to be the unique solution satisfying:
y2(xo) = 0
AND y2’(xo) = 1
-in this case W[y1,y2]=1 so y1 and y2 satisfy linear independence

21
Q

Standard Basis for the Solution Space of a Second Order Equation Proof

A
-for  the standard basi we choose:
y1(xo) = 1,  y1'(xo) = 0
AND
y2(xo) = 0, y2'(xo) = 1
-in this case 
W[y1,y2]=1≠0 
so y1 and y2 satisfy linear independence
-these conditions also satisfy the existence and uniqueness theorem:
let y(x) = αy1(x) + βy2(x)
-calculate y(xo):
y(xo) = αy1(xo) + βy2(x0)
= α*1  + 0 = α
-calculate y'(xo):
y'(xo) = αy1'(xo) + βy2'(x)
= 0 + β*1 = β
22
Q

Change of Basis

A
  • let y1 and y2 be a basis
  • we can form a new basis by non-singular linear transformation
  • i.e. take the 2x1 column vector y1, y2 and multiply by the 2x2 vector a,b,c,d such that ad-bc≠0
23
Q

Change of Basis
Hyperbolic Function
y’’ - y = 0

A
y'' - y = 0
-this equation has solutions;
y1 = e^x
y2 = e^(-x)
-an alternative basis is:
^y1 = coshx
^y2 = sinhx
-these basis are related by the 2x2 matrix with entries
a=1/2 , b=1/2 , c=1/2 , d=-1/2
24
Q

Finding a Solution to Euler’s (Cauchy’s) Equation

Deriving the Indicial Equation

A
x²y'' + axy' + by = 0
-since:
y = x^r
=> x*y' = r*x^r
=> x²*y'' = r(r-a)*x^r
-we have:
x²y'' + axy' + by 
= [ r(r-1) + ar + b ] * x^r = 0
-if r is chosen to satisfy the indicial equation
F(r) = r² + (a-1)r + b = 0
-then y=x^r is a solution of Euler's Equation
25
Finding a Solution to Euler's (Cauchy's) Equation | 2 Distinct Real Roots
r = r1, r2 -in this case the general solution (for x>0) is : y = c1*x^(r1) + c2*x^(r2) -if r is complex or irrational then we define: x^r = e^(r*logx)
26
Finding a Solution to Euler's (Cauchy's) Equation | Double Root
(r-r1)² = 0 -we only have one solution, y1 = x^(r1) -the solution is of the form: y = (c1 + c2*logx)*x^(r1)
27
Finding a Solution to Euler's (Cauchy's) Equation | Complex Conjugate Roots
``` r = α ± iβ -writing: x^r = e^((α ± iβ)*logx) =e^(α*logx) [cos(βlogx)+isin(βlogx) ]] -we have: y1 = x^α cos(βlogx) and y2 = x^α sin(βlogx) -as independent solutions ```
28
Reduction of Order Equation
d/dx (y2/y1) = W(x) / y1²
29
Euler's Equation | Indicial Equation
``` x²y'' + axy' + by = 0 -using the substitution: y= x^r * Σanx^n => y' = Σ(n+r)*an*x^(n+r-1) y'' = Σ(n+r-1)*(n+r)*an*x^(n+r-2) -sub in to Euler's Equation -choose n=0 to get the lowest possible power of x term: x^r -equate the coefficient of x^r to 0, since ao≠0, the indicial equation is: r² + (a-1)r + b = 0 = F(r) ```
30
Defining a Function
- many functions arise in some specific context with no reference to differential equations e.g. trigonometric functions in geometry - however it is possible to define well known functions as solutions of some differential equations and then derive some well known properties of these functions
31
Properties of Solutions of y' = y
-let E(x) be the solution of the initial value problem: y' = y , y(0)=1 -since the equation is autonomous, the function Er(x) = E(x+r) is also a solution -since the equation is first order and linear there exists a constant a such that: Er(x) = a*E(x) -we have Er(0)=a*E(0), so a=E(r) -we have shown that: E(x+r) = E(x)*E(r) -one of the fundamental properties of the exponential function -we can show that E(x)>0 for all x: ->suppose E(s)=0 for some real s, then for arbitrary x we have: E(x+s) = E(x)*E(s) = 0 so, E(x)=0, this contradicts E(0)=1
32
Properties of the Solutions of y''+y=0
``` -if y(x) is any solution of y'' + y = 0 -then y' is also a solution since: y'' + y' = 0 = (y')''+y'=0 -take S(x) as the unique solution satisfying S(0)=0 and S'(0)=1 -define C(x) = S'(x) -then C'(x) = S''(x) = -S(x) -since W' = p(x)*W and p(x)=0, W = constant -calculate the Wronskian: W[S,C] = -(S² + C²) -specifically W for x=0 is -1 -since W is constant - (S² + C²) = -1 S² + C² = 1 -we have derived a well known property of sine and cosine just from a differential equation ```
33
Euler's Equation | Solutions for Different Values of r
-where y is the general solution: 1) Two Distinct Real Roots: y = c1*x^(r1) + c2*x^(r2) 2) Double Roots (r-r1)²=0 , y1=x^(r1) y = (c1+c2*logx)*x^r1 3) Complex Conjugate Roots r=α±iβ y = e^[(α±iβ)logx] = e^(αlogx) * [cos(βlogx)+isin(βlogx) => y1 = x^α * cos(βlogx) y2 = x^α * sin(βlogx)