3.3: Linear systems of ODEs (2024)

  1. Last updated
  2. Save as PDF
  • Page ID
    364
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}} % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}} % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vectorC}[1]{\textbf{#1}}\)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}}\)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}}\)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}}}\)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    First let us talk about matrix or vector valued functions. Such a function is just a matrix whose entries depend on some variable. If \(t\) is the independent variable, we write a vector valued function \( \vec {x} (t) \) as

    \[ \vec {x} (t) = \begin {bmatrix} x_1(t) \\ x_2 (t) \\ \vdots \\ x_n (t) \end {bmatrix} \nonumber \]

    Similarly a matrix valued function \( A(t) \) is

    \[ A (t) = \begin {bmatrix} a_{11} (t) & a_{12} (t) & \cdots & a_{1n} (t) \\ a_{21} (t) & a_ {22} (t) & \cdots & a_{2n} (t) \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1}(t) & a_{n2}(t) & \cdots & a_{nn}(t) \end {bmatrix} \nonumber \]

    We can talk about the derivative \(A'(t)\) or \( \frac {dA}{dt} \). This is just the matrix valued function whose \(ij^{th}\) entry is \(a'_{ij} (t) \).

    Rules of differentiation of matrix valued functions are similar to rules for normal functions. Let \(A(t)\) and \(B(t)\) be matrix valued functions. Let \(c\) be a scalar and let \(C\) be a constant matrix. Then

    \[\begin{align}\begin{aligned} {(A(t) + B(t))}' &= A' (t) + B' (t) \\ (A(t)B(t))' &= A'(t)B(t) + A(t)B'(t) \\ (cA(t))' &= cA' (t) \\ (CA(t))' &= CA'(t) \\ (A(t)C)' &= A' (t)C \end{aligned}\end{align} \nonumber \]

    Note the order of the multiplication in the last two expressions.

    A first order linear system of ODEs is a system that can be written as the vector equation

    \[ \vec {x} (t) = P(t) \vec {x} (t) + \vec {f} (t) \nonumber \]

    where \( P(t) \) is a matrix valued function, and \( \vec {x} (t) \) and \( \vec {f} (t) \) are vector valued functions. We will often suppress the dependence on \(t\) and only write \( \vec {x} = P \vec {x} + \vec {f} \). A solution of the system is a vector valued function \( \vec {x} \) satisfying the vector equation.

    For example, the equations

    \[\begin{align}\begin{aligned} x'_1 &= 2tx_1 + e^tx_2 + t^2 \\ x'_2 &= \frac {x_1}{t} - x_2 + e^t \end{aligned}\end{align} \nonumber \]

    can be written as

    \[ \vec {x'} = \begin {bmatrix} 2t & e^t \\ \frac {1}{t} & -1 \end {bmatrix} \vec {x'} + \begin {bmatrix} t^2 \\ e^t \end {bmatrix} \nonumber \]

    We will mostly concentrate on equations that are not just linear, but are in fact constant coefficient equations. That is, the matrix \( P\) will be constant; it will not depend on \(t\).

    When \( \vec {f} = \vec {0} \) (the zero vector), then we say the system is hom*ogeneous. For hom*ogeneous linear systems we have the principle of superposition, just like for single hom*ogeneous equations.

    Theorem \(\PageIndex{1}\)

    Superposition

    Let \( \vec {x'} = P \vec {x'} \) be a linear hom*ogeneous system of ODEs. Suppose that \( \vec {x}_1, \dots, \vec {x}_n \) are \(n\) solutions of the equation, then

    \[ \vec {x} = c_1 \vec {x}_1 + c_2 \vec {x}_2 + \dots + c_n \vec {x}_n \nonumber \]

    is also a solution. Furthermore, if this is a system of \(n\) equations \( (P \rm{~is~} n \times n) \), and \( \vec {x}_1, \dots , \vec {x}_n \) are linearly independent, then every solution can be written as \(\eqref{eq:12}\).

    Linear independence for vector valued functions is the same idea as for normal functions. The vector valued functions \( \vec {x}_1, \vec {x}_2, \dots, \vec {x}_n \) are linearly independent when

    \[ \label{eq:12}c_1 \vec {x}_1 + c_2 \vec {x}_2 + \dots + c_n \vec {x}_n = \vec {0} \]

    has only the solution \( c_1 = c_2 = \dots = c_n = 0 \), where the equation must hold for all \(t\).

    Example 3.3.1

    \( \vec {x}_1 = \begin {bmatrix} t^2 \\ t \end {bmatrix}, \vec {x}_2 = \begin {bmatrix} 0 \\ {1 + t } \end {bmatrix}, \vec {x}_3 = \begin {bmatrix} -t^2 \\ 1 \end {bmatrix} \) are linearly depdendent because \( \vec {x}_1 + \vec {x}_3 = \vec {x}_2\), and this holds for all \(t\). So \(c_1 = 1, c_2 = -1\) and \(c_3 = 1\) above will work.

    On the other hand if we change the example just slightly \( \vec {x}_1 = \begin {bmatrix} t^2 \\ t \end {bmatrix}, \vec {x}_2 = \begin {bmatrix} 0 \\ t \end {bmatrix}, \vec {x}_3 = \begin {bmatrix} -t^2 \\ 1 \end {bmatrix} \), then the functions are linearly independent. First write \( c_1 \vec {x}_1 + c_2 \vec {x}_2 + c_3 \vec {x}_3 = \vec {0} \) and note that it has to hold for all \(t\). We get that

    \( c_1 \vec {x}_1 + c_2 \vec {x}_2 + c_3 \vec {x}_3 = \begin {bmatrix} c_1t^2 - c_3t^3 \\ c_1t + c_2t + c_3 \end {bmatrix} = \begin {bmatrix} 0 \\ 0 \end {bmatrix} \)

    In other words \( c_1t^2 - c_3t^3 = 0 \) and \(c_1t + c_2t + c_3 = 0 \). If we set \(t = 0\), then the second equation becomes \(c_3 = 0 \). However, the first equation becomes \(c_1t^2 = 0\) for all \(t\) and so \(c_1 = 0 \). Thus the second equation is just \(c_2t = 0\), which means \(c_2 = 0\). So \(c_1 = c_2 = c_3 = 0 \) is the only solution and \( \vec {x}_1, \vec {x}_2 \) and \(\vec {x}_3\) are linearly independent.

    The linear combination \( c_1 \vec {x}_1 + c_2 \vec {x}_2 + \dots + c_n \vec {x}_n \) could always be written as

    \[ X (t) \vec {c} \nonumber \]

    where \( X (t) \) is the matrix with columns \(\vec {x}_1, \dots , \vec {x}_n \), and \( \vec {c} \) is the column vector with entries \( c_1, \dots , c_n \). The matrix valued function \( X (t) \) is called the fundamental matrix, or the fundamental matrix solution.

    To solve nonhom*ogeneous first order linear systems, we use the same technique as we applied to solve single linear nonhom*ogeneous equations.

    Theorem \(\PageIndex{2}\)

    Let \( \vec {x}' = P \vec {x} + \vec {f} \) be a linear system of ODEs. Suppose \( \vec {x}_p\) is one particular solution. Then every solution can be written as

    \[ \vec {x} = \vec {x}_c + \vec {x}_p \nonumber \]

    where \( \vec {x}_c \) is a solution to the associated hom*ogeneous equation \( (\vec {x} = P \vec {x}) \).

    So the procedure will be the same as for single equations. We find a particular solution to the nonhom*ogeneous equation, then we find the general solution to the associated hom*ogeneous equation, and finally we add the two together.

    Alright, suppose you have found the general solution \( \vec {x}' = P \vec {x} + \vec {f} \). Now you are given an initial condition of the form \[ \vec {x} {t_0} = \vec {b} \nonumber \] for some constant vector \( \vec {b} \). Suppose that \( X (t) \) is the fundamental matrix solution of the associated hom*ogeneous equation (i.e. columns of \( X (t) \) are solutions). The general solution can be written as

    \[ \vec {x} (t) = X (t) \vec {c} + \vec {x}_p (t) \nonumber \]

    We are seeking a vector \(\vec {c} \) such that

    \[ \vec {b} = \vec {x} (t_0) = X (t_0) \vec {c} + \vec {x}_p (t_0) \nonumber \]

    In other words, we are solving for \( \vec {c} \) the nonhom*ogeneous system of linear equations

    \[ X(t_0) \vec {c} = \vec {b} - \vec {x}_p (t_0) \nonumber \]

    Example 3.3.2

    In Section 3.1 we solved the system

    \[\begin{align}\begin{aligned} x_1' &= x_1 \\ x'_2 &= x_1 - x_2 \end{aligned}\end{align} \nonumber \]

    with initial conditions \( x_1(0) = 1, x_2 (0) = 2\).

    Solution

    This is a hom*ogeneous system, so \( \vec {f} (t) = \vec {0} \). We write the system and the initial conditions as

    \[ \vec {x} ' = \begin {bmatrix} 1 & 0 \\ 1 & -1 \end {bmatrix} \vec{x}, \quad \vec{x}(0) = \begin{bmatrix} 1 \\ 2 \end{bmatrix} \nonumber \]

    We found the general solution was \( x_1= C_1 e^t \) and \(x_2=\frac{c_1}{2} e^t + c_2 e^{-t} \). Letting \( C_1=1 \) and \(C_2=0\), we obtain the solution \( \begin{bmatrix} e^t \\ \frac{1}{2}e^t \end{bmatrix} \). Letting \( C_1=0 \) and \(C_2=1\), we obtain \( \begin{bmatrix} 0 \\ e^{-t} \end{bmatrix} \). These two solutions are linearly independent, as can be seen by setting \( t=0 \), and noting that the resulting constant vectors are linearly independent. In matrix notation, the fundamental matrix solution is, therefore,

    \[ X(t) = \begin{bmatrix} e^t & 0 \\ \frac{1}{2}e^t & e^{-t} \end{bmatrix} \nonumber \]

    Hence to solve the initial problem we solve the equation

    \[ X(0)\vec(c) = \vec{b} \nonumber \]

    or in other words,

    \[ \begin{bmatrix} 1 & 0 \\ \frac{1}{2} & 1 \end{bmatrix} \vec{c} =\begin{bmatrix} 1 \\ 2 \end{bmatrix} \nonumber \]

    \[ \vec{x}(t) = X(t) \vec{c} = \begin{bmatrix} e^t & 0 \\ \frac{1}{2}e^{t} & e^{-t} \end{bmatrix} \begin{bmatrix} 1 \\ \frac{3}{2} \end{bmatrix} = \begin{bmatrix} e^t \\ \frac{1}{2} e^t + \frac{3}{2} e^{-t} \end{bmatrix} \nonumber \]

    This agrees with our previous solution from Section 3.1.

    3.3: Linear systems of ODEs (2024)
    Top Articles
    Rush Orthopaedics & Sports Medicine Services
    Why Is 365 Market Troy Mi On My Bank Statement
    What to Serve with Lasagna (80+ side dishes and wine pairings)
    Professor Qwertyson
    How to change your Android phone's default Google account
    Sunday World Northern Ireland
    Nieuwe en jong gebruikte campers
    Rainfall Map Oklahoma
    Toonily The Carry
    13 The Musical Common Sense Media
    4156303136
    Signs Of a Troubled TIPM
    Nonuclub
    Craigslist List Albuquerque: Your Ultimate Guide to Buying, Selling, and Finding Everything - First Republic Craigslist
    My.tcctrack
    Blackwolf Run Pro Shop
    Po Box 35691 Canton Oh
    Praew Phat
    St. Petersburg, FL - Bombay. Meet Malia a Pet for Adoption - AdoptaPet.com
    Halo Worth Animal Jam
    Menus - Sea Level Oyster Bar - NBPT
    Xfinity Cup Race Today
    Mini Handy 2024: Die besten Mini Smartphones | Purdroid.de
    How to Watch Every NFL Football Game on a Streaming Service
    Horn Rank
    Sessional Dates U Of T
    Kitchen Exhaust Cleaning Companies Clearwater
    Sam's Club Gas Price Hilliard
    Ff14 Sage Stat Priority
    Rush County Busted Newspaper
    The Ultimate Guide to Obtaining Bark in Conan Exiles: Tips and Tricks for the Best Results
    Sun-Tattler from Hollywood, Florida
    Carespot Ocoee Photos
    Leatherwall Ll Classifieds
    The Boogeyman Showtimes Near Surf Cinemas
    Dr. John Mathews Jr., MD – Fairfax, VA | Internal Medicine on Doximity
    Elisabeth Shue breaks silence about her top-secret 'Cobra Kai' appearance
    Directions To Advance Auto
    Cookie Clicker The Advanced Method
    How Many Dogs Can You Have in Idaho | GetJerry.com
    Stewartville Star Obituaries
    Saline Inmate Roster
    Rocky Bfb Asset
    3 bis 4 Saison-Schlafsack - hier online kaufen bei Outwell
    Blue Beetle Showtimes Near Regal Evergreen Parkway & Rpx
    Marcal Paper Products - Nassau Paper Company Ltd. -
    Pickwick Electric Power Outage
    Grand Park Baseball Tournaments
    Zalog Forum
    Ff14 Palebloom Kudzu Cloth
    Coors Field Seats In The Shade
    Www.card-Data.com/Comerica Prepaid Balance
    Latest Posts
    Article information

    Author: Madonna Wisozk

    Last Updated:

    Views: 5935

    Rating: 4.8 / 5 (68 voted)

    Reviews: 83% of readers found this page helpful

    Author information

    Name: Madonna Wisozk

    Birthday: 2001-02-23

    Address: 656 Gerhold Summit, Sidneyberg, FL 78179-2512

    Phone: +6742282696652

    Job: Customer Banking Liaison

    Hobby: Flower arranging, Yo-yoing, Tai chi, Rowing, Macrame, Urban exploration, Knife making

    Introduction: My name is Madonna Wisozk, I am a attractive, healthy, thoughtful, faithful, open, vivacious, zany person who loves writing and wants to share my knowledge and understanding with you.