eifueo/docs/2a/ece204.md
2023-11-07 13:51:46 -05:00

5.1 KiB
Raw Blame History

ECE 204: Numerical Methods

Linear regression

Given a regression \(y=mx+b\) and a data set \((x_{i..n}, y_{i..n})\), the residual is the difference between the actual and regressed data:

\[E_i=y_i-b-mx_i\]

Method of least squares

This method minimises the sum of the square of residuals.

\[\boxed{S_r=\sum^n_{i=1}E_i^2}\]

\(m\) and \(b\) can be found by taking the partial derivative and solving for them:

\[\frac{\partial S_r}{\partial m}=0, \frac{\partial S_r}{\partial b}=0\]

This returns, where \(\overline y\) is the mean of the actual \(y\)-values:

\[ \boxed{m=\frac{n\sum^n_{i=1}x_iy_i-\sum^n_{i=1}x_i\sum^n_{i=1}y_i}{n\sum^n_{i=1}x_i^2-\left(\sum^n_{i=1}x_i\right)^2}} \\ b=\overline y-m\overline x \]

The total sum of square around the mean is based off of the actual data:

\[\boxed{S_t=\sum(y_i-\overline y)^2}\]

Error is measured with the coefficient of determination \(r^2\) — the closer the value is to 1, the lower the error.

\[ r^2=\frac{S_t-S_r}{S_t} \]

If the intercept is the origin, \(m\) reduces down to a simpler form:

\[m=\frac{\sum^n_{i=1}x_iy_i}{\sum^n_{i=1}x_i^2}\]

Non-linear regression

Exponential regression

Solving for the same partial derivatives returns the same values, although bisection may be required for the exponent coefficient (\(e^{bx}\)) Instead, linearising may make things easier (by taking the natural logarithm of both sides. Afterward, solving as if it were in the form \(y=mx+b\) returns correct

!!! example \(y=ax^b\implies\ln y = \ln a + b\ln x\)

Polynomial regression

The residiual is the offset at the end of a polynomial.

\[y=a+bx+cx^2+E\]

Taking the relevant partial derivatives returns a system of equations which can be solved in a matrix.

Interpolation

Interpolation ensures that every point is crossed.

Direct method

To interpolate \(n+1\) data points, you need a polynomial of a degree up to \(n\), and points that enclose the desired value. Substituting the \(x\) and \(y\) values forms a system of equations for a polynomial of a degree equal to the number of points chosen - 1.

Newtons divided difference method

This method guesses the slope to interpolate. Where \(x_0\) is an existing point:

\[\boxed{f(x)=b_0+b_1(x-x_0)}\]

The constant is an existing y-value and the slope is an average.

\[ \begin{align*} b_0&=f(x_0) \\ b_1&=\frac{f(x_1)-f(x_0)}{x_1-x_0} \end{align*} \]

This extends to a quadratic, where the second slope is the average of the first two slopes:

\[\boxed{f(x)=b_0+b_1(x-x_0)+b_2(x-x_0)(x-x_1)}\]

\[ b_2=\frac{\frac{f(x_2)-f(x_1)}{x_2-x_1}-\frac{f(x_1)-f(x_0)}{x_1-x_0}}{x_2-x_0} \]

Derivatives

Derivatives are estimated based on first principles:

\[f'(x)=\frac{f(x+h)-f(x)}{h}\]

Derivatives of continuous functions

At a desired \(x\) for \(f'(x)\):

  1. Choose an arbitrary \(h\)
  2. Calculate derivative via first principles
  3. Shrink \(h\) and recalculate derivative
  4. If the answer is drastically different, repeat step 3

Derivatives of discrete functions

Guesses are made based on the average slope between two points.

\[f'(x_i)=\frac{f(x_{i+1})-f(x_i)}{x_{i+1}-x_i}\]

Divided differences

  • Using the next term, or a \(\Delta x > 0\) indicates a forward divided difference (FDD).
  • Using the previous term, or a \(\Delta x < 0\) indicates a backward divided difference (BDD).

The central divided difference averages both if \(h\) or \(\Delta x\) of the forward and backward DDs are equal.

\[f'(x)=CDD=\frac{f(x+h)-f(x-h)}{2h}\]

Higher order derivatives

Taking the Taylor expansion of the function or discrete set and then expanding it as necessary can return any order of derivative. This also applies for \(x-h\) if positive and negative are alternated.

\[f(x+h)=f(x)+f'(x)h+\frac{f''(x)}{2!}h^2+\frac{f'''(x)}{3!}h^3\]

!!! example To find second order derivatives:

\begin{align*}
f''(x)&=\frac{2f(x+h)-2f(x)-2f'(x)h}{h^2} \\
&=\frac{2f(x+h)-2f(x)-(f(x+h)-f(x-h))}{h^2} \\
&=\frac{f(x+h)-2f(x)+f(x-h)}{h^2}
\end{align*}

!!! example \(f''(3)\) if \(f(x)=2e^{1.5x}\) and \(h=0.1\):

\begin{align*}
f''(3)&=\frac{f(3.1)-2\times2f(3)+f(2.9)}{0.1^2} \\
&=405.08
\end{align*}

For discrete data:

  • If the desired point does not exist, differentiating the surrounding points to create a polynomial interpolation of the derivative may be close enough.

!!! example | t | 0 | 10 | 15 | 20 | 22.5 | 30 | | — | — | — | — | — | — | — | | v(t) | 0 | 227.04 | 362.78 | 517.35 | 602.47 | 901.67 |

$v'(16)$ with FDD:

Using points $t=15,t=20$:

\begin{align*}
v'(x)&=\frac{f(x+h)-f(x)}{h} \\
&=\frac{f(15+5)-f(15)}{5} \\
&=\frac{517.35-362.78}{5} \\
&=30.914
\end{align*}

$v'(16)$ with Newton's first-order interpolation:

\begin{align*}
v(t)&=v(15)+\frac{v(20)-v(15)}{20-15}(t-15) \\
&=362.78+30.914(t-15) \\
&=-100.93+30.914t \\
v'(t)&=\frac{v(t+h)-v(t)}{2h} \\
&=\frac{v(16.1)-v(15.9)}{0.2} \\
&=30.914
\end{align*}
  • If the spacing is not equal (to make DD impossible), again creating an interpolation may be close enough.
  • If data is noisy, regressing and then solving reduces random error.