Solving for the same partial derivatives returns the same values, although bisection may be required for the exponent coefficient ($e^{bx}$) Instead, linearising may make things easier (by taking the natural logarithm of both sides. Afterward, solving as if it were in the form $y=mx+b$ returns correct
!!! example
$y=ax^b\implies\ln y = \ln a + b\ln x$
### Polynomial regression
The residiual is the offset at the end of a polynomial.
$$y=a+bx+cx^2+E$$
Taking the relevant partial derivatives returns a system of equations which can be solved in a matrix.
## Interpolation
Interpolation ensures that every point is crossed.
### Direct method
To interpolate $n+1$ data points, you need a polynomial of a degree **up to $n$**, and points that enclose the desired value. Substituting the $x$ and $y$ values forms a system of equations for a polynomial of a degree equal to the number of points chosen - 1.
### Newton's divided difference method
This method guesses the slope to interpolate. Where $x_0$ is an existing point:
$$\boxed{f(x)=b_0+b_1(x-x_0)}$$
The constant is an existing y-value and the slope is an average.
$$
\begin{align*}
b_0&=f(x_0) \\
b_1&=\frac{f(x_1)-f(x_0)}{x_1-x_0}
\end{align*}
$$
This extends to a quadratic, where the second slope is the average of the first two slopes:
Derivatives are estimated based on first principles:
$$f'(x)=\frac{f(x+h)-f(x)}{h}$$
### Derivatives of continuous functions
At a desired $x$ for $f'(x)$:
1. Choose an arbitrary $h$
2. Calculate derivative via first principles
3. Shrink $h$ and recalculate derivative
4. If the answer is drastically different, repeat step 3
### Derivatives of discrete functions
Guesses are made based on the average slope between two points.
$$f'(x_i)=\frac{f(x_{i+1})-f(x_i)}{x_{i+1}-x_i}$$
### Divided differences
- Using the next term, or a $\Delta x > 0$ indicates a **forward divided difference (FDD)**.
- Using the previous term, or a $\Delta x <0$indicatesa**backward divided difference (BDD)**.
The **central divided difference** averages both if $h$ or $\Delta x$ of the forward and backward DDs are equal.
$$f'(x)=CDD=\frac{f(x+h)-f(x-h)}{2h}$$
### Higher order derivatives
Taking the Taylor expansion of the function or discrete set and then expanding it as necessary can return any order of derivative. This also applies for $x-h$ if positive and negative are alternated.
- If the desired point does not exist, differentiating the surrounding points to create a polynomial interpolation of the derivative may be close enough.
The error for the $i$th trapezoidal segment is $|E_i|=\left|\frac{h^3}{12}\right|f''(x)$. This can be approximated with a maximum value of $f''$:
$$\boxed{|E_T|\leq(b-a)\frac{h^2}{12}M}$$
### Simpson's 1/3 rule
This uses the second-order polynomial with **two segments**. Three points are usually used: $a,\frac{a+b}{2},b$. Thus for two segments:
$$\int^b_af(x)dx\approx\frac h 3\left[f(a)+4f\left(\frac{a+b}{2}\right)+f(b)\right]$$
For an arbitrary number of segments, as long as there are an **even number** of **equal** segments:
$$\int^b_af(x)dx=\frac{b-a}{3n}\left[f(x_0)+4\sum^{n-1}_{\substack{i=1 \\ \text{i is odd}}}f(x_i)+2\sum^{n-2}_{\substack{i=2 \\ \text{i is even}}}f(x_i)+f(x_n)\right]$$
The error is:
$$|E_T|=(b-a)\frac{h^4}{180}M$$
## Ordinary differential equations
### Initial value problems
These problems only have results for one value of $x$.
**Euler's method** converts the function to the form $f(x,y)$, where $y'=f(x,y)$.
The **Runge-Kutta fourth-order method** is the most accurate of the three methods:
$$y_n+1=y_n+\frac 1 6(k_1+2k_2+2k_3+k_4)$$
- $k_1=hf(x_n,y_n)$
- $k_2=hf(x_n+\tfrac 1 2h,y_n+\tfrac 1 2k_1)$
- $k_3=hf(x_n+\tfrac 1 2 h, y_n+\tfrac 1 2 k_2)$
- $k_4=hf(x_n+h,y_n+k_3)$
### Higher order ODEs
Higher order ODEs can be solved by reducing them to first order ODEs by creating a system of equations. For a second order ODE: Let $y'=u$.
$$
y'=u \\
u'=f(x,y,u)
$$
For each ODE, the any method can be used:
$$
y_{n+1}=y_n+hu_n \\
u_{n+1}=u_n+hf(x_n,y_n,u_n)
$$
!!! example
For $y''+xy'+y=0,y(0)=1,y'(0)=2,h=0.1$:
\begin{align*}
y'&= u \\
u'&=-xu-y \\
y_1&=y_0+0.1u_0 \\
&=1+0.1×2 \\
&=1.2 \\
\\
u_1&=u_0+0.1×f(x_0,y_0,u_0) \\
&=u_0+0.1(-x_0u_0-y_0] \\
&=2+0.1(-0×2-1) \\
&=1.9
\end{align*}
### Boundary value problems
The **finite difference method** divides the interval between the boundary into $n$ sub-intervals, replacing derivatives with their first principles representations. Solving each $n-1$ equation returns a proper system of equations.
!!! example
For $y''+2y'+y=x^2, y(0)=0.2,y(1)=0.8,n=4\implies h=0.25$: