MFDS II Flashcards
Continuity
A function f(x) is said to be continuous at a point x=a if the following conditions are met:
f(a) is defined
lim(x→a) f(x) exists
lim(x→a) f(x) = f(a)
Differentiability
A function f(x) is said to be differentiable at a point x=a if the derivative f’(a) exists. The derivative is defined as:
f’(a) = lim(h→0) [f(a+h) - f(a)]/h
Continuity and differentiability relation
If a function is differentiable at a point, it is also continuous at that point.
If a function is continuous at a point, it is not necessarily differentiable at that point
Real analysis applications
1 Machine learning
Linear regression (OLS estimator)
Gradient descent (opti loss
function)
2 Time series analysis
Fourier series (decom. time ser
data)
3 Data compression
PCA algorithm
SVD algorithm
4 Signal processing
Frequency noise reduction by
filtering
Convergence
Limit of nth term of a series exists and is finite
monotonic sequence
A monotonic sequence is a sequence whose terms are either consistently increasing or consistently decreasing
gradient
vector that points to maximum rate of increase of function
del f(x,y) = (df/fx, df/dy)
divergence
measures the extetent to which vector field spreads out from a point. it is scalar quanityt and represents rate of change of field’s density
curl
measures the rotation or circulation at a point. It results in another vector that represents axis and magnitude
directional derivative is a scalar quantity
The directional derivative of a function is a scalar quantity that represents the rate of change of the function in a specific direction.
example:
phi = x2yz3 at point 1,1,1
v = i + 2j - 3k
Applications of partial derivatives in data science
Gradient descent (minimize loss function)
Neural Network Optimization(gradient of losss function wrt network weights)
Regression analysis (best fit data by minimizing error function)
Support vector machine (optimize hyperparameters for SVM)
Expectation Maximization Algorithm (iteratively update parameter estimate)