Skip to main content

Posts

Showing posts from March, 2024

Linear Regression

Linear Regression: Mathematical Foundations Linear Regression: Mathematical Foundations Linear regression is a fundamental statistical technique used to predict a real-valued output \( y \in \mathbb{R} \) for a given input data point \( x \in \mathbb{R}^D \). It assumes that the expected value of the target variable is a linear function of the input features: $$ \mathbb{E}[y \mid x] = w^\top x $$ 1. Dataset Representation Let the training dataset be represented by a feature matrix: $$ X \in \mathbb{R}^{N \times D} $$ where \( N \) is the number of data points and \( D \) is the number of features. The dataset can be expressed as: $$ X = [x_1, x_2, \dots, x_D] $$ Each \( x_i \) (for \( i = 1, \dots, D \)) is a column vector representing one feature across all samples. 2. Model Formulation A general polynomial form of regression can be written as: $$ y = w_0 + w_{11} x_1 + w_{12} x_1^2 + \dots + w_{21} x_2 + w_{22}...

Basis Change and Matrix Approximation with SVD

Basis Change and Matrix Approximation with SVD Basis Change and Matrix Approximation with SVD 1. Basis Change In this section, we’ll see how a transformation matrix changes when we change the basis of a linear mapping. Let there be a linear transformation: $$ \phi: V \rightarrow W $$ where \( V \in \mathbb{R}^n \) and \( W \in \mathbb{R}^m \). Let the ordered bases for \( V \) and \( W \) be: $$ B = (b_1, \dots, b_n), \quad C = (c_1, \dots, c_m) $$ and the new bases be: $$ \tilde{B} = (\tilde{b}_1, \dots, \tilde{b}_n), \quad \tilde{C} = (\tilde{c}_1, \dots, \tilde{c}_m) $$ If \( A \) represents the transformation matrix of \( \phi \) with respect to the bases \( (B, C) \), and \( \tilde{A} \) is the corresponding matrix with respect to \( (\tilde{B}, \tilde{C}) \), we aim to relate \( A \) and \( \tilde{A} \). 1.1 Relationship between old and new bases Each new basis vector in \( \tilde{B} \) can be written as a linear combi...