Skip to main content

Posts

Showing posts from March, 2024

Linear Regression

Linear regression is used to predict real-valued output for a given input data point . Linear regression establishes a relationship of dependent variable with the features of the input data with an assumption that the expected value of the output(dependent variable) is a linear function of the input ( ).     Let's assume our training dataset is where is the number of data points and is the number of dimension or number of features in our dataset. From now on we will write our dataset as where each for is a column vector.  We can write the output as: or we can write it as: Before computing the final weights for this equation, we need to figure out what degree we should choose. We usually select the degree for which we get less mean squared error(MSE). The most common form of linear regression is degree 1 form: There are two ways by which we can estimate the parameters: Normal equation: Weight vector is estimated by matrix multiplication o...

Basis Change and Matrix Approximation with SVD

Basis Change:  In this blog we will see how the transformation matrix for any linear mapping changes with the change in the basis. Let's assume we have a linear mapping   with & , ordered bases for vector spaces  and are and respectively and we are changing these bases to  and for vector spaces  and respectively. Also assume the transformation matrix in case of bases & is and after changing the bases to & it becomes .  In this blog I will be taking subscripts , , and to represent vectors in the bases , , and respectively. As we know basis vectors spans the entire vector space, so will span the entire vector space , so will also span the basis vectors of . So we can say that each new basis vector of will be the linear combinations of the basis vectors of and we can write it as, . We store these coefficients in column-wise manner in a matrix (say ), so entries in the column of will be the coef...