Monday, February 25, 2019

Linear regression: Ridge regression


Why should we add a $\lambda$ to the matrix?
1. To reduce the value of the weights $\omega$.
2. Make $X^{T}X$ invertible.
3. Prevent overfitting

Why will $X^{T}X$ become not invertible?
$$\omega_{LMS}=(X^{T}X)^{-1}X^{T}y$$
1. Data points of X is less than the dimension of $\omega$
2. Columns of X are not linear independent. Such as: a column is a duplicate of one of the features, a column is the scaled version of another. Those columns are dependent between each others.
Example:

Sunday, February 24, 2019

Linear regression: Polynomial basis function

From the previous post, a first order model fitted the data like this:

To fit a known data better, we need more information and a more complex model. Here is a first order model used in the plot above:
$$y=ax+b$$
Let's turn it into second-order model:
$$y=ax^{2}+bx+c$$

Saturday, February 23, 2019

Linear regression: Univariate and Multivariate model


Here is a simple univariate linear equation:
$$y=ax+b$$
The goal here is to find the coefficients a,b to fit the linear model. For example, I applied iris data with only petal and sepal length. I want to predict petal length by using sepal length.

Let the input as sepal length, petal length as predict variable.

Thursday, February 21, 2019

SIFT Scale-Invariant Feature Transform

SIFT is commonly used in feature detectors in computer vision. Today I'm gonna briefly explain the concepts of what I learned.

Here is the an image and we want to extract key points of it:



CSES Subtree Queries

You are given a rooted tree consisting of n nodes. The nodes are numbered 1,2,…,n, and node 1 is the root. Each node has a value. Your...