Spectral Clustering Example in Python

    Spectral clustering is a technique to apply the spectrum of the similarity matrix of the data in dimensionality reduction. It is useful and easy to implement clustering method.  

    The Scikit-learn API provides SpectralClustering class to implement spectral clustering method in Python. The SpectralClustering applies the clustering to a projection of the normalized Laplacian. In this tutorial, we'll briefly learn how to cluster and visualize data with SpectralClustering in Python. The tutorial covers:

  1. Preparing the data
  2. Clustering with the SpectralClustering and visualizing
  3. Source code listing

TSNE Visualization Example in Python

     T-distributed Stochastic Neighbor Embedding (T-SNE) is a tool for visualizing high-dimensional data. T-SNE, based on stochastic neighbor embedding, is a nonlinear dimensionality reduction technique to visualize data in a two or three dimensional space.

    The Scikit-learn API provides TSNE class to visualize data with T-SNE method. In this tutorial, we'll briefly learn how to fit and visualize data with TSNE in Python. The tutorials covers:

  1. Iris dataset TSNE fitting and visualizing
  2. MNIST dataset TSNE fitting and visualizing
  3. Source code listing

Dimensionality Reduction with Sparse, Gaussian Random Projection and PCA in Python

    Dimensionality reducing is used when we deal with large datasets, which contain too many feature data, to increase the calculation speed, to reduce the model size, and to visualize the huge datasets in a better way. The purpose of this method is to keep the most important data while removing the most of the feature data. 

    In this to tutorial, we'll briefly learn how to reduce data dimensions with Sparse and Gaussian random projection and PCA methods in Python. The Scikit-learn API provides the SparseRandomProjection, GaussianRandomProjection classes and PCA transformer function to reduce data dimension. After reading this tutorial, you'll learn how to reduce dimensionality of the dataset by using those methods. The tutorial covers:

  1. Preparing the data
  2. Gaussian random projection
  3. Sparse random projection
  4. PCA projection
  5. MNIST data projection
  6. Source code listing

Curve Fitting Example With Nonlinear Least Squares in R

    The Nonlinear Least Squares (NLS) estimate the parameters of a nonlinear model. R provides 'nls' function to fit the nonlinear data. The 'nls' tries to find out the best parameters of a given function by iterating the variables. 

    In this tutorial, we'll briefly learn how to fit nonlinear data by using the 'nls' function in R. The 'nls' comes in a 'stats' base package. The tutorial covers:
  1. Preparing the data
  2. Fitting the model and prediction
  3. Source code listing

Principal Component Analysis (PCA) Example in Python

    Principal Component Analysis (PCA) is an unsupervised learning approach of the feature data by changing the dimensions and reducing the variables in a dataset. No label or response data is considered in this analysis. The Scikit-learn API provides the PCA transformer function that learns components of data and projects input data on learned components. 

    In this tutorial, we'll briefly learn how to do principle components analysis by using the PCA function, change data dimensions, and visualize the projected data in Python. The tutorial covers:

  1. Extracting principal components
  2. Dimension changing and visualizing
  3. Iris PCA Example
  4. Source code listing

    We'll start by loading the required libraries and functions.

from sklearn.decomposition import PCA
from sklearn.datasets import load_iris
import matplotlib.pyplot as plt
import numpy as np 
 

K-Nearest Neighbor Regression Example in R

     K-Nearest Neighbor (KNN) is a supervised machine learning algorithms that can be used for classification and regression problems. In this algorithm, k is a constant defined by user and nearest neighbors distances vector is calculated by using it. 

    The 'caret' package provides 'knnreg' function to apply KNN for regression problems.

    In this tutorial, we'll briefly learn how to fit and predict regression data by using 'knnreg' function in R. The tutorial covers:
  1. Preparing the data
  2. Fitting the model and prediction
  3. Accuracy checking
  4. Source code listing
We'll start by loading the required libraries.

library(caret)