Regression Example with XGBoost in R

    The XGBoost stands for "Extreme Gradient Boosting" and it is an implementation of gradient boosting trees algorithm. It is a popular supervised machine learning method with characteristics like computation speed, parallelization, and performance. XGBoost is an open-source software library and you can use it in the R development environment by downloading the xgboost R package.
    In this tutorial, we'll briefly learn how to fit and predict regression data with the 'xgboost' function. The tutorial covers:
  1. Preparing the data
  2. Fitting the model and prediction
  3. Accuracy checking
  4. Source code listing
We'll start by loading the required library.

library(xgboost)
library(caret)

Classification Example with Ridge Classifier in Python

   The Ridge Classifier,  based on Ridge regression method, converts the label data into [-1, 1] and solves the problem with regression method. The highest value in prediction is accepted as a target class and for multiclass data muilti-output regression is applied.
   In this tutorial, we'll briefly learn how to classify data by using Scikit-learn's RidgeClassifier class in Python. The tutorial covers:
  1. Preparing the data
  2. Training the model
  3. Predicting and accuracy check
  4. Iris dataset classification example
  5. Source code listing
   We'll start by loading the required libraries.

from sklearn.linear_model import RidgeClassifier
from sklearn.datasets import load_iris
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split from sklearn.model_selection import cross_val_score from sklearn.metrics import confusion_matrix from sklearn.metrics import classification_report

Regression Example with Linear SVR Method in Python

   Based on support vector machines method, the Linear SVR is an algorithm to solve the regression problems. The Linear SVR algorithm applies linear kernel method and it works well with large datasets. L1 or L2 method can be specified as a loss function in this model.

   In this tutorial, we'll briefly learn how to fit and predict regression data by using Scikit-learn's LinearSVR class in Python. The tutorial covers:
  1. Preparing the data
  2. Training the model
  3. Predicting and accuracy check
  4. Boston dataset prediction
  5. Source code listing
   We'll start by loading the required libraries.

How to Build Variational Autoencoder and Generate Images in Python

     Classical autoencoder simply learns how to encode input and decode the output based on given data using in between randomly generated latent space layer. By using this method we can not increase the model training ability by updating parameters in learning.

    The variational autoencoders, on the other hand, apply some statistical findings by using learned mean and standard deviations to learn the distribution. The latent space mean and variance are kept to update in each layer and this helps to improve the generator model.
    
    In this tutorial, we'll learn how to build the Variational Autoencoder (VAE) and generate the images with Keras in Python. The tutorial covers:
  1. Preparing the data
  2. Defining the encoder
  3. Defining decoder
  4. Defining the VAE model
  5. Generating images
  6. Source code listing  

Regression Example with Nu Support Vector Regression Method in Python

   Based on support vector machines method, Nu Support Vector Regression (NuSVR) is an algorithm to solve the regression problems. The NuSVR algorithm applies nu parameter by replacing the the epsilon parameter of SVR method. The Scikit-learn explains that the parameter nu is an upper bound on the fraction of training errors and a lower bound of the fraction of support vectors¹.

   In this tutorial, we'll briefly learn how to fit and predict regression data by using Scikit-learn's NuSVR class in Python. The tutorial covers:
  1. Preparing the data
  2. Training the model
  3. Predicting and accuracy check
  4. Boston dataset prediction
  5. Source code listing
   We'll start by loading the required libraries.

Classification Example with Linear SVC in Python

   The Linear Support Vector Classifier (SVC) method applies a linear kernel function to perform classification and it performs well with a large number of samples. If we compare it with the SVC model, the Linear SVC has additional parameters such as penalty normalization which applies 'L1' or 'L2' and loss function. The kernel method can not be changed in linear SVC, because it is based on the kernel linear method. 

   In this tutorial, we'll briefly learn how to classify data by using Scikit-learn's LinearSVC class in Python. The tutorial covers:
  1. Preparing the data
  2. Training the model
  3. Predicting and accuracy check
  4. Iris dataset classification example
  5. Video tutorial
  6. Source code listing
   We'll start by loading the required libraries.

from sklearn.svm import LinearSVC
from sklearn.datasets import load_iris
from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split from sklearn.model_selection import cross_val_score from sklearn.metrics import confusion_matrix from sklearn.metrics import classification_report