Discussion of article "Gradient boosting in transductive and active machine learning"

MetaQuotes
Moderator
229309
MetaQuotes  

New article Gradient boosting in transductive and active machine learning has been published:

In this article, we will consider active machine learning methods utilizing real data, as well discuss their pros and cons. Perhaps you will find these methods useful and will include them in your arsenal of machine learning models. Transduction was introduced by Vladimir Vapnik, who is the co-inventor of the Support-Vector Machine (SVM).

Let us go straight to active learning and test its effectiveness on our data.

There are several libraries for active learning in the Python language, the most popular of them being:

  • modAL is quite a simple and easy-to-learn package, which is a kind of a wrapper for the popular machine learning library scikit-learn (they are fully compatible). The package provides the most popular active learning methods.
  • Libact uses the multi-armed bandit strategy over existing query strategies for a dynamic selection of the best query. 
  • Alipy is a kind of a laboratory from package providers, which contains a large number of query strategies.

I have selected the modAL library as being more intuitive and suitable for getting acquainted with the active learning philosophy. It offers greater freedom in designing models and in creating your own models by using standard blocks or by creating your own ones.

Let us consider the above described process using the below scheme, which does not require further explanations:

Author: Maxim Dmitrievsky