Diving into data

A blog on machine learning, data mining and visualization

Main menu

Skip to content
  • About

Monthly Archives: December 2014

Selecting good features – Part IV: stability selection, RFE and everything side by side

December 20, 2014

In my previous posts, I looked at univariate methods,linear models and regularization and random forests for feature selection.In this post, I’ll look at two other methods: stability selection and recursive feature elimination (RFE), which can both considered wrapper methods. They … Continue reading →

Posted in Feature selection, Machine learning | Replies: 45

Selecting good features – Part III: random forests

December 1, 2014

In my previous posts, I looked at univariate feature selection and linear models and regularization for feature selection.In this post, I’ll discuss random forests, another popular approach for feature ranking. Random forest feature importanceRandom forests are among the most popular … Continue reading →

Posted in Feature selection | Replies: 57

Recent Posts

  • Monotonicity constraints in machine learning
  • Random forest interpretation – conditional feature contributions
  • Histogram intersection for change detection
  • Who are the best MMA fighters of all time. A Bayesian study
  • First Estonian Machine Learning Meetup

Archives

  • September 2018
  • October 2016
  • February 2016
  • December 2015
  • November 2015
  • October 2015
  • August 2015
  • June 2015
  • February 2015
  • December 2014
  • November 2014
  • October 2014

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org