Building Machine Learning Models

David Landup
David Landup

Shallow Learning Algorithms

Let's start out with Shallow Learning - both as a sanity check and as a way to determine the baseline performance of a more simple rule-based algorithm. Although these models are simpler - they can be exceedingly powerful, and the strongest contender is a Random Forest which in many cases gets Deep Learning-level results, in a fraction of the time.

It'll be our main baseline performance indicator throughout the project! Besides the random forest regressor, we'll try out some other regressors as well.

ElasticNet, Decision Trees and Random Forests

ElasticNets are linear regressors with L1 and L2 priors are regularizers. Linear regression won't get us really far if there's a complex relationship between variables, though in our case, even a linear regressor might perform decently given how strongly some of the features correlate with the price. If we had less features correlating so high, ElasticNet would probably perform significantly worse than it might perform here.

Decision Trees seem like a much more fitting algorithm for our case conceptually, and a Random Forest is an ensemble of Decision Trees that mutually help each other predict the most precise value. Generally speaking, ensemble models perform better than singular models, though it might not always be the case. Also generally speaking, Random Forests perform significantly better than singular Decision Trees, and are an extremely powerful algorithm.

Start project to continue
Lessson 4/6
You must first start the project before tracking progress.
Mark completed

© 2013-2024 Stack Abuse. All rights reserved.