If you have read the previous post, you could have guessed the ideology behind this new model. The forest gives it away. As in nature the forest is made up of many trees, Random forest regression is also combination of many Decision tree regressions.

Decision trees are used for many purposes from sorting and searching to classifying but using it for regression is really hard as there are infinitely possible values for a regression. There is simple math behind it which can be implemented by multiple 'if' loops. This model is available in the sklearn.tree library. Take for example this data.

This is the model you use when you have to deal with variables that are non-linearly related to the target.The degree of the relation is to be provided at the time of making the model object so it is a trial and error method if it is not known from the start.

This is the same as Simple Linear Regression but it is better as it also considers multiple features unlike the former. This model serves the same purpose to draw a straight line based on the training data.

Jay Nankani - Blog
All rights reserved 2018
Powered by Webnode
Create your website for free! This website was made with Webnode. Create your own for free today! Get started