Sowing a decision tree

This is my first post - wish me luck! 


OK! Let me speak English first - It's gotten so much easier to do data science (duh! tell me about all the .fit() and .predict() that have made our lives easy!) that we have forgotten or simply don't care about the math that drives some of these predictions. I myself can vouch for how many times my brain froze even for the simplest of the concepts (err.... what is supervised learning? just kidding.. but can you tell me what an Entropy is?) that I sometimes can't recollect. This is one such post! I decided to write about decisions trees because I use it so often and frankly its embarrassing to know how to drive but not know the primary difference between electric and gas cars! So, getting back to decision trees -  I will try to explain this concept and provide an overview of the technicalities in lay man terms but if you are not familiar with the following terms I suggest you google the following terms before reading further: predictive modeling, classification in ML, probability concepts



Tree based models are so common and widely used for sometime now. These models have evolved so much across the years, and gotten only better after each iteration. Supervised learning algorithm makes certain assumptions, based on which the data is split, and is then used to make predictions. In general, these models are not super hard to understand and can be easily understood by a non technical audience. Decision trees are of two types - Classification and Regression, popularly know as CART models.


Limitations






Construction of a decision tree

Decision Tree Pruning



Decision Tree Based Techniques



Decision Tree Parameters



Helpful Links

1. How to calculate ideal Decision Tree depth without overfitting?