The greatest boosting algorithm that existed so far
What do you like best?
It's the best performing stand-alone algorithm (not counting deep learning algorithms which is whole another field) famous for winning many online machine learning competitions. It runs fast and performs better than bagging algorithms because it learns from the mistakes of previous tree models that were built within it. It is possible to tune XGBoost for various metrics, too so if you want a high recall, you can do it with the help of GridSearchCV. It is very efficient compared to famous Random Forest algorithm.
What do you dislike?
That it is not a part of a bigger package such as Anaconda but we have to install it separately. Also, its greatness comes with the cost of overfitting just like deep neural networks. It learns so good that after hyperparameter tuning it overfits more than other algorithms.
Recommendations to others considering the product:
Be careful when hyperparameter tuning, it can overfit even though train and test data were separated.
What problems are you solving with the product? What benefits have you realized?
I am solving Machine Learning problems with XGBoost. It learns and performs very well both in terms of performance metrics such as accuracy and fscore as well as training time.