Feature Crawler used for a Fraud Prevention competition
-
Updated
Aug 2, 2018 - Python
Feature Crawler used for a Fraud Prevention competition
In this project, we have analyzed, explored and processed the data, developed and evaluated various classification and regression models to provide strategies for high returns with low risk for investors.
Analysed the features for breast cancer data and predicted the diagnosis using Random Forest, Gradient Boosting Machine (GBM)
A prediction model that uses logistic regression and gradient boosting to classify population income.
[SIGE-MII-UGR-2016-17] Competición en Kaggle: Titanic
Data Cleaning and modeling Approaches
Attrition Prediction using GBM (Classification)
(National Rank: 22) Analyze This' 18, American Express Data Science Competition
An extension of Py-Boost to probabilistic modelling
Kernels for machine learning problems
R package for automatic hyper parameter tuning and ensembles with deep learning, gradient boosting machines, and random forests. Powered by h2o.
Implementation of Decision Tree and Ensemble Learning algorithms in Python with numpy
Blackbox feasibility prediction with machine learning to optimize a CMA-ES algorithm
Linear regression at transformed features by gradient boosting machine
An insight to analyzing Titanic survival using decision trees and ensemble methods
One Data Set with All Algorithms
This is a set Machine Learning codes written for a kaggle competition titled: "Flavours-of-Physics".
Add a description, image, and links to the gradient-boosting-machine topic page so that developers can more easily learn about it.
To associate your repository with the gradient-boosting-machine topic, visit your repo's landing page and select "manage topics."