Blog

Automating Kaggle II: Feature Engineering is Where it’s At

download

 

Kaggle competitions push the limits of predictive modeling. When skilled, intelligent people compete and collaborate to solve difficult problems, wonderful things happen. We learn how to better engineer features and how to more effectively ensemble predictive models in sophisticated ways.

I’m excited to be a part of building a product and platform that builds, tunes, and combines predictive models in sophisticated ways to generate optimal results. We are also in the early stages of building automated feature engineering capabilities.

Here are some results of our platform on recent Kaggle competitions:

Homesite
Our push-button automated modeling platform scores 134th on the private leaderboard, beating over 92% of data science competitors--and teams--who had months to engineer features and build models.

Keep in mind that these scores are the result of spending minutes--not hours or days--of user time to train models. In addition, we're blindly modeling the data: no thoughtful, sophisticated feature engineering involved. With a little help from a skilled data scientist, we know the ranking would be even higher. Our vision is to build a tool that allows data scientists to save time while training models, allowing them focus on critical feature engineering. In addition, we're helping those who are not as skilled build great models regardless.

BNP
BNP is still a live competition. Once again, the platform is blindly beating about 92% of competitors:

We have released a free beta version of our platform to gain feedback from the public. It builds great models very quickly, but we're only unlocking the full "beast mode" modeling capabilities (with the results you see above) for commercial users. We'd love for you to give automation a try with our free beta and give us your feedback.

Jason Maughan - Chief Analytics Officer | Data Scientist

Leave a Reply