Image classification sample solution for Kaggle competition

Image classification sample solution for Kaggle competition

At, we’re doing our best to make our mark in state‑of‑the‑art data science. For many years, we have been competing in machine learning challenges, gaining both conceptual and technical expertise. Now, we have decided to open source an end‑to‑end image classification sample solution for the ongoing Cdiscount Kaggle competition. In so doing, we believe we’ll encourage data scientists both seasoned and new to compete on Kaggle and test their neural nets.


Competing in machine learning challenges is fun, but also a lot of work. Participants must design and implement end‑to‑end solutions, test neural architectures and run dozens of experiments to train deep models properly. But this is only a small part of the story. Strong Kaggle competition solutions have advanced data pre‑ and post‑processing, ensembling and validation routines, to name just a few. At this point, competing effectively becomes really complex and difficult to manage, which may discourage some data scientists from rolling up their sleeves and jumping in. Here at we believe that Kaggle is a great platform for advanced data scientific training at any level of expertise. So great, in fact, that we felt compelled to open‑source an image classification sample solution to the currently open Cdiscount challenge. Below, we describe what we have prepared.

Related:  Spot the flaw - visual quality control in manufacturing

Image classification sample solution overview

When we say our solution is end‑to‑end, we mean that we started with raw input data downloaded directly from the Kaggle site (in the bson format) and finish with a ready‑to‑upload submit file. Here are the components:

  1. data loader
    1. Keras custom iterator for bson file
    2. label encoder representing product IDs to fit the Keras API
  2. neural network training on n classes and k examples per class. We use the following architectures:
    1. MobileNet (Howard et al. ’17)
    2. Inception v3
    3. ensembles of the models mentioned above
  3. model predictions
    1. single-model prediction
    2. ensembling (by averaging) for multiple models
  4. submit generation


For instance, the image classification with MobileNets ensemble would be defined as followings:


Watch this video for quick code overview:

Related:  Playing Atari with deep reinforcement learning -’s approach

What if I want to use my network architecture?

You are encouraged to replace our network with your own. Below you can find a short snippet of code that you simply place in the file:

Otherwise I would suggest extending BasicKerasClassifier, or  KerasDataLoader with custom augmentations, learning rate schedules and other tricks of your choice.

How to get started?

To start using our pipeline, follow these steps:

  1. download the source code from
  2. follow the README instructions to run the code
  3. modify this image classification sample solution to fit your needs
  4. have fun competing on Kaggle!
Image classification sample solution - neptune dashboard with Kaggle experiment

Image classification sample solution running in Neptune. Live charts presents log-loss and accuracy for the running experiment.

Related:  How to start with machine learning wisely and become a data scientist?

Final remarks

Feel free to use, modify and run this code for your own purposes. We run multiple of them on Neptune, which you may find useful for managing your experiments.

Related Posts

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *