Kaggling: A Journey of Past Competitions — Part1

Kaggling: A Journey of Past Competitions — Part1

Techniques learnt and Stay tuned with me for more. In Jigsaw competition, Cross-validation, postprocessing, and preprocessing played a lot of importance.

1. Jigsaw Multilingual Toxic Comment Classification Competition

In Jigsaw competition, Cross-validation, postprocessing, and preprocessing played a lot of importance.

  1. Pseudo labeling: performance improvement when we used test-set predictions as training data — the intuition is that it helps models learn the test set distribution. Using all test-set predictions as soft-labels worked better than any other version of pseudo-labeling (e.g., hard labels, confidence thresholded PLs, etc.). Towards the end of the competition, we discovered a minor but material boost in LB. To know more about Pseudo labeling, best explained here by Chris from whom I learned a lot.
  2. k-fold CV and validation set as hold-out but as refined test predictions and used pseudo-labels + validation set for training, the validation metric became noisy to the point where we relied primarily on the public LB score.
  3. Postprocessing: the history of submissions and tweaking the test set predictions. Then tracking the delta of predictions for each sample for successful submissions, averaging them, and nudging the predictions in the same direction made the way for winning solutions.

In this competition reading, MRI data was a bit tedious. So preprocessing the data for constructed features, postprocessing and TLDR played a major role.

  1. Preprocessing: Adding bias to different columns in the test set to make it closer to the train set. Linear models showed good performance in the competition so it was expected that adding biases would help a lot (at least for linear models). There are lots of ways to figure out possible biases: we used the minimization of Kolmogorov-Smirnov test’s statistic between train and test dataset. To know more about the Kolmogorov-Smirnov(KS) test’s statistic here is a notebook.
  2. Postprocessing: Postprocessing used the same logic as a preprocessing: That is calculating the KS statistic and finding the best fit. But the effect of postprocessing was small: only 5e-5 for both public and private.
  3. TLDR: Incremental PCA for 3d images.Offsets for test features (like we did in ION).

deep-learning solutions neural-networks artificial-intelligence kaggle

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

Artificial Neural Network | Deep Learning with TensorFlow and Artificial Intelligence

Artificial Neural Network | Deep Learning with Tensorflow and Artificial Intelligence | I have talked about Artificial neural networks and its implementation in TensorFlow using google colab. You will learn: What is an Artificial Neural Network? Building your neural network using Tensorflow.

Start a Career in Machine Learning and Artificial Intelligence

Enroll now at best Artificial Intelligence training in Noida, - the best Institute in India for Artificial Intelligence Online Training Course and Certification.

Deep Learning 101 —  Neural Networks Explained

The past few decades have witnessed a massive boom in the penetration as well as the power of computation, and amidst this information.

Deep Learning Explained in Layman's Terms

Deep Learning Explained in Layman's Terms. In this post, you will get to learn deep learning through a simple explanation (layman terms) and examples.

Introduction to Neural Networks

There has been hype about artificial intelligence, machine learning, and neural networks for quite a while now. This will not be a math-heavy introduction because I just want to build the idea here.