Anomaly Detection using Autoencoders

Learn what are AutoEncoders, how they work, their usage, and finally implement Autoencoders for anomaly detection in perform fraud detection using Autoencoders in TensorFlow

Autoencoders and The Denoising Feature: From Theory to Practice…

This tutorial will be a theoretical and practical introduction to the concept of Autoencoders and more specifically, to Denoising…

Credit Card Customer Clustering with Autoencoder and K-means

A Further Dig into Business Intelligence for Customer Marketing with Improved models. So, in this article, we will continue the journey on customer clustering using auto-encoder and k-means. As usual, it is split into 4 parts.

Building Autoencoders on Sparse, One Hot Encoded Data

A hands-on review of loss functions suitable for embedding sparse one-hot-encoded data in PyTorch. In this article, I’ll briefly discuss One Hot Encoding (OHE) data and general autoencoders.

Autoencoders for Dimensionality Reduction

In this post, we will provide a concrete example of how we can apply Autoeconders for Dimensionality Reduction. We will work with Python and TensorFlow 2.x.

Autoencoders: Overview of Research and Applications

Autoencoders are neural network models designed to learn complex non-linear relationships between data points. In this post, I will try to give an overview of the various types of autoencoders developed over the years and their applications.

Deep Stacking Network (DSN)

Deep Stacking Network (DSN). Neural networks have been around for a while, but the ability to provide features such as feature extraction has made their use more viable

Autoencoders for Dimensionality Reduction

In this post, we will provide a concrete example of how we can apply Autoeconders for Dimensionality Reduction. We will work with Python and TensorFlow 2.x.

The Deep Autoencoder in Action: Digit Reconstruction

I am going to be using MNIST Handwritten Digit dataset in which each of its image samples has the size of 28 by 28 pixels. This size is then going to be flattened, hence we will have 784 values to represent each of those images.

Create your own Mini-Word-Embedding from Scratch.

On a lighter note, the embedding of a particular word (In Higher Dimension) is nothing but a vector representation of that word (In Lower Dimension).

Artificial Neural Networks- An intuitive approach Part 5

A continuation of an earlier article. A machine learning model can contain millions of parameters or dimensions. Therefore the cost function has to be optimized over millions of dimensions. The goal is to obtain a global minimum of the function which will give us the best possible values to optimize our evaluation metric with the given parameters.

De-Blurring images using Convolutional Neural Networks with code

Ever wondered that it would be great if you brought some low quality pictures with lack of details back to life? Deep Learning comes to…

Contrastive Learning: Effective Anomaly Detection with Auto-Encoders

Contrastive Learning: Effective Anomaly Detection with Auto-Encoders: How to improve auto-encoders performance in anomaly detection tasks with Contrastive Learning and Keras

Reconstruct corrupted data using Denoising Autoencoder(Python code)

Reconstruct corrupted data using Denoising Autoencoder(Python code). This article will help you demystify denoising using autoencoder in few minutes!!

Anomaly detection with Autoencoders

Anomalies in systems occur rarely. The validation layers stand guard over correctness by catching them out and eliminating them from the process.

Unconventional Deep Learning Techniques for Tabular Data

Learn Dimensionality Reduction, Denoising and Synthetic Data Generation for Tabular Data using Deep Learning. In recent years, Deep Learning has made huge strides.

An Interactive Visualization of Autoencoders, Built with Tensorflow.js

Introducing Anomagram - An interactive tool that lets you train and evaluate an autoencoder for the task of anomaly detection on ECG data. It lets you visualize…