Reverse Engineering Graph Convolutional Networks

Reverse Engineering Graph Convolutional Networks

This blog post will summarise the paper “ Simplifying Graph Convolutional Networks[1] ”, which tries to reverse engineer the Graph Convolutional Networks. So, let us evolve Graph Convolutional Networks backward. Thus, to debunk the GCNs, the paper tries to reverse engineer the GCN and proposes a simplified linear model called Simple Graph Convolution (SGC).

This blog post will summarise the paper “ Simplifying Graph Convolutional Networks[1] ”, which tries to reverse engineer the Graph Convolutional Networks. So, let us evolve Graph Convolutional Networks backward.

Graphs are pervasive models of structures. They are everywhere, from social networks to the chemistry molecule. Various things can be represented in terms of graphs. However, applying Machine learning to these structures is something that didn’t come directly to us. Everything in Machine learning came from a small simple idea or model which was made complex with time as per the need. Just as an example, initially, we had Perceptron which evolved to Multi-Layer perception, similarly, we had image filters that evolved to non-linear CNNs, and so on. However, Graph Convolutional Networks, referred to as GCN, were something we derived directly from existing ideas and had a more complex start. Thus, to debunk the GCNs, the paper tries to reverse engineer the GCN and proposes a simplified linear model called Simple Graph Convolution (SGC). SGC as when applied gives comparable performance to GCNs and is faster than even the Fast-GCN.

Basic Notations

Inputs to the Graph convolutional network are:

1. Node Labels

2. Adjacency matrix

Adjacency matrix: *The adjacency matrix *A *is *n x n,matrix where n is the number of nodes, with a(i,j) = 1 if node i is connected to node j else a(i,j) = 0. If edge is weighted then a(i,j) = edge weight.

Diagonal Matrix: *Diagonal matrix *D *is n x n matrix with d(i,i) = sum of *ith row of adjacency matrix.

Input features: *X is an input feature matrix of size *n x c with c as the number of classes.

Let us see how GCNs actually work before reverse engineering it.

machine-learning graph-convolution graph-neural-networks gcn neural-networks

Bootstrap 5 Complete Course with Examples

Bootstrap 5 Tutorial - Bootstrap 5 Crash Course for Beginners

Nest.JS Tutorial for Beginners

Hello Vue 3: A First Look at Vue 3 and the Composition API

Building a simple Applications with Vue 3

Deno Crash Course: Explore Deno and Create a full REST API with Deno

How to Build a Real-time Chat App with Deno and WebSockets

Convert HTML to Markdown Online

HTML entity encoder decoder Online

How Graph Convolutional Networks (GCN) Work

In this post, we’re gonna take a close look at one of the well-known Graph neural networks named GCN. First, we’ll get the intuition to see how it works, then we’ll go deeper into the maths behind it.

Convolutional Neural Network: How is it different from the other networks?

Convolutional Neural Network: How is it different from the other networks? What’s so unique about CNNs and what does convolution really do? This is a math-free introduction to the wonders of CNNs.

Wading through Graph Neural Networks

In this blog,Let's discuss Graph Neural Network and its variants. Let us start with what graph neural networks are and what are the areas in which it can be applied. The sequence in which we proceed further is as follows.

Hire Machine Learning Developers in India

We supply you with world class machine learning experts / ML Developers with years of domain experience who can add more value to your business.

Using substructures for provably expressive graph neural networks

The figure below shows an example most of us are familiar with, the molecule of caffeine, whose level in my bloodstream is alarmingly low.