Vision Transformers: Beginning of the end for CNNs?

Vision Transformers: Beginning of the end for CNNs?

1. Introduction NLP (Natural Language Processing) has popularised the use of self-attention based Transformers as a way to go for scalable NLP tasks. Transformers have almost replaced RNN (Recurrent Neural Network) based architectures for NLP. Recently, Deep Learning...
Interpretability in AI

Interpretability in AI

In this post we will cover the concept of model interpretability. We will talk about its increasing importance as newer AI driven systems are getting adopted for critical and high impact scenarios. After the introduction, we will look into some popular approaches...
Deep Dive into Masked Autoencoder (MADE)

Deep Dive into Masked Autoencoder (MADE)

In this post I will talk about the Masked Autoencoder for Distribution Estimation MADE which was covered in a paper in 2015 as linked above. I will follow the implementation from University of Berkeley’s Deep Unsupervised Learning course which can be found here....
Introduction to Deep Reinforcement Learning

Introduction to Deep Reinforcement Learning

1. Introduction Reinforcement Learning (RL) is a sub topic under Machine Learning. It is one of the fastest growing disciplines helping make AI real. Combining Deep Learning with Reinforcement Learning has led to many significant advances that are increasingly getting...