Resources for essential topics of Machine Learning and Deep learning, including Natural language processing (NLP), Computer Vision (CV), Reinforcement Learning (RL), Self-Supervised Learning (SSL), etc.

Read more »

Sources:

  1. The spelled-out intro to neural networks and backpropagation: building micrograd
  2. Stanford CS231N, Lecture 4

Other useful resources:

  1. StatQuest's Neural Networks videos

  2. Efficient backprop. You need do download it via

    1
    wget http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf

The python script and jupyter notebook used in this article can be found at here.

This article is a step-by-step explanation of neural networks which are extensively used in machine learning. It only involves the most basic case where the input of a neuron is a vector (1-D tensor)and output of a neuron is a scalar (0-D tensor). But the idea holds for higher, please reter to Derivatives, Backpropagation, and Vectorization for details.

Read more »

Ref:

  1. EE 376A: Information Theory. Winter 2018. Lecture 6. - Stanford University
  2. EE 376A: Information Theory. Winter 2017. Lecture 4. - Stanford University
  3. Elements of Information Theory
Read more »
0%